UPDATED 16:49 EST / MARCH 01 2023

AI

OpenAI makes its ChatGPT and Whisper models available via cloud APIs

Developers can now integrate OpenAI LLC’s ChatGPT and Whisper models into their software thanks to new application programming interfaces that became available today.

An application programming interface, or API, is a channel through which programs exchange data and instructions with one another. It allows one application to have another workload perform a task when certain conditions are met. A marketing application, for example, can ask an artificial intelligence tool to translate ad copy into a different language.

The first new API that OpenAI debuted today is for its ChatGPT model. Until now, ChatGPT was primarily accessible through a graphical interface, which means developers had no simple way to connect it to their applications. The new API that debuted today removes that limitation. 

Since ChatGPT made its debut last December, OpenAI’s researchers have made optimizations that allow the neural network to run using less hardware. The result is a significant reduction in costs. The ChatGPT API will be available at a price 10 times lower than the API for OpenAI’s earlier GPT-3.5 model, the startup detailed today. 

OpenAI hosts ChatGPT in Microsoft Corp.’s Azure public cloud on multitenant infrastructure. That means multiple users’ deployments run on the same hardware. For customers with advanced requirements, OpenAI is adding the ability to deploy its new ChatGPT API on dedicated infrastructure not shared with other users.

Using dedicated infrastructure enables customers to customize the ChatGPT API’s settings to a greater extent than otherwise possible. A company could, for example, trade off some response generation speed for throughput or vice versa if an application so requires. Additionally, OpenAI says, the dedicated hosting option enables developers to optimize their ChatGPT deployments’ performance in a more fine-grained manner.

“The ChatGPT model family we are releasing today, gpt-3.5-turbo, is the same model used in the ChatGPT product,” OpenAI executives and researchers wrote in a blog post today. “It is priced at $0.002 per 1k tokens, which is 10x cheaper than our existing GPT-3.5 models. It’s also our best model for many non-chat use cases.” 

The second API that OpenAI introduced today will enable developers to access a managed version of the startup’s Whisper transcription model. 

The model, which was first detailed last September, is available under an open-source license. As a result, companies can theoretically create an in-house deployment of Whisper instead of using OpenAI’s API. But according to the startup, Whisper is technically challenging to deploy manually, an adoption barrier that the API tackles.

It enables developers to implement automated transcription features in one of ways. Whisper can transcribe spoken words in their original language, or translate them to English. The original version of Whisper that debuted last year was trained on 680,000 hours of audio sourced from the web, about a third of which was non-English.

“We’ve now made the large-v2 model available through our API, which gives convenient on-demand access priced at $0.006/minute,” OpenAI staffers explained in today’s blog post. “In addition, our highly-optimized serving stack ensures faster performance compared to other services.”

OpenAI is rolling out its two new machine learning APIs alongside an update to its terms of service. The startup will no longer use data submitted to its models for AI training purposes unless customers opt in. Additionally, the company is simplifying the text of its terms of service document and improving its products’ developer documentation.

Image: OpenAI

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU