UPDATED 13:51 EST / NOVEMBER 08 2023

AI

Amazon reportedly building ‘Olympus’ language model with 2 trillion parameters

Amazon.com Inc. engineers are developing a large language model with 2 trillion parameters, Reuters reported this morning.

The model is believed to be known as Olympus internally. Amazon is reportedly spending “millions” of dollars on the project. OpenAI LP’s GPT-4 model, which is estimated to have a similar parameter count as Olympus, cost more than $100 million to train.

Parameters are configuration settings that determine how an artificial intelligence processes data. Unlike hyperparameters, another type of setting found in neural networks, parameters are not defined by an AI’s developers but rather by the AI itself during the training process. 

Among the configuration details a neural network sets during training are connection weights. Those are parameters that determine the extent to which each piece of input data an AI receives influences its output. The more parameters an AI has, the larger the number of tasks it can perform. 

Amazon’s push to build a language model with 2 trillion parameters is reportedly led by Rohit Prasad, the company’s senior vice president and head scientist for artificial general intelligence. Prasad earlier led the business unit that developed Alexa. The executive has reportedly assembled a team of AI experts from the Alexa unit and Amazon Science, the company’s research division, to lead the development of Olympus.

According to today’s report, Amazon’s goal with the project is to enhance the value proposition of Amazon Web Services. That suggests Olympus could become available to other companies through AWS. It’s unclear if the online retail and cloud giant also plans to use Olympus internally.

Earlier this year, AWS launched a cloud service called Amazon Bedrock through which it offers access to internally developed and third-party generative AI models. The service spares customers the hassle of managing the underlying infrastructure. The AI models on offer are accessible through an application programming interface that developers can integrate into their applications.

The neural networks in Bedrock include a lineup of customizable language models, the Titan series, that Amazon has developed in-house. The series’ main highlight is a pair of large language models that can process up to 4,000 and 8,000 tokens per prompt, respectively. A token is a unit of data that corresponds to a few characters or numbers.

If AWS makes Olympus available through AWS, the model might be pricier more than the Titan models. The more parameters there are in a neural network, the more hardware is required to perform inference, which raises costs. It’s also possible AWS will allow users to customize Olympus by training it on custom datasets, a feature the Titan series already provides.

If and when it becomes available to customers, Olympus could create more competition for major AI developers such as OpenAI and Anthropic PBC. Amazon is a major investor in the latter startup. In September, Anthropic secured a funding round worth up to $4 billion from the company and named AWS as its preferred cloud provider. 

Photo: Amazon

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU