UPDATED 14:45 EST / NOVEMBER 28 2024

AI

Amazon reportedly develops new multimodal language model

Amazon.com Inc. has reportedly developed a multimodal large language model that could debut as early as next week.

The Information on Wednesday cited sources as saying that the algorithm is known as Olympus internally. Last November, Reuters reported that Amazon was spending millions of dollars to train an LLM called Olympus with 2 trillion parameters. It’s unclear whether the algorithm detailed in The Information’s report is the same LLM, a new version of the original Olympus or an entirely different system.

The upcoming model is believed to be capable of processing not only text but also images and videos. According to the report, Olympus will make it possible to search video repositories for specific clips using natural language prompts. Helping energy companies analyze geological data is reportedly also among the use cases that the LLM supports.

The Information’s sources expect Amazon to debut Olympus as early as next week during AWS re:Invent. That the model could be announced at the event suggests the plan may be to offer it through Amazon Web Services, possibly via AWS Bedrock. Introduced last April, Bedrock is a managed service that provides access to cloud-hosted frontier models. 

The service already offers more than a half-dozen Amazon-developed models. The most advanced LLM in the series, Amazon Titan Text Premier, supports prompts with up to 32,000 tokens. It can generate text and code as well as perform chain of thought reasoning, a process whereby an AI breaks down a complex task into smaller steps to streamline analysis.

Bedrock also includes three Amazon-developed models for generating embeddings. Those are mathematical structures in which machine learning applications keep information. One of the models can generate embeddings with multimodal data, a capability that could potentially make it easier for customers to use Olympus’ rumored multimodal features.

Besides Amazon-developed models, Bedrock also provides access to LLMs from other companies. One of those companies is Anthropic PBC, which has raised $8 billion in funding from the online retail and cloud computing giant. The most recent $4 billion tranche was announced last week.

According to The Information, the upcoming Olympus model could be a way for Amazon to reduce its reliance on Anthropic. Other tech giants are also working to bring more of their AI stacks in-house. Meta Platforms Inc. is reportedly developing a search engine in a bid to reduce its Meta AI chatbot’s reliance on search technologies from Microsoft Corp. and Google LLC.

Amazon’s AI strategy encompasses not only software but also hardware. The company has developed two chip lineups, AWS Trainium and AWS Inferentia, that are optimized for training and inference workloads, respectively. Last week, Anthropic detailed that it will collaborate with the cloud giant to enhance the former processor series.

Photo: Amazon

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU