Anthropic releases affordable, high-speed Claude 3 Haiku model
Anthropic PBC, an artificial intelligence startup that builds trustworthy AI models rivaling OpenAI’s GPT-4, Wednesday released Claude 3 Haiku, the newest addition to its Claude 3 family of models designed for speed and affordability.
Anthropic introduced the Claude 3 family of large language models earlier in March with three models. The most advanced model, Claude 3 Opus, the company says has significant processing power that rivals even the best-in-class from industry giants such as OpenAI and Google LLC. Its other sibling Sonnet balances speed for cost.
The company says that Haiku is three times faster than its peers when processing most workloads, making it perfect for use cases that require sheer speed and low latency. That makes it ideal for use cases such as customer service, fieldwork, question-and-answer and other applications where quick answers are ideal.
“Speed is essential for our enterprise users who need to quickly analyze large datasets and generate timely output for tasks like customer support,” the company said in the announcement. “It also generates swift output, enabling responsive, engaging chat experiences and the execution of many small tasks in tandem.”
According to Anthropic, Haiku is capable of processing up to 21,000 tokens, or around 30 pages of text, per second for prompts under 32,000 tokens.
Like the rest of the models in the Claude 3 family, Haiku is capable of responding to basic questions and requests. It has a maximum prompt size of 200,000 tokens, which is around 150,000 words, or more than 500 pages of material. The company said that all three models have enhanced capabilities when it comes to content creation, code generation and analysis, as well as improved fluency in non-English languages such as Spanish, Japanese and French.
The company also put particular focus on making the model affordable, placing a 1:5 pricing model on the input-to-output token ratio for enterprise workloads where longer prompts are common. Businesses often rely on LLMs to digest and analyze extremely large documents and this can lead to higher costs. Anthropic said that the model could analyze 400 Supreme Court cases or 2,500 images for just $1.
“Businesses can rely on Haiku to quickly analyze large volumes of documents, such as quarterly filings, contracts, or legal cases, for half the cost of other models in its performance tier,” the company said.
As part of the announcement, Anthropic said that Haiku is joining Sonnet on Amazon Web Service Inc.’s public cloud through Amazon Bedrock, a managed service that provides access to AI foundation models from AWS and other companies. The company said that the model will also be coming soon to Google Cloud Vertex AI, a platform from Google LLC for training and deploying generative AI models.
Customers and developers can also use Haiku through the company’s application programming interface or with a Claude Pro subscription via claude.ai.
Image: Anthropic
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU