UPDATED 14:30 EDT / OCTOBER 16 2024

AI

Mistral introduces Ministral 3B and 8B AI computing models for phones and laptops

Mistral AI, a Paris-based artificial intelligence startup, today introduced two new AI large language models, Ministral 3B and 8B, designed for on-device and edge computing thanks to their small size.

The company called this new model family “les Ministraux,” for its rating in the sub-10 billion-parameter category, which makes them small enough to run on platforms such as smartphones, tablets and internet of things devices. Mistral said the new frontier models can be tuned for common use cases including specialist tasks and work as AI agents via function-calling capabilities.

Customer and partners have increasingly been asking for “local, privacy-first inference for critical applications such as on-device translation, internet-less smart assistants, local analytics, and autonomous robotics,” the company said in the announcement. Les Ministraux is aimed at providing a compute-efficient and low-latency solution for those scenarios.

These smaller AI models can be used to moderate larger models, such as Mistral Large, as intermediaries in multistep workflows to handle input parsing, task routing and application calling to reduce costs.

The company said both models support a context length of up to 128,000 tokens, which puts them in line with OpenAI’s GPT-4 Turbo for how much data can be input. Ministral 8B also comes with a special “sliding window attention pattern,” which allows faster and more memory-efficient deployment.

The release of Ministral 3B and 8B comes a year after the release of Mistral 7B, an LLM that the company touted as a significant advancement in model architecture. The 8B and 3B regards the number of parameters in both models, 8 billion and 3 billion, and the company says the smallest model, Ministral 3B, already outperforms Mistral 7B in most benchmarks.

According to benchmarks, pretrained Ministral 3B beat Google LLC’s Gemma 2 2B and Meta Platforms Inc. Llama 3.2 3B models in the Multi-task Language Understanding evaluation with a score of 60.9 compared to 52.4 and 56.2, respectively. Ministral 8B also outperformed Llama 8B with a 65.0 score compared with 64.7.

The Ministraux model family closely follows Mistral’s introduction of Pixtral 12B last month, an advanced AI model that’s the first of the company’s models capable of vision encoding, making it possible to process both images and text.

Image: Pixabay

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU