Introducing “Les Ministraux”: Mistral's new AI models for efficient on-device computing

Introducing “Les Ministraux”: Mistral's new AI models for efficient on-device computing

AI company Mistral has launched two new models, Ministral 3B and Ministral 8B, designed for on-device computing and edge use cases. Known as Les Ministraux, these models expand capabilities in the sub-10B category, excelling in knowledge, commonsense reasoning, function-calling, and efficiency. They support up to 128k context length, surpassing the current 32k on vLLM, with Ministral 8B featuring an interleaved sliding-window attention pattern for enhanced speed and memory efficiency.

Les Ministraux aim to deliver compute-efficient, low-latency solutions for privacy-focused applications such as on-device translation, internet-less smart assistants, local analytics, and autonomous robotics. In combination with larger language models like Mistral Large, they serve as efficient intermediaries for function-calling in multi-step workflows, capable of input parsing, task routing, and API calling across multiple contexts with minimal latency and cost. Both models are now available for use.

by Paul

du
duttyend found this interesting
  • ...

Mistral AI offers an open and portable generative AI solution tailored for developers and businesses. As an AI chatbot, it provides an ad-free, AI-powered experience, earning a rating of 5. Mistral AI stands alongside alternatives such as ChatGPT, HuggingChat, and Perplexity, catering to users seeking advanced conversational AI capabilities.

No comments so far, maybe you want to be first?
Gu