Mistral Small 4 blends reasoning, coding, and multimodal AI into one open-source model

Mistral Small 4 blends reasoning, coding, and multimodal AI into one open-source model

Mistral has announced the release of Mistral Small 4, a unified model that combines advanced reasoning, multimodal, and coding capabilities. This release marks a shift in the Mistral Small lineup by eliminating the need to select between a fast instruct model, a capable reasoning engine, or a multimodal assistant. Users benefit from one model that consolidates the flagship strengths of Magistral, Pixtral, and Devstral, while maintaining configurable reasoning effort and high efficiency.

Building on this integration, Small 4 is engineered as a hybrid solution supporting broad workflows such as general chat, agentic tasks, programming, and complex reasoning. Its architecture enables both text and image input, extending application options across use cases from virtual assistants to research to software development. This adaptability means users no longer need to change models for different tasks, as Small 4 accommodates shifting needs within a single deployment.

In terms of accessibility, Mistral Small 4 is released fully open source under the Apache 2.0 license, continuing Mistral’s approach to transparency and customization. The model is available for immediate use through the Mistral API, Mistral AI Studio, and the Hugging Face repository, facilitating deployment and fine-tuning for specialized or general-purpose applications.

by Paul

justarandomTBayAreaPat
justarandom found this interesting
  • ...

Mistral Le Chat is an AI-powered chatbot offering a conversational experience that leverages Mistral's models. It focuses on educational and fun exploration, providing customization options and exceptional latency across infrastructures. Rated 4.9, it features an ad-free interface and dark mode for user convenience.

Comments

UserPower
0

The base requirements is 4 x H100, which can be rented to less than $10/hour (or can be bought actually at less than $100k), and the price goes down every quarter (and every time a new data center is built).

It's much cheaper than GPT/Claude API calls, especially for medium-sized business that won't need it half of the time, but setup may be burdensome for a personal use.

Gu