
OpenAI's o3-mini model is now available on Microsoft Azure and GitHub Models
OpenAI's o3-mini model is now accessible via Microsoft's Azure OpenAI Service, allowing developers to utilize it through Azure AI Foundry. The cost-effective o3-mini matches OpenAI's o1 model in math, coding, and science while offering better reasoning, responsiveness, and efficiency. It supports function calls, Structured Outputs, streaming, and developer interactions via OpenAI's APIs, with added features like reasoning effort control and optimization tools.
GitHub has integrated o3-mini into its GitHub Copilot and GitHub Models, enhancing output quality for developers relative to o1-mini. The model is available for GitHub Copilot Pro, Business, and Enterprise users via the model picker in Visual Studio Code and github.com chat.
GitHub plans to expand o3-mini support to Microsoft Visual Studio and JetBrains soon. Copilot subscribers are limited to 50 messages every 12 hours with o3-mini. GitHub Business and Enterprise administrators can enable access for organization members through admin settings.