Raycast 1.99 adds support for local AI models through new integration with Ollama
Raycast version 1.99 brings major enhancements for macOS users seeking local artificial intelligence features. With this release, Raycast introduces support for running local AI models through a new integration with Ollama. This integration allows users to access and run over 100 open source large language models on their own machines, with options ranging from compact 135 million parameter models to expansive 671 billion parameter versions.
Building on its AI feature set, Raycast now offers experimental support for AI Extensions running on local models. However, some aspects — such as tool choice and streaming for tool calls — are not yet supported by Ollama, making the extension experience less reliable. Users interested in testing this feature can activate it via the AI settings panel.
Following these AI improvements, updates to the Model Context Protocol (MCP) offer better error reporting when standard input/output servers fail and improve compatibility with server JSON schemas. A new “Copy to Clipboard” action has also been added to the server management settings, further streamlining workflows. The update also delivers several minor enhancements and bug fixes for overall stability and usability.
