Native, Apple Silicon–only local LLM server. Similar to Ollama, but built on Apple's MLX for maximum performance on M-series chips. SwiftUI app + SwiftNIO server with OpenAI-compatible endpoints.

Jellybox is described as 'Run AI models locally and entirely offline' and is a large language model (llm) tool in the ai tools & services category. There are more than 25 alternatives to Jellybox for a variety of platforms, including Mac, Linux, Windows, Web-based and Self-Hosted apps. The best Jellybox alternative is Ollama, which is both free and Open Source. Other great apps like Jellybox are GPT4ALL, Jan.ai, Open WebUI and AnythingLLM.
Native, Apple Silicon–only local LLM server. Similar to Ollama, but built on Apple's MLX for maximum performance on M-series chips. SwiftUI app + SwiftNIO server with OpenAI-compatible endpoints.

Dione makes installing complex applications as simple as clicking a button — no terminal or technical knowledge needed. For developers, Dione offers a zero-friction way to distribute apps using just a JSON file. App installation has never been this effortless.




This project aims to provide a user-friendly interface to access and utilize various LLM models for a wide range of tasks. Whether you need help with writing, coding, organizing data, generating images, or seeking answers to your questions, LoLLMS WebUI has got you covered.




Cloudflare Workers AI provides a serverless platform to execute AI models utilizing GPUs in its network, eliminating infrastructure needs. Access over 50 open-source models, use AI Gateway for app control, and deploy globally with tools like Vectorize, R2, and D1.


Cloudflare Workers AI is the most popular SaaS alternative to Jellybox.
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level.
