The Swiss Army Knife of offline AI. Chat, speak, and generate images. Privacy first, zero internet. Download an LLM and use it on your mobile device. No data ever leaves your phone.
Cost / License
- Free
- Open Source (MIT)
Platforms
- Android


LM Studio is described as 'Discover, download, and run local LLMs' and is a AI Chatbot in the ai tools & services category. There are more than 50 alternatives to LM Studio for a variety of platforms, including Mac, Windows, Linux, Android and Self-Hosted apps. The best LM Studio alternative is Ollama, which is both free and Open Source. Other great apps like LM Studio are GPT4ALL, Jan.ai, Open WebUI and AnythingLLM.
The Swiss Army Knife of offline AI. Chat, speak, and generate images. Privacy first, zero internet. Download an LLM and use it on your mobile device. No data ever leaves your phone.


AI00 RWKV Server is an inference API server for the RWKV language model based upon the web-rwkv inference engine.




This project aims to eliminate the barriers of using large language models by automating everything for you. All you need is a lightweight executable program of just a few megabytes. Additionally, this project provides an interface compatible with the OpenAI API, which means...








Experience the power of RWKV models directly on your device. Completely offline, privacy-first, and efficient. No internet required.




A modern web interface for managing and interacting with vLLM servers (www.github.com/vllm-project/vllm). Supports both GPU and CPU modes, with special optimizations for macOS Apple Silicon and enterprise deployment on OpenShift/Kubernetes.







Dione makes installing complex applications as simple as clicking a button — no terminal or technical knowledge needed. For developers, Dione offers a zero-friction way to distribute apps using just a JSON file. App installation has never been this effortless.




Shinkai is a two click install App that allows you to create Local AI agents in 5 minutes or less using a simple UI. Supports: MCPs, Remote and Local AI, Crypto and Payments.
Run frontier LLMs and VLMs with day-0 model support across GPU, NPU, and CPU, with comprehensive runtime coverage for PC (Python/C++), mobile (Android & iOS), and Linux/IoT (Arm64 & x86 Docker). Supporting OpenAI GPT-OSS, IBM Granite-4, Qwen-3-VL, Gemma-3n, Ministral-3, and more.




This project aims to provide a user-friendly interface to access and utilize various LLM models for a wide range of tasks. Whether you need help with writing, coding, organizing data, generating images, or seeking answers to your questions, LoLLMS WebUI has got you covered.



