LLM Hub icon
LLM Hub icon

LLM Hub

LLM Hub is an open-source Android app for on-device LLM chat and image generation. It's optimized for mobile usage (CPU/GPU/NPU acceleration) and supports multiple model formats so you can run powerful models locally and privately.

LLM Hub screenshot 1

Cost / License

  • Free
  • Open Source (MIT)

Platforms

  • Android  Android 8.0+
  • Android Tablet  Android 8.0+
-
No reviews
0likes
0comments
0news articles

Features

Suggest and vote on features

Properties

  1.  Privacy focused

Features

  1.  Hardware Accelerated
  2.  OCR
  3.  No Tracking
  4.  Ad-free
  5.  No registration required
  6.  Works Offline
  7.  Dark Mode
  8.  Text to Speech
  9.  Text to Image Generation
  10.  GPU Acceleration
  11.  speech transcription
  12.  AI-Powered
  13.  Speech Recognition
  14.  Speech to text
  15.  Translator
  16.  Offline
  17.  Material design

 Tags

  • on-device-ai
  • asr
  • phi4
  • gemma3
  • npu
  • qnn
  • Stable Diffusion
  • transcription
  • gguf
  • stt
  • RAG
  • mediapipe
  • litert
  • AI
  • retrieval-augmented-generation
  • llama
  • mnn
  • multimodal
  • llm-inference
  • local-ai
  • onnx
  • scamdetector
  • gemma3n

LLM Hub News & Activities

Highlights All activities

Recent activities

LLM Hub information

  • Developed by

    AU flagTimmy Qian / Yuan Qian (timmyy123 / timmy boy)
  • Licensing

    Open Source (MIT) and Free product.
  • Alternatives

    20 alternatives listed
  • Supported Languages

    • English
    • Arabic
    • German
    • Spanish
    • Persian
    • French
    • Hebrew
    • Indonesian
    • Italian
    • Japanese
    • Korean
    • Polish
    • Portuguese
    • Russian
    • Turkish
    • Ukrainian

AlternativeTo Categories

AI Tools & ServicesSystem & HardwareAudio & MusicEducation & ReferenceOffice & ProductivityPhotos & Graphics

GitHub repository

  •  95 Stars
  •  25 Forks
  •  11 Open Issues
  •   Updated  
View on GitHub
LLM Hub was added to AlternativeTo by bugmenot on and this page was last updated .
No comments or reviews, maybe you want to be first?

What is LLM Hub?

LLM Hub is an open-source Android app for on-device LLM chat and image generation. It's optimized for mobile usage (CPU/GPU/NPU acceleration) and supports multiple model formats so you can run powerful models locally and privately.

? SIX AI TOOLS

📝 CHAT Multi-turn conversations with RAG memory, web search, TTS auto-readout, and multimodal input (text, images, audio)

?? WRITING AID Summarize, expand, rewrite, improve grammar, or generate code from descriptions

🎨 IMAGE GENERATOR Create images from text prompts using Stable Diffusion 1.5 with swipeable gallery for variations

🌍 TRANSLATOR Translate text, images (OCR), and audio across 50+ languages - works offline

🎙? TRANSCRIBER Convert speech to text with on-device processing

🛡? SCAM DETECTOR Analyze messages and images for phishing with risk assessment.

🔐 PRIVACY & SECURITY

• 100% on-device processing - no internet required for inference • Zero data collection - conversations never leave your device • No accounts, no tracking - completely private • Open-source - fully transparent

? ADVANCED FEATURES

• GPU/NPU acceleration for fast performance • Text-to-Speech with auto-readout • RAG with global memory for enhanced responses • Import custom models (.task, .litertlm, .mnn, .gguf) • Direct downloads from HuggingFace • 16 language interfaces

Quick Start

  1. Download from Google Play or build from source
  2. Open Settings ? Download Models ? Download or Import a model
  3. Select a model and start chatting or generating images

Supported Model Families (summary)

• Gemma (LiteRT Task) • Llama (Task + GGUF variants) • Phi (LiteRT LM) • LiquidAI LFM (LFM 2.5 1.2B + LFM VL 1.6B vision-enabled) • Ministral / Mistral family (GGUF / ONNX) • IBM Granite (GGUF)

Model Formats

• Task / LiteRT (.task): MediaPipe/LiteRT optimized models (GPU/NPU capable) • LiteRT LM (.litertlm): LiteRT language models • GGUF (.gguf): Quantized models — CPU inference powered by Nexa SDK; some vision-capable GGUF models require an additional mmproj vision project file • ONNX (.onnx): Cross-platform model runtime

Importing models

• Settings ? Download Models ? Import Model ? choose .task, .litertlm, .mnn, .gguf, or .onnx • The full model list and download links live in app/src/.../data/ModelData.kt (do not exhaustively list variants in the README)

Technology

• Kotlin + Jetpack Compose (Material 3) • LLM Runtime: MediaPipe, LiteRT, Nexa SDK • Image Gen: MNN / Qualcomm QNN • Quantization: INT4/INT8

Acknowledgments

• Nexa SDK — GGUF model inference support (credit shown in-app About) ? • Google, Meta, Microsoft, IBM, LiquidAI, Mistral, HuggingFace — model and tooling contributions

Official Links