Local Dream icon
Local Dream icon

Local Dream

Run Stable Diffusion on Android Devices with Snapdragon NPU acceleration. Also supports CPU/GPU inference.

Local Dream screenshot 1

Cost / License

  • Free
  • Open Source

Platforms

  • Android  NSFW filtered if download via Google Play, otherwise download from GitHub Releases
2likes
0comments
0articles

Features

Local Dream News & Activities

Highlights All activities

Recent activities

Local Dream information

  • Developed by

    CN flagxororz
  • Licensing

    Open Source and Free product.
  • Alternatives

    8 alternatives listed
  • Supported Languages

    • English
    • Chinese

GitHub repository

  •  1,808 Stars
  •  111 Forks
  •  60 Open Issues
  •   Updated  
View on GitHub
Local Dream was added to AlternativeTo by bugmenot on and this page was last updated .
No comments or reviews, maybe you want to be first?

Featured in Lists

Wake up the NPU on your device

List by bugmenot with 46 apps, updated

What is Local Dream?

Android Stable Diffusion with Snapdragon NPU acceleration. Also supports CPU/GPU inference.

Features

  • 🎨 txt2img - Generate images from text descriptions
  • 🖼? img2img - Transform existing images
  • 🎭 inpaint - Redraw selected areas of images
  • custom models - Import your own SD1.5 models for CPU (in app) or NPU (follow conversion guide). You can get some pre-converted models from xororz/sd-qnn or Mr-J-369
  • lora support - Support adding LoRA weights to custom CPU models when importing.
  • prompt weights - Emphasize certain words in prompts. E.g., (masterpiece:1.5). Same format as Automatic1111
  • embeddings - Support for custom embeddings like EasyNegative. SafeTensor format is required. Convert pt to safetensors using this
  • upscalers - 4x upscaling with realesrgan_x4plus_anime_6b and 4x-UltraSharpV2_Lite

Technical Implementation

NPU Acceleration

  • SDK: Qualcomm QNN SDK leveraging Hexagon NPU
  • Quantization: W8A16 static quantization for optimal performance
  • Resolution: Fixed 512×512 model shape
  • Performance: Extremely fast inference speed

CPU/GPU Inference

  • Framework: Powered by MNN framework
  • Quantization: W8 dynamic quantization
  • Resolution: Flexible sizes (128×128, 256×256, 384×384, 512×512)
  • Performance: Moderate speed with high compatibility

NPU High Resolution Support

Please note that quantized high-resolution(>768x768) models may produce images with poor layout. We recommend first generating at 512 resolution (optionally you can upscale it), then using the high-resolution model for img2img (which is essentially Highres.fix). The suggested img2img denoise_strength is around 0.8. After that, you can get images with better layout and details.

Device Compatibility

NPU Acceleration Support Compatible with devices featuring:

  • Snapdragon 8 Gen 1/8+ Gen 1
  • Snapdragon 8 Gen 2
  • Snapdragon 8 Gen 3
  • Snapdragon 8 Elite
  • Snapdragon 8 Elite Gen 5/8 Gen 5
  • Non-flagship chips with Hexagon V68 or above (e.g., Snapdragon 7 Gen 1, 8s Gen 3) Note: Other devices cannot download NPU models

CPU/GPU Support

  • RAM Requirement: ~2GB available memory
  • Compatibility: Most Android devices from recent years

Available Models

The following models are built-in and can be downloaded directly in the app:

Model Type CPU/GPU NPU Clip Skip Source AnythingV5 SD1.5 ? ? 2 CivitAI ChilloutMix SD1.5 ? ? 1 CivitAI Absolute Reality SD1.5 ? ? 2 CivitAI QteaMix SD1.5 ? ? 2 CivitAI CuteYukiMix SD1.5 ? ? 2 CivitAI

🎲 Seed Settings

Custom seed support for reproducible image generation:

  • CPU Mode: Seeds guarantee identical results across different devices with same parameters
  • GPU Mode: Results may differ from CPU mode and can vary between different devices
  • NPU Mode: Seeds ensure consistent results only on devices with identical chipsets

Official Links