This Chrome extension is powered by Ollama. Inference is done on your local machine without any external server support. However, due to security constraints in the Chrome extension platform, the app does rely on local server support to run the LLM.
Cost / License
- Free
- Open Source
Application type
Platforms
- Self-Hosted
- Google Chrome
- Docker




































