You need Ollama running on your localhost with some model. Once Ollama is running the model can be pulled from Follamac or from command line. From command line type something like: Ollama pull llama3 If you wish to pull from Follamac you can write llama3 into "Model name to...
Cost / License
- Free
- Open Source
Application types
Platforms
- Linux
- Flathub
- Flatpak
- Mac
- Windows
- BSD


















































