

Follamac
5 likes
You need Ollama running on your localhost with some model. Once Ollama is running the model can be pulled from Follamac or from command line. From command line type something like: Ollama pull llama3 If you wish to pull from Follamac you can write llama3 into "Model name to...
License model
- Free • Open Source
Application types
Platforms
- Linux
- Flathub
- Flatpak
- Mac
- Windows
- BSD
Features
Follamac News & Activities
Highlights • All activities
Recent activities
- Danilo_Venom added Follamac as alternative to Cloudflare Workers AI
- Danilo_Venom added Follamac as alternative to node-llama-cpp
- POX added Follamac as alternative to Together Chat
- POX added Follamac as alternative to v0 by Vercel
- K0RR added Follamac as alternative to Ollama App
Follamac information
AlternativeTo Category
AI Tools & ServicesGitHub repository
- 19 Stars
- 1 Forks
- 1 Open Issues
- Updated Feb 23, 2025
No comments or reviews, maybe you want to be first?
Post comment/review