

LocalPilot
7 likes
Use GitHub Copilot locally on your Macbook with one-click.
Features
- Offline
- AI-Powered
- Code Completion
- VSCode
Tags
- vscode-extension
- AI Code Generator
LocalPilot News & Activities
Highlights All activities
Recent News
No news, maybe you know any news worth sharing?
Share a News TipRecent activities
ccocks added LocalPilot as alternative to Agentica
SageBowSystems added LocalPilot as alternative to SageBow
lamjed001 added LocalPilot as alternative to nao
POX added LocalPilot as alternative to BLACKBOX.AI
pranshu_traycer added LocalPilot as alternative to Traycer AI
doomeron added LocalPilot as alternative to Dyad- POX added LocalPilot as alternative to opencode
- POX added LocalPilot as alternative to hi - AI Assistant
Maoholguin added LocalPilot as alternative to Gemini CLI
POX added LocalPilot as alternative to opcode
LocalPilot information
No comments or reviews, maybe you want to be first?
Post comment/reviewWhat is LocalPilot?
Use GitHub Copilot locally on your MacBook with one-click.
Is the code as good as GitHub Copilot?
For simple line completions, yes. For simple function completions, mostly. For complex functions... maybe.
Is it as fast as GitHub Copilot?
On my MacBook Pro with an Apple M2 Max, the 7b models are roughly as fast. The 34b models are not. Please consider this repo a demonstration of a very inefficient implementation. I'm sure we can make it faster; please do submit a pull request if you'd like to help. For example, I think we need debouncer because sometimes llama.cpp/GGML isn't fast at interrupting itself when a newer request comes in.






