AI-powered coding assistant with a beautiful web interface, powered by FREE Groq API.
- 🚀 Lightning Fast - Powered by Groq's ultra-fast inference
- 💰 Completely FREE - Uses Groq's free tier (Llama 3.3 70B)
- 🎨 Beautiful UI - Modern, responsive web interface
- 💬 Chat Interface - Natural conversation with context awareness
- 🔍 Code Analysis - Explain, review, optimize, fix, and document code
- ⚡ Code Generation - Generate code from natural language descriptions
- 🔄 Multiple Models - Choose from various free models
pip install ameck-copilot- Visit console.groq.com/keys
- Sign up for a free account
- Create a new API key
ameck-copilotOn first run, you'll be prompted to enter your API key. It will be saved securely for future use.
The app will automatically open at http://127.0.0.1:8000
# Start the server (default)
ameck-copilot
# Start on a specific port
ameck-copilot run --port 3000
# Configure API key
ameck-copilot setup
# Show current configuration
ameck-copilot configYou can also set the API key via environment variable:
export GROQ_API_KEY=gsk_your_api_key_here
ameck-copilotHave natural conversations about code. Ask questions, get explanations, and receive coding help.
The assistant now supports multiple modes to tailor behavior:
- Ask — General Q&A and conversational assistance (default)
- Agent — Proposes prioritized actions, outlines goals, and asks clarifying questions when needed
- Edit — Produces edits/patches or unified diffs for code and text
- Plan — Generates concise, actionable plans with numbered steps and acceptance criteria
- Explain - Get detailed explanations of code
- Review - Get code quality feedback
- Optimize - Get performance improvements
- Fix - Identify and fix bugs
- Document - Add documentation
- Test - Generate unit tests
Describe what you want to build and get production-ready code with proper error handling.
This project can be published in three distribution channels:
- PyPI (Python package) — installable via
pip install ameck-copilot. - VS Code Marketplace — a minimal extension is included in
vscode-extension/that opens the local web UI or starts the server. - GitHub — recommended repo:
https://github.com/QuantBender/ameck-copilot(I can create it if you provide a token).
See RELEASE.md for step-by-step publishing instructions and how to add required secrets.
| Model | Description |
|---|---|
| Llama 3.3 70B | Best all-around (default) |
| Llama 3.1 8B | Fast & lightweight |
| GPT-OSS 120B | OpenAI's open model |
| GPT-OSS 20B | Smaller OpenAI model |
| Llama 4 Scout 17B | Latest Llama 4 |
| Qwen 3 32B | Alibaba's model |
git clone https://github.com/ameck/ameck-copilot.git
cd ameck-copilot
pip install -e .ameck-copilot run --host 0.0.0.0 --port 8000ameck-copilot/
├── src/
│ └── ameck_copilot/
│ ├── __init__.py
│ ├── cli.py # CLI entry point
│ ├── app/
│ │ ├── main.py # FastAPI application
│ │ ├── config.py # Configuration
│ │ ├── models.py # Pydantic models
│ │ ├── routes/ # API routes
│ │ └── services/ # Business logic
│ └── static/ # Frontend files
├── pyproject.toml
└── README.md
MIT License - see LICENSE for details.
- Groq - For providing free, fast AI inference
- Meta - For Llama models
- FastAPI - For the amazing web framework
Made with ❤️ by Ameck