100% FREE | Powered by Groq API | Real-time streaming | Llama 3.3 70B | ZIP/TAR support
Download from nodejs.org — v18 or higher needed
- Go to console.groq.com
- Sign up (free, no credit card)
- Click API Keys → Create API Key
- Copy the key (starts with
gsk_...)
# Copy env file
cp .env.example .env
# Open .env and paste your key:
GROQ_API_KEY=gsk_your_actual_key_herenpm install
npm starthttp://localhost:3000
| Model | Speed | Smarts | Best For |
|---|---|---|---|
llama-3.3-70b-versatile ⭐ |
Fast | 🔥🔥🔥 | All-round debugging (default) |
deepseek-r1-distill-llama-70b |
Medium | 🧠🧠🧠 | Complex logic bugs |
llama-3.1-8b-instant |
Ultra fast | 🔥🔥 | Quick checks |
mixtral-8x7b-32768 |
Fast | 🔥🔥🔥 | Large files (32K context) |
gemma2-9b-it |
Fast | 🔥🔥 | Lightweight use |
Change model in .env:
GROQ_MODEL=deepseek-r1-distill-llama-70b- 📦 ZIP, TAR.GZ, TAR, GZ — extract all code files automatically
- 🤖 Groq AI Debugging — Llama 3.3 70B finds every bug type
- ⚡ Real-time Streaming — see analysis live via WebSocket
- 🔀 Split Diff Viewer — side-by-side original vs fixed
- 🔁 Batch Processing — 3 files at once (configurable)
- ⏸ Pause / Resume — mid-batch control
- 📅 Auto Scheduling — run every X minutes/hours
- ⬇ Export ZIP — all fixed files + markdown report
- 🛡 Security Scanning — SQLi, XSS, path traversal, etc.
- 30+ Languages supported
| Key | Action |
|---|---|
Ctrl+O |
Upload files |
Ctrl+Enter |
Run debug batch |
Ctrl+D |
Download fix |
Ctrl+L |
Clear terminal |
ai-debug-agent/
├── server.js ← Node.js backend (Groq API + WebSocket)
├── public/index.html ← Full frontend UI
├── package.json
├── .env ← Your config (create from .env.example)
└── README.md
"GROQ_API_KEY not configured"
→ Create .env from .env.example, paste your key from console.groq.com
Rate limit errors
→ Groq free tier has limits. Lower BATCH_CONCURRENCY=1 in .env
Large file issues
→ Use mixtral-8x7b-32768 model (supports 32K tokens)