Use LLMs to generate terminal commands from natural language queries.
Click the image above to watch the tutorial for setting up AI CLI.
- Navigate to the
ai_cli
directory - Install the required packages:
pip install -r requirements.txt
- Make the script executable:
chmod +x ai_cli.py
- Set up environment variables:
- Set
GROQ_API_KEY
for Groq service - Set
FAST_OLLAMA_MODEL
to your preferred fast Ollama model (optional)
- Set
python ai_cli.py [input] [options]
--service {ollama,groq}
: Select the AI service to use (default: ollama)--model MODEL
: Specify a custom model to use (optional)
python ai_cli.py "List all files in the current directory"
python ai_cli.py "Create a new directory named 'test'" --service groq
python ai_cli.py "Show system information" --model codestral
- The script will display the generated command and ask for confirmation before execution.
- Press Enter to execute the command or 'n' to cancel.