Empowering Tunisian developers with AI-powered shell command assistance
Installation β’ Usage β’ Features β’ Documentation β’ Contributing β’ Support
BourguibaGPT is an innovative, AI-powered shell command assistant specifically designed for Tunisian developers and system administrators. Named after Tunisia's founding father, this tool bridges the gap between natural language and shell commands, making terminal operations more accessible and safer for users of all skill levels.
- πΉπ³ Tunisian Heritage: Built with pride for the Tunisian tech community
- π€ AI-Powered: Leverages Ollama's powerful language models
- π‘οΈ Safety First: Advanced command validation prevents dangerous operations
- π¨ Beautiful Interface: Animated banners and rich terminal UI
- π Educational: Learn shell commands through natural language interaction
- π§ Intelligent Command Generation: Convert natural language to precise shell commands
- π Advanced Safety Validation: Multi-layer command validation system
- π Command History & Feedback: Track usage patterns and improve recommendations
- π Cross-Platform Support: Works seamlessly on Linux, macOS, and Windows
- β‘ Performance Optimized: Smart model selection based on system resources
- π¬ Animated Welcome Banner: Dynamic startup experience
- π¨ Rich Terminal UI: Beautiful, colorful output using Rich library
- π¬ Interactive Prompts: Intuitive conversation-style interface
- π Real-time Feedback: Immediate command validation and suggestions
- π Usage Analytics: Track your command generation patterns
- β Command Whitelist: Only safe, pre-approved commands are executed
- π Argument Validation: Deep inspection of command parameters
- π« Dangerous Command Blocking: Prevents destructive operations
- π Execution Logging: Complete audit trail of all operations
- π‘οΈ Sandboxed Execution: Isolated command execution environment
bourguibagpt/
βββ src/
β βββ bourguibagpt/
β βββ main.py # Application entry point
β βββ config.py # Configuration management
β βββ validators.py # Command validation logic
β βββ windows.py # Windows-specific functions
β βββ models/ # AI model configurations
β βββ utils/ # Utility functions
β βββ tests/ # Test suite
βββ docs/ # Documentation
βββ requirements.txt # Python dependencies
βββ setup.py # Package setup
βββ pyproject.toml # Modern Python packaging
βββ README.md # This file
| File | Purpose | Key Functions |
|---|---|---|
main.py |
Application core | Banner display, command generation, Ollama management |
config.py |
Settings management | Model configuration, user preferences, OS detection |
validators.py |
Security layer | Command whitelist, argument validation, safety checks |
windows.py |
Windows support | Ollama installation, service management, Windows-specific features |
- Python 3.7+ (Python 3.9+ recommended)
- 4GB RAM minimum (8GB+ recommended for larger models)
- Internet connection (for initial Ollama setup)
# Clone the repository
git clone https://github.com/yourusername/bourguibagpt.git
cd bourguibagpt
# Install dependencies
pip install -r requirements.txt
# Run the application
python -m src.bourguibagpt.mainpip install bourguibagpt
bourguibagptconda create -n bourguibagpt python=3.9
conda activate bourguibagpt
pip install bourguibagptgit clone https://github.com/yourusername/bourguibagpt.git
cd bourguibagpt
pip install -e .BourguibaGPT automatically detects and installs Ollama if not present:
- Windows: Uses winget or direct installer download
- macOS: Uses Homebrew or direct download
- Linux: Uses curl installation script
# Start BourguibaGPT
python -m src.bourguibagpt.main
# Example interactions
> "list all files in the current directory"
Generated: ls -la
Execute this command? (y/n): y
> "find all Python files modified in the last 7 days"
Generated: find . -name "*.py" -mtime -7
Execute this command? (y/n): y| Command | Description | Example |
|---|---|---|
help |
Show help information | help |
history |
Display command history | history |
execute <cmd> |
Execute specific command | execute ls -la |
model |
Change AI model | model |
sibourguiba |
Model selection alias | sibourguiba |
config |
Show configuration | config |
stats |
Usage statistics | stats |
export |
Export command history | export history.json |
clear |
Clear screen | clear |
exit/quit |
Exit application | exit |
> model
Available models:
1. llama3.2:1b (Fast, 1GB RAM)
2. llama3.1:8b (Balanced, 8GB RAM)
3. codellama:13b (Code-focused, 16GB RAM)
Choose model [1-3]: 2> history --filter "git"
> history --export json
> history --clear> config safety --level strict
> config whitelist --add "custom-command"
> config validation --enable-deep-scan- Linux/macOS:
~/.config/bourguibagpt/settings.json - Windows:
%APPDATA%\bourguibagpt\settings.json
{
"model": {
"preferred": "llama3.1:8b",
"fallback": "llama3.2:1b",
"auto_select": true
},
"safety": {
"validation_level": "strict",
"allow_sudo": false,
"enable_whitelist": true,
"log_commands": true
},
"ui": {
"show_banner": true,
"color_scheme": "tunisia",
"animation_speed": "normal"
},
"ollama": {
"host": "localhost",
"port": 11434,
"timeout": 30
}
}export BOURGUIBA_MODEL="llama3.1:8b"
export BOURGUIBA_SAFETY_LEVEL="strict"
export OLLAMA_HOST="localhost:11434"
export BOURGUIBA_LOG_LEVEL="INFO"- Whitelist Checking: Commands must be in approved list
- Argument Validation: Parameters are sanitized and validated
- Destructive Operation Detection: Dangerous commands are blocked
- User Confirmation: Interactive approval for all executions
- Execution Logging: Complete audit trail
- File deletion commands (
rm -rf,del) - System modification commands (
format,fdisk) - Network-based attacks (
curlto suspicious URLs) - Privilege escalation without confirmation
- Recursive operations without limits
- File operations (list, copy, move)
- Text processing (grep, sed, awk)
- System information (ps, top, df)
- Development tools (git, docker, npm)
- Archive operations (tar, zip)
- RAM: 4GB
- Storage: 2GB free space
- CPU: Dual-core processor
- Network: Stable internet for initial setup
- RAM: 8GB+ (for larger models)
- Storage: 10GB+ (for model storage)
- CPU: Quad-core processor
- GPU: NVIDIA GPU (optional, for faster inference)
| Model | Size | RAM Usage | Speed | Accuracy |
|---|---|---|---|---|
| llama3.2:1b | 1GB | 2GB | β‘β‘β‘ | βββ |
| llama3.1:8b | 8GB | 10GB | β‘β‘ | ββββ |
| codellama:13b | 13GB | 16GB | β‘ | βββββ |
# Check Ollama status
ollama list
# Manually start Ollama
ollama serve
# Restart BourguibaGPT
python -m src.bourguibagpt.main# Verify model availability
ollama list
# Pull missing model
ollama pull llama3.1:8b
# Check system resources
bourguibagpt --check-system# Linux/macOS
chmod +x $(which ollama)
sudo chown $USER ~/.ollama
# Windows (run as administrator)
icacls "%USERPROFILE%\.ollama" /grant %USERNAME%:Fpython -m src.bourguibagpt.main --debug --log-level DEBUG- Linux/macOS:
~/.config/bourguibagpt/logs/ - Windows:
%APPDATA%\bourguibagpt\logs\
We welcome contributions from the Tunisian developer community and beyond!
- π Report bugs and issues
- π‘ Suggest new features
- π Improve documentation
- π§ͺ Write tests
- π Add internationalization
- π¨ Enhance UI/UX
# Fork the repository
git clone https://github.com/yourusername/bourguibagpt.git
cd bourguibagpt
# Create development environment
python -m venv venv
source venv/bin/activate # Linux/macOS
# or
venv\Scripts\activate # Windows
# Install development dependencies
pip install -r requirements-dev.txt
# Run tests
pytest
# Install pre-commit hooks
pre-commit install- Follow PEP 8 style guidelines
- Write comprehensive docstrings
- Add type hints where appropriate
- Include unit tests for new features
- Update documentation
This project is licensed under the MIT License - see the LICENSE file for details.
MIT License
Copyright (c) 2024 BourguibaGPT Contributors
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
If CrashSense has helped streamline your debugging workflow, consider supporting continued development:
| Platform | ID |
|---|---|
| π³ RedotPay | 1951109247 |
| π‘ Binance | 1104913076 |
Your support helps keep CrashSense free and continuously improving!
Your contributions directly support:
- π§ Development Time: New features and bug fixes
- π» Infrastructure: Server costs and CI/CD
- π Documentation: Better guides and tutorials
- π§ͺ Testing: Comprehensive test coverage
- π Community: Events and workshops in Tunisia
- π Education: Free coding workshops for students
- Habib Bourguiba - Inspiration for the project name
- Ollama Team - Amazing local AI inference
- Rich Library - Beautiful terminal interfaces
- Tunisian Developer Community - Continuous support and feedback
This project was created with pride in Tunisia, inspired by our rich history of innovation and technological advancement. We dedicate this work to all Tunisian developers pushing the boundaries of technology.
- π§ Email: azizbahloul3@gmail.com
"The best way to predict the future is to create it" - Habib Bourguiba