Skip to content

jamshaid-develop/ai-python-assistant

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🐍 AI Python Code Assistant

An intelligent Python coding assistant powered by Groq API with multiple specialized LLM models. Fix errors, improve code quality, generate new code from prompts, and get plain-English explanations — all in one sleek dark UI.

Streamlit App Python 3.9+ Groq API


📸 Features

Feature Description Model Used
🔧 Fix Error Diagnose bugs + get corrected code llama-3.3-70b-versatile
Improve Code Optimize, refactor, apply best practices openai/gpt-oss-20b
Generate Code Describe → clarify → get complete code llama-3.1-8b-instant
📖 Explain Code Step-by-step plain-English breakdown openai/gpt-oss-20b

🏗️ Architecture

Python_code_assistant/
│
├── app.py                    # Entry point — Streamlit UI + page routing
│
├── modules/                   # Core feature modules
│   ├── __init__.py
│   ├── groq_client.py         # Groq API client + model definitions
│   ├── prompt_engineering.py  # All prompts (system + user)
│   ├── error_fixer.py         # Fix Error feature logic
│   ├── code_improver.py       # Improve Code feature logic
│   ├── code_generator.py      # Generate Code (2-stage: clarify → generate)
│   └── code_explainer.py      # Explain Code feature logic
│
├── utils/                     # Helper utilities
│   ├── __init__.py
│   ├── helpers.py             # Session state, history, model info
│   └── formatting.py         # Styled UI components (boxes, badges)
│
├── .streamlit/
│   └── secrets.toml           # API key (local only, gitignored)
│
├── requirements.txt           # Dependencies
├── .gitignore                 # Git exclusions
└── README.md                  # This file

Data Flow

User Input
    │
    ▼
app.py (routes to correct page)
    │
    ▼
modules/[feature].py (validates input, calls API)
    │
    ├── modules/prompt_engineering.py (builds prompt)
    │
    └── modules/groq_client.py (calls Groq API)
                    │
                    ▼
              Groq LLM Response
                    │
                    ▼
         utils/formatting.py (renders UI)
                    │
                    ▼
              User sees output

Model Selection Rationale

Model Task Why
llama-3.3-70b-versatile Error Fixing Largest model — best at complex logic debugging
llama-3.1-8b-instant Code Generation 32K context — handles long generation tasks
openai/gpt-oss-20b Explain + Improve Fast, instruction-tuned — ideal for clear explanations

⚙️ Setup in PyCharm

Step 1: Clone or create the project

# Option A: Clone from GitHub
git clone  https://github.com/jamshaid-develop/ai-python-assistant.git
cd ai_python_assistant

# Option B: Create project folder in PyCharm
# File → New Project → name it Python_code_assistant

Step 2: Create a virtual environment

# In PyCharm terminal (bottom panel):
python -m venv venv

# Activate it:
# Windows:
venv\Scripts\activate

# Mac/Linux:
source venv/bin/activate

Step 3: Install dependencies

pip install -r requirements.txt

Step 4: Add your Groq API key

  1. Get your free API key at console.groq.com
  2. Open .streamlit/secrets.toml
  3. Replace your_groq_api_key_here with your actual key:
GROQ_API_KEY = "gsk_xxxxxxxxxxxxxxxxxxxxxxxxxxxx"

⚠️ Never commit secrets.toml to GitHub. It's already in .gitignore.

Step 5: Run the app

streamlit run app.py

The app opens automatically at http://localhost:8501


🚀 Deploy on Streamlit Cloud (Free)

Step 1: Push to GitHub

# Initialize git (if not already)
git init

# Add all files
git add .

# Commit
git commit -m "Initial commit: AI Python Assistant"

# Add your GitHub remote
git remote add origin  https://github.com/jamshaid-develop/ai-python-assistant.git

# Push
git branch -M main
git push -u origin main

✅ Make sure .gitignore is committed — it excludes secrets.toml automatically.

Step 2: Deploy on Streamlit Cloud

  1. Go to share.streamlit.io
  2. Click "New app"
  3. Connect your GitHub account
  4. Select your repository and set:
    • Main file path: app.py
    • Branch: main
  5. Click "Advanced settings"
  6. Under Secrets, add:
    GROQ_API_KEY = "gsk_xxxxxxxxxxxxxxxxxxxxxxxxxxxx"
  7. Click "Deploy!"

Your app will be live at https://ai-python-assistant-ihspm34axrboflhdcmbeqc.streamlit.app/ 🎉


🔑 Environment Variables

Variable Description Where to set
GROQ_API_KEY Your Groq API key .streamlit/secrets.toml (local) or Streamlit Cloud secrets

🛠️ Development

Adding a new feature

  1. Create modules/your_feature.py — add your logic function + sample inputs
  2. Add prompts to modules/prompt_engineering.py
  3. Add a new page case in app.py's radio nav options
  4. Add a history entry with add_to_history()

Changing a model

Edit modules/groq_client.py — update the MODEL_* constants at the top of the file.


📦 Dependencies

streamlit>=1.32.0   # UI framework
groq>=0.5.0         # Groq API client

No heavy ML libraries — runs on Streamlit Cloud free tier.


📄 License

MIT License — free to use, modify, and distribute.


Built with ❤️ using Python, Streamlit, and Groq API

About

This is a mini AI Assistant . It is trained for only python language and also developed by python .This assistant fix,improve ,explain your code by prompt and Also generate code without any error 100% workable code in python language by your Promt

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages