Skip to content

Atish019/SmartCode-Assistant

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SmartCode Assistant

A Python-based AI coding assistant that provides intelligent code suggestions and programming help through a user-friendly Gradio web interface. The application connects to a local Ollama server running the "codeguru" model to generate contextual responses for coding queries.

Features

  • AI-Powered Code Assistant: Leverages the "codeguru" model for intelligent coding suggestions
  • Conversation History: Maintains context across multiple interactions
  • Web Interface: Clean and intuitive Gradio-based UI
  • Real-time Responses: Instant feedback for your coding queries
  • Local Processing: Runs entirely on your local machine for privacy

Prerequisites

Before running the SmartCode Assistant, ensure you have the following installed:

Required Software

Required Python Packages

  • requests - For HTTP API calls
  • gradio - For the web interface
  • json - For data serialization (built-in)

Installation

  1. Clone or download the project files

    git clone <git remote add origin https://github.com/Atish019/SmartCode-Assistant.git>
    cd SmartCode-Assistant
  2. Install Python dependencies

    pip install -r requirements.txt

    Or install manually:

    pip install requests gradio
  3. Install and setup Ollama

    • Download and install Ollama from ollama.ai
    • Pull the codeguru model:
      ollama pull codeguru

Usage

  1. Start the Ollama server

    ollama serve

    This will start the Ollama API server on http://localhost:11434

  2. Run the SmartCode Assistant

    python app.py
  3. Access the web interface

    • The Gradio interface will launch automatically
    • Open your browser and navigate to the provided local URL (typically http://127.0.0.1:7860)
  4. Start coding with AI assistance

    • Enter your coding questions, problems, or requests in the text area
    • Click submit to get AI-generated responses
    • The assistant maintains conversation history for better context

Project Structure

SmartCode-Assistant/
├── venv/                   # Virtual environment (optional)
├── app.py                  # Main application file
├── modelfile              # Ollama model configuration
├── requirements.txt       # Python dependencies
└── README.md             # This file

Configuration

API Endpoint

The application is configured to connect to Ollama at http://localhost:11434/api/generate. If your Ollama server runs on a different port or address, modify the url variable in app.py:

url = "http://your-ollama-host:port/api/generate"

Example Queries

  • "How do I implement a binary search algorithm in Python?"
  • "Explain the difference between lists and tuples"
  • "Write a function to validate email addresses"
  • "Help me debug this sorting algorithm"
  • "What's the best way to handle exceptions in Python?"

Troubleshooting

Common Issues

Error Messages

  • "error: ": Check the console output for detailed error information
  • HTTP 404: Verify the Ollama API endpoint is correct
  • HTTP 500: Check if the specified model exists and is accessible

Happy Coding with SmartCode-Assistant!

About

CodeGuru AI is an intelligent, AI-powered multi-language coding assistant built using CodeLlama, LangChain, and Gradio.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages