Skip to content

Kartik-001/ai-qa-bot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🚀 AI Q&A Bot

A simple AI-powered Question Answering Bot, built as part of my intern assignment. It started as a command-line app and later evolved into a Streamlit web app, deployed on Hugging Face Spaces.


📌 Why I Chose This Project

I chose AI Q&A Bot because I’ve always been interested in having an AI of my own for gathering and summarizing information. Initially, I wanted to connect it to the internet for real-time answers, but for this assignment, I focused on building the foundation step by step.


🛠️ Development Journey

Step 1: First Run Attempt

  • I started with OpenAI’s openai.ChatCompletion.create().
  • Error I got:
openai.lib._old_api.APIRemovedInV1:
You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0

Fix: Migrated to the new SDK (client.chat.completions.create) instead of downgrading.


Step 2: Fixing ImportError

  • Error with from openai import OpenAI.
  • Cause: My OpenAI library version was mismatched.
  • Solution: Upgraded to OpenAI v2.0.1, reinstalled, and confirmed import worked.

Step 3: Quota Issue

  • Eventually hit this error:
openai.RateLimitError: You exceeded your current quota
  • My OpenAI free credits expired.
  • Solution: Instead of adding billing, I switched to Hugging Face models that run locally for free.

Step 4: Switching to Hugging Face

  • Model chosen: google/flan-t5-small

  • Why?

    • Lightweight (~80MB).
    • Runs on CPU (no GPU needed).
    • Can handle Q&A, summarization, and general instructions.

Step 5: Transformer Import Error

  • When using Hugging Face pipeline, I got:
ModuleNotFoundError: Could not import module 'pipeline'
RuntimeError: operator torchvision::nms does not exist

Fix:

  • Uninstalled torchvision (not needed for text models).
  • Installed CPU-only torch.
  • Used AutoTokenizer + AutoModelForSeq2SeqLM instead of pipeline.

Step 6: Minimum Working Bot

Finally, I got the bot running locally! 🎉

Example Run

🤖 Welcome to the Local AI Q&A Bot (Flan-T5 Small)!
Type 'exit' to quit.

You: tell me about yourself
Bot: if you are a sexy person, you may want to know about your sexy personality.

Reflection

  • The bot works, but small models sometimes give funny/off-topic answers.
  • The assignment goal was effort, resourcefulness, and creativity — and I achieved that.

Step 7: Adding a Streamlit UI

To go beyond the minimum, I built a simple web interface using Streamlit.

How to Run Locally

pip install streamlit
streamlit run app_streamlit.py

Screenshot UI Screenshot UI Screenshot

Reflection

  • Made the bot more polished and user-friendly.
  • Even though the model is small, the UI makes it easier to interact with.

Step 8: Deployment on Hugging Face Spaces

To make the project accessible anywhere, I deployed it for free on Hugging Face Spaces.

Deployment Steps

  1. Created a Hugging Face account.
  2. Created a Space (Streamlit SDK).
  3. Uploaded app_streamlit.py, requirements.txt, and README.md.
  4. Hugging Face auto-built and launched the app.

Result 👉 AI Q&A Bot on Hugging Face Spaces


⚡ Future Improvements

  • Add conversation history in the UI.
  • Allow choosing between multiple models (flan-t5-small, distilbert, etc.).
  • Connect to the internet for real-time answers.
  • Fine-tune or train the model with reinforcement learning (RL) to improve accuracy.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published