A simple AI-powered Question Answering Bot, built as part of my intern assignment. It started as a command-line app and later evolved into a Streamlit web app, deployed on Hugging Face Spaces.
I chose AI Q&A Bot because I’ve always been interested in having an AI of my own for gathering and summarizing information. Initially, I wanted to connect it to the internet for real-time answers, but for this assignment, I focused on building the foundation step by step.
- I started with OpenAI’s
openai.ChatCompletion.create()
. - Error I got:
openai.lib._old_api.APIRemovedInV1:
You tried to access openai.ChatCompletion, but this is no longer supported in openai>=1.0.0
Fix: Migrated to the new SDK (client.chat.completions.create
) instead of downgrading.
- Error with
from openai import OpenAI
. - Cause: My OpenAI library version was mismatched.
- Solution: Upgraded to OpenAI v2.0.1, reinstalled, and confirmed import worked.
- Eventually hit this error:
openai.RateLimitError: You exceeded your current quota
- My OpenAI free credits expired.
- Solution: Instead of adding billing, I switched to Hugging Face models that run locally for free.
-
Model chosen:
google/flan-t5-small
-
Why?
- Lightweight (~80MB).
- Runs on CPU (no GPU needed).
- Can handle Q&A, summarization, and general instructions.
- When using Hugging Face
pipeline
, I got:
ModuleNotFoundError: Could not import module 'pipeline'
RuntimeError: operator torchvision::nms does not exist
Fix:
- Uninstalled
torchvision
(not needed for text models). - Installed CPU-only
torch
. - Used
AutoTokenizer
+AutoModelForSeq2SeqLM
instead ofpipeline
.
Finally, I got the bot running locally! 🎉
Example Run
🤖 Welcome to the Local AI Q&A Bot (Flan-T5 Small)!
Type 'exit' to quit.
You: tell me about yourself
Bot: if you are a sexy person, you may want to know about your sexy personality.
Reflection
- The bot works, but small models sometimes give funny/off-topic answers.
- The assignment goal was effort, resourcefulness, and creativity — and I achieved that.
To go beyond the minimum, I built a simple web interface using Streamlit.
How to Run Locally
pip install streamlit
streamlit run app_streamlit.py
Reflection
- Made the bot more polished and user-friendly.
- Even though the model is small, the UI makes it easier to interact with.
To make the project accessible anywhere, I deployed it for free on Hugging Face Spaces.
Deployment Steps
- Created a Hugging Face account.
- Created a Space (Streamlit SDK).
- Uploaded
app_streamlit.py
,requirements.txt
, andREADME.md
. - Hugging Face auto-built and launched the app.
Result 👉 AI Q&A Bot on Hugging Face Spaces
- Add conversation history in the UI.
- Allow choosing between multiple models (
flan-t5-small
,distilbert
, etc.). - Connect to the internet for real-time answers.
- Fine-tune or train the model with reinforcement learning (RL) to improve accuracy.