This project demonstrates how to serve a trained Iris species classifier with a FastAPI backend and an interactive Streamlit UI, all containerized with Docker & Docker Compose.
uvicorn main:app --reloadOpen API Docs: http://127.0.0.1:8000/docs
docker build -t myfastapi:dev .
docker run --rm -p 8000:8000 myfastapi:devdocker compose up --build
docker compose downTo run in background (detached mode):
docker compose up --build -d- API Docs: http://localhost:8000/docs
- Health Check: http://localhost:8000/health
- Streamlit UI: http://localhost:8501
docker compose exec api python IrisWorking/train_iris.py- FastAPI → API backend
- Streamlit → Frontend UI
- scikit-learn → Machine Learning model
- Docker + Docker Compose → Containerization & orchestration
✨ With this setup, you can interact with the Iris ML model via Swagger Docs or use the Streamlit web UI to visualize predictions.





