Skip to content

Working example for serving a ML model using FastAPI and Celery

Notifications You must be signed in to change notification settings

srikanth-IQA/ServingMLFastCelery

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ServingMLFastCelery

Working example for serving a ML model using FastAPI and Celery.

Usage

Install requirements:

pip install -r requirements.txt

Set environment variables:

  • MODEL_PATH: Path to pickled machine learning model
  • BROKER_URI: Message broker to be used by Celery e.g. RabbitMQ
  • BACKEND_URI: Celery backend e.g. Redis
export MODEL_PATH=...
export BROKER_URI=...
export BACKEND_URI=...

Start API:

uvicorn app:app

Start worker node:

celery -A celery_task_app:worker worker -l info

About

Working example for serving a ML model using FastAPI and Celery

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 95.2%
  • Python 4.8%