I developed this end-to-end MLOps project which uses modern MLOps approaches and tech stack. The primary objective of it is to make the end user able to get the churn prediction via an endpoint when posting its required data while Data Science team can monitor the model on a particularly developed app serving on another endpoint.
The project consists of two different parts in the first part of which includes the whole model development and deployment process done in Python, while the second part is a Shiny dashboard developed in R to monitor the model drift and other aspects of it.
Note: Each service is deployed in a docker container
- POST request is sent to an API of the running project on an AWS instance
- Prediction is evaluated on the same endpoint.
- All relevant data is written to mysql database running on the same instance.
- Monitoring application running on another instance connects to mysql database, fetches the live data and demonstrates simultaneously.
- Docker
- Linux Cmd
- Git
- AWS
- S3
- EC2
- Python
- FastAPI
- sqlalchemy
- boto3
- pydantic
- sklearn
- R
- shinydashboard
- tidyverse
- RMySQL
- highcharter
- formattable