Using a logistic regression model to predict a person's chance of suffering from a heart attack using their age, sex, resting blood pressure, cholestrol levels, blood sugar levels, maximum heart rates, and chest pain type.
The model is capable of predicting a "low chance" or "high chance" of a heart attack for a patient given the results of their clinic reports.
Three models were trained for this project i.e., logistic regression, k nearest neighbors, and support vector machines. Their hyperparameters can be changed using the 'params.yaml' file.
# logistic regression
lr:
solver: ['lbfgs', 'liblinear', 'newton-cg']
max_iter: [25, 50, 100]
# k-nearest neighbors
knn:
neighbors: [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
# support vector machines
svm:
C: [0.05, 0.1, 0.5, 1, 10]
kernel: ['poly', 'rbf', 'sigmoid']
For hyperparamter tuning grid search cv was used with svm and logistic regression. To find an optimal k for KNN, the elbow plot method was used.
Use the jupyter notebook inside notebook directory. Latest versions of trained models will automatically be saved inside the models directory.
Just run the following script from your terminal
$ uvicorn app:app
Go to the link displayed in terminal and open the swagger UI of fastapi. You'll see something like this.
Just build the container using the following script on terminal. Replace my-app-name with a name that you'd like for your image.
$ docker build -t my-app-name .
And once the image is built just run a container using
$ docker run -p 8000:8000 my-app-name
You'll see this on terminal. Just open the link and you'll find that the app is running inside a docker container.
The best logistic regression model had an accuracy of 84% which is pretty good given the small size of this dataset. The knn model and svm model had an accuracy close to 82% as well.
Using the fastapi swagger UI, we can feed the model input values to get a prediction. Under /predict click on try it out and feed the input values and hit execute. The result will be displayed below.
The dataset can be downloaded from here.