You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A Bachelor's Thesis project analyzing and comparing classifiers for breast cancer detection using fine needle aspiration biopsies. Includes Jupyter Notebooks for model training and evaluation, and a LaTeX document detailing the methodology and results. Features SHAP for explainable AI analysis.
Python/Jupyter Notebook to my Bachelor-Thesis in Computer Science. Explains contributions of features that are not part of a Machine Learning model by using Transfer Learning and Shapley Values/SHAP.
This repository introduces different Explainable AI approaches and demonstrates how they can be implemented with PyTorch and torchvision. Used approaches are Class Activation Mappings, LIMA and SHapley Additive exPlanations.
This repository contains pre-requisite notebooks of Machine Learning Explainability Course from Kaggle for my internship as a Machine Learning Application Developer at Technocolabs.
This notebook is ispired by the AIX360 HELOC Credit Approval Tutorial, which shows different explainability methods for a credit approval process. Here XGBoost is used for classification, achieving better accuracy than most of the models used in that notebook. Then, feature importance methods are shown, to be compared with the Data Scientist exp…