Skip to content

JakesShell/AI-Model-Performance-Dashboard

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 

Repository files navigation

AI Model Performance Dashboard

Overview

AI Model Performance Dashboard is a Streamlit-based analytics project for reviewing and comparing AI model evaluation metrics in an interactive browser interface.

This project is positioned as a recruiter-ready data visualization and machine learning reporting portfolio piece. It allows users to compare model accuracy, precision, recall, and F1 score through a clean dashboard experience with metric cards, charts, and summary tables.

Real-World Business Use Case

This project maps to practical analytics and machine learning workflows used by:

  • Data Analysts
  • Machine Learning Engineers
  • AI Product Teams
  • Technical Stakeholders Reviewing Model Quality
  • Students Building AI And Data Portfolios

A team may need to answer questions such as:

  • Which model performs best overall?
  • How do precision and recall differ across models?
  • Which metric should be prioritized for a given use case?
  • How can model performance be presented clearly to decision-makers?

This dashboard is useful for model review, reporting, portfolio presentation, and lightweight experiment comparison.

Key Features

  • Interactive Model Filter
  • Metric Selection Sidebar
  • Metric Summary Cards
  • Cross-Model Comparison Chart
  • Selected Model Detail Table
  • Overall Performance Table
  • Accuracy Snapshot Chart
  • Key Observation Summary

Tech Stack

  • Python
  • Streamlit
  • Pandas
  • Matplotlib

Repository Contents

  • Dashboard.py
  • requirements.txt
  • README.md
  • .gitignore

How To Run The Dashboard

1. Create And Activate A Virtual Environment

python -m venv .venv
.\.venv\Scripts\Activate.ps1

About

Streamlit dashboard for comparing AI model evaluation metrics such as accuracy, precision, recall, and F1 score.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages