Skip to content

At-Walid/clusterxplain

Repository files navigation

ClusterXplain: Clustering-based Pipelines for DNN Explanation

This project provides a tool to explain Deep Neural Network (DNN) errors by clustering failure-inducing inputs and generating insights into model weaknesses. It leverages transfer learning, clustering algorithms, and dimensionality reduction techniques.


Features

  • Transfer learning-based feature extraction using pre-trained models.
  • Clustering using algorithms like DBSCAN, K-means, and HDBSCAN.
  • Dimensionality reduction for visualization (PCA, UMAP).
  • Interactive web-based interface powered by Gradio.

Authors


Requirements

  • Python 3.7.x
  • Docker.
  • Compatible operating systems: macOS, Linux, Windows.

Project Files

The project directory structure is as follows:

project-folder/
├── app/
│   ├── run_tool.py        # Main script to run the application
│   ├── functionsTL.py     # Core functions for feature extraction and clustering
│   ├──runClustering.py    # Clustering algorithms
│   ├──runFE.py            # Feature-extraction methods
├── Dockerfile             # Docker configuration file for containerized deployment
├── requirements.txt       # List of Python dependencies
├── README.md              # Project documentation

File Descriptions

  • app/run_tool.py: The main entry point of the application. It initializes the Gradio interface and runs the app.
  • app/functionsTL.py: Contains functions for transfer learning, clustering, and dimensionality reduction.
  • Dockerfile: Configuration file to build the Docker container for the application.
  • requirements.txt: Specifies the Python packages required for the project.
  • README.md: Documentation for the project, including setup instructions and troubleshooting.

Setup Instructions

Option 1: Using Docker with Pre-built Image (.tar file)

Step 1: Download the Pre-built Docker Image

Download the docker-image-pipeline-tool.tar file from Zenodo.
The Docker image was built on amd64 architecture.

Step 2: Load the Docker Image

  1. Load the Docker image into your Docker environment:
    docker load < path/to/pipelines-tool.tar

Step 3: Run the Docker Container

  1. Start the application with the following command:
    docker run -p 7860:7860 pipelines-tool

Step 4: Access the Web Interface

  1. Open your browser and navigate to:
     http://0.0.0.0:7860 or http://localhost:7860/
    

Option 1: Using Docker with Pre-built Image (.tar file)

Step 1: Download the zenodo folder

Download the repository and point to the "pipelines-tool" using CMD

Step 2: Load the Docker Image

  1. Build the Docker image:
    docker build -t pipeline-tool-latest .

Step 3: Run the Docker Container

  1. Start the application with the following command:
    docker run -p 7860:7860 pipelines-tool

Step 4: Access the Web Interface

  1. Open your browser and navigate to:
    http://0.0.0.0:7860 or http://localhost:7860/
    

Option 3: Setting up from Scratch with Python 3.7

Step 1: Clone the Repository

  1. Download the repository from Zenodo

Step 2: Install Python 3.7

  1. Install Python 3.7 if it’s not already available. For Ubuntu:
    sudo apt update
    sudo apt install python3.7 python3.7-venv python3.7-dev

Step 3: Set Up a Virtual Environment

  1. Create and activate a virtual environment:
    python3.7 -m venv env
    source env/bin/activate  # For macOS/Linux
    .\env\Scripts\activate   # For Windows

Step 4: Install Dependencies

  1. Install the required Python packages:
    pip install -r requirements.txt

Step 5: Run the Application

  1. Start the Gradio application:
    python app/run_tool.py

Step 6: Access the Web Interface

  1. Open your browser and navigate to:
    http://0.0.0.0:7860 or http://localhost:7860/
    

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages