Sparse Neural Network on CIFAR-10 "Efficient Training of Sparse Neural Networks" is a deep learning project that explores Sparse Evolutionary Training (SET) for convolutional neural networks (CNNs) using the CIFAR-10 dataset. The goal is to improve performance and reduce computation by keeping only the most important connections in the network.
๐ Features ๐ฑ Sparse Evolutionary Training (SET) Applies SET to reduce the number of connections in the neural network while preserving or improving accuracy.
โก Lightweight Architecture Introduces adaptive pruning and regrowth of weights to maintain a sparse yet efficient network during training.
๐ Accuracy & F1 Score Tracking Tracks accuracy, loss, F1-score, and class-wise performance using confusion matrices and classification reports.
๐ Visualization Plots graphs like:
Accuracy vs Sparsity
Confusion Matrix
Class-wise Accuracy Bar Chart
๐ง Transfer Learning Friendly Starts training from a pre-trained base CNN model and applies sparsity on top of it.
๐ ๏ธ Use Cases ๐ Research on model compression & sparsity
โ๏ธ Deploying efficient models to resource-constrained environments
๐ Comparing full vs sparse networks on standard datasets
๐งพ Dataset ๐ CIFAR-10 60,000 images in 10 classes
50,000 training and 10,000 testing images
Used with standard normalization and batch loading