Skip to content

Ashish2599/Neural_Network_Sparse

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

2 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

Sparse Neural Network on CIFAR-10 "Efficient Training of Sparse Neural Networks" is a deep learning project that explores Sparse Evolutionary Training (SET) for convolutional neural networks (CNNs) using the CIFAR-10 dataset. The goal is to improve performance and reduce computation by keeping only the most important connections in the network.

๐Ÿš€ Features ๐ŸŒฑ Sparse Evolutionary Training (SET) Applies SET to reduce the number of connections in the neural network while preserving or improving accuracy.

โšก Lightweight Architecture Introduces adaptive pruning and regrowth of weights to maintain a sparse yet efficient network during training.

๐Ÿ“‰ Accuracy & F1 Score Tracking Tracks accuracy, loss, F1-score, and class-wise performance using confusion matrices and classification reports.

๐Ÿ“Š Visualization Plots graphs like:

Accuracy vs Sparsity

Confusion Matrix

Class-wise Accuracy Bar Chart

๐Ÿง  Transfer Learning Friendly Starts training from a pre-trained base CNN model and applies sparsity on top of it.

๐Ÿ› ๏ธ Use Cases ๐Ÿš€ Research on model compression & sparsity

โš™๏ธ Deploying efficient models to resource-constrained environments

๐Ÿ“ˆ Comparing full vs sparse networks on standard datasets

๐Ÿงพ Dataset ๐Ÿ“š CIFAR-10 60,000 images in 10 classes

50,000 training and 10,000 testing images

Used with standard normalization and batch loading

About

Training neural networks with sparse connections on CIFAR-10.")

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages