- 📒 Table of Contents
- 📍 Overview
- 📂 Project Structure
- 🔎 Details of Codes
- 🚀 Getting Started
- 🤝 Collaborators
This project is one of the Computational Intelligence course projects in the spring of 2023, and it includes code related to training neural networks with gradient descent, training neural network using neuroevolution, Neural Architecture Search (NAS), and Self-Organizing Maps (SOM). The core functionalities of the project are image classification tasks using the CIFAR-10 dataset. The purpose of the project is to explore and compare different approaches and techniques for improving the accuracy and efficiency of image classification models.
Gradient descent is an optimization algorithm commonly used for training neural networks. It iteratively adjusts the parameters of the neural network (such as weights and biases) to minimize a defined loss function. By computing gradients and updating the parameters in the direction of steepest descent, gradient descent helps the neural network gradually improve its performance over time.
Neuro-Evolution combines neural networks and evolutionary algorithms to optimize the parameters (such as weights and biases). It involves evolving a population of neural networks through processes such as mutation, crossover, and selection, similar to how genetic algorithms work in evolutionary computation for improved performance on image classification tasks.
Neural architecture search is a technique used to automatically discover the architecture of a neural network that performs well on a given task. It involves searching through a large space of possible network architectures with evolutionary algorithms to find the most suitable one.
Self-Organizing Maps (SOM) is an unsupervised learning algorithm used for clustering and visualization of high-dimensional data. It maps the input data onto a lower-dimensional grid, preserving the topological relationships between the data points.
Before you begin, ensure that you have the packages in requirements.txt
installed.
- Clone the CIFAR_10_Image_Classification repository:
git clone https://github.com/kianmajl/CIFAR_10_Image_Classification.git
- Change to the project directory:
cd CIFAR_10_Image_Classification
- Install the dependencies:
pip install -r ./Codes/requirements.txt
Now you can train neural network with gradient descent or neuro evolution, find the best architecture for neural network with NAS, and classify images with the method you want.