My extensive work on Multiclass Image classification based on Intel image classification dataset from Kaggle and Implemented using Pytorch 🔦
-
Updated
Dec 12, 2021 - Jupyter Notebook
My extensive work on Multiclass Image classification based on Intel image classification dataset from Kaggle and Implemented using Pytorch 🔦
• Designed deep residual learning model with exponential linear unit for image classification with higher accuracy. • Decreased the error rate to 5.62% and 26.55% on CIFAR-10 and CIFAR-100 datasets respectively which outpaced the most competitive approaches previously published. • Published research paper for the same on 21th Sept 2016 at ACM co…
• Trained the network for MNIST dataset • Implemented neural network on MNIST dataset by using Sigmoid, ReLU, ELU as the activation function. • Analyzed network’s running time, error rate, efficiency and accuracy.
Train car on a known track to generate dataset which include steering angle and view of car from 3 different angles. Use this dataset to drive car on an unknown track. And also learn to identify 43 different traffic signals using existing dataset.
"The 'Activation Functions' project repository contains implementations of various activation functions commonly used in neural networks. "
Using advanced deep learning techniques on the MNIST dataset. Over 98% validation set accuracy.
Deep Learning concepts practice using Cifar-10 dataset
Deep Learning concepts practice using Cifar-10 dataset
Add a description, image, and links to the elu-activation topic page so that developers can more easily learn about it.
To associate your repository with the elu-activation topic, visit your repo's landing page and select "manage topics."