An advanced collection of Deep Learning implementations developed during my Master's degree program. This repository demonstrates proficiency in neural networks, computer vision, sequence modeling, and generative models using TensorFlow/Keras and PyTorch.
This repository contains comprehensive implementations of state-of-the-art deep learning algorithms, from fundamental neural networks to advanced architectures like GANs and Transformers. Each notebook includes theoretical concepts, practical implementations, and real-world applications.
- DeepLab1.ipynb - Introduction to Neural Networks
- Perceptrons and activation functions
- Backpropagation algorithm
- Multi-layer perceptrons (MLP)
- Forward and backward propagation
- DeepLab2.ipynb - Optimization & Training
- Gradient descent variants (SGD, Adam, RMSprop)
- Batch normalization
- Dropout and early stopping
- Learning rate scheduling
- DeepLab3_CNN.ipynb - Computer Vision
- Convolutional layers and pooling
- Feature extraction and visualization
- Image classification tasks
- Architecture design principles
- DeepLab4_Regularization.ipynb - Preventing Overfitting
- L1/L2 regularization
- Dropout strategies
- Data augmentation
- Cross-validation techniques
- DeepLab5_TransferLearning.ipynb - Pre-trained Models
- VGG, ResNet, Inception architectures
- Fine-tuning strategies
- Feature extraction
- Domain adaptation
-
DeepLab6_Encoders.ipynb - Dimensionality Reduction
- Vanilla autoencoders
- Denoising autoencoders
- Sparse autoencoders
- Latent space representation
-
DL_7_StackedAE.ipynb - Deep Autoencoders
- Stacked autoencoder architecture
- Layer-wise pre-training
- Deep feature learning
- Reconstruction analysis
- DepLab8_GAN.ipynb - Generative Models
- Generator and discriminator networks
- Adversarial training
- Mode collapse handling
- Image synthesis
-
Deep9_LSTM_PartA.ipynb - LSTM Fundamentals
- LSTM cell architecture
- Forget, input, and output gates
- Sequence prediction
- Time series forecasting
-
Deep9_LSTM_PartB.ipynb - Advanced LSTM
- Bidirectional LSTM
- Stacked LSTM layers
- Attention mechanisms
- Text generation
- Deep10_GRU.ipynb - Simplified RNN Architecture
- GRU cell structure
- Update and reset gates
- LSTM vs GRU comparison
- Sequence modeling
- TensorFlow 2.x - Deep learning framework
- Keras - High-level neural networks API
- PyTorch - Dynamic computational graphs
- NumPy - Numerical computing
- Pandas - Data manipulation
- Matplotlib - Plotting and visualization
- Seaborn - Statistical graphics
- TensorBoard - Training visualization
- Plotly - Interactive plots
- OpenCV - Computer vision operations
- PIL/Pillow - Image processing
- scikit-learn - Preprocessing and metrics
- Google Colab - Cloud computing platform
- Activation functions (ReLU, Sigmoid, Tanh, Softmax)
- Loss functions (Cross-entropy, MSE, MAE)
- Optimization algorithms (SGD, Adam, Adagrad)
- Backpropagation and gradient descent
- Convolutional operations
- Pooling strategies (Max, Average, Global)
- Object detection and classification
- Image segmentation basics
- Time series prediction
- Natural language processing
- Sentiment analysis
- Text generation and translation
- Latent space manipulation
- Adversarial training dynamics
- Image generation and enhancement
- Style transfer concepts
- Hyperparameter tuning
- Regularization techniques
- Transfer learning strategies
- Model compression
pip install tensorflow keras torch torchvision numpy pandas matplotlib seaborn opencv-python pillow scikit-learn- Clone the repository
git clone https://github.com/Manya123-max/Deep-Learning-Algorithms-.git
cd Deep-Learning-Algorithms-- Launch Jupyter Notebook
jupyter notebook- Or open directly in Google Colab
- Each notebook is optimized for Colab
- GPU acceleration recommended for training
- Runtime β Change runtime type β GPU
DeepLab1 (Neural Networks Basics)
β
DeepLab2 (Optimization)
β
DeepLab3 (CNNs)
β
DeepLab4 (Regularization)
β
DeepLab5 (Transfer Learning)
β
DeepLab6 & DL_7 (Autoencoders)
β
DepLab8 (GANs)
β
Deep9 (LSTM) β Deep10 (GRU)
- Implemented custom CNN architectures for image classification
- Achieved 90%+ accuracy on benchmark datasets
- Explored different filter sizes and pooling strategies
- Fine-tuned pre-trained models (VGG16, ResNet50)
- Demonstrated significant reduction in training time
- Applied to custom datasets with limited data
- Built generator and discriminator from scratch
- Explored training dynamics and stability
- Generated synthetic images
- Implemented bidirectional LSTM for sentiment analysis
- Compared LSTM vs GRU performance
- Applied to time series forecasting
- Created stacked autoencoders for feature learning
- Implemented denoising autoencoders
- Visualized latent space representations
- Computer Vision: Image classification, object detection
- Natural Language Processing: Sentiment analysis, text generation
- Time Series: Stock price prediction, weather forecasting
- Generative AI: Image synthesis, style transfer
- Anomaly Detection: Fraud detection using autoencoders
- Recommendation Systems: Collaborative filtering with neural networks
Deep-Learning-Algorithms-/
β
βββ Fundamentals/
β βββ DeepLab1.ipynb (Neural Networks)
β βββ DeepLab2.ipynb (Optimization)
β
βββ Computer Vision/
β βββ DeepLab3_CNN.ipynb
β βββ DeepLab4_Regularization.ipynb
β βββ DeepLab5_TransferLearning.ipynb
β
βββ Generative Models/
β βββ DeepLab6_Encoders.ipynb
β βββ DL_7_StackedAE.ipynb
β βββ DepLab8_GAN.ipynb
β
βββ Sequence Models/
β βββ Deep9_LSTM_PartA.ipynb
β βββ Deep9_LSTM_PartB.ipynb
β βββ Deep10_GRU.ipynb
β
βββ README.md
Each notebook includes:
- β Theoretical background and mathematics
- β Step-by-step implementation
- β Visualization of results
- β Performance metrics and evaluation
- β Hyperparameter tuning experiments
- β Comparative analysis
- Deep understanding of neural network architectures
- Proficiency in TensorFlow and Keras
- Ability to implement research papers
- Strong debugging and optimization skills
- Experience with GPU-accelerated training
- Knowledge of best practices in deep learning
- Capability to work with large-scale datasets