Skip to content

Walks through the entire modern deep learning landscape, from perceptrons to generative models to RLHF — and a pdf that serves as a reference companion to the code.

License

Notifications You must be signed in to change notification settings

scorchinghot/Core-Deep-Learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Core Deep Learning

A complete hands-on journey through the foundations, architectures, and systems that power modern AI and AGI research.

This repo contains both:

  • 📘 A comprehensive conceptual guide (math_and_explanations.pdf)
  • 💻 Interactive PyTorch notebooks implementing every stage, from perceptrons to genrative model.

Overview

This project is structured around the 7 Core Deep Learning Topics.

# Topic Focus
1️⃣ Foundations Perceptron, MLP, activations, loss functions, optimization
2️⃣ Convolutional Neural Networks (CNNs) Convolutions, pooling, feature hierarchies
3️⃣ Sequence Models RNNs, LSTMs, GRUs, embeddings (pre-Transformer)
4️⃣ Transformers Self-attention, positional encoding, encoder vs decoder, pretraining
5️⃣ Embeddings & Representation Learning Autoencoders, VAEs, contrastive learning (SimCLR, CLIP)
6️⃣ Generative Models GANs, VAEs, Diffusion Models (Stable Diffusion, Imagen)
7️⃣ Reinforcement Learning Q-learning, Policy Gradients, PPO, RLHF

What’s Inside

/core_DL.ipynb

Hands-on Jupyter notebook covering:

  • MLP, CNN, RNN, Transformer demos
  • Autoencoders, VAEs, GANs,

includes:

  • Math → Code mapping
  • Visualizations
  • Practical datasets (MNIST, CIFAR, IMDB, CartPole)

/math_and_explanations.pdf

A 22-page conceptual guide with diagrams, equations, and intuition behind each topic.
Perfect as a quick reference or for revision.

Covers theory from scratch — activation functions, convolutions, backprop, self-attention, scaling laws, and beyond.


Great for:

  • Understanding how every deep learning model works (mathematically and intuitively)
  • Being able to implement and train them from scratch in PyTorch
  • Having a complete, practical foundation for Deep learning

Setup

# Clone the repo
git clone https://github.com/scorchinghot/core_DL.ipynb.git
cd core_DL.ipynb

# Install dependencies
pip install -r requirements.txt

Common packages:

torch torchvision torchaudio
transformers datasets
matplotlib

Running the Notebooks

  1. Open any notebook under /core_DL.ipynb
  2. Run cells step-by-step (recommended order: 1 → 7)
  3. View generated samples, loss curves, and embeddings

If you have a GPU:

export CUDA_VISIBLE_DEVICES=0

or in notebook:

device = 'cuda' if torch.cuda.is_available() else 'cpu'

References & Inspiration

  • CS231n: Convolutional Neural Networks for Visual Recognition
  • CS224n: Natural Language Processing with Deep Learning
  • Deep Learning — Goodfellow, Bengio, Courville
  • Spinning Up in Deep RL — OpenAI
  • Illustrated Transformer — Jay Alammar
  • Lil’Log — Lilian Weng (https://lilianweng.github.io)

License

MIT License © 2025 Scorchinghot Use freely for learning and teaching. Attribution appreciated.

About

Walks through the entire modern deep learning landscape, from perceptrons to generative models to RLHF — and a pdf that serves as a reference companion to the code.

Topics

Resources

License

Stars

Watchers

Forks