A collection of standalone TensorFlow and PyTorch models in Jupyter Notebooks.
- Perceptron [TensorFlow] [PyTorch 0.4]
- Logistic Regression [TensorFlow] [PyTorch 0.4]
- Softmax Regression (Multinomial Logistic Regression) [TensorFlow] [PyTorch 0.4]
- Multilayer Perceptron [TensorFlow] [PyTorch 0.4]
- Multilayer Perceptron with Dropout [TensorFlow] [PyTorch 0.4]
- Multilayer Perceptron with Batch Normalization [TensorFlow] [PyTorch 0.4]
- Multilayer Perceptron with Backpropagation from Scratch [TensorFlow]
- Convolutional Neural Network [TensorFlow] [PyTorch 0.4]
- Convolutional Neural Network with He Initialization [TensorFlow] [PyTorch 0.4]
- Convolutional Neural Network VGG-16 [TensorFlow] [PyTorch 0.4]
- Convolutional ResNet and Residual Blocks [PyTorch 0.4]
- Siamese Network with Multilayer Perceptrons [TensorFlow]
- Autoencoder [TensorFlow] [PyTorch 0.4]
- Convolutional Autoencoder with Deconvolutions [TensorFlow] [PyTorch 0.4]
- Convolutional Autoencoder with Deconvolutions (without pooling operations) [PyTorch 0.4]
- Convolutional Autoencoder with Nearest-neighbor Interpolation [TensorFlow] [PyTorch 0.4]
- Convolutional Autoencoder with Nearest-neighbor Interpolation -- Trained on CelebA [PyTorch 0.4]
- Variational Autoencoder [PyTorch 0.4]
- General Adversarial Networks [TensorFlow]
- Convolutional General Adversarial Networks [TensorFlow]
- Using PyTorch Dataset Loading Utilities for Custom Datasets -- CSV files converted to HDF5
- Using PyTorch Dataset Loading Utilities for Custom Datasets -- Face Images from CelebA
- Getting Gradients of an Intermediate Variable in PyTorch
- Saving and Loading Trained Models -- from TensorFlow Checkpoint Files and NumPy NPZ Archives
- Chunking an Image Dataset for Minibatch Training using NumPy NPZ Archives
- Storing an Image Dataset for Minibatch Training using HDF5
- Using Input Pipelines to Read Data from TFRecords Files
- Using Queue Runners to Feed Images Directly from Disk
- Using TensorFlow's Dataset API