Skip to content

Latest commit

 

History

History
47 lines (46 loc) · 11.3 KB

File metadata and controls

47 lines (46 loc) · 11.3 KB
# File name Description
1 1D_tensors Basic operations on 1D tensors
2 2D_tensors Basic operations on 2D tensors
3 Derivatives Derivatives in Pytorch
4 Toy_dataset Creating a toy dataset in Pytorch, compose and perform transformations on it
5 Datasets_and_transforms Build an image dataset object and perform pre-build transformations using torchvision.transforms on it
6 MNIST_data_&_transforms How to use pre-built MNIST dataset and perform transformations on it
7 Regression_prediction Make predictions for multiple 1D inputs using linear class
8 1D_Linear_regression_1_parameter Create linear regression model using 1 parameter, cost/criterion function using MSE, and plot parameters as well as loss values
9 1D_Linear_regression_2_parameters 1D Linear regression model using 2 parameters (w and b). Visualize the data space and the parameter space during training via batch gradient descent
10 Stochastic_gradient_descent 1D Linear regression using stochastic gradient descent
11 Mini_batch_gradient_descent 1D Linear regression using mini-batch gradient descent. This code also includes comparison between batch, stochastic and mini-batch gradient descent with different batch sizes
12 Mini_batch_gradient_descent2 1D Linear regression using PyTorch build-in functions
13 Models_with_different_LR 1D Linear regression with different learning rates and view results such as training and validation losses at different LR
14 Multiple_linear_regression_prediction Multiple linear regression prediction (preparing forward propagation with 1xn tensor input)
15 Multiple_linear_regression_training Multiple linear regression training with input of 1xn tensor
16 Multi_target_linear_regression Multiple target linear regression prediction (forward propagation)
17 training_multiple_output_linear_regression pytorch build in functions to train multiple target linear regression
18 logistic_regression_prediction Prediction using sigmoid/logistic function
19 logistic_regression_prediction Illustration of poor performance of logistic regression via bad parameters initialization
20 Softmax_in_1D Building a Softmax classifier in 1D
21 predicting_MNIST_using_Softmax Classify handwritten digits from the MNIST database by using Softmax classifier and visualize parameters learned for each class following model training
22 simpleNN_1hiddenlayer Simple Neural Network with 1 hidden layer
23 NN_more_hidden_neurons Neural Networks with 1 hidden layer (more neurons)
24 Neural_networks Building a neural network with 1 hidden layer to classify noisy XOR data
25 1_layer_neural_network_MNIST Neural networks with 1 hidden layer to classify MNIST data
26 Activation_function How to apply different Activation functions in Neural Network
27 Different_activations_on_neural_network Apply different activation functions in Neural Network on the MNIST dataset
28 Deep_Neural_Networks Deep Neural Networks with 2 hidden layers on the MNIST dataset
29 Deeper_Neural_Networks Deep Neural Networks with 3 hidden layers using nn.ModuleList()
30 Dropout_prediction Deep Neural Networks with dropout for classification
31 Dropout_regression Using dropout in regression
32 Weight_initialization Performance of neural networks with constant (w = 1) vs. default weight initialization
33 Xavier_initialization Performance of neural networks with uniform, default, and Xavier initialization
34 He_initialization Performance of neural networks with uniform, default, and He initialization
35 MomentumwithPolynomialFunctions Use of momentum in the model optimization
36 MomentumwithPolynomialFunctions Neural networks model optimization with different momentum values
37 BachNorm Comparison of neural networks with and without batch normalization
38 Convolution Convolution on an image and estimate the output size using kernel of K size
39 MaxPooling Application of activation function and max pooling
40 Multiple_Channel_Convolution Convolutions using multiple input and output channels
41 ConvolutionalNeuralNetworkexample Example of convolutional neural network
42 CNN_Small_Image Build CNN using small MNIST images, visualize learned parameters and plot testing image activations after each layer. The code also plot the mis-classified samples
43 CNN_Small_Image_batch Compare a CNN using batch normalization with a regular CNN to classify handwritten digits from the MNIST database