- Linear Algebra, part one: Using PyTorch for doing some calculations in the field of Linear Algebra; becoming familar with Tensor and how to use it to define vectors and matrices.
- Doing math: Plotting some math functions using PyTorch and Matplotlib. Also, doing some reduction oeprations such as mean and std.
- Tensors in PyTorch, closer view: We usually create tensors with initialization or randomly. Then, we can use them similar to the way we use arrays in Numpy.
- ToTensor(): Here, we become familiar with the transform ToTensor(). It changes images of Type PIL or numpy.ndarray. Some examples including CIFAR10 are provided.
- Datasets in PyTorch: How to use (x,y) pairs of the dataset samples in PyTorch using the Dataset class. One way is to use TensorDataset, and the second way is to define our own custom Dataset. For this purpose, we employ the iris dataset.
- DataLoaders in PyTorch: When we have a Dataset, we can apply DataLoader to it, so that we can iterate over the dataset with ease. A DataLoader supports iter() and next() methods.
- One-hot encoding: We often need to convert labels, which are usually integer values, into one-hot vectors. One-hot vectors are more suitable for machine learning models and deep learning ones to work with.
- Torchvision transforms: We generally have two kinds of transforms, random and functional. The random transforms may create different results for each call, whereas functional transforms produce the same results in each call. Torchvision transforms are mainly for image transformations. Anyway, we can define our own transforms using callable classes.
- Torch.nn module for model definition: We use Sequential or a class with forward method for defining a neural network, which we call it usually the model. In this post, we define a model to deal with MNIST dataset.
- torch.autograd: This is the differentiation engine for gradient computation. This is useful when we adjust parameters of a neural network based on the backpropagation algorithm. Here, we become familiar with requires_grad=True and torch.no_grad(). We also show a single step of gradient descent.
- Training a neural network: At this post, we download the MNIST dataset, and prepare it for training a neural network(model). A model is defined and is trained by SGD (Stochastic Gradient Descent) algorithm using the MSE (Mean Squared Error) as the loss function.
- LeNet for the MNIST: At this time, a LeNet is defined and trained with the MNIST dataset. Then, We show how to save the LeNet parameters or the whole LeNet with torch.save(). To load the model or model's parameters, we use torch.load().
-
Notifications
You must be signed in to change notification settings - Fork 0
This repository provides topics in PyTorch which is used for Deep Learning
License
ostad-ai/PyTorch-Tutorial
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
This repository provides topics in PyTorch which is used for Deep Learning
Topics
Resources
License
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published