Projects and Assignments of Deep Learning Foundations Nanodegree on Udacity
In this project, I built my first neural network of this course and use it to predict daily bike rental ridership.
Building a sentiment network with Numpy.
Instead of a network written with Numpy, I used TFLearn, a high-level library built on top of TensorFlow. TFLearn makes it simpler to build networks just by defining the layers. It takes care of most of the details for you.
In this project, I built a neural network that recognizes handwritten numbers 0-9.
This kind of neural network is used in a variety of real-world applications including: recognizing phone numbers and sorting postal mail by address. To build the network, we used the MNIST data set, which consists of images of handwritten numbers and their correct labels 0-9.
In this project, I continued Andrew Trask's work by building a network for sentiment analysis on the movie review data.
In this lab, I used all the tools I have learned from Introduction to TensorFlow to label images of English letters! The data I was using, notMNIST, consists of images of a letter from A to J in different fonts.
In this project, I classified images from the CIFAR-10 dataset. The dataset consists of airplanes, dogs, cats, and other objects. I have preprocessed the images, then trained a convolutional neural network on all the samples. The images was normalized and the labels was one-hot encoded. I have applied what I learned and built a convolutional, max pooling, dropout, and fully connected layers.
In this notebook, I have built a character-wise RNN trained on Anna Karenina, one of my all-time favorite books. It is able to generate new text based on the text from the book.
This network is based off of Andrej Karpathy's post on RNNs and implementation in Torch. Also, some information here at r2rt and from Sherjil Ozair on GitHub. Below is the general architecture of the character-wise RNN.