-
Notifications
You must be signed in to change notification settings - Fork 0
Lab Assignment 3
We have learnt about different deep learning algorithms and how to use tensor board like showing the graphs in tensor board.
The main objective of this assignment is to implement the logistic regression and word embedding and show the graph in tensor board.
Logistic regression
For this task we have chosen the MNIST dataset.We have changed the different hyper parameters and observed the accuracy. The hyper parameters include optimizers,learning rate,number of steps,step size.
For the 4 optimizers we tried to change the number of steps,batch size,learning rate and observed the accuracy values.
For all the different optimizers implemented,we found that gradient descent optimizer got the highest accuracy of 92% for learning rate 0.9.But the value has decreased to 84% for learning rate of 0.01. Adagrad Optimizer has the consistent accuracy of 91% for different different values of learning rate.
Parameters:
Accuracy Ouptput:
Parameters:
Accuracy Ouptput:
Tensor Board:
Parameters:
Accuracy Ouptput:
Parameters:
Accuracy Ouptput:
Tensor Board:
Accuracy Ouptput:
Parameters:
Accuracy Ouptput:
Tensor Board:
Parameters:
Accuracy Ouptput:
Parameters:
Accuracy Ouptput:
Tensor Board:
Word Embeddings
For this task we have chosen the ewik8 dataset in 'http://mattmahoney.net/dc/enwik8.zip' URL.We have changed the different hyper parameters like optimizers,learning rate,number of steps,step size and observed the loss.
We observed that Adam Optimizer has lowest loss(3) when the learning rate is low like 0.01 ,but loss has gone up on increasing the learning rate.Adagrad Optimizer also has the low loss of 2 which is very near to the Adam Optimizer.
Parameters:
Output:
Plot:

Parameters:
Output:
Plot:

Parameters:
Output:
Plot:

Parameters:
Output:
Plot:

Tensor Board:
Parameters:
Output:
Plot:

Tensor Board:
Parameters:
Output:
Plot:

Tensor Board:
Parameters:
Output:
Plot:

Tensor Board:
All the tasks have been implemented successfully.