CoSIP Intense Course on Deep Learning - ICDL'17
Introduction To Tensorflow I & II
Please fint the scripts required for installation of the required packages at https://github.com/VigneshSrinivasan10/Cosip/tree/master/installation_scripts
Hands-on material contain
- The partially filled codes for the hands-on session on both the days
- Model weights - VGG16 and Mnist
- Images for style transfer experiments
Please download this repository and have the required installations before you a arrive. The idea behind the installations is to create a virtual environment using miniconda and have all the packages ( tensorflow, jupyter, ..) installed inside this virtual environment. The estimated time for the download and installation is about 30mins.
Interest in Deep learning has grown by leaps and bounds in the last few years. So has the number of platforms which support in writing quick and effective code for neural networks. Tensorflow, developed by Google Brain is an open source library for neural network programming. Since its release in Nov'2015, it has become one of the widely used platforms for executing neural network programs in scientific research as well as in production environments.
In the first part of the talk, we look at the data flow graphs in tensorflow which enable its fast computation and parallel processing. Participants will learn how to explore Tenorflow using its Python API and build simple neural network models. It will also involve a hand-on session where participants will write their own Tensorflow code to solve 2 exciting tasks using neural networks -
- Image classification on MNIST
- Style transfer - adding art to an image
The second part of the talk will delve into InterpretTensor - a high level wrapper for Tensorflow which helps in computing Layer-wise Relevance Propagation (LRP) for neural networks. In the hands-on session, participants will use InterpretTensor in addition with core Tensorflow functions for the image classification task on MNIST and learn how to compute and visualize the relevances of the inputs for the predictions of the neural networks.