What is Transfer Learning?
The ability of a system to recognize and apply knowledge and skills learned in previous tasks to novel tasks (or new domains).
Keras Pre-trained Models
Keras - High-level neural networks API, written in Python. Modular, minimalistic and easy to use. Runs on top of Theano, Tensorflow, CNTK.
Keras applications (Model Zoo) contains following pre-trained models:
Xception VGG16 VGG19 ResNet50 InceptionV3
Why to use pre-trained models ?
It is relatively rare to have a dataset of sufficient size. Instead, it is common to use pretrained ConvNet which was trained on a very large dataset (e.g. ImageNet, which contains 1.2 million images with 1000 categories), and then use the ConvNet either as an: Initialization or A fixed feature extractor for the task of interest. And training of model from scratch requires more time then training the dense layers of pre-trained models.
Steps for using pre-trained models:-
1. Feature extractor :
Remove the Fully Connected (Bottleneck layer) from pre-trained VGG16 model. Run images from Dataset through this truncated network to produce image vectors. Use these vectors to train another classifier to predict the labels in training set. Prediction is made with second classifier against image vector.
2. Fine Tuning :
We train the model partially. Remove the Fully Connected (Bottleneck layer) from pre-trained VGG16 model. Make weights of all convolution blocks non-trainable(frozen)except the last few convolutional layers. Attach our own classifier to the bottom. Train the resulting classifier with very low learning rate. Computationally more expensive but still cheaper than training network from scratch. More robust model.