Skip to content
Deep learning lecture
Jupyter Notebook Python
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
Polythech'Angers Add files via upload Oct 10, 2019
jupyter_notebooks Rename TP1_MNIST_Optim.ipynb to jupyter_notebooks/TP1_MNIST_Optim.ipynb Oct 29, 2018
00_Main_Deep_2018.pdf
00_Main_Deep_2019.pdf Add files via upload Jun 14, 2019
AFIA_AG_2018.pdf Add files via upload Oct 10, 2018
CEREMA_IA_2018.pdf Add files via upload Dec 13, 2018
Intro_Deep_SJTU_2018.pdf Add files via upload Oct 30, 2018
README.md Update README.md Oct 31, 2018
TP_Deep_1_MNIST.py Add files via upload Sep 13, 2018
TP_Deep_2_webcam.py Add files via upload Sep 6, 2018
TP_Deep_3_fine_tuning.py Add files via upload Sep 6, 2018
test_cheese.zip Add files via upload Sep 6, 2018
train_cheese.zip

README.md

Deep learning: an historical perspective

Description

This repository contains the sildes of the seminar “Deep learning: an historical perspective”, given October 31 2pm-3pm at SJTU co-organized by SJTU School of Mathematical Sciences and SJTU-ParisTech Elite Institute of Technology (Intro_Deep_SJTU_2018.pdf, 21.5 MB)

It also contains the slides of the introduction to deep learning 3h lecture given at the ECAS-ENBIS 1-Day Summer School, on the 6 of september, 2018 (00_Main_Deep_2018.pdf)

It comes together with practical exercices on deep learning with the solution in python based on keras and some jupyter notebooks within the so called directory.

Deep learning practical session

Requirements

Keras should be available on your python environment.
You can install this library in the local environment using pip (pip3 install keras)

TP_Deep_2_webcam.py require opencv-python
You can also install this library in the local environment using pip (pip install opencv-python)

TP_Deep_1_MNIST.py (based on MNIST)

The purpose of this code is to help to take our first step in deep learning by reproducing the results given on the MNIST site. In less than 3 minutes, you will build and train a fully connected neural network (NN) performing less than 1.5% error on the MNIST database, and then, in less than 15 minutes, a convolutional neural network performing less than 1% error.
It comes together with the jupyter notebook TP_Deep_1_MNIST.ipynb. The other jupyter notebook TP_Deep_1_MNIST_Optim.ipynb, proposes a comparizon of different optimizers on the MNIST fashion dataset.

TP_Deep_2_webcam.py

require a web cam, and opencv-python pip install opencv-python

TP_Deep_3_fine_tuning.py

It works with the directories contained in these zip files

  • train_cheese.zip
  • test_cheese.zip

to make it run you may:

  • dowload the TP_Deep_3_fine_tuning.py file to some directory and move to this directory with python (e.g. cd ../Deep_learning_lecture)
  • define the class MonArg as follows
class MonArg(object):
   def __init__(self, train, val):
        self.train_dir = train
        self.val_dir = val
        self.nb_epoch = NB_EPOCHS
        self.batch_size = BAT_SIZE
        self.output_model_file = "inceptionv3-ft.model"
        self.plot = "store_true"
  • define some constants
IM_WIDTH, IM_HEIGHT = 299, 299 
NB_EPOCHS = 25
BAT_SIZE = 32
FC_SIZE = 1024
NB_IV3_LAYERS_TO_FREEZE = 172
  • import the TP_Deep_3_fine_tuning.py file and run it
import TP_Deep_3_fine_tuning
TP_Deep_3_fine_tuning.train(MonArg("train_cheese","test_cheese"))

To go further

You can’t perform that action at this time.