Skip to content

Commit

Permalink
initial release tested with Keras 2.1.2
Browse files Browse the repository at this point in the history
  • Loading branch information
roebius committed Jan 12, 2018
1 parent e6edd0c commit b091041
Show file tree
Hide file tree
Showing 29 changed files with 17,304 additions and 8,353 deletions.
28 changes: 16 additions & 12 deletions README.md
@@ -1,33 +1,37 @@
# Modified notebooks and Python files for Keras 2 and Python 3 from the fast.ai Deep Learning course
# Modified notebooks and Python files for Keras 2 and Python 3 from the fast.ai Deep Learning course v.1
The repository includes modified copies of the original Jupyter notebooks and Python files from the excellent
(and really unique) deep learning course "Practical Deep Learning For Coders" Part 1 and Part 2,
(and really unique) deep learning course "Practical Deep Learning For Coders" Part 1 and Part 2, v.1,
created by [fast.ai](http://fast.ai). The [original files](https://github.com/fastai/courses)
require Keras 1.

The current version of the repository has been tested with **_Keras 2.1.2_**.
The previous version, tested with _Keras 2.0.6_, is available [here](https://github.com/roebius/deeplearning_keras2/releases).
### Part 1
Located in the _nbs_ folder. Tested with Keras 2.0.6 on both Ubuntu 16.04 and Python 3.5 (installed through apt-get) and
MacOS 10.12.4 with Python 3.6 (installed with Homebrew). In Part 1 the Theano backend for Keras has been used.
Located in the _nbs_ folder. Tested on _Ubuntu 16.04_ and _Python 3.5_, installed through [Anaconda](https://www.anaconda.com), using the [Theano](http://deeplearning.net/software/theano/) 1.0.1 backend.

### Part 2
Located in the _nbs2_ folder. Tested with Keras 2.0.6 on Ubuntu 16.04 with Python 3.5 (installed through apt-get). In Part 2 the TensorFlow backend for Keras has been used.
Located in the _nbs2_ folder. Tested on _Ubuntu 16.04_ and _Python 3.5_, installed through [Anaconda](https://www.anaconda.com), using the [TensorFlow](https://www.tensorflow.org/) 1.3.0 backend.
A few modules requiring PyTorch were also tested, using [PyTorch](http://pytorch.org/) 0.3.0.

The files keras.json.for\_TensorFlow and keras.json.for\_Theano provide a template for the appropriate keras.json file, based on which one of the two backends needs to be used.
The files _keras.json.for\_TensorFlow_ and _keras.json.for\_Theano_ provide a template for the appropriate _keras.json_ file, based on which one of the two backends needs to be used by Keras.

A Python 3 virtualenv has been used for both parts. In order to facilitate the installation of the required Python packages, this repository includes
also the requirement files that can be used with the pip command. These files include additional packages that might be useful for further exploration.
An _environment.yml_ file for creating a suitable [conda environment](https://conda.io/docs/user-guide/tasks/manage-environments.html) is provided.

The comments that I inserted in the modules generally start with *"# -"* when they are not just *"# Keras 2"*.

### Notes about Part 2
#### Issues
My goal has been to modify the original files to the minimum extent possible.The comments that I inserted in the modules generally start with *"# -"* when they are not just *"# Keras 2"*.

### Notes and issues about Part 2
*neural-style.ipynb*: due to a function parameter change in _Keras 2.1_, the _VGG16_ provided by _Keras 2.1_ has been used instead of the original custom module _vgg16\_avg.py_

*rossman.ipynb*: section "Using 3rd place data" has been left out for lack of the required data

*spelling_bee_RNN.ipynb*: after the main part of the notebook, in the final "Test code ..." section I was not able to solve an issue with the K.conv1d cell not working
*spelling_bee_RNN.ipynb* and *attention_wrapper.py*: due to the changed implementation of the recurrent.py module in Keras 2.1, the attention part of the notebook doesn't work anymore

*taxi_data_prep_and_mlp.ipynb*: section "Uh oh ..." has been left out. Caveat: running all the notebook at once exhausted 128 GB RAM; I was able to run each section individually only after resetting the notebook kernel each time

*tiramisu-keras.ipynb*: in order to run the larger size model I had to reset the notebook kernel in order to free up enough GPU memory (almost 12 GB) and jump directly to the model


#### Left-out modules
*neural-style-pytorch.ipynb* (found no way to load the VGG weights; it looks like some version compatibility issue)

Expand Down
185 changes: 185 additions & 0 deletions environment.yml
@@ -0,0 +1,185 @@
name: p3
channels:
- pytorch
- conda-forge
- defaults
dependencies:
- backports.weakref=1.0rc1=py35_0
- bleach=1.5.0=py35_0
- distributed=1.20.2=py35_0
- html5lib=0.9999999=py35_0
- jupyter_contrib_core=0.3.3=py35_1
- jupyter_nbextensions_configurator=0.3.0=py35_0
- markdown=2.6.9=py35_0
- asn1crypto=0.23.0=py35h4ab26a5_0
- backports=1.0=py35hd471ac7_1
- bcolz=1.1.2=py35hcb27967_0
- binutils_impl_linux-64=2.28.1=h04c84fa_2
- binutils_linux-64=7.2.0=25
- bokeh=0.12.13=py35h2f9c1c0_0
- boto=2.48.0=py35h2cfd601_1
- bz2file=0.98=py35_0
- bzip2=1.0.6=h6d464ef_2
- ca-certificates=2017.08.26=h1d4fec5_0
- certifi=2017.11.5=py35h9749603_0
- cffi=1.11.2=py35hc7b2db7_0
- chardet=3.0.4=py35hb6e9ddf_1
- click=6.7=py35h353a69f_0
- cloudpickle=0.5.2=py35hbe86bc5_0
- cryptography=2.1.4=py35hbeb2da1_0
- cudatoolkit=8.0=3
- cudnn=6.0.21=cuda8.0_0
- cycler=0.10.0=py35hc4d5149_0
- cython=0.27.3=py35h6cdc64b_0
- dask=0.16.0=py35hcb8ecc8_0
- dask-core=0.16.0=py35hfc66869_0
- dbus=1.10.22=h3b5a359_0
- decorator=4.1.2=py35h3a268aa_0
- entrypoints=0.2.3=py35h48174a2_2
- expat=2.2.5=he0dffb1_0
- fastcache=1.0.2=py35hec2bbaa_0
- fontconfig=2.12.4=h88586e7_1
- freetype=2.8=hab7d2ae_1
- gcc_impl_linux-64=7.2.0=hc5ce805_2
- gcc_linux-64=7.2.0=25
- gensim=3.1.0=py35h7300b16_0
- glib=2.53.6=h5d9569c_2
- gmp=6.1.2=h6c8ec71_1
- gmpy2=2.0.8=py35hd0a1c9a_2
- gst-plugins-base=1.12.2=he3457e5_0
- gstreamer=1.12.2=h4f93127_0
- gxx_impl_linux-64=7.2.0=hd3faf3d_2
- gxx_linux-64=7.2.0=25
- h5py=2.7.1=py35h8d53cdc_0
- hdf5=1.10.1=h9caa474_1
- heapdict=1.0.0=py35h51e6c10_0
- icu=58.2=h9c2bf20_1
- idna=2.6=py35h8605a33_1
- imageio=2.2.0=py35hd0a6de2_0
- intel-openmp=2018.0.0=hc7b2577_8
- ipykernel=4.7.0=py35h2f9c1c0_0
- ipython=6.2.1=py35hd850d2a_1
- ipython_genutils=0.2.0=py35hc9e07d0_0
- ipywidgets=7.0.5=py35h8147dc1_0
- jedi=0.11.0=py35_2
- jinja2=2.10=py35h480ab6d_0
- jpeg=9b=h024ee3a_2
- jsonschema=2.6.0=py35h4395190_0
- jupyter=1.0.0=py35hd38625c_0
- jupyter_client=5.1.0=py35h2bff583_0
- jupyter_console=5.2.0=py35h4044a63_1
- jupyter_core=4.4.0=py35ha89e94b_0
- keras=2.1.2=py35_0
- libedit=3.1=heed3624_0
- libffi=3.2.1=hd88cf55_4
- libgcc=7.2.0=h69d50b8_2
- libgcc-ng=7.2.0=h7cc24e2_2
- libgfortran-ng=7.2.0=h9f7466a_2
- libgpuarray=0.7.5=h14c3975_0
- libpng=1.6.32=hbd3595f_4
- libprotobuf=3.4.1=h5b8497f_0
- libsodium=1.0.15=hf101ebd_0
- libstdcxx-ng=7.2.0=h7a57d05_2
- libtiff=4.0.9=h28f6b97_0
- libxcb=1.12=hcd93eb1_4
- libxml2=2.9.4=h2e8b1d7_6
- locket=0.2.0=py35h170bc82_1
- lzo=2.10=h49e0be7_2
- mako=1.0.7=py35h69899ea_0
- markupsafe=1.0=py35h4f4fcf6_1
- matplotlib=2.1.1=py35ha26af80_0
- mistune=0.8.1=py35h9251d8c_0
- mkl=2018.0.1=h19d6760_4
- mkl-service=1.1.2=py35h0fc7090_4
- mpc=1.0.3=hec55b23_5
- mpfr=3.1.5=h11a74b3_2
- mpmath=1.0.0=py35h7ce6e34_2
- msgpack-python=0.4.8=py35h783f4c8_0
- nbconvert=5.3.1=py35hc5194e3_0
- nbformat=4.4.0=py35h12e6e07_0
- ncurses=6.0=h9df7e31_2
- networkx=2.0=py35hc690e10_0
- nltk=3.2.5=py35h09ad193_0
- notebook=5.2.2=py35he644770_0
- numexpr=2.6.4=py35h119f745_0
- numpy=1.13.3=py35hd829ed6_0
- olefile=0.44=py35h2c86149_0
- openssl=1.0.2n=hb7f436b_0
- pandas=0.22.0=py35hf484d3e_0
- pandoc=1.19.2.1=hea2e7c5_1
- pandocfilters=1.4.2=py35h1565a15_1
- parso=0.1.1=py35h1b200a3_0
- partd=0.3.8=py35h68187f2_0
- pcre=8.41=hc27e229_1
- pexpect=4.3.0=py35hf410859_0
- pickleshare=0.7.4=py35hd57304d_0
- pillow=5.0.0=py35h3deb7b8_0
- pip=9.0.1=py35h7e7da9d_4
- prompt_toolkit=1.0.15=py35hc09de7a_0
- protobuf=3.4.1=py35he6b9134_0
- psutil=5.4.1=py35h2e39a06_0
- ptyprocess=0.5.2=py35h38ce0a3_0
- pycparser=2.18=py35h61b3040_1
- pygments=2.2.0=py35h0f41973_0
- pygpu=0.7.5=py35h14c3975_0
- pyopenssl=17.5.0=py35h4f8b8c8_0
- pyparsing=2.2.0=py35h041ed72_1
- pyqt=5.6.0=py35h0e41ada_5
- pysocks=1.6.7=py35h6aefbb0_1
- pytables=3.4.2=py35hfa98db7_2
- python=3.5.4=h417fded_24
- python-dateutil=2.6.1=py35h90d5b31_1
- pytz=2017.3=py35hb13c558_0
- pywavelets=0.5.2=py35h53ec731_0
- pyyaml=3.12=py35h46ef4ae_1
- pyzmq=16.0.3=py35ha889422_0
- qt=5.6.2=h974d657_12
- qtconsole=4.3.1=py35h4626a06_0
- readline=7.0=ha6073c6_4
- requests=2.18.4=py35hb9e6ad1_1
- scikit-image=0.13.1=py35h14c3975_1
- scikit-learn=0.19.1=py35hbf1f462_0
- scipy=1.0.0=py35hcbbe4a2_0
- setuptools=36.5.0=py35ha8c1747_0
- simplegeneric=0.8.1=py35h2ec4104_0
- sip=4.18.1=py35h9eaea60_2
- six=1.11.0=py35h423b573_1
- smart_open=1.5.3=py35_0
- sortedcontainers=1.5.7=py35h683703c_0
- sqlite=3.20.1=hb898158_2
- sympy=1.1.1=py35h919b29a_0
- tblib=1.3.2=py35hf1eb0b4_0
- tensorflow=1.3.0=0
- tensorflow-base=1.3.0=py35h79a3156_1
- tensorflow-gpu=1.3.0=0
- tensorflow-gpu-base=1.3.0=py35cuda8.0cudnn6.0_1
- tensorflow-tensorboard=0.1.5=py35_0
- terminado=0.6=py35hce234ed_0
- testpath=0.3.1=py35had42eaf_0
- theano=1.0.1=py35h6bb024c_0
- tk=8.6.7=hc745277_3
- toolz=0.8.2=py35h90f1797_0
- tornado=4.5.2=py35hf879e1d_0
- tqdm=4.19.4=py35h68e51d2_0
- traitlets=4.3.2=py35ha522a97_0
- ujson=1.35=py35_0
- urllib3=1.22=py35h2ab6e29_0
- wcwidth=0.1.7=py35hcd08066_0
- webencodings=0.5.1=py35hb6cf162_1
- werkzeug=0.12.2=py35hbfc1ea6_0
- wheel=0.30.0=py35hd3883cf_1
- widgetsnbextension=3.0.8=py35h84cb72a_0
- xz=5.2.3=h55aa19d_2
- yaml=0.1.7=had09818_2
- zeromq=4.2.2=hbedb6e5_2
- zict=0.1.3=py35h29275ca_0
- zlib=1.2.11=ha838bed_2
- pytorch=0.3.0=py35_cuda8.0.61_cudnn7.0.3hb362f6e_4
- torchvision=0.2.0=py35heaa392f_1
- pip:
- keras-tqdm==2.0.1
- tables==3.4.2
- torch==0.3.0.post4
- xgboost==0.7.post3
prefix: /home/roebius/anaconda/envs/f1

0 comments on commit b091041

Please sign in to comment.