A keras implementation of ENet (abandoned for the foreseeable future)
Branch: master
Clone or download



license Read the Docs

This is an implementation of ENet: A Deep Neural Network Architecture for Real-Time Semantic Segmentation, ported from ENet-training (lua-torch) to keras.


Setup environment


On Anaconda/miniconda: conda env create -f environment.yml

On pip: pip install -r requirements.txt

One-time dependencies

pip install Cython in order to make pycocotools.

pip install torchfile in order to convert the torch model to a keras one.

Build pycocotools

cd src/data/pycocotools/

Get code

git clone https://github.com/PavlosMelissinos/enet-keras.git

Set up data/model

cd enet-keras

The setup script only sets up some directories and converts the model to an appropriate format.

MSCOCO is only downloaded on demand.


Train on MS-COCO


Remaining tasks

  • Clean up code
    • Remove hardcoded paths
    • Add documentation everywhere
  • Test code
    • Add tests
  • Fix performance (mostly preprocessing bottleneck)
    • Remove unnecessary computations in data preprocessing
    • Index dataset category internals. Dataset categories have fields with one-to-one correspondence like id, category_id, palette, categories. This seems like perfect table structure. Might be too much though.
    • (Optionally) Make data loader multithreaded (no idea how to approach this one, multithreadedness is handled by keras though)
  • Enhance reproducibility/usability
    • Upload pretrained model
    • Finalize predict.py
      • Test whether it works after latest changes
      • Modify predict.py to load a single image or from a file. There's no point in loading images from the validation set.
  • Fix bugs
    • Investigate reason for bad results, see #11
    • Fix MSCOCOReduced, also see #9
    • ?????