Skip to content
Lifelong Learning via Progressive Distillation and Retrospection
C++ Python Cuda CMake MATLAB Shell Other
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
lll-caffe
lll-data update README.md, add imagenet2flowers.sh Sep 16, 2018
lll-models update lll-data and lll-models Sep 4, 2018
.gitignore
LICENSE Initial commit Aug 31, 2018
README.md Update README.md Sep 18, 2018
imagenet2flowers.sh update imagenet2flowers.sh Sep 23, 2018

README.md

ECCV18_Lifelong_Learning

Thi repository is for the paper "Lifelong Learning via Progressive Distillation and Retrospection".

Instructions

  1. Install the dependencies for Caffe according to the official instructions and modify the ./lll-caffe/Makefile.config.
  2. Compile the ./lll-caffe and make a soft link to the ./lll-data.
cd lll-caffe
make -j  12
ln -s ../lll-data data
  1. Download the pretrained models from the following links and put them in the ./lll-models.

Link1 (Password: wcdfwu)

  1. Generate the lmdb for each dataset in the ./lll-data according to the provided image list.
#Take Flowers as an example
./lll-data/data/flowers_train_lmdb #train set
./lll-data/data/flowers_test_lmdb #test set
./lll-data/data/flowers_seed1_uniform5_train_lmdb #subset from train set for Retrospection
  1. ImageNet2Flowers is taken as an example to illustrate the usage of the code. Please refer to imagenet2flowers.sh for the details, including training and evaluation.
You can’t perform that action at this time.