Docker images for using torch-rnn (https://github.com/jcjohnson/torch-rnn)
crisbal/torch-rnn:base
- Based on
ubuntu:14.04
- Allows usage of torch-rnn in CPU mode
- Based on
crisbal/torch-rnn:cuda6.5
- Based on
nvidia/cuda:6.5
- Allows usage of torch-rnn in GPU mode (Cuda 6.5 support)
- Only run with nvidia-docker https://github.com/NVIDIA/nvidia-docker
- Based on
crisbal/torch-rnn:cuda7.5
- Based on
nvidia/cuda:7.5
- Allows usage of torch-rnn in GPU mode (Cuda 7.5 support)
- Only run with nvidia-docker https://github.com/NVIDIA/nvidia-docker
- Based on
More details here: https://github.com/jcjohnson/torch-rnn#usage
-
Start bash in the container
docker run --rm -ti crisbal/torch-rnn:base bash
-
Preprocess the sample data
python scripts/preprocess.py \ --input_txt data/tiny-shakespeare.txt \ --output_h5 data/tiny-shakespeare.h5 \ --output_json data/tiny-shakespeare.json
-
Train
th train.lua \ -input_h5 data/tiny-shakespeare.h5 \ -input_json data/tiny-shakespeare.json \ -gpu -1
-
Sample
th sample.lua -checkpoint cv/checkpoint_10000.t7 -length 2000 -gpu -1
-
Install nvidia-docker
-
Start bash in the container
nvidia-docker run --rm -ti crisbal/torch-rnn:cuda7.5 bash
-
Preprocess the sample data
python scripts/preprocess.py \ --input_txt data/tiny-shakespeare.txt \ --output_h5 data/tiny-shakespeare.h5 \ --output_json data/tiny-shakespeare.json
-
Train
th train.lua \ -input_h5 data/tiny-shakespeare.h5 \ -input_json data/tiny-shakespeare.json
-
Sample
th sample.lua -checkpoint cv/checkpoint_10000.t7 -length 2000