Skip to content

anibali/dlds

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Deep learning datasets (DLDS)

The purpose of DLDS is to make fetching and preparing datasets an automatic and painless process.

  • Necessary resources are automatically downloaded and checked for integrity.
  • Datasets are processed into HDF5 files, which can be read using a variety of languages including Lua, Python, and Matlab.
  • Class labels all use 1-based indexing

Building

  1. Copy config.example.json to config.json and customise to your liking
  2. Install Docker
  3. Build the DLDS Docker image with docker build -t dlds $PWD

Usage

Example: Installing the MNIST data set.

docker run --rm -it --volume=/data:/data dlds install mnist

Ensure that you set the volume(s) to match your particular config.json. The command shown here works with the example config file.

Supported datasets

Conventions

  • Labels stored in an n x 1 tensor
  • Images stored in an n x channels x height x width tensor

About

Easily install deep learning datasets

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages