Skip to content

Commit

Permalink
Merge 8e52b22 into 7964266
Browse files Browse the repository at this point in the history
  • Loading branch information
rened committed Jan 13, 2015
2 parents 7964266 + 8e52b22 commit f473322
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion README.md
Expand Up @@ -17,7 +17,7 @@ Mocha is a Deep Learning framework for [Julia](http://julialang.org/), inspired
- The *pure Julia backend* is portable -- it runs on any platform that support Julia. This is reasonably fast on small models thanks to Julia's LLVM-based just-in-time (JIT) compiler and [Performance Annotations](http://julia.readthedocs.org/en/latest/manual/performance-tips/#performance-annotations), and could be very useful for prototyping.
- The *native extension backend* could be turned on when a C++ compiler is available. It runs 2~3 times faster than the pure Julia backend.
- The *GPU backend* uses NVidia® [cuDNN](https://developer.nvidia.com/cuDNN), cuBLAS and customized CUDA kernels to provide highly efficient computation. 20~30 times or even more speedup could be observed on a modern GPU device, especially on larger models.
- **Compatibility**: Mocha uses the widely adopted HDF5 format to store both datasets and model snapshots, making it easy to inter-operate with Matlab, Python (numpy) and other existing computational tools. Mocha also provides tools to import trained model snapshot from Caffe.
- **Compatibility**: Mocha uses the widely adopted HDF5 format to store both datasets and model snapshots, making it easy to inter-operate with Matlab, Python (numpy) and other existing computational tools. Mocha also provides tools to import trained model snapshots from Caffe.
- **Correctness**: the computational components in Mocha in all backends are extensively covered by unit-tests.
- **Open Source**: Mocha is licensed under [the MIT "Expat" License](LICENSE.md).

Expand Down

0 comments on commit f473322

Please sign in to comment.