Skip to content
/ mxnet Public
forked from apache/mxnet

Distributed deep learning framework for efficiency and flexibility

License

Notifications You must be signed in to change notification settings

feiga/mxnet

 
 

Repository files navigation

for Deep Learning

Build Status Documentation Status GitHub license

MXNet is a deep learning framework designed for both efficiency and flexibility. It allows you to mix the flavours of deep learning programs together to maximize the efficiency and your productivity.

What's New

Contents

Features

  • To Mix and Maximize
    • Mix all flavours of programming models to maximize flexibility and efficiency.
  • Lightweight, scalable and memory efficient.
    • Minimum build dependency, scales to multi-GPUs with very low memory usage.
  • Auto parallelization
    • Write numpy-style ndarray GPU programs, which will be automatically parallelized.
  • Language agnostic
    • With support for python, c++, more to come.
  • Cloud friendly
    • Directly load/save from S3, HDFS, AZure
  • Easy extensibility
    • Extending no requirement on GPU programming.

Bug Reporting

Contributing to MXNet

MXNet has been developed and used by a group of active community members. Everyone is more than welcome to contribute. It is a way to make the project better and more accessible to more users.

  • Please add your name to CONTRIBUTORS.md after your patch has been merged.

License

© Contributors, 2015. Licensed under an Apache-2.0 license.

History

MXNet is initiated and designed in collaboration by authors from cxxnet, minerva and purine2. The project reflects what we have learnt from the past projects. It combines important flavour of the existing projects, being efficient, flexible and memory efficient.

About

Distributed deep learning framework for efficiency and flexibility

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 66.6%
  • Python 21.9%
  • R 3.7%
  • C 3.5%
  • CMake 2.5%
  • Makefile 0.8%
  • Other 1.0%