Skip to content
Neural Modules: a toolkit for conversational AI
Python Other
  1. Python 99.5%
  2. Other 0.5%
Branch: master
Clone or download
chiphuyen Merge pull request #122 from NVIDIA/install_patch
fixing install instruction to pip install
Latest commit c3d8abc Nov 15, 2019


Project Status: Active – The project has reached a stable, usable state and is being actively developed. NeMo documentation on GitHub pages NeMo core license and license for collections in this repo

NVIDIA Neural Modules: NeMo

NeMo (Neural Modules) is a toolkit for creating AI applications built around neural modules, conceptual blocks of neural networks that take typed inputs and produce typed outputs. Such modules typically represent data layers, encoders, decoders, language models, loss functions, or methods of combining activations.

NeMo makes it easy to combine and re-use these building blocks while providing a level of semantic correctness checking via its neural type system. As long as two modules have compatible inputs and outputs, it is legal to chain them together. An application built with NeMo is a Directed Acyclic Graph (DAG) of connected modules.

NeMo's API is designed to be framework-agnostic, but currently only PyTorch is supported.

NeMo follows a lazy execution model: no computation is done until an action (such as optimizer.optimize(...) is called.

The toolkit comes with extendable collections of pre-built modules for automatic speech recognition (ASR) and natural language processing (NLP). Furthermore, NeMo provides built-in support for distributed training and mixed precision on the latest NVIDIA GPUs.

NeMo consists of:

  • NeMo Core: fundamental building blocks for all neural models and type system.
  • NeMo collections: pre-built neural modules for particular domains such as automatic speech recognition (nemo_asr) and natural language processing (nemo_nlp).


See this video for a quick walk-through.


  1. Python 3.6 or 3.7
  2. PyTorch 1.2 or 1.3 with GPU support
  3. NVIDIA APEX. Install from here:


NeMo documentation

See examples/start_here to get started with the simplest example. The folder examples contains several examples to get you started with various tasks in NLP and ASR.

Getting started

You can use NVIDIA NGC PyTorch container which already includes all the requirements above.

  • Pull the docker: docker pull
  • Run: nvidia-docker run -it --rm -v <nemo_github_folder>:/NeMo --shm-size=1g -p 8888:8888 -p 6006:6006 --ulimit memlock=-1 --ulimit stack=67108864
  • (If your docker version is >=19.03) Run: docker run --runtime=nvidia -it --rm -v <nemo_github_folder>:/NeMo --shm-size=1g -p 8888:8888 -p 6006:6006 --ulimit memlock=-1 --ulimit stack=67108864
  • cd /NeMo

and then continue with the following steps.

If you have all requirements installed (or are using NGC PyTorch container ), then you can simply use pip to install the latest released version (currently 0.8.2) of NeMo and its collections:

pip install nemo-toolkit  # installs NeMo Core
pip install nemo-asr # installs NeMo ASR collection
pip install nemo-nlp # installs NeMo NLP collection


Installing From Github

If you prefer to use NeMo's latest development version (from GitHub) follow the steps below:

Note: For step 2 and 3, if you want to use NeMo in development mode, use: pip install -e . instead of pip install .

  1. Clone the repository git clone
  2. Go to NeMo folder and install the toolkit:
cd NeMo/nemo
pip install .
  1. Install the collection(s) you want.
# Install the ASR collection from collections/nemo_asr
apt-get install libsndfile1
cd NeMo/collections/nemo_asr
pip install .

# Install the NLP collection from collections/nemo_nlp
cd NeMo/collections/nemo_nlp
pip install .


This command runs unittests:

python -m unittest tests/*.py


If you are using NeMo please cite the following publication

title={NeMo: a toolkit for building AI applications using Neural Modules}, author={Oleksii Kuchaiev and Jason Li and Huyen Nguyen and Oleksii Hrinchuk and Ryan Leary and Boris Ginsburg and Samuel Kriman and Stanislav Beliaev and Vitaly Lavrukhin and Jack Cook and Patrice Castonguay and Mariya Popova and Jocelyn Huang and Jonathan M. Cohen}, year={2019}, eprint={1909.09577}, archivePrefix={arXiv}, primaryClass={cs.LG}


You can’t perform that action at this time.