Skip to content

Commit

Permalink
Merge branch 'r1.0.0rc1' into main
Browse files Browse the repository at this point in the history
  • Loading branch information
okuchaiev committed Mar 22, 2021
2 parents ddd7e13 + 532c71b commit aaab317
Show file tree
Hide file tree
Showing 2 changed files with 8 additions and 8 deletions.
14 changes: 7 additions & 7 deletions tutorials/NeMo_Getting_Started.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -195,7 +195,7 @@
"source": [
"## Generate English audio from text\n",
"Speech generation from text typically has two steps:\n",
"* Generate spectrogram from the the text. In this example we will use Tachotron 2 model for this.\n",
"* Generate spectrogram from the text. In this example we will use Tacotron 2 model for this.\n",
"* Generate actual audio from the spectrogram. In this example we will use WaveGlow model for this.\n"
]
},
Expand Down Expand Up @@ -239,14 +239,14 @@
"\n",
"**NeMo is built for training.** You can fine-tune, or train from scratch on your data all models used in this example. We recommend you checkout the following, more in-depth, tutorials next:\n",
"\n",
"* [NeMo fundamentals](https://colab.research.google.com/github/NVIDIA/NeMo/blob/$BRANCH/tutorials/00_NeMo_Primer.ipynb)\n",
"* [NeMo models](https://colab.research.google.com/github/NVIDIA/NeMo/blob/$BRANCH/tutorials/01_NeMo_Models.ipynb)\n",
"* [Speech Recognition](https://colab.research.google.com/github/NVIDIA/NeMo/blob/$BRANCH/tutorials/asr/01_ASR_with_NeMo.ipynb)\n",
"* [Punctuation and Capitalization](https://colab.research.google.com/github/NVIDIA/NeMo/blob/$BRANCH/tutorials/nlp/Punctuation_and_Capitalization.ipynb)\n",
"* [Speech Synthesis](https://colab.research.google.com/github/NVIDIA/NeMo/blob/$BRANCH/tutorials/tts/1_TTS_inference.ipynb)\n",
"* [NeMo fundamentals](https://colab.research.google.com/github/NVIDIA/NeMo/blob/r1.0.0rc1/tutorials/00_NeMo_Primer.ipynb)\n",
"* [NeMo models](https://colab.research.google.com/github/NVIDIA/NeMo/blob/r1.0.0rc1/tutorials/01_NeMo_Models.ipynb)\n",
"* [Speech Recognition](https://colab.research.google.com/github/NVIDIA/NeMo/blob/r1.0.0rc1/tutorials/asr/01_ASR_with_NeMo.ipynb)\n",
"* [Punctuation and Capitalization](https://colab.research.google.com/github/NVIDIA/NeMo/blob/r1.0.0rc1/tutorials/nlp/Punctuation_and_Capitalization.ipynb)\n",
"* [Speech Synthesis](https://colab.research.google.com/github/NVIDIA/NeMo/blob/r1.0.0rc1/tutorials/tts/1_TTS_inference.ipynb)\n",
"\n",
"\n",
"You can find scripts for training and fine-tuning ASR, NLP and TTS models [here](https://github.com/NVIDIA/NeMo/tree/$BRANCH/examples). "
"You can find scripts for training and fine-tuning ASR, NLP and TTS models [here](https://github.com/NVIDIA/NeMo/tree/r1.0.0rc1/examples). "
]
}
],
Expand Down
2 changes: 1 addition & 1 deletion tutorials/asr/08_ASR_with_Subword_Tokenization.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -612,7 +612,7 @@
"\r\n",
"We will use a Citrinet model to demonstrate the usage of subword tokenization models for training and inference. Citrinet is a [QuartzNet-like architecture](https://arxiv.org/abs/1910.10261), but it uses subword-tokenization along with 8x subsampling and [Squeeze-and-Excitation](https://arxiv.org/abs/1709.01507) to achieve strong accuracy in transcriptions while still using non-autoregressive decoding for efficient inference.\r\n",
"\r\n",
"We'll be using the **Neural Modules (NeMo) toolkit** for this part, so if you haven't already, you should download and install NeMo and its dependencies. To do so, just follow the directions on the [GitHub page](https://github.com/NVIDIA/NeMo), or in the [documentation](https://docs.nvidia.com/deeplearning/nemo/user-guide/docs/en/v1.0.0b1/).\r\n",
"We'll be using the **Neural Modules (NeMo) toolkit** for this part, so if you haven't already, you should download and install NeMo and its dependencies. To do so, just follow the directions on the [GitHub page](https://github.com/NVIDIA/NeMo), or in the [documentation](https://docs.nvidia.com/deeplearning/nemo/user-guide/docs/en/stable/index.html/).\r\n",
"\r\n",
"NeMo let us easily hook together the components (modules) of our model, such as the data layer, intermediate layers, and various losses, without worrying too much about implementation details of individual parts or connections between modules. NeMo also comes with complete models which only require your data and hyperparameters for training."
]
Expand Down

0 comments on commit aaab317

Please sign in to comment.