Skip to content

Latest commit

 

History

History

Recurrent Neural Networks for Language Modeling


Lab

  • Hidden State Activation

    Take another look at the hidden state activation function. It can be written in two different ways. I'll show you, step by step, how to implement each and then how to verify the results produced by each are the same.

  • Vanilla RNNs, GRUs and the scan function

    Learn how to define the forward method for vanilla RNNs and GRUs. Additionally, you will see how to define and use the function scan to compute forward propagation for RNNs.

  • Working with JAX NumPy and Calculating Perplexity

    Learn about data generators. Data generators will be part of every assignment of C3.

  • Creating a GRU model using Trax

    Trax allows to define neural network architectures by stacking layers (similarly to other libraries such as Keras). In this ungraded lab you will see how a GRU model can be created using Trax.

Assignment

  • Deep N-grams