Skip to content

Files

Latest commit

 

History

History
  • 5.1. LSTM to the Rescue
  • 5.2. Understanding the LSTM cell
  • 5.3. Forward propagation in LSTM
  • 5.4. Backpropagation in LSTM
  • 5.5. Deriving backpropagation of LSTM Step by step
    • 5.5.1. Gradients with respect to Gates
    • 5.5.2. Gradients with respect to Weights
      • 5.5.2.1. Gradients with respect to V
      • 5.5.2.2. Gradients with respect to W
      • 5.5.2.3. Gradients with respect to U
  • 5.6. Predicting Bitcoins price using LSTM
  • 5.7. Gated Recurrent Units
  • 5.8. Understanding GRU cell
    • 5.8.1. Update Gate
    • 5.8.2. Reset Gate
    • 5.8.3. Updating the hidden state
  • 5.9. Forward propagation in GRU cell
  • 5.10. Deriving backpropagation in GRU cell
    • 5.10.1. Gradients with respect to Gates
    • 5.10.2. Gradients with respect to Weights
      • 5.10.2.1. Gradients with respect to V
      • 5.10.2.2. Gradients with respect to W
      • 5.10.2.3. Gradients with respect to U
  • 5.11. Implementing GRU cell in Tensorflow
  • 5.12. BiDirectional RNN
  • 5.13. Going Deep with Deep RNN
  • 5.14. Language Translation Seq2seq models
    • 5.14.1. Encoder
    • 5.14.2. Decoder
    • 5.14.3. Attention Mechanism