Skip to content

Latest commit

 

History

History
94 lines (64 loc) · 3.7 KB

PITCHME.md

File metadata and controls

94 lines (64 loc) · 3.7 KB

Recurrent Neural Networks


Introduction

  • Sequence labeling – process of transcribing data sequence to sequence of discrete labels
  • Applications
    • Speech recognition
    • Handwriting recognition
    • Protein secondary structure prediction
  • Sequence labeling vs. pattern classification
    • Correlations in input data and output data importance of context

Problem types

  • Pattern classification

    pattern classification -> T

  • Sequence classification

    sequence classification -> T

  • Segment classification

    segment classification

    • Frame-wise labels
    • Context
    • Time windows
  • Temporal classification

    temporal classification -> Text

    • Unsegmented labels

Unfolding RNN

  • Unfolding network along input sequence

  • No recurrent connections

    unfolding


Recurrent Neural Network (RNN)

  • MLP maps input vector to output vector
  • Recurrent connections allow ‘memory’ of previous inputs
  • RNN maps entire history of previous inputs to output vector rnn

Forward pass

  • Almost the same as MLP, except inputs come from the hidden layer as well

    • input to hidden unit h at time t

      a_h^t

    • output of hidden unit h at time t

      b_h^t

    forward pass


Backward pass

  • Backpropagation through time (BPTT)

    \delta_h^t

    grad

    backward pass