Skip to content

LukeZhuang/Recurrent-Attention-Model

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

27 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

RAM

My implementation of "Recurrent Model of Visual Attention"

Interpretation of Gradient Flow Network in RAM

In the graph below, arrow line means forward flow in the network. Colored oval means the source of loss and colored line means the flow of gradient. Special Line with block sign means gradient does not flow though this line. Gradient Flow Network

References:

[1] https://github.com/zhongwen/RAM

Releases

No releases published

Packages

No packages published