Skip to content
This repository has been archived by the owner on Mar 31, 2019. It is now read-only.

better weights management for memory layers #84

Open
justheuristic opened this issue Nov 7, 2016 · 0 comments
Open

better weights management for memory layers #84

justheuristic opened this issue Nov 7, 2016 · 0 comments
Assignees

Comments

@justheuristic
Copy link
Collaborator

Right now recurrent memory cells have clumsy weight management.

Some ideas on how to make that better

  • reimplement all rnn.py as whole layers like GRUMemoryLayer (otherwise deprecate it)
    • extra work for farther development
    • may introduce bugs
  • add some MacroLayer that wraps any lasagne network
    • simple to implement in current lasagne (see f0k comment here)
    • yet another abstraction
  • add some dict notation for weight initialization to make it more human readable

in case someone actually reads this, pls share your ideas. I'd be astonished to know that you exist.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

1 participant