Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

WIP: BaseRecurrent and recurrent_apply_method #10

Merged
merged 10 commits into from
Oct 22, 2014
Merged

Conversation

rizar
Copy link
Contributor

@rizar rizar commented Oct 22, 2014

The scan related code is a typical boilerplate code
of which we had so much in Groundhog. This commit
introduces a way to get rid of it: a decorator that adds iteration
code to a single-step transition function.

The scan related code is a typical boilerplate code
of which we had so much in Groundhog. This commit
introduces a way to get rid of it.
@bartvm
Copy link
Member

bartvm commented Oct 22, 2014

I like the idea!

inputs : list of strs
Names of transition function argument that
play input role.
states : list of str or (str, function) tuples.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For carrying over the hidden state I guess it's easiest if we make the initial state a shared variable which can be passed to the updates argument of function. Would we do this by passing a particular state initialization function then?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Carrying over the hidden state is not the only advanced usecase that we
should support. In machine translation the initial hidden state of the
decoder is a function of a hidden state of the bidirectional RNN.
That's why it should be possible to pass the initial hidden state as an
argument in apply.

On 22.10.2014 19:32, Bart van Merriënboer wrote:

In blocks/bricks.py:

  •    can be described as follows: depending on the context variables
    
  •    and driven by input sequences the RNN updates its states and
    
  •    produces output sequences. Thus the input variables of
    
  •    your transition function play one of three roles: an input,
    
  •    a context or a state. These roles should be specified it the
    
  •    decorator call to make iteration possible.
    
  •    Parameters
    

  •    contexts : list of strs
    
  •        Names of transition function arguments that
    
  •        play context role.
    
  •    inputs : list of strs
    
  •        Names of transition function argument that
    
  •        play input role.
    
  •    states : list of str or (str, function) tuples.
    

For carrying over the hidden state I guess it's easiest if we make the
initial state a shared variable which can be passed to the |updates|
argument of |function|. Would we do this by passing a particular state
initialization function then?


Reply to this email directly or view it on GitHub
https://github.com/bartvm/blocks/pull/10/files#r19229784.

@bartvm
Copy link
Member

bartvm commented Oct 22, 2014

I made a PR to your branch by the way with all the PEP8 fixes, you might want to merge it, else Travis will never pass.

…tvm-rnn2-pep8

Conflicts:
	blocks/tests/test_rnn.py

I removed a function before merging.
@rizar
Copy link
Contributor Author

rizar commented Oct 22, 2014

Thank you for your fixes, there is a version passing travis now.

By the way, how can I build documentation locally, not with readthedocs?

@bartvm
Copy link
Member

bartvm commented Oct 22, 2014

Running sphinx-build -b html . _build/html in blocks/docs should work.

@rizar
Copy link
Contributor Author

rizar commented Oct 22, 2014

Ok, I will try, thanks.

What about merging this pull request?

bartvm added a commit that referenced this pull request Oct 22, 2014
WIP: BaseRecurrent and recurrent_apply_method
@bartvm bartvm merged commit 57ff431 into mila-iqia:master Oct 22, 2014
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants