Navigation Menu

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can anybody write test case for this Bayesian LSTM? #382

Merged
merged 7 commits into from Jan 26, 2017

Conversation

jnhwkim
Copy link
Contributor

@jnhwkim jnhwkim commented Jan 26, 2017

though this code is used in my recent work.

I implement Bayesian LSTM and can be applied TrimZero, which resolves #335.

@jnhwkim jnhwkim changed the title Can anybody write test case for this? Can anybody write test case for this Bayesian LSTM? Jan 26, 2017
@nicholas-leonard nicholas-leonard merged commit e0efb51 into Element-Research:master Jan 26, 2017
@nicholas-leonard
Copy link
Member

Don't have to write it. Current test past, so its fine for now.

@hashbangCoder
Copy link

Hi,

I have been testing the dropout, specifically for FastLSTM and there seems to be some implementation errors.

 p= 0.5    --dropout prob
lstm = nn.LSTM(10,20,nil,nil,p)
print(lstm.p)   --returns 0.5 and works as expected

-- But,
f_lstm = nn.FastLSTM(10,20,nil,nil,nil,nil,p)
print(f_lstm.p)   -- prints 0 !

I think the error is in calling the __init() of parent class in FastLSTM here. This overwrites the self.p of child class.
If you comment that line for example (or preferably pass the p argument to __init), it works fine.

Just to be sure, I updated the module before checking. And I checked multiple times for argument positional errors. I hope I'm not wrong on the OOP part.

@nicholas-leonard @jnhwkim

@jnhwkim
Copy link
Contributor Author

jnhwkim commented Feb 7, 2017

@hashbangCoder I'll check that shortly. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Implementation Request: dropout between gates in LSTM
3 participants