-
Notifications
You must be signed in to change notification settings - Fork 112
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ipynb is down #11
Comments
Thanks! Good catch. Just fixed a typo in the ipynb. |
Hi Jonathan,
using your package, since I have written most part of the rnn. Do you Cosmo On 04/20/2015 10:18 PM, Jonathan Raiman wrote:
Best, |
Hey Cosmo, To get dropout to work I've done the following: Get some random number generator: import theano, theano.tensor as T
srng = theano.tensor.shared_randomstreams.RandomStreams(1234) Then create the mask you want to use: drop_prob = T.scalar()
shape = (300, 300)
mask = T.cast(
srng.binomial(n=1, p=1-drop_prob, size=shape),
theano.config.floatX
) And then in your scan:
Your theano function can now take as inputs:
As long as the random variables aren't generated within the scan op you should be fine. I've had problems whenever I tried created a new set of binomials on every time step or something else that's fancy. But the frozen dropout values as shown above worked for me. |
Thank you very much, I solved it last night in another way: import theano, theano.tensoras T drop_prob= T.scalar() and then |result, updates = theano.scan(fn = step, Do you think it is also plausible? And in your setting, how did yo write step function? Thank you very much! On 4/21/2015 10:47 PM, Jonathan Raiman wrote:
|
The ipynb was not fixed..
The text was updated successfully, but these errors were encountered: