Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

dropping output units rather than connections #9

Closed
lzamparo opened this issue Jun 5, 2014 · 1 comment
Closed

dropping output units rather than connections #9

lzamparo opened this issue Jun 5, 2014 · 1 comment

Comments

@lzamparo
Copy link

lzamparo commented Jun 5, 2014

Hi,

I think you've got a bug in your implementation: you're applying the dropout mask to output units rather than elements of your weight matrices, which is what the original version of dropout is intended to do. This means that you're dropping out bias units randomly, which might disrupt the model averaging interpretation of dropout.

I'm not sure if you intended to do this, but if not, you should reconsider having _droput_from_layer apply a mask directly to the Ws, and then computing the layer output (see eqn's 2.3 -- 2.6 of Nitish's thesis)

@lzamparo
Copy link
Author

lzamparo commented Jun 6, 2014

Ah, nevermind. I didn't read the model specification closely myself.

@lzamparo lzamparo closed this as completed Jun 6, 2014
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant