Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deterministic dropout layer #31

Merged
merged 21 commits into from
Apr 27, 2020
Merged

Deterministic dropout layer #31

merged 21 commits into from
Apr 27, 2020

Conversation

mryab
Copy link
Member

@mryab mryab commented Apr 5, 2020

Implements custom dropout layer accepting precomputed dropout mask as an input, tests its determinism (although it can't fail in any conceivable situation). Also fixed some bugs/inconsistencies in backend/utils/test_utils which I faced when writing a test.

Hopefully solves #21.

TODO:

  • Custom DeterministicDropout function and layer
  • Tests for forward/backward determinism
  • Add docstrings for DeterministicDropout
  • Rewrite all docstrings hinting at non-determinism (e.g. here and here). From now, users will have to implement custom layers with no state other than parameters and explicitly track all buffers/RNG outputs if they want complete determinism.

@mryab mryab changed the title [WIP] Deterministic dropout layer Deterministic dropout layer Apr 19, 2020
@justheuristic justheuristic merged commit 9573455 into master Apr 27, 2020
@mryab mryab deleted the deterministic_dropout branch April 28, 2020 13:06
@mryab mryab mentioned this pull request May 1, 2020
2 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants