-
Notifications
You must be signed in to change notification settings - Fork 22.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix doc for functional.dropout* #10417
Conversation
weiyangfb
commented
Aug 10, 2018
•
edited
Loading
edited
- fixes Better functional.dropout{,2d,3d} docs #4177
@@ -591,7 +591,19 @@ def adaptive_avg_pool3d(input, output_size): | |||
|
|||
|
|||
# Activation functions | |||
def dropout(input, p=0.5, training=False, inplace=False): | |||
def dropout(input, p=0.5, training=True, inplace=False): |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
b685fc2
to
b7181fa
Compare
torch/nn/modules/dropout.py
Outdated
@@ -54,7 +54,7 @@ def forward(self, input): | |||
|
|||
|
|||
class Dropout2d(_DropoutNd): | |||
r"""Randomly zeroes whole channels of the input tensor. | |||
r"""Randomly zeroes whole channels (a channel is a (N, C) pair) of the input tensor. |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks good, couple of nits. only commented on dropout3d but applies to all of them
torch/nn/functional.py
Outdated
|
||
Args: | ||
p: probability of an element to be zeroed. Default: 0.5 | ||
training: apply dropout if is True. Defualt: True |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
torch/nn/functional.py
Outdated
def dropout3d(input, p=0.5, training=False, inplace=False): | ||
def dropout3d(input, p=0.5, training=True, inplace=False): | ||
r""" | ||
Randomly zeroes whole channels (a channel is a 3D slice of dimensions D, H, W) of the input tensor. |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
torch/nn/modules/dropout.py
Outdated
@@ -54,8 +55,10 @@ def forward(self, input): | |||
|
|||
|
|||
class Dropout2d(_DropoutNd): | |||
r"""Randomly zeroes whole channels of the input tensor. | |||
The channels to zero-out are randomized on every forward call. | |||
r"""Randomly zero-out entire channels (a channel a 2D feature map of |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
torch/nn/modules/dropout.py
Outdated
@@ -20,7 +20,8 @@ def extra_repr(self): | |||
class Dropout(_DropoutNd): | |||
r"""During training, randomly zeroes some of the elements of the input | |||
tensor with probability :attr:`p` using samples from a Bernoulli | |||
distribution. The elements to zero are randomized on every forward call. | |||
distribution. ach channel will be zero-out indipendently on every forward |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
weiyangfb has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
@ssnl is this good to go? |
torch/nn/modules/dropout.py
Outdated
r"""Randomly zeroes whole channels of the input tensor. | ||
The channels to zero are randomized on every forward call. | ||
r"""Randomly zero-out entire channels (a channel is a 3D feature map of | ||
dimensions D, H, W) of the input tensor. Each channel will be zero-out |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
torch/nn/modules/dropout.py
Outdated
The channels to zero-out are randomized on every forward call. | ||
r"""Randomly zero-out entire channels (a channel is a 2D feature map of | ||
dimensions H, W) of the input tensor. Each channel will be zero-out | ||
indipendently on every forward call. with probability :attr:`p` using |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
torch/nn/modules/dropout.py
Outdated
@@ -54,8 +55,10 @@ def forward(self, input): | |||
|
|||
|
|||
class Dropout2d(_DropoutNd): | |||
r"""Randomly zeroes whole channels of the input tensor. | |||
The channels to zero-out are randomized on every forward call. | |||
r"""Randomly zero-out entire channels (a channel is a 2D feature map of |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
7b56fbb
to
aa649a3
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
weiyangfb has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
aa649a3
to
534085f
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! Thanks!
torch/nn/modules/dropout.py
Outdated
The channels to zero-out are randomized on every forward call. | ||
r"""Randomly zero out entire channels (a channel is a 2D feature map, | ||
e.g., the :math:`j`-th channel of the :math:`i`-th sample in the | ||
batched input is a 2D tensor :math:`input[i, j]`) of the input tensor). |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
837f517
to
cc98606
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
weiyangfb has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
* upstream/master: (26 commits) cudnn 7 upgrade with spatialBN fix (pytorch#11291) Ignore FuseGraph Call on Windows (pytorch#11015) defer resolution of mkl to a cmake wrapper library (pytorch#11298) Cleanup dependency of distributed flags (pytorch#11221) Move minimal wrapdim functionality to core, remove THTensor include i… (pytorch#11283) Change includes from ATen/Storage.h to ATen/core/Storage.h (pytorch#11217) Fix scalar tensor assert in fusion compiler (pytorch#10952) Add dead code elimination pass (pytorch#10101) Distributed Data Parallel CPU module for C10D (pytorch#11168) Back out "[pt1][tensor] Add strides to caffe2::Tensor" Fix conv gradient conversion (pytorch#11312) Bag of clang tidy fixes for torch/csrc/ and torch/csrc/autograd (pytorch#11050) Sparse tensor printing; add NotImplemented autograd fn (pytorch#10181) Add convertToCaffe2Proto to python API fix doc for functional.dropout* (pytorch#10417) typo fix Tranpose2D -> Transpose2D (pytorch#11281) Remove THFinalizer Forward declarations of needed curand functions (pytorch#10911) nomnigraph - simplify core graph API and test (pytorch#11256) Small fixes to cppdocs for sync script (pytorch#11300) ...
Summary: - fixes pytorch#4177 Pull Request resolved: pytorch#10417 Differential Revision: D9542876 Pulled By: weiyangfb fbshipit-source-id: 480ed973d1fe0364f4acb5cd596c2031895b82df