New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve docs of softmax #2362
Improve docs of softmax #2362
Conversation
The travis-ci tests will not be passed, because autopep8 generates the diffs in the files not related to this PR. Should I need to some setup?
|
The cause may be the version of autopep8. |
remove WIP. The latest commit fails on Travis because of an autopep8 problem that will be fixed in #2363. I will kick Travis again, after it will be merged to the master. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for sending this PR! I added some comments, so please check them :)
X (:class:`~chainer.Variable` or :class:`numpy.ndarray` or \ | ||
:class:`cupy.ndarray`): | ||
Input variable. | ||
A 2d (N, D) :math:`((s_1^1, ..., s_D^1), ..., (s_1^N, ..., s_D^N))` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
softmax
can also be applied to more than 2-dimensional tensor, e.g., a minibatch of images shaped like (N, C, H, W)
. In such case, the output will be (N, C, H, W)
-shaped tensor and the summation over the channel axis will be always 1. In short, the output is not always "2d (N, D)", so I'd like you to fix this part ;)
[ 0., 2., 4.]], dtype=float32) | ||
>>> F.softmax(x).data | ||
array([[ 0.09003057, 0.24472848, 0.66524094], | ||
[ 0.01587624, 0.11731043, 0.86681336]], dtype=float32) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How about showing that the summation over the second axis is always 1? For example,
>>> y = F.softmax(x)
>>> y.data
array([[ 0.09003057, 0.24472848, 0.66524094],
[ 0.01587624, 0.11731043, 0.86681336]], dtype=float32)
>>> F.sum(y, axis=1).data
array([ 1., 1.], dtype=float32)
put doctest result
|
27826c7
to
94000cd
Compare
@mitmul |
@keisuke-umezawa Could you resolve the conflicts? Now it's pointed to _v2 branch. Sorry for late reaction. |
@mitmul |
Jenkins, test this please! |
LGRWKJG |
This is PR for improving docs of functions/links. Related issue: #2182
Modified functions/links: