Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve docs of softmax #2362

Merged
merged 4 commits into from Jul 6, 2017
Merged

Conversation

keisuke-umezawa
Copy link
Member

This is PR for improving docs of functions/links. Related issue: #2182

Modified functions/links:

  • functions/activation/softmax

@keisuke-umezawa keisuke-umezawa changed the title [WIP] improve docs of softmax [WIP] Improve docs of softmax Mar 5, 2017
@keisuke-umezawa keisuke-umezawa changed the title [WIP] Improve docs of softmax Improve docs of softmax Mar 5, 2017
@keisuke-umezawa keisuke-umezawa changed the title Improve docs of softmax [WIP] Improve docs of softmax Mar 5, 2017
@keisuke-umezawa
Copy link
Member Author

The travis-ci tests will not be passed, because autopep8 generates the diffs in the files not related to this PR.
This phenomenon also occurs at my local master branch and the other's PR build.

Should I need to some setup?

~/dev/chainer]
$ autopep8 -r . --global-config .pep8 --diff | tee check_autopep8
--- original/./chainer/__init__.py
+++ fixed/./chainer/__init__.py
@@ -78,6 +78,7 @@
         thread_local.function_hooks = collections.OrderedDict()
     return thread_local.function_hooks
 
+
 _debug = False
 
 
@@ -127,6 +128,7 @@
     def __exit__(self, *_):
         set_debug(self._old)
 
+
 basic_math.install_variable_arithmetics()
 array.get_item.install_variable_get_item()
 
--- original/./chainer/utils/conv_nd_kernel.py
+++ fixed/./chainer/utils/conv_nd_kernel.py
@@ -164,6 +164,7 @@
     @chainer.cuda.memoize()
     def generate(ndim):
         return _im2col_nd_kernel._generate(ndim)
+
 
 _im2col_nd_kernel = Im2colNDKernel()
 
@@ -289,4 +290,5 @@
     def generate(ndim):
         return _col2im_nd_kernel._generate(ndim)
 
+
 _col2im_nd_kernel = Col2imNDKernel()

... cont'd

@keisuke-umezawa
Copy link
Member Author

The cause may be the version of autopep8.
My build and PR build use autopep8 of 1.3.0.
But the builds before use autopep8 of 1.2.4.
So, there are some updates from 1.2.4 to 1.3.0 and they generate some diff for other files.

@mitmul mitmul added cat:document Documentation such as function documentations, comments and tutorials. reviewer-team labels Mar 6, 2017
@delta2323 delta2323 assigned delta2323 and unassigned delta2323 Mar 6, 2017
@keisuke-umezawa keisuke-umezawa changed the title [WIP] Improve docs of softmax Improve docs of softmax Mar 7, 2017
@keisuke-umezawa
Copy link
Member Author

remove WIP. The latest commit fails on Travis because of an autopep8 problem that will be fixed in #2363. I will kick Travis again, after it will be merged to the master.

Copy link
Member

@mitmul mitmul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for sending this PR! I added some comments, so please check them :)

X (:class:`~chainer.Variable` or :class:`numpy.ndarray` or \
:class:`cupy.ndarray`):
Input variable.
A 2d (N, D) :math:`((s_1^1, ..., s_D^1), ..., (s_1^N, ..., s_D^N))`
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

softmax can also be applied to more than 2-dimensional tensor, e.g., a minibatch of images shaped like (N, C, H, W). In such case, the output will be (N, C, H, W)-shaped tensor and the summation over the channel axis will be always 1. In short, the output is not always "2d (N, D)", so I'd like you to fix this part ;)

[ 0., 2., 4.]], dtype=float32)
>>> F.softmax(x).data
array([[ 0.09003057, 0.24472848, 0.66524094],
[ 0.01587624, 0.11731043, 0.86681336]], dtype=float32)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about showing that the summation over the second axis is always 1? For example,

>>> y = F.softmax(x)
>>> y.data
array([[ 0.09003057,  0.24472848,  0.66524094],
       [ 0.01587624,  0.11731043,  0.86681336]], dtype=float32)
>>> F.sum(y, axis=1).data
array([ 1.,  1.], dtype=float32)

@keisuke-umezawa
Copy link
Member Author

put doctest result

...

Document: reference/functions
-----------------------------
1 items passed all tests:
  89 tests in default
89 tests in 1 items.
89 passed and 0 failed.
Test passed.

...

Doctest summary
===============
  308 tests
    8 failures in tests
    0 failures in setup code
    0 failures in cleanup code
build finished with problems.
make: *** [doctest] Error 1

@mitmul mitmul self-assigned this Mar 21, 2017
@keisuke-umezawa
Copy link
Member Author

@mitmul
I rebased this branch against master. Please rerun the Jenkins test.

@keisuke-umezawa
Copy link
Member Author

2017-04-19 8 54 50

@keisuke-umezawa
Copy link
Member Author

@mitmul
I had some comment of log-softmax ( #2411 ) and reflected this PR.
Please reveiw this PR.

@mitmul mitmul changed the base branch from master to _v2 May 25, 2017 13:53
@mitmul
Copy link
Member

mitmul commented May 25, 2017

@keisuke-umezawa Could you resolve the conflicts? Now it's pointed to _v2 branch. Sorry for late reaction.

@mitmul mitmul changed the base branch from _v2 to master June 22, 2017 06:01
@keisuke-umezawa
Copy link
Member Author

@mitmul
I resolved the conflict. plz merge this.

@mitmul
Copy link
Member

mitmul commented Jul 6, 2017

Jenkins, test this please!

@mitmul
Copy link
Member

mitmul commented Jul 6, 2017

LGRWKJG

@mitmul mitmul merged commit aebe1f9 into chainer:master Jul 6, 2017
@keisuke-umezawa keisuke-umezawa deleted the improve-docs-examples branch July 9, 2017 04:45
@beam2d beam2d added this to the v3.0.0b1 milestone Jul 31, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cat:document Documentation such as function documentations, comments and tutorials.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants