Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve docs of softmax_cross_entropy #3105

Merged

Conversation

keisuke-umezawa
Copy link
Member

This is PR for improving docs of functions/links. Related issue: #2182

Modified functions/links:

  • functions/loss/softmax_cross_entropy.py

@rezoo rezoo self-assigned this Aug 21, 2017
@mitmul mitmul self-assigned this Aug 21, 2017
@mitmul
Copy link
Member

mitmul commented Aug 22, 2017

Jenkins, test this please

@mitmul
Copy link
Member

mitmul commented Aug 23, 2017

@keisuke-umezawa Jenkins tests failed. Could you rebase it to the current master?

@keisuke-umezawa keisuke-umezawa force-pushed the improve-docs-of-softmax_cross_entropy branch from 9a53cb9 to c0f376a Compare August 24, 2017 09:44
@keisuke-umezawa
Copy link
Member Author

@mitmul I rebased this branch.

@keisuke-umezawa keisuke-umezawa added the cat:document Documentation such as function documentations, comments and tutorials. label Aug 26, 2017
Copy link
Member

@rezoo rezoo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry for the late review. LGTM except for the followings.

@@ -223,17 +223,21 @@ def backward_gpu(self, inputs, grad_outputs):
def softmax_cross_entropy(
x, t, normalize=True, cache_score=True, class_weight=None,
ignore_label=-1, reduce='mean'):
"""Computes cross entropy loss for pre-softmax activations.
"""Computes cross entropy loss after softmax activations.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems that the previous sentence is better; in your sentence, it is a bit confusing that we should apply "softmax" functions before calling softmax_cross_entropy.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

revereted it.

>>> log_softmax = -F.log_softmax(x)
>>> expected_loss = np.mean([log_softmax[row, column].data \
for row, column in enumerate(t)])
>>> y.data == expected_loss
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about using y.array instead of y.data?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok. I used y.array

@keisuke-umezawa
Copy link
Member Author

@rezoo I fixed them!

@rezoo
Copy link
Member

rezoo commented Sep 6, 2017

Thank you. LGTM.

@rezoo rezoo merged commit 181186b into chainer:master Sep 6, 2017
@beam2d beam2d added this to the v3.0.0rc1 milestone Sep 12, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cat:document Documentation such as function documentations, comments and tutorials.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants