Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

[BUGFIX] fix log_sigmoid bugs #20372

Merged
merged 3 commits into from Jun 30, 2021
Merged

[BUGFIX] fix log_sigmoid bugs #20372

merged 3 commits into from Jun 30, 2021

Conversation

Adnios
Copy link
Contributor

@Adnios Adnios commented Jun 22, 2021

Description

Solve #20371

Checklist

Essentials

  • PR's title starts with a category (e.g. [BUGFIX], [MODEL], [TUTORIAL], [FEATURE], [DOC], etc)
  • Changes are complete (i.e. I finished coding on this PR)
  • All changes have test coverage
  • Code is well-documented

Comments

@mxnet-bot
Copy link

Hey @Adnios , Thanks for submitting the PR
All tests are already queued to run once. If tests fail, you can trigger one or more tests again with the following commands:

  • To trigger all jobs: @mxnet-bot run ci [all]
  • To trigger specific jobs: @mxnet-bot run ci [job1, job2]

CI supported jobs: [edge, centos-gpu, sanity, unix-gpu, windows-gpu, miscellaneous, unix-cpu, centos-cpu, clang, windows-cpu, website]


Note:
Only following 3 categories can trigger CI :PR Author, MXNet Committer, Jenkins Admin.
All CI tests must pass before the PR can be merged.

@mseth10 mseth10 added pr-awaiting-testing PR is reviewed and waiting CI build and test pr-work-in-progress PR is still work in progress and removed pr-awaiting-testing PR is reviewed and waiting CI build and test labels Jun 22, 2021
@mseth10 mseth10 added pr-awaiting-testing PR is reviewed and waiting CI build and test pr-work-in-progress PR is still work in progress and removed pr-work-in-progress PR is still work in progress pr-awaiting-testing PR is reviewed and waiting CI build and test labels Jun 22, 2021
@mseth10 mseth10 added pr-awaiting-testing PR is reviewed and waiting CI build and test pr-work-in-progress PR is still work in progress and removed pr-work-in-progress PR is still work in progress pr-awaiting-testing PR is reviewed and waiting CI build and test labels Jun 23, 2021
@Adnios
Copy link
Contributor Author

Adnios commented Jun 23, 2021

@mxnet-bot run ci [unix-gpu]

@mxnet-bot
Copy link

Jenkins CI successfully triggered : [unix-gpu]

@mseth10 mseth10 added pr-awaiting-testing PR is reviewed and waiting CI build and test pr-work-in-progress PR is still work in progress and removed pr-work-in-progress PR is still work in progress pr-awaiting-testing PR is reviewed and waiting CI build and test labels Jun 23, 2021
@Adnios
Copy link
Contributor Author

Adnios commented Jun 23, 2021

@mxnet-bot run ci [unix-gpu]

@mxnet-bot
Copy link

Jenkins CI successfully triggered : [unix-gpu]

@mseth10 mseth10 added pr-awaiting-testing PR is reviewed and waiting CI build and test pr-awaiting-review PR is waiting for code review and removed pr-work-in-progress PR is still work in progress pr-awaiting-testing PR is reviewed and waiting CI build and test labels Jun 23, 2021
@Adnios
Copy link
Contributor Author

Adnios commented Jun 28, 2021

@szha @bartekkuncer @bgawrych can you help preview review?

@bartekkuncer
Copy link
Contributor

@Adnios change looks alright but as it mostly involves gpu I believe @szha should take a look.

@@ -161,10 +161,34 @@ The storage type of ``log_sigmoid`` output is always dense

)code" ADD_FILELINE)
.set_attr<FCompute>("FCompute<cpu>", UnaryOp::Compute<cpu, mshadow_op::log_sigmoid>)
.set_attr<nnvm::FGradient>("FGradient", ElemwiseGradUseIn{"_backward_log_sigmoid"});
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The previous version looks correct with the ElemwiseGradUseIn which makes the input to the gradient function the input of the elementwise function. Could you elaborate on in which cases this would fail and why you need to change it to ElemwiseGradUseOut and the definition?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure how scalar array would trigger the problem yet

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi, @szha .
The reason of "scalar array would trigger the problem" is:
https://github.com/apache/incubator-mxnet/blob/835e25031f847b80277b6d11db0519723d26a80a/src/operator/nn/activation.cc#L126-L140

There are 2 solutions to make scalar array input to work.

  1. The input of log_sigmoid_grad should be y. So we can modify the following code which takes x as its input. This is also what I am doing in this pr.
    https://github.com/apache/incubator-mxnet/blob/835e25031f847b80277b6d11db0519723d26a80a/src/operator/mshadow_op.h#L416
    MXNET_UNARY_MATH_OP(log_sigmoid_grad, 1.0f - math::exp(a));
  2. Since log_sigmoid_grad takes x as input, we can also change the following code. Now it will make x as input to log_sigmoid_grad.
    https://github.com/apache/incubator-mxnet/blob/835e25031f847b80277b6d11db0519723d26a80a/src/operator/nn/activation-inl.h#L207-L210
        case activation::kLogSigmoid:
          ActivationBackward<xpu, mshadow_op::log_sigmoid, mshadow_op::log_sigmoid_grad>(
              ctx, inputs[0], inputs[2], req[0], outputs[0]);
          break;

I think solution_1 is better. For y = log_sigmoid(x), it calculates dx based on (dy, y) instead of (dy, x) which enables inplace operation during y = log_simoid(x) (i.e. y and x shares the same memory).

Another problem arose when I adopted the solution_1. The gradient of sym.log_sigmoid() will be wrong. The reason of this problem is that the input of _backward_log_sigmoid is x. When I adopt the solution_1, the input of _backward_log_sigmoid should be y. The source code of sym.log_sigmoid() is the following.
https://github.com/apache/incubator-mxnet/blob/835e25031f847b80277b6d11db0519723d26a80a/src/operator/tensor/elemwise_unary_op_basic.cc#L152-L167
So, I change it to ElemwiseGradUseOut in reference of the source code of sym.sigmoid().

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the detailed analysis. The proposed change looks good and I have no further concern.

@szha szha merged commit cb5bd4e into apache:master Jun 30, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
pr-awaiting-review PR is waiting for code review
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants