Skip to content

Conversation

@tafsiri
Copy link
Contributor

@tafsiri tafsiri commented Jul 14, 2020

modularize round, rsqrt, sigmoid, isNan, isInf, isFinite, softplus, sqrt, step

To see the logs from the Cloud Build CI, please join either our discussion or announcement mailing list.


This change is Reviewable

Copy link
Collaborator

@lina128 lina128 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you Yannick!

Reviewable status: :shipit: complete! 1 of 1 approvals obtained (waiting on @annxingyuan, @lina128, and @tafsiri)


tfjs-core/src/gradients/IsInf_grad.ts, line 27 at r1 (raw file):

  kernelName: IsInf,
  gradFunc: (dy: Tensor) => {
    // TODO(nsthorat): Let gradients be null for cases where we want to stop

Curious, do we still want to do this?

@tafsiri
Copy link
Contributor Author

tafsiri commented Jul 15, 2020

@lina128 good question. I don't really know, cc @nsthorat about the TODOs in these gradients

Copy link
Contributor

@nsthorat nsthorat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Reviewable status: :shipit: complete! 1 of 1 approvals obtained (waiting on @annxingyuan and @lina128)


tfjs-core/src/gradients/IsInf_grad.ts, line 27 at r1 (raw file):

Previously, lina128 (Na Li) wrote…

Curious, do we still want to do this?

yes I think we should do this, its a perf optimization. Right now we pass zeros backwards and do more work than we need to (while null can signal stop propagating the gradient backwards altogether)

@tafsiri tafsiri merged commit 3d09ac6 into master Jul 15, 2020
@tafsiri tafsiri deleted the mod-unary-4 branch July 15, 2020 19:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants