Skip to content

Conversation

Krovatkin
Copy link
Contributor

@Krovatkin Krovatkin commented Oct 29, 2019

prim::AutogradAnyNonZero is optimized away under normal circumstances (a graph executor specializes tensor arguments and runs specializeAutogradZero), so the change should be backward compatible for as long as we are running the original executor.

@Krovatkin Krovatkin requested a review from suo October 29, 2019 21:21
Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Krovatkin has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Krovatkin has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@Krovatkin merged this pull request in 4e56455.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants