Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

【Prim】Support Relu Custom VJP #51742

Merged
merged 2 commits into from Mar 20, 2023

Conversation

JiabinYang
Copy link
Contributor

@JiabinYang JiabinYang commented Mar 16, 2023

PR types

Others

PR changes

Others

Describe

Pcard-66975
This PR support relu custom vjp.



class TestCompositeSoftmaxPrimBackward(unittest.TestCase):
"test composite softmax and prim backward"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

change softmax as relu

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done in PR 51838

@@ -0,0 +1,122 @@
# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's better to delete this if op test with prim can cover this test

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since activation op test only test simple case, add this case can be more safe to use.

@@ -97,6 +97,7 @@ def composite_batchnorm(
batch_mean = zeros(run_mean.shape, run_mean.dtype)
batch_var = zeros(run_var.shape, run_var.dtype)
if not use_run_stat:

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

unnecessary change

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pre-commit hook did this

Copy link
Contributor

@cyber-pioneer cyber-pioneer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@JiabinYang JiabinYang merged commit 604b7a5 into PaddlePaddle:develop Mar 20, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants