Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add cumsum prim backward #50565

Merged
merged 28 commits into from Feb 28, 2023

Conversation

GGBond8488
Copy link
Contributor

PR types

Others

PR changes

OPs

Describe

add cumsum prim backward

@paddle-bot
Copy link

paddle-bot bot commented Feb 16, 2023

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@GGBond8488
Copy link
Contributor Author

GGBond8488 commented Feb 17, 2023

#49518 引入的bug:
cumsum_grad 在axis=None的场景下将产生与该pr合入之前不同的行为,且新的行为与torch以及autograd均不同,应该是错误的,所以这里没有测试axis=None的场景

Copy link
Contributor

@JiabinYang JiabinYang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add cinn test maybe

@GGBond8488
Copy link
Contributor Author

GGBond8488 commented Feb 22, 2023

add cinn test maybe

see https://github.com/PaddlePaddle/CINN/blob/develop/cinn/frontend/op_mappers/paddle/cumsum.cc

Compared with the paddle cumsum operator, CINN's cumsum operator lacks some parameters, and it cannot be aligned when connected to CINN. Consider adding operator mapping or enhancing the CINN cumsum operator in the future.

Copy link
Contributor

@JiabinYang JiabinYang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

one comment


return res

# def test_cinn(self):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

leave to do here and remove this test, no commented code should be pushed.

@GGBond8488 GGBond8488 closed this Feb 24, 2023
@GGBond8488 GGBond8488 reopened this Feb 24, 2023
self.attrs = {'axis': 2}
self.inputs = {'X': np.random.random((5, 6, 10)).astype("float64")}
self.outputs = {'Out': self.inputs['X'].cumsum(axis=2)}

def test_check_output(self):
self.check_output()
self.check_output(check_prim=True)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

基础算子没有前向拆分规则,不需要在前向测试中加上组合规则测试,可以去掉这个参数。

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

已去掉,下同

@@ -138,35 +144,41 @@ def setUp(self):
}

def test_check_output(self):
self.check_output()
self.check_output(check_prim=True)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

同上

self.attrs = {'axis': 1}
self.inputs = {'X': np.random.random((5, 6, 10)).astype("float64")}
self.outputs = {'Out': self.inputs['X'].cumsum(axis=1)}

def test_check_output(self):
self.check_output()
self.check_output(check_prim=True)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

同上

self.attrs = {'axis': 0}
self.inputs = {'X': np.random.random((5, 6, 10)).astype("float64")}
self.outputs = {'Out': self.inputs['X'].cumsum(axis=0)}

def test_check_output(self):
self.check_output()
self.check_output(check_prim=True)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

同上

self.inputs = {'X': np.random.random((5, 20)).astype("float64")}
self.outputs = {'Out': self.inputs['X'].cumsum(axis=1)}

def test_check_output(self):
self.check_output()
self.check_output(check_prim=True)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

同上

Comment on lines 348 to 349
attrs :
axis : axis
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

name相同的属性这里的映射不用配置

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

已修改

Copy link
Contributor

@zyfncg zyfncg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM for op_compat

Copy link
Contributor

@JiabinYang JiabinYang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

one comment

bool flatten = static_cast<bool>(this->Attr<bool>("flatten"));
bool exclusive = static_cast<bool>(this->Attr<bool>("exclusive"));
bool reverse = static_cast<bool>(this->Attr<bool>("reverse"));
VLOG(6) << "Runing add_grad composite func";
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cumsum?

Copy link
Contributor

@JiabinYang JiabinYang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, fix problem in next PR

@JiabinYang JiabinYang merged commit ca2b609 into PaddlePaddle:develop Feb 28, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants