Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[AMP OP&Test] add bf16 fp16 type support for interpolate #51153

Merged
merged 2 commits into from Mar 6, 2023

Conversation

Courtesy-Xs
Copy link
Contributor

PR types

Others

PR changes

OPs

Describe

add fp16 and bf16 support for interpolate in phi

@paddle-bot
Copy link

paddle-bot bot commented Mar 3, 2023

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@zhhsplendid
Copy link
Member

经开发者沟通,之前的这个Op有对应的uint8和int64前向,没有对应的反向,会报缺少某些数据类型的错。但本PR修改的是fp16和bf16,和相关注册类型没有关系。uint8和int64可能为了推理所加。

故本PR中,我approve和数据类型注册相关CI

@ZzSean ZzSean changed the title add bf16 fp16 type support for interpolate [AMP OP&Test] add bf16 fp16 type support for interpolate Mar 6, 2023
@@ -105,13 +110,14 @@ static void BilinearInterpolationGrad(const DenseTensor& output_grad,
for (int j = 0; j < c; j++) { // loop for channels
// bilinear interpolation grad
if (data_layout == DataLayout::kNCHW) {
const T grad = output_grad_t(i, j, k, l);
// const T grad = output_grad_t(i, j, k, l);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

删除注释代码,下个PR修改

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done
对应PR#51660

@ZzSean ZzSean merged commit 2f2bf4e into PaddlePaddle:develop Mar 6, 2023
@Courtesy-Xs Courtesy-Xs deleted the phi_interp branch July 7, 2023 03:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants