Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

【Hackathon No.60】prelu, clip_by_norm, multi_dot 算子FP16/BF16单测完善 #52666

Merged
merged 3 commits into from Apr 18, 2023

Conversation

co63oc
Copy link
Contributor

@co63oc co63oc commented Apr 8, 2023

PR types

Others

PR changes

Others

Describe

prelu, clip_by_norm, multi_dot 算子FP16/BF16单测完善
图片

文档修改PR
PaddlePaddle/docs#5789

multi_dot 测试grad误差在0.2
图片

@paddle-bot
Copy link

paddle-bot bot commented Apr 8, 2023

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@@ -63,7 +63,9 @@ def clip_by_norm(x, max_norm, name=None):
return _legacy_C_ops.clip_by_norm(x, 'max_norm', max_norm)

helper = LayerHelper("clip_by_norm", **locals())
check_variable_and_dtype(x, 'X', ['float32', 'float16'], 'clip_by_norm')
check_variable_and_dtype(
x, 'X', ['float16', 'float32', 'float64', 'uint16'], 'clip_by_norm'
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个op里不支持float64

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

CI提示没有double类型,已修改取消float64


def get_dtype(self):
self.np_dtype = np.uint16
return "float32"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里会把self.dtype设置成float32,应该是uint16

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

已修改

place,
['X', 'Alpha'],
'Out',
max_relative_error=max_relative_error,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

反向尝试使用默认值,如果无法通过再尝试调整,float16同理

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

已修改

place,
['X', 'Alpha'],
'Out',
max_relative_error=max_relative_error,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这样写感觉不太好,建议就直接去掉max_relative_error的设置,使用默认值

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

已修改使用默认值

self.check_grad_with_place(self.place, ['x0'], 'Out')
self.check_grad_with_place(self.place, ['x1'], 'Out')
except:
self.check_grad_with_place(self.place, ['x0'], 'Out', atol=0.2)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个从PR贴图来看,差的有点多,建议要么调大numeric_grad_delta或者自己写个user_defined_grad

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

已修改增加numeric_grad_delta,测试设置为0.01,设置0.005,0.008测试失败

@co63oc
Copy link
Contributor Author

co63oc commented Apr 15, 2023

@luotao1 @ZzSean CI已完成
PR-CI-Inference 是生成文件大小问题 图片

PR-CI-Static-Check 是提示没有double类型支持
图片

Copy link
Contributor

@ZzSean ZzSean left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@Aurelius84 Aurelius84 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM for dtype registration

@luotao1 luotao1 merged commit c3055d2 into PaddlePaddle:develop Apr 18, 2023
24 checks passed
@co63oc co63oc deleted the prelu branch April 18, 2023 15:43
jjyaoao pushed a commit to jjyaoao/Paddle that referenced this pull request Apr 19, 2023
lijialin03 pushed a commit to lijialin03/Paddle that referenced this pull request Apr 25, 2023
ZzSean pushed a commit to ZzSean/Paddle that referenced this pull request May 5, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
contributor External developers
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants