-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
【PIR API adaptor No.35、40】 Migrate paddle.nn.ChannelShuffle/ClipGradByNorm into pir #60192
Conversation
你的PR提交成功,感谢你对开源项目的贡献! |
问题已收到,正在排查 |
|
已经添加了TestPirGradientClipByNorm 并测试通过 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nice work~
但 ClipGradByNorm 的 pir 单测还需要适配 test_none_grad 这个单测,否则 ci 覆盖率不达标
@0x45f 帮忙看一下这个问题:pir.Value 没有原来的 "need_clip" 属性,导致 pir 的单测与旧ir单测无法对齐,ci 覆盖率不达标,是否需要删除 _pir_clip
函数中对 need_clip 属性的判断?
这里可以尝试在python/paddle/pir/core.py文件create_parameter函数最后return之前添加下面两行代码: need_clip = kwargs.get('need_clip', True)
setattr(param, 'need_clip', need_clip) |
添加python/paddle/pir/core.py文件代码之后,还需要在TestPirGradientClipByNorm的测试中添加test_none_grad单侧么
|
在修改了 core.py 后,还需要在 TestPirGradientClipByNorm 的测试中添加 test_none_grad 单测。 x = (
base.default_main_program()
.global_block()
.create_parameter(
name="x", shape=[2, 3], dtype="float32", need_clip=False
)
) 可以改成: with paddle.pir_utils.IrGuard():
main = paddle.static.Program()
startup = paddle.static.Program()
with paddle.static.program_guard(main, startup):
x = paddle.pir.core.create_parameter(
dtype="float32",
shape=[2, 3],
name="x",
initializer=paddle.nn.initializer.Constant(value=0.5),
need_clip=False
) |
注释core.py代码,添加test_none_grad
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
辛苦 merge 一下最新分支,然后可以尝试在python/paddle/pir/core.py文件create_parameter函数参考 #60345 对 regulizer 属性的添加方法,添加下面两行代码:
need_clip = kwargs.get('need_clip', True)
param.need_clip = need_clip
python/paddle/nn/clip.py
Outdated
for p, g in params_grads: | ||
if g is None: | ||
continue | ||
if getattr(p, 'stop_gradient', True) is True: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if getattr(p, 'stop_gradient', True) is True: | |
if getattr(p, 'need_clip', True) is False: |
这行要保持和旧ir静态图的行为一致,都是去检查 need_clip
属性
PR types
Others
PR changes
APIs
Description
PIR API 推全升级
paddle.nn.ClipGradByNorm
引用了clip_by_norm,对clip_by_norm迁移升级至 pir,并更新单测,test_clip_by_norm_op单测覆盖率:4/4 test_gradient_clip 单测覆盖率:2/2paddle.nn.ChannelShuffle
引用了paddle.nn.functional.channel_shuffle,对channel_shuffle迁移升级至 pir,并更新单测, 单测覆盖率:6/6(TestChannelShuffleError 通过)