We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
import paddle from paddle import nn class Model(nn.Layer): def __init__(self): super().__init__() self.fc1 = nn.Linear(2, 1) def forward(self, x, y): z = paddle.concat([x, y], axis=-1) z = self.fc1(z) z = nn.functional.silu(z) gx = paddle.grad(z, x, create_graph=True)[0] return paddle.abs(gx).pow(2).sum() model = Model() # data N = 2 x = paddle.randn([N, 1]) x.stop_gradient = False y = paddle.randn([N, 1]) y.stop_gradient = False # dynamic loss_dynamic = model(x, y) loss_dynamic.backward() grads1 = [param.grad.clone().detach() for param in model.parameters()] model.clear_gradients() # static st_model = paddle.jit.to_static(model) loss_static = st_model(x, y) loss_static.backward() grads2 = [param.grad.clone().detach() for param in model.parameters()] # check import numpy as np for p1, p2 in zip(grads1, grads2): np.testing.assert_allclose(p1.numpy(), p2.numpy(), 1e-3, 1e-3)
运行命令:
FLAGS_prim_all=False FLAGS_enable_pir_in_executor=true FLAGS_enable_pir_api=True python test_run.py
结果:
AssertionError: Not equal to tolerance rtol=0.001, atol=0.001 Mismatched elements: 8 / 8 (100%) Max absolute difference: 0.08932383 Max relative difference: 3.0927124 x: array([[ 0.050079, 0.003131, 0.12114 , -0.012295], [-0.106463, 0.056415, -0.26742 , 0.029972]], dtype=float32) y: array([[ 0.012236, -0.017195, 0.031816, -0.003915], [-0.068708, 0.077533, -0.178814, 0.021687]], dtype=float32)
由于silu没有二阶大算子,因此运行时预期应该报错,而不是仍然进行计算并给出错误结果;
如果把激活函数从silu换成配二阶算子的tanh,则可以通过测试。
No response
The text was updated successfully, but these errors were encountered:
会在 #63914 解决~
Sorry, something went wrong.
BiynXu
Successfully merging a pull request may close this issue.
bug描述 Describe the Bug
运行命令:
结果:
由于silu没有二阶大算子,因此运行时预期应该报错,而不是仍然进行计算并给出错误结果;
如果把激活函数从silu换成配二阶算子的tanh,则可以通过测试。
其他补充信息 Additional Supplementary Information
No response
The text was updated successfully, but these errors were encountered: