Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

【PIR API adaptor No.1、20、103、104、120】 Migrate L1Loss/BCELoss/HSigmoidLoss/SmoothL1Loss/KLDivLoss into pir #58708

Merged
merged 19 commits into from
Dec 26, 2023

Conversation

DrRyanHuang
Copy link
Member

@DrRyanHuang DrRyanHuang commented Nov 6, 2023

PR types

Others

PR changes

APIs

Description

  • paddle.nn.L1Loss
  • paddle.nn.BCELoss
  • paddle.nn.HSigmoidLoss
  • paddle.nn.SmoothL1Loss
  • paddle.nn.KLDivLoss

  • test/legacy_test/test_l1_loss.py 文件下
    self.assertTrue('aaa' in result3.name) 不支持pir model,于是将 TestClassL1Loss.run_staticTestFunctionalL1Loss.run_static 关闭

  • test/legacy_test/test_hsigmoid_op.py 没有开启 test_hs_grad_with_sparse,原因是 调用的 paddle.static.nn.embeddingbase.DataFeeder 应该没有适配pir

test/legacy_test/test_l1_loss.py Outdated Show resolved Hide resolved
test/legacy_test/test_hsigmoid_op.py Outdated Show resolved Hide resolved
test/legacy_test/test_hsigmoid_op.py Outdated Show resolved Hide resolved
test/legacy_test/test_bce_loss.py Outdated Show resolved Hide resolved
@DrRyanHuang
Copy link
Member Author

目前需要等待 #58876 先合入

@MarioLulab
Copy link
Contributor

MarioLulab commented Dec 19, 2023

nn.initializer.Bilinear/Assign 已迁移至 pir 下:#60114
此 pr 可以继续推进了~

@MarioLulab

This comment was marked as resolved.

@0x45f
Copy link
Contributor

0x45f commented Dec 22, 2023

HsigmoidLossOp::Vjp遇到报错,报错栈为

2023-12-20 03:24:01 I1219 19:24:00.857872 24887 pass.cc:38] --- detected [0] subgraphs!
2023-12-20 03:24:01 test_hsigmoid_op failed
2023-12-20 03:24:01  .............E..
2023-12-20 03:24:01 ======================================================================
2023-12-20 03:24:01 ERROR: test_check_grad (test_hsigmoid_op.TestHSigmoidOpWithCostumTreeWithoutBias)
2023-12-20 03:24:01 ----------------------------------------------------------------------
2023-12-20 03:24:01 Traceback (most recent call last):
2023-12-20 03:24:01   File "/workspace/Paddle/build/test/legacy_test/test_hsigmoid_op.py", line 490, in test_check_grad
2023-12-20 03:24:01     self.check_grad(
2023-12-20 03:24:01   File "/workspace/Paddle/build/python/op_test.py", line 2969, in check_grad
2023-12-20 03:24:01     self.check_grad_with_place(
2023-12-20 03:24:01   File "/workspace/Paddle/build/python/op_test.py", line 3288, in check_grad_with_place
2023-12-20 03:24:01     pir_grad = self._get_ir_gradient(
2023-12-20 03:24:01   File "/workspace/Paddle/build/python/op_test.py", line 3814, in _get_ir_gradient
2023-12-20 03:24:01     grad_inputs = ir_grad(
2023-12-20 03:24:01   File "/workspace/Paddle/build/python/paddle/autograd/ir_backward.py", line 1077, in grad
2023-12-20 03:24:01     input_grad = calc_gradient(outputs, inputs, grad_outputs, no_grad_set)
2023-12-20 03:24:01   File "/workspace/Paddle/build/python/paddle/autograd/ir_backward.py", line 962, in calc_gradient
2023-12-20 03:24:01     input_to_inputgrad_map = calc_gradient_helper(
2023-12-20 03:24:01   File "/workspace/Paddle/build/python/paddle/autograd/ir_backward.py", line 901, in calc_gradient_helper
2023-12-20 03:24:01     append_backward_ops(
2023-12-20 03:24:01   File "/workspace/Paddle/build/python/paddle/autograd/ir_backward.py", line 759, in append_backward_ops
2023-12-20 03:24:01     input_grads = paddle.framework.core.call_vjp(
2023-12-20 03:24:01 ValueError: 
2023-12-20 03:24:01 
2023-12-20 03:24:01 0   paddle::dialect::VjpInterface::Vjp(pir::Operation*, std::vector<std::vector<pir::Value, std::allocator<pir::Value> >, std::allocator<std::vector<pir::Value, std::allocator<pir::Value> > > > const&, std::vector<std::vector<pir::OpResult, std::allocator<pir::OpResult> >, std::allocator<std::vector<pir::OpResult, std::allocator<pir::OpResult> > > > const&, std::vector<std::vector<pir::OpResult, std::allocator<pir::OpResult> >, std::allocator<std::vector<pir::OpResult, std::allocator<pir::OpResult> > > > const&, std::vector<std::vector<bool, std::allocator<bool> >, std::allocator<std::vector<bool, std::allocator<bool> > > > const&)
2023-12-20 03:24:01 1   paddle::dialect::VjpInterface::Model<paddle::dialect::HsigmoidLossOp>::Vjp(pir::Operation*, std::vector<std::vector<pir::Value, std::allocator<pir::Value> >, std::allocator<std::vector<pir::Value, std::allocator<pir::Value> > > > const&, std::vector<std::vector<pir::OpResult, std::allocator<pir::OpResult> >, std::allocator<std::vector<pir::OpResult, std::allocator<pir::OpResult> > > > const&, std::vector<std::vector<pir::Value, std::allocator<pir::Value> >, std::allocator<std::vector<pir::Value, std::allocator<pir::Value> > > > const&, std::vector<std::vector<bool, std::allocator<bool> >, std::allocator<std::vector<bool, std::allocator<bool> > > > const&)
2023-12-20 03:24:01 2   paddle::dialect::HsigmoidLossOp::Vjp(pir::Operation*, std::vector<std::vector<pir::Value, std::allocator<pir::Value> >, std::allocator<std::vector<pir::Value, std::allocator<pir::Value> > > > const&, std::vector<std::vector<pir::OpResult, std::allocator<pir::OpResult> >, std::allocator<std::vector<pir::OpResult, std::allocator<pir::OpResult> > > > const&, std::vector<std::vector<pir::Value, std::allocator<pir::Value> >, std::allocator<std::vector<pir::Value, std::allocator<pir::Value> > > > const&, std::vector<std::vector<bool, std::allocator<bool> >, std::allocator<std::vector<bool, std::allocator<bool> > > > const&)
2023-12-20 03:24:01 3   paddle::primitive::hsigmoid_loss_vjp(paddle::Tensor const&, paddle::Tensor const&, paddle::Tensor const&, paddle::optional<paddle::Tensor> const&, paddle::optional<paddle::Tensor> const&, paddle::optional<paddle::Tensor> const&, paddle::Tensor const&, paddle::Tensor const&, int, bool, std::vector<std::vector<bool, std::allocator<bool> >, std::allocator<std::vector<bool, std::allocator<bool> > > > const&)
2023-12-20 03:24:01 4   std::tuple<paddle::Tensor, paddle::Tensor, paddle::Tensor> paddle::primitive::backend::hsigmoid_loss_grad<paddle::primitive::LazyTensor>(paddle::Tensor const&, paddle::Tensor const&, paddle::Tensor const&, paddle::optional<paddle::Tensor> const&, paddle::optional<paddle::Tensor> const&, paddle::optional<paddle::Tensor> const&, paddle::Tensor const&, paddle::Tensor const&, int, bool)
2023-12-20 03:24:01 5   paddle::dialect::hsigmoid_loss_grad(pir::Value const&, pir::Value const&, pir::Value const&, paddle::optional<pir::Value> const&, paddle::optional<pir::Value> const&, paddle::optional<pir::Value> const&, pir::Value const&, pir::Value const&, int, bool)
2023-12-20 03:24:01 6   paddle::dialect::HsigmoidLossGradOp::Build(pir::Builder&, pir::OperationArgument&, pir::Value, pir::Value, pir::Value, pir::Value, pir::Value, pir::Value, pir::Value, pir::Value, int, bool)
2023-12-20 03:24:01 7   paddle::dialect::IrMetaTensor::share_meta(phi::MetaTensor const&)
2023-12-20 03:24:01 8   phi::enforce::EnforceNotMet::EnforceNotMet(common::ErrorSummary const&, char const*, int)
2023-12-20 03:24:01 9   phi::enforce::GetCurrentTraceBackString[abi:cxx11](bool)
2023-12-20 03:24:01 
2023-12-20 03:24:01 ----------------------
2023-12-20 03:24:01 Error Message Summary:
2023-12-20 03:24:01 ----------------------
2023-12-20 03:24:01 InvalidArgumentError: The current MetaTensor is not initialized.
2023-12-20 03:24:01   [Hint: Expected meta_tensor.initialized() == true, but received meta_tensor.initialized():0 != true:1.] (at ../paddle/fluid/pir/dialect/operator/ir/ir_meta_tensor.cc:22)

内部相关同学在解决这个问题了~

@changeyoung98
Copy link
Contributor

changeyoung98 commented Dec 25, 2023

HsigmoidLossOp::Vjp遇到报错,报错栈为

2023-12-20 03:24:01 I1219 19:24:00.857872 24887 pass.cc:38] --- detected [0] subgraphs!
2023-12-20 03:24:01 test_hsigmoid_op failed
2023-12-20 03:24:01  .............E..
2023-12-20 03:24:01 ======================================================================
2023-12-20 03:24:01 ERROR: test_check_grad (test_hsigmoid_op.TestHSigmoidOpWithCostumTreeWithoutBias)
2023-12-20 03:24:01 ----------------------------------------------------------------------
2023-12-20 03:24:01 Traceback (most recent call last):
2023-12-20 03:24:01   File "/workspace/Paddle/build/test/legacy_test/test_hsigmoid_op.py", line 490, in test_check_grad
2023-12-20 03:24:01     self.check_grad(
2023-12-20 03:24:01   File "/workspace/Paddle/build/python/op_test.py", line 2969, in check_grad
2023-12-20 03:24:01     self.check_grad_with_place(
2023-12-20 03:24:01   File "/workspace/Paddle/build/python/op_test.py", line 3288, in check_grad_with_place
2023-12-20 03:24:01     pir_grad = self._get_ir_gradient(
2023-12-20 03:24:01   File "/workspace/Paddle/build/python/op_test.py", line 3814, in _get_ir_gradient
2023-12-20 03:24:01     grad_inputs = ir_grad(
2023-12-20 03:24:01   File "/workspace/Paddle/build/python/paddle/autograd/ir_backward.py", line 1077, in grad
2023-12-20 03:24:01     input_grad = calc_gradient(outputs, inputs, grad_outputs, no_grad_set)
2023-12-20 03:24:01   File "/workspace/Paddle/build/python/paddle/autograd/ir_backward.py", line 962, in calc_gradient
2023-12-20 03:24:01     input_to_inputgrad_map = calc_gradient_helper(
2023-12-20 03:24:01   File "/workspace/Paddle/build/python/paddle/autograd/ir_backward.py", line 901, in calc_gradient_helper
2023-12-20 03:24:01     append_backward_ops(
2023-12-20 03:24:01   File "/workspace/Paddle/build/python/paddle/autograd/ir_backward.py", line 759, in append_backward_ops
2023-12-20 03:24:01     input_grads = paddle.framework.core.call_vjp(
2023-12-20 03:24:01 ValueError: 
2023-12-20 03:24:01 
2023-12-20 03:24:01 0   paddle::dialect::VjpInterface::Vjp(pir::Operation*, std::vector<std::vector<pir::Value, std::allocator<pir::Value> >, std::allocator<std::vector<pir::Value, std::allocator<pir::Value> > > > const&, std::vector<std::vector<pir::OpResult, std::allocator<pir::OpResult> >, std::allocator<std::vector<pir::OpResult, std::allocator<pir::OpResult> > > > const&, std::vector<std::vector<pir::OpResult, std::allocator<pir::OpResult> >, std::allocator<std::vector<pir::OpResult, std::allocator<pir::OpResult> > > > const&, std::vector<std::vector<bool, std::allocator<bool> >, std::allocator<std::vector<bool, std::allocator<bool> > > > const&)
2023-12-20 03:24:01 1   paddle::dialect::VjpInterface::Model<paddle::dialect::HsigmoidLossOp>::Vjp(pir::Operation*, std::vector<std::vector<pir::Value, std::allocator<pir::Value> >, std::allocator<std::vector<pir::Value, std::allocator<pir::Value> > > > const&, std::vector<std::vector<pir::OpResult, std::allocator<pir::OpResult> >, std::allocator<std::vector<pir::OpResult, std::allocator<pir::OpResult> > > > const&, std::vector<std::vector<pir::Value, std::allocator<pir::Value> >, std::allocator<std::vector<pir::Value, std::allocator<pir::Value> > > > const&, std::vector<std::vector<bool, std::allocator<bool> >, std::allocator<std::vector<bool, std::allocator<bool> > > > const&)
2023-12-20 03:24:01 2   paddle::dialect::HsigmoidLossOp::Vjp(pir::Operation*, std::vector<std::vector<pir::Value, std::allocator<pir::Value> >, std::allocator<std::vector<pir::Value, std::allocator<pir::Value> > > > const&, std::vector<std::vector<pir::OpResult, std::allocator<pir::OpResult> >, std::allocator<std::vector<pir::OpResult, std::allocator<pir::OpResult> > > > const&, std::vector<std::vector<pir::Value, std::allocator<pir::Value> >, std::allocator<std::vector<pir::Value, std::allocator<pir::Value> > > > const&, std::vector<std::vector<bool, std::allocator<bool> >, std::allocator<std::vector<bool, std::allocator<bool> > > > const&)
2023-12-20 03:24:01 3   paddle::primitive::hsigmoid_loss_vjp(paddle::Tensor const&, paddle::Tensor const&, paddle::Tensor const&, paddle::optional<paddle::Tensor> const&, paddle::optional<paddle::Tensor> const&, paddle::optional<paddle::Tensor> const&, paddle::Tensor const&, paddle::Tensor const&, int, bool, std::vector<std::vector<bool, std::allocator<bool> >, std::allocator<std::vector<bool, std::allocator<bool> > > > const&)
2023-12-20 03:24:01 4   std::tuple<paddle::Tensor, paddle::Tensor, paddle::Tensor> paddle::primitive::backend::hsigmoid_loss_grad<paddle::primitive::LazyTensor>(paddle::Tensor const&, paddle::Tensor const&, paddle::Tensor const&, paddle::optional<paddle::Tensor> const&, paddle::optional<paddle::Tensor> const&, paddle::optional<paddle::Tensor> const&, paddle::Tensor const&, paddle::Tensor const&, int, bool)
2023-12-20 03:24:01 5   paddle::dialect::hsigmoid_loss_grad(pir::Value const&, pir::Value const&, pir::Value const&, paddle::optional<pir::Value> const&, paddle::optional<pir::Value> const&, paddle::optional<pir::Value> const&, pir::Value const&, pir::Value const&, int, bool)
2023-12-20 03:24:01 6   paddle::dialect::HsigmoidLossGradOp::Build(pir::Builder&, pir::OperationArgument&, pir::Value, pir::Value, pir::Value, pir::Value, pir::Value, pir::Value, pir::Value, pir::Value, int, bool)
2023-12-20 03:24:01 7   paddle::dialect::IrMetaTensor::share_meta(phi::MetaTensor const&)
2023-12-20 03:24:01 8   phi::enforce::EnforceNotMet::EnforceNotMet(common::ErrorSummary const&, char const*, int)
2023-12-20 03:24:01 9   phi::enforce::GetCurrentTraceBackString[abi:cxx11](bool)
2023-12-20 03:24:01 
2023-12-20 03:24:01 ----------------------
2023-12-20 03:24:01 Error Message Summary:
2023-12-20 03:24:01 ----------------------
2023-12-20 03:24:01 InvalidArgumentError: The current MetaTensor is not initialized.
2023-12-20 03:24:01   [Hint: Expected meta_tensor.initialized() == true, but received meta_tensor.initialized():0 != true:1.] (at ../paddle/fluid/pir/dialect/operator/ir/ir_meta_tensor.cc:22)

此问题已解决:#60264 ,可拉取最新代码进行测试。

MarioLulab

This comment was marked as resolved.

test/legacy_test/test_l1_loss.py Outdated Show resolved Hide resolved
test/legacy_test/test_l1_loss.py Outdated Show resolved Hide resolved
Copy link
Contributor

@MarioLulab MarioLulab left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@0x45f 0x45f left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM,关于函数名的修改可以另提一个PR进行修改~

test/legacy_test/test_smooth_l1_loss.py Show resolved Hide resolved
test/legacy_test/test_smooth_l1_loss.py Show resolved Hide resolved
test/legacy_test/test_smooth_l1_loss.py Show resolved Hide resolved
test/legacy_test/test_smooth_l1_loss.py Show resolved Hide resolved
@0x45f 0x45f merged commit a3a3466 into PaddlePaddle:develop Dec 26, 2023
29 checks passed
@DrRyanHuang DrRyanHuang deleted the loss branch December 26, 2023 07:46
Wanglongzhi2001 pushed a commit to Wanglongzhi2001/Paddle that referenced this pull request Jan 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants