Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

【PIR Dist Op Reg No.24】 reg distributed_lookup_table #60911

Merged

Conversation

xiaoyewww
Copy link
Contributor

PR types

Others

PR changes

Others

Description

#60436
注册算子distributed_lookup_table

Copy link

paddle-bot bot commented Jan 17, 2024

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@paddle-bot paddle-bot bot added the contributor External developers label Jan 17, 2024
@luotao1 luotao1 added the HappyOpenSource 快乐开源活动issue与PR label Jan 18, 2024

import unittest

import test_op_transcriber
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
import test_op_transcriber
import test_op_translator

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

感谢~已修改,麻烦再review一下~

func : DistributeLookupTableInferMeta
kernel :
func : distributed_lookup_table
data_type : ids
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
data_type : ids
data_type : dtype

这里需要和DistributedLookupTableOp::GetExpectedKernelType 保持一致

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

谢谢,已修改~

@@ -374,6 +374,15 @@
data_type : fpn_rois
optional : rois_num, multi_level_rois_num

- op : distributed_lookup_table
args : (Tensor[] ids, Tensor w, int table_id = 0, bool is_distributed = false, str lookup_table_version = "lookup_table", int64_t padding_idx = -1, int dtype = 5, bool is_test = false)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
args : (Tensor[] ids, Tensor w, int table_id = 0, bool is_distributed = false, str lookup_table_version = "lookup_table", int64_t padding_idx = -1, int dtype = 5, bool is_test = false)
args : (Tensor[] ids, Tensor w, int table_id = 0, bool is_distributed = false, str lookup_table_version = "lookup_table", int64_t padding_idx = -1, DataType dtype = DataType::FLOAT32, bool is_test = false)

bool is_distributed,
const std::string& lookup_table_version,
int64_t padding_idx,
int dtype,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
int dtype,
DataType dtype,

@xingmingyyj
Copy link
Contributor

可以rerun一下CI,应该可以合入了

@xiaoyewww
Copy link
Contributor Author

可以rerun一下CI,应该可以合入了

rerun后CI全过了,麻烦approve一下,感谢~

@@ -0,0 +1,56 @@
# Copyright (c) 2023 PaddlePaddle Authors. All Rights Reserved.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

2023 -> 2024

@kangguangli kangguangli merged commit 4cca092 into PaddlePaddle:develop Jan 25, 2024
29 checks passed
eee4017 pushed a commit to eee4017/Paddle that referenced this pull request Jan 30, 2024
)

* feat(pir): support distributed_lookup_table

* fix(pir): support distributed_lookup_table

* fix(pir): support distributed_lookup_table
@xiaoyewww xiaoyewww deleted the pir/distributed_lookup_table branch May 10, 2024 15:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
contributor External developers HappyOpenSource 快乐开源活动issue与PR
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants