-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[PHI]Standardise some C++ API (Part5) #47860
Conversation
你的PR提交成功,感谢你对开源项目的贡献! |
return KernelSignature("tril_triu", {"X"}, {"diagonal", "lower"}, {"Out"}); | ||
} | ||
|
||
KernelSignature TrilTriuGradOpArgumentMapping( | ||
const ArgumentMappingContext& ctx) { | ||
return KernelSignature( | ||
"tril_grad", {"Out@GRAD"}, {"diagonal", "lower"}, {"X@GRAD"}); | ||
"tril_triu_grad", {"Out@GRAD"}, {"diagonal", "lower"}, {"X@GRAD"}); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
动静态图分别走两套kernel会不会埋下动静不一致的隐患?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个应该没有吧,类似于静态图走的addRawKernel,动态图走的addKernel
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
目前感觉影响不大,不过按理说后面Raw和非Raw的kernel可能也需要尝试做一些整合
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个PR先合吧,后面根据情况再看需不要调整
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
PR types
Others
PR changes
Others
Describe
标准化hsigmoid_loss/layer_norm/matrix_rank/tril/triu c++ api,此PR是系列工作的最后一个PR
异构设备相关修改:PaddlePaddle/PaddleCustomDevice#204