Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Flash att infer #59083

Merged
merged 13 commits into from
Nov 23, 2023
Merged

Conversation

liuzhenhai93
Copy link
Contributor

@liuzhenhai93 liuzhenhai93 commented Nov 17, 2023

PR types

Others

PR changes

Others

Description

card-77754 flash attention 前反向切分推导

Copy link

paddle-bot bot commented Nov 17, 2023

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

Copy link

paddle-bot bot commented Nov 17, 2023

❌ The PR is not created using PR's template. You can refer to this Demo.
Please use PR's template, it helps save our maintainers' time so that more developers get helped.

chenwhql
chenwhql previously approved these changes Nov 23, 2023
Copy link
Contributor

@GhostScreaming GhostScreaming left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@liuzhenhai93 liuzhenhai93 reopened this Nov 23, 2023
@GhostScreaming GhostScreaming merged commit dee507b into PaddlePaddle:develop Nov 23, 2023
28 checks passed
SecretXV pushed a commit to SecretXV/Paddle that referenced this pull request Nov 28, 2023
* polish

* polish

* polish

* polish

* polish

* polish

* polsh

* polish

* polish
@liuzhenhai93 liuzhenhai93 deleted the flash_att_infer branch November 28, 2023 09:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants