Skip to content

Add bfp16/int8 support into XDL GEMM operator#50

Merged
asroy merged 49 commits into
developfrom
add_bfp16_int8_gemm
Nov 15, 2021
Merged

Add bfp16/int8 support into XDL GEMM operator#50
asroy merged 49 commits into
developfrom
add_bfp16_int8_gemm

Conversation

@zjing14
Copy link
Copy Markdown
Contributor

@zjing14 zjing14 commented Nov 11, 2021

@zjing14 zjing14 changed the base branch from master to develop November 11, 2021 18:08
@zjing14 zjing14 requested review from asroy and j4yan November 11, 2021 18:28
@zjing14 zjing14 changed the title Add bfp16/int8 support into xdlops gemm Add bfp16/int8 support into Gemm Nov 11, 2021
@asroy
Copy link
Copy Markdown
Contributor

asroy commented Nov 14, 2021

@zjing14 is #37 still needed after this one is merged?

@zjing14
Copy link
Copy Markdown
Contributor Author

zjing14 commented Nov 15, 2021

@asroy No. PR #37 is fully included in this PR.

@asroy asroy changed the title Add bfp16/int8 support into Gemm Add bfp16/int8 support into XDL GEMM operator Nov 15, 2021
@asroy asroy merged commit 3737bb0 into develop Nov 15, 2021
@junliume junliume deleted the add_bfp16_int8_gemm branch October 21, 2023 06:09
carlushuang pushed a commit that referenced this pull request Apr 26, 2024
* Load 1d gamma and beta for each thread

* adjust loop order, avoid cast gamma & beta repeately

* Refine naming
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants