Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mixtral MoE improvements: transposed w2 to have reduction dim be innermost dim #128

Merged
merged 5 commits into from
Mar 10, 2024

Conversation

yanboliang
Copy link
Contributor

No description provided.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Mar 10, 2024
@yanboliang yanboliang requested a review from Chillee March 10, 2024 00:07
mixtral-moe/model.py Outdated Show resolved Hide resolved
@yanboliang yanboliang merged commit 873723b into pytorch-labs:main Mar 10, 2024
1 check passed
@yanboliang yanboliang deleted the mixtral_improvements branch March 10, 2024 23:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants