Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

llama v2 with sharded weights #82

Merged
merged 3 commits into from
Dec 13, 2023
Merged

llama v2 with sharded weights #82

merged 3 commits into from
Dec 13, 2023

Conversation

awni
Copy link
Member

@awni awni commented Dec 12, 2023

Enable support for llama v2 and larger llama models with sharded weights to address #56

I am moving this example to be more like the mistral / mixtral example just for the sake of consistency. Still keeping the nicely commented generation loop since that's pretty useful IMO.

Copy link
Member

@angeloskath angeloskath left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🚀

@awni awni merged commit a614e95 into main Dec 13, 2023
@awni awni deleted the llamav2 branch December 14, 2023 06:29
Blaizzy pushed a commit to Blaizzy/mlx-examples that referenced this pull request Mar 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants