Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is there any plan to support MoE models like Mixtral8×7B? #548

Closed
arronKler opened this issue Dec 18, 2023 · 5 comments
Closed

Is there any plan to support MoE models like Mixtral8×7B? #548

arronKler opened this issue Dec 18, 2023 · 5 comments

Comments

@arronKler
Copy link

Like the title says, do we have any plan or do we have the ability to support MoE models like Mixtral8×7B?

@mryab
Copy link
Member

mryab commented Dec 19, 2023

Hi! We definitely have the ability to support Mixtral and other MoE models (Hivemind, the library for decentralized DL used by Petals, was initially designed for mixtures-of-experts), but currently the team does not have enough bandwidth to implement them in Petals right away. I might have some time over the holidays to work on it, but if you (or someone else from the community) is willing to contribute that, it will probably be much faster

@fakerybakery
Copy link

+1

1 similar comment
@gaborkukucska
Copy link

+1

@frburrue
Copy link

Hello,

I'm trying to implement Mixtral8x7B following this guide: https://github.com/bigscience-workshop/petals/wiki/Run-a-custom-model-with-Petals

I have some doubts when implementing the block.py and model.py files. Could you give me some support?

I would be very interested in contributing to the project.

Thank you.

@artek0chumak
Copy link
Collaborator

Hello!

We added and merged support for Mixtral models: #553.

Just update servers for the new version of the petals.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants