Skip to content
This repository has been archived by the owner on Feb 25, 2022. It is now read-only.

Mixture of Experts #31

Closed
StellaAthena opened this issue Sep 8, 2020 · 4 comments
Closed

Mixture of Experts #31

StellaAthena opened this issue Sep 8, 2020 · 4 comments
Assignees
Labels
documentation Improvements or additions to documentation.

Comments

@StellaAthena
Copy link
Member

Is the Mixture of Experts model still a work-in-progress as indicated in the README? Which MoE model is being implemented? Is there a reason we are not using @lucidrains's package https://github.com/lucidrains/mixture-of-experts?

Tagging @Mistobaan who asked about this in Discord.

@StellaAthena StellaAthena added the documentation Improvements or additions to documentation. label Sep 8, 2020
@lucidrains
Copy link
Collaborator

@StellaAthena ohh, that package is for pytorch

@lucidrains
Copy link
Collaborator

@StellaAthena it's already done, we just need to test it in a run

@StellaAthena
Copy link
Member Author

Thanks! Tomorrow I’ll update the documentation, add testing MoE to the Kanban, and close this issue.

@StellaAthena StellaAthena self-assigned this Sep 8, 2020
@StellaAthena
Copy link
Member Author

Documentation has been updated. Moved the question of testing the code to #37

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
documentation Improvements or additions to documentation.
Projects
None yet
Development

No branches or pull requests

2 participants