Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to use m2m_100 _12B Model?? #15

Closed
im-yangp opened this issue Feb 24, 2021 · 4 comments
Closed

how to use m2m_100 _12B Model?? #15

im-yangp opened this issue Feb 24, 2021 · 4 comments

Comments

@im-yangp
Copy link

https://github.com/pytorch/fairseq/tree/master/examples/m2m_100

@nreimers
Copy link
Member

The 12B model is currently not supported, as running it is quite complex (you need to split the model across multiple GPUs)

@im-yangp im-yangp closed this as completed Mar 5, 2021
@im-yangp im-yangp reopened this Mar 5, 2021
@im-yangp
Copy link
Author

im-yangp commented Mar 5, 2021

Do you know what configuration is required to run this model

@nreimers
Copy link
Member

nreimers commented Mar 5, 2021

Sadly not.

@im-yangp
Copy link
Author

im-yangp commented Mar 5, 2021

Thank you

@im-yangp im-yangp closed this as completed Mar 5, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants