Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for Inf2 optimum class [WIP] #1364

Merged
merged 6 commits into from
Feb 5, 2024

Conversation

michaelfeil
Copy link
Contributor

@michaelfeil michaelfeil commented Jan 28, 2024

Adding support for inferntia.
Closes: #1343

@CLAassistant
Copy link

CLAassistant commented Jan 28, 2024

CLA assistant check
All committers have signed the CLA.

@michaelfeil michaelfeil marked this pull request as draft January 28, 2024 06:37
@michaelfeil michaelfeil marked this pull request as ready for review January 29, 2024 18:05
@michaelfeil
Copy link
Contributor Author

michaelfeil commented Jan 31, 2024

Can you give this a first round of review. @haileyschoelkopf ?

Copy link
Collaborator

@haileyschoelkopf haileyschoelkopf left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for creating this! The big thing I see with the current implementation is that we can minimize code reuse a lot here.

lm_eval/models/neuron_optimum.py Show resolved Hide resolved
lm_eval/models/neuron_optimum.py Outdated Show resolved Hide resolved
@haileyschoelkopf haileyschoelkopf added the feature request A feature that isn't implemented yet. label Jan 31, 2024
@michaelfeil
Copy link
Contributor Author

Are there any additional comments, or can this be merged?

@haileyschoelkopf
Copy link
Collaborator

Didn't realize this was done! Yes, so long as unit tests pass, can be merged!

@michaelfeil
Copy link
Contributor Author

michaelfeil commented Feb 2, 2024

Sounds good, unit tests have been passing.

@haileyschoelkopf
Copy link
Collaborator

One last thing: could you add this model to the table of LM types and their names in the README, and also mention that it requires the optimum extra in the extras table?

@michaelfeil
Copy link
Contributor Author

Okay @haileyschoelkopf done that and improved the dependencies message on model loading.

@haileyschoelkopf
Copy link
Collaborator

Thank you, LGTM!

(still missing an entry in our table of LM integrations: https://github.com/EleutherAI/lm-evaluation-harness?tab=readme-ov-file#model-apis-and-inference-servers but I can add that in a later commit if desired, can't edit your PR)

@michaelfeil
Copy link
Contributor Author

michaelfeil commented Feb 5, 2024

Okay, updated the model name neuronx in the table and renamed all extras and hf_model to neuronx. With this, from my side, this is ready to be merged.

Copy link
Collaborator

@haileyschoelkopf haileyschoelkopf left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks very much for the responsiveness and the contribution!

@haileyschoelkopf haileyschoelkopf merged commit d17dcea into EleutherAI:main Feb 5, 2024
2 checks passed
wx-zhang pushed a commit to wx-zhang/lm-evaluation-harness that referenced this pull request Mar 13, 2024
* initial commit

* remove overwrite bs

* adding neuronx dependencies

* Update README.md

* update neuronx
djstrong pushed a commit to speakleash/lm-evaluation-harness that referenced this pull request Aug 2, 2024
* initial commit

* remove overwrite bs

* adding neuronx dependencies

* Update README.md

* update neuronx
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature request A feature that isn't implemented yet.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Contributing AWS Inferentia Code
3 participants