Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HPU requirements config file #3242

Closed
wants to merge 3 commits into from
Closed

HPU requirements config file #3242

wants to merge 3 commits into from

Conversation

RafLit
Copy link
Contributor

@RafLit RafLit commented Jul 15, 2024

Description

In #3230 vllm was added to requirements/common.txt.
Installation of vllm automatically installs an exact version of pytorch and overwrites our version.
We propose to add a parameter to install dependencies from a separate configuration for hpu.
Alternatively, moving vllm away from the common requirements file should also be considered.

Type of change

  • requirements/hpu.txt with gaudi dependencies was added,

  • --hpu flag was added to ts_scripts/install_dependencies.py instead of --skip_torch_install as torch was eventually installed by vllm anyway.

  • Bug fix (non-breaking change which fixes an issue)

Checklist:

  • Did you have fun?
  • Have you added tests that prove your fix is effective or that this feature works?
  • Has code been commented, particularly in hard-to-understand areas?
  • Have you made corresponding changes to the documentation?

@RafLit
Copy link
Contributor Author

RafLit commented Jul 15, 2024

@agunapal could you please review this?

@agunapal
Copy link
Collaborator

@agunapal could you please review this?

Hi @RafLit We'll have to discuss internally where to place vllm. We'll have a similar challenge with TensorRT(trt-llm)

But I like this idea of having a separate requirements file, at least for different GPU HWs. But the question that would still remain is..who will update this file? We get alerts when new versions of these libraries are released. Would you be updating them?.

@agunapal
Copy link
Collaborator

Hi @RafLit Closing this PR since we removed the dependency on vllm. Please check and confirm

@RafLit
Copy link
Contributor Author

RafLit commented Jul 29, 2024

@agunapal Hi, we might be interested in the future in having a separate file for dependencies for our hardware and we could update it. For now this can remain closed, Thank you!

@RafLit RafLit closed this Jul 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants