You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It would be nice to have the option to use AMD GPUs that support ROCm .
PyTorch seems to support ROCm AMD GPUs on Linux - the following was tested on Ubuntu 22.04.2 LTS with an AMD Ryzen 5825U (Radeon Vega Barcelo 8-core, shared memory) and ROCm 5.5.0 and PyTorch for ROCm 5.4.2 installed:
Let's open an issue over there? I think the default resource and strategies for BentoML will need some more discussion internally from the team and how we could support AMD GPUs on K8s
Feature request
It would be nice to have the option to use AMD GPUs that support ROCm .
PyTorch seems to support ROCm AMD GPUs on Linux - the following was tested on Ubuntu 22.04.2 LTS with an AMD Ryzen 5825U (Radeon Vega Barcelo 8-core, shared memory) and ROCm 5.5.0 and PyTorch for ROCm 5.4.2 installed:
A cursory look seems to indicate that currently there are only CPU and NVidia Resource implementations in BentoML:
BentoML/src/bentoml/_internal/resource.py
Line 217 in c346890
Motivation
This feature would enable running models with OpenLLM on AMD GPUs with ROCm support.
Other
No response
The text was updated successfully, but these errors were encountered: