Use of multiple GPUs in inferences xxts model #3658
Unanswered
CRochaVox
asked this question in
General Q&A
Replies: 1 comment
-
Complementing the previous question. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello everyone, I'm currently using the xxts model on a single gpu, I'm looking to scale the inferences with the model and would like to know if multi-gpus is supported.
Could anyone tell me if it's possible?
Beta Was this translation helpful? Give feedback.
All reactions