You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I wanted to share a quick demonstration of a home use case for this that I made - though it is a bit silly: https://youtu.be/Q6PgKZfQeK8
I am running that on a 3090ti and Ubuntu 2022.04. As a side question, would there be a way to get this to utilize my second 3090ti as well? Great job on this project and thanks for sharing it openly!
The text was updated successfully, but these errors were encountered:
Currently we don’t support inference one case with multiple gpus. But you can set CUDA_VISIBLE_DEVICES to run multi different inference simultaneously
try
CUDA_VISIBLE_DEVICES=1 python -m scripts.inference .....
Currently we don’t support inference one case with multiple gpus. But you can set CUDA_VISIBLE_DEVICES to run multi different inference simultaneously try CUDA_VISIBLE_DEVICES=1 python -m scripts.inference .....
Thanks very much for the tip! That would be awesome to utilize both cards for different generations!
I wanted to share a quick demonstration of a home use case for this that I made - though it is a bit silly: https://youtu.be/Q6PgKZfQeK8
I am running that on a 3090ti and Ubuntu 2022.04. As a side question, would there be a way to get this to utilize my second 3090ti as well? Great job on this project and thanks for sharing it openly!
The text was updated successfully, but these errors were encountered: