You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to run bash ./launch/infer_vos.sh ytvos, but am getting errors of "GPU out of memory". Trying to reduce the batch_size down to 8, 4, 2, 1, but still getting the error. I have nVidia K2000, with only 4G GPU memory. Any suggestions/advice how to get around the issue? Thanks.
The text was updated successfully, but these errors were encountered:
Hi, having at least 12GB of GPU memory is the requirement for this implementation. The memory bottleneck are the context frames: depending on the test set, there may be up to 20 previous predictions accumulated in the context. Those have to stay on GPU memory. Obviously, you can move some of the computations on the CPU, but that would dramatically slow down the inference (and it's already quite slow on GPU). Nikita
I am trying to run bash ./launch/infer_vos.sh ytvos, but am getting errors of "GPU out of memory". Trying to reduce the batch_size down to 8, 4, 2, 1, but still getting the error. I have nVidia K2000, with only 4G GPU memory. Any suggestions/advice how to get around the issue? Thanks.
The text was updated successfully, but these errors were encountered: