Hi, your model is a great work valuable for my projects.
I am conducting long video inference with 64 frames input.
I follow the multiple GPUs inference step in readme, and keep the params unchanged(input_size=448, num_segments=64, max_num=1).
I have 8*H800 80G, but "cuda out of memory" error still occurs.
Could you provide me some suggestion for this issue?
Thanks a lot!!!
Hi, your model is a great work valuable for my projects.
I am conducting long video inference with 64 frames input.
I follow the multiple GPUs inference step in readme, and keep the params unchanged(input_size=448, num_segments=64, max_num=1).
I have 8*H800 80G, but "cuda out of memory" error still occurs.
Could you provide me some suggestion for this issue?
Thanks a lot!!!