Skip to content

Cuda out of memory #1

@zeddo123

Description

@zeddo123

Hi,

I tried running the model on my Quadro M1200 (4G), and I've ran into a memory issue. Does the model need a lot of vRAM or can I somehow tweak it?

Here's the output after running the main script:

torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 34.00 MiB (GPU 0; 3.95 GiB total capacity; 3.29 GiB already allocated; 11.38 MiB free; 3.37 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation.  See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

I've tried changing max_slip_size_mb but it didn't help. Thank you in advance for any feedback you might give ;)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions