I need help training Flux Lora on multiple GPUs. The memory on a single GPU is not sufficient, so I want to train on multiple GPUs. However, configuring device: cuda:0,1 in the config file doesn't seem to work.
Could you please provide guidance on how to properly set up and run Flux Lora training across multiple GPUs? The current single-GPU memory limitation is preventing me from training effectively.
Any assistance or examples of multi-GPU configurations for Flux Lora would be greatly appreciated. Thank you!
I need help training Flux Lora on multiple GPUs. The memory on a single GPU is not sufficient, so I want to train on multiple GPUs. However, configuring device: cuda:0,1 in the config file doesn't seem to work.
Could you please provide guidance on how to properly set up and run Flux Lora training across multiple GPUs? The current single-GPU memory limitation is preventing me from training effectively.
Any assistance or examples of multi-GPU configurations for Flux Lora would be greatly appreciated. Thank you!