Replies: 3 comments
-
Nope, you need a GPU unfortunately. |
Beta Was this translation helpful? Give feedback.
-
There is a possibility to check if MPS (Apple Metal -> M1 GPU support) via torch.backends.is_mps_available() and convert .cuda() calls into .to(torch.device('mps')) however... Generally speaking it should be doable anyway ( but I am not python dev/experienced ML dev at all so it would take me too much time ) by replacing to specific MPS engine commands. Unfortunately MPS engine is not always as friendly as cuda - in CUDA pytorch simple .cuda() calls works. You also need to make sure to update requirements.txt numba=0.55.1 to 0.55.2 as Apple M1 is supported in 0.55.2 (without that compile fails -> numba/numba#7951) If someone experienced could pick this.. M1 machines/gpus are very strong and can yield better results that TESLA T4 etc.. |
Beta Was this translation helpful? Give feedback.
-
Hello!
I would like to train a glow_tts model in Hungarian on a MacBook Pro M1. CUDA GPU is not available, but it is very slow to train.
Is there any solution to make the training faster?
Beta Was this translation helpful? Give feedback.
All reactions