-
Notifications
You must be signed in to change notification settings - Fork 474
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CUFFT_INTERNAL_ERROR on RTX4090 #96
Comments
But it turns out to run successfully with some warnings on torch 2.0.1... |
Should work with CUDA 12.2, but not CUDA 12.3
|
Here is my method (Ubuntu 2204) :
You may need to modify your code because of warnings or errors. |
Same as #80 for visibility. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Requirements.txt in meloTTS:torch<2.0
but the codes below can only be valid in torch version higher than 1.13.0, So my choices are torch 13.0/13.1.
All torch 13.0/13.1 packages are built against cuda 11.6/11.7.
Now that I have a RTX4090, I can't train meloTTS on it with torch 1.13.1 for a cuda-bug which is fixed in cuda 11.8:
pytorch/pytorch#88038
So I hope you the developers of MeloTTS could take the torch version up to 2.0 or higher.
The text was updated successfully, but these errors were encountered: