Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CUDA fixes #484

Merged
merged 5 commits into from
Jan 8, 2023
Merged

CUDA fixes #484

merged 5 commits into from
Jan 8, 2023

Conversation

NullSenseStudio
Copy link
Collaborator

Automatically converts dependency exceptions to RuntimeError such as torch.cuda.OutOfMemoryError so the exception info isn't lost on the frontend while unpickling would have raised an error like ModuleNotFoundError: No module named 'torch'.

Also will force full precision for GTX 16xx cards. (untested as I don't have one)

@NullSenseStudio NullSenseStudio added the bug Something isn't working label Dec 27, 2022
@carson-katri carson-katri merged commit 034540b into main Jan 8, 2023
@NullSenseStudio NullSenseStudio deleted the cuda-fixes branch January 8, 2023 19:32
@carson-katri carson-katri added this to the v0.0.10 milestone Jan 15, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants