-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[question] how to train on GPU with the Hausdorff Losses? #1
Comments
Loss itself is always on cpu in my implementation since I use numpy. But training a net on gpu is no problem, since you only put output onto cpu once it is being calculated. I guess if you really want to calculate loss on gpu you could check up I haven't run any experiments in paper, I've only done an implementation based on provided description of methods. I'm not an author of a paper so it's not an official implementation - afaik there's no official implementation anywhere. |
i can recomend you some tricks to work on gpu on either tf and torch @neuronflow if you are still interested |
Hello, I'm very interested in implementing HausdorffDTLoss for training 3d medical image segmentation, could you please provide some tricks for me about it? |
@umutdundar99, a GPU implementation of Hausdorff loss would definitely be interesting! Apparently not only for me :) |
Is it possible to train on GPU with
HausdorffDTLoss
orHausdorffERLoss
?How did you train for https://arxiv.org/pdf/1904.10030.pdf ?
The text was updated successfully, but these errors were encountered: