Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Still there: Incompatibility with bfloat16 #203

Open
borisfom opened this issue Jan 25, 2024 · 4 comments
Open

Still there: Incompatibility with bfloat16 #203

borisfom opened this issue Jan 25, 2024 · 4 comments

Comments

@borisfom
Copy link

There was an issue filed previously of lack of support for bfloat16 - looks like it's still there.
I am getting the same error :
RuntimeError: "_" not implemented for 'BFloat16'
running the code below:

from torch_cluster import radius, radius_graph
import torch
from torch import tensor
x =     tensor([[  4.0625, -27.3750,  -4.3438],
                [  3.0312, -27.6250,  -3.4844],
                [  5.0312, -28.5000,  -4.2812],
                [ -8.1875, -17.7500,  -2.9062],
                [ -8.1875, -19.0000,  -2.2812],
                [ -8.0625, -20.2500,  -2.9688]], device='cuda:0', dtype=torch.bfloat16)

radius_inp = (
    x,
    5.0,
    tensor([0, 0, 0, 0, 0, 0], device='cuda:0'),
    10
)

radius_edges = radius_graph(*radius_inp)
@borisfom
Copy link
Author

borisfom commented Jan 25, 2024

Same issue with radius() call.
This is for the package built from the trunk, on A6000 box.

@rusty1s
Copy link
Owner

rusty1s commented Jan 28, 2024

Currently, bfloat16 support only exists on CPU :(

@borisfom
Copy link
Author

Any plans to include bfloat16 support on GPU soon ?

@rusty1s
Copy link
Owner

rusty1s commented Jan 31, 2024

Currently no, since this repo is no longer in active development.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants