Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

problem about freedom.py #22

Closed
wenyuzzz opened this issue Nov 18, 2023 · 1 comment
Closed

problem about freedom.py #22

wenyuzzz opened this issue Nov 18, 2023 · 1 comment

Comments

@wenyuzzz
Copy link

sim = torch.mm(context_norm, context_norm.transpose(1, 0))
_, knn_ind = torch.topk(sim, self.knn_k, dim=-1)
adj_size = sim.size()

When we create this graph using GPU, it may cause a problem like this:

sim size: torch.Size([63001, 63001])
sim dtype: torch.float32
knn_ind: tensor([[7585306885142836339, 3991370661542720611, 3486688227326894121, ..., 140594981698912,
140594981698912, 4294967295],
[354038449242118, 1271310336001, 8387195064482211950, ..., 7957688336701796217,
8319958742476218724, 7795558767364238639],
[ 3304159867, 0, 1, ..., 140588979432688,
140594973553136, 140588979527472],
...,
[ 117, 117, 4473802540135219199, ..., 140594981698960,
140594981698960, 4294967295],
[ 0, 0, 0, ..., 0,
0, 0],
[ 0, 0, 0, ..., 0,
0, 0]], device='cuda:2')
knn_ind size: torch.Size([63001, 10])

But if we make this graph through cpu. This bug will disappear.

@enoche
Copy link
Owner

enoche commented Dec 5, 2023

By default, the graph is created on GPU, no problem on my side, you may check you prepared data:
image

@enoche enoche closed this as completed Feb 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants