Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Complex dtype #959

Closed
daniellga opened this issue Jan 15, 2023 · 1 comment · Fixed by #1091
Closed

Complex dtype #959

daniellga opened this issue Jan 15, 2023 · 1 comment · Fixed by #1091
Labels
bug Something isn't working

Comments

@daniellga
Copy link

Hi! I noticed from dtype.R that cdouble is the same as cfloat64.

torch_cfloat <- function() torch_dtype$new(cpp_torch_cfloat())
torch_cfloat32 <- function() torch_dtype$new(cpp_torch_cfloat())
torch_cdouble <- function() torch_dtype$new(cpp_torch_cdouble())
torch_cfloat64 <- function() torch_dtype$new(cpp_torch_cdouble())

In pytorch they use cdouble as the same as torch.complex128 (2 torch.float64 values). Are the names really supposed to be different from pytorch? It confused me a little when I saw that as I didn't know if cfloat64 was representing 2 32 bit values or 64 bits for each value. So I went to check on pytorch's documentation, which made it clear (2nd print).
Also, I don't know if intentional or not, I think this torch package is missing the torch.complex32 (chalf) implementation.

image
image

@skeydan
Copy link
Collaborator

skeydan commented Jan 17, 2023

Hi, thanks for reporting this! You're right, that's actually a bug that should be fixed.

@skeydan skeydan added the bug Something isn't working label Jan 17, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants