New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add cupy.setxor1d
api
#6582
add cupy.setxor1d
api
#6582
Conversation
We align to the latest supported numpy version in the master branch. If NumPy errors, CuPy should error too unless there is a good reason that prevents this (device synchronization, checks using divergent threads, etc). In such case it should be clearly stated in the docstring. |
Alright, thanks. Will make the required changes. |
An algorithm using |
Hey @asi1024! I wrote the function using
However, this seems to run a little slower compared to the code already existing. (As of now I am comparing the time taken by each implementation to run the tests, please let me know if there is a more definite way to compare time). |
@pri1311 Could you try |
Aah yes, on it. |
@pri1311 I think that merging cupy.concatenate and bitwise-and into one ElementwiseKernel will reduce additional memory allocation! |
Yes working on that |
For array sizes
|
@pri1311 Great! That implementation looks mostly OK except for a few minor comments. Could you apply the fix to this PR? |
Hey, a few tests are failing (and I understand why) let me just make that correction, and will commit the changes. For some reason, the bug didn't show up when I was manually running the function from a separate file. |
Hey @asi1024, apologies for the delay, had my final exams going on. Have added the final changes from my side. Can you review them once? |
Co-authored-by: Akifumi Imanishi <akifumi.imanishi@gmail.com>
/test mini |
LGTM! Thank you for your PR! |
Linked Issue: #6078
Hey! I have implememeted numpy.setxor1d but seems like when you provide a multi-dim array with assume_unique as True, numpy raises an error, since their code concatenates array as - np.concatenate((ar1,ar2)) and not np.concatenate((ar1,ar2), axis=None).
However, when assume_unique is false, the array is flattened as it uses the numpy.unique function which returns a flattened array. Do we replicate the same behaviour in CuPy or allow multi-dim arrays even when assume_unique is set to true?
Attaching a screenshot below to explain the above:
Open discussion in numpy - numpy/numpy#14670
Currently, the code I have written does not throw errors for multi-dim arrays, regardless of where
assume_unique
is True or False. However, tests currently don't cover that, since I can't compare it to numpy results. Should I add some manual-comparison tests? Please let me know your opinion.