Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

__cuda_array_interface__ #2860

Merged
merged 13 commits into from
May 22, 2018
Merged

__cuda_array_interface__ #2860

merged 13 commits into from
May 22, 2018

Conversation

sklam
Copy link
Member

@sklam sklam commented Mar 28, 2018

for interop with 3rd party CUDA array.

@codecov-io
Copy link

codecov-io commented Apr 3, 2018

Codecov Report

Merging #2860 into master will decrease coverage by 0.05%.
The diff coverage is 4.76%.

@@            Coverage Diff             @@
##           master    #2860      +/-   ##
==========================================
- Coverage   85.84%   85.79%   -0.06%     
==========================================
  Files         326      327       +1     
  Lines       68311    68422     +111     
  Branches     7721     7729       +8     
==========================================
+ Hits        58643    58703      +60     
- Misses       8428     8481      +53     
+ Partials     1240     1238       -2

@stuartarchibald stuartarchibald added this to the Numba 0.39 RC milestone Apr 9, 2018
@seibert
Copy link
Contributor

seibert commented Apr 13, 2018

I can confirm this works in combination with cupy/cupy#1144.

import cupy

from numba import cuda

@cuda.jit
def add(x, y, out):
    start = cuda.grid(1)
    stride = cuda.gridsize(1)
    for i in range(start, x.shape[0], stride):
        out[i] = x[i] + y[i]

a = cupy.arange(10)
b = a * 2
out = cupy.empty_like(a)

print('out before:', out)

add[1, 32](a, b, out)

print('out after:', out)
print('array types:', type(a), type(b), type(out))

Output:

(cupy_dev) seibert@nvidia1:~/repos/numba_cupy$ python demo.py
out before: [0 0 0 0 0 0 0 0 0 0]
out after: [ 0  3  6  9 12 15 18 21 24 27]
array types: <class 'cupy.core.core.ndarray'> <class 'cupy.core.core.ndarray'> <class 'cupy.core.core.ndarray'>

@seibert
Copy link
Contributor

seibert commented Apr 13, 2018

It looks like support for @vectorize/@guvectorize needs to be done still.

@sklam sklam changed the title [WIP] __cuda_array_interface__ __cuda_array_interface__ May 3, 2018
@seibert
Copy link
Contributor

seibert commented May 8, 2018

I think you forgot to check in cuda_array_interface.rst :)

@sklam
Copy link
Member Author

sklam commented May 8, 2018

Oops, added missing file now

@stuartarchibald stuartarchibald requested a review from seibert May 10, 2018 16:39
@@ -20,6 +20,17 @@ transfer:
.. autofunction:: numba.cuda.to_device
:noindex:

In addition to the device arrays, numba can consume any object that implements
:ref:`cuda array interface <cuda-array-interface>`. These objects can be
converted into a device array by creating a view of the GPU buffer using the
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should phrase this as "These objects also can be manually converted into a Numba device array by creating a view of the GPU buffer using the following APIs:"

@@ -20,6 +20,17 @@ transfer:
.. autofunction:: numba.cuda.to_device
:noindex:

In addition to the device arrays, numba can consume any object that implements
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

s/numba/Numba/

@seibert
Copy link
Contributor

seibert commented May 17, 2018

Two minor documentation nits, and then this looks good to merge.

as per comment
@seibert seibert merged commit f0418c7 into numba:master May 22, 2018
@sklam sklam deleted the enh/cudaaryface branch May 22, 2018 15:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants