-
-
Notifications
You must be signed in to change notification settings - Fork 780
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pinned memory allocation returns odd size #3625
Comments
I think this was the reason: cupy/cupy/cuda/pinned_memory.pyx Lines 296 to 298 in 725c350
|
So what's the best way to use this? Just trim the array viewing the allocation afterwards? Or is there a better way for one to be allocating pinned memory? |
Hi John, sorry I dropped the ball. This is usually what I do >>> mem = cupy.cuda.alloc_pinned_memory(a.nbytes)
>>> b = numpy.frombuffer(mem, a.dtype, a.size)
>>> b.nbytes
8000000 Perhaps the core devs can recommend better approaches? |
Another way: >>> mem = cupy.cuda.alloc_pinned_memory(a.nbytes)
>>> c = numpy.ndarray((a.size,), dtype=a.dtype, buffer=mem)
>>> c.nbytes
8000000 |
We were not aware of this use case of pinned memory allocator. When do you need this? |
@kmaehashi It is very useful to back NumPy arrays by pinned memory. I use this very often when knowing in advance there'll be frequent device-host transfer following. |
Thanks Leo! That makes sense 🙂 Yeah this is the same thing we were looking at. Having a NumPy array is a bit easier to work with. |
Related to #4080, I'm wondering if it's better to provide this (equivalent to Leo's snippet) as an API under cupyx. PyCUDA has an API for this feature: |
That would be very helpful 🙂 |
I sent #4870 (still WIP) to address this need. |
It looks like CuPy allocates more bytes than expected when calling
cupy.cuda.alloc_pinned_memory
. Any ideas why that might be?python -c 'import cupy; cupy.show_config()'
)The text was updated successfully, but these errors were encountered: