-
-
Notifications
You must be signed in to change notification settings - Fork 30.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ctypes arrays >=2GB in length causes exception #61069
Comments
The environment is Windows 8 Pro 64-bit running Python 64-bit in the WinPython distribution. Python is v2.7.3 built on Apr 10 2012. I first found this with create_string_buffer however I found out that it happens with an even simpler example. The following code throws an AttributeException: class must define a _length_ attribute, which must be a positive integer. import ctypes
c_char * int(2*1024*1024*1024) # 2GB, also fails with long() instead of int() However the following works import ctypes
c_char * int(2*1024*1024*1024-1) # 1 byte less than 2GB This is the same with the other c_ types (not limited to size of memory since c_int * int(2*1024*1024*1024-1) works and would be nearly 4 times the size of the failed c_char one). |
Would you like to investigate a patch? |
Note that adding support for >2GB arrays is a new feature and therefore can't go in 2.7 (but it would be OK for 3.4+). The error message could be improved though. |
I have no idea where I would start and don't have much time... I am not so sure it is a new features. It seems that the ctypes system is internally using unsigned integers for length but should be using size_t (or at least ssize_t). Seems like a bug. |
I mean using signed integers currently. |
If it works elsewhere and/or it's documented to work, then it might indeed considered a bug. Maybe someone more familiar with ctype can comment. |
Okay, so I tested in Linux (CentOS 6.3) which has Python 2.6.6 64-bit. It works. So the Windows 2.7.3 64-bit version is bugged. I was able to perform the c_char * long(32*1024*1024*1024) [the highest value I tried] and it worked fine. The Linux machine I tested this on was limited in RAM so ran into memory issues, but I was able to allocate a 2GB buffer with create_string_buffer(). So all in all, it is a bug. Works with Linux Python v2.6.6 64-bit but not Windows Python v2.7.3. The ctypes documentation does not mention an upper limit so I would assume that it should be based on the maximum memory allocation of the underlying system (e.g. Windows 32-bit can't allocate more than 2GB, but Windows 64-bit should be very very large). |
This case works fine on 64-bit Linux (Ubuntu) and OS X 10.7.5. I suspect this is due to the fact that 64-bit Windows uses the LLP64 data model and we are using longs somewhere. I am investigating further now. |
In _ctypes.c there are (only!) two occurrences of the "long" type... both are related to ctypes arrays and look suspect. |
I just ran into this issue. I'm trying to write code like this: (ctypes.c_char*bufferLen).from_buffer(buffer) where buffer is a bytearray. When bufferLen is greater than 2GB I fail foul of this code in _ctypes.c long length;
....
length = PyLong_AsLongAndOverflow(length_attr, &overflow);
if (overflow) {
PyErr_SetString(PyExc_OverflowError,
"The '_length_' attribute is too large");
Py_DECREF(length_attr);
goto error;
} Surely this should not be forcing long on us. Can't it use PyLong_AsSsize_t or perhaps PyLong_AsLongLongAndOverflow? |
In older versions of ctypes, before it was added to the standard library, the underlying length field was a C int, and CArrayType_new used the PyInt_AS_LONG macro. ctypes was added to the standard library in 2.5, by which time the length field is Py_ssize_t, but CArrayType_new still used the PyInt_AS_LONG macro. That's still the case in 2.7. Python 3 changed this to call PyLong_AsLongAndOverflow, but apparently Christian Heimes didn't consider fixing this properly to use Py_ssize_t: https://hg.python.org/cpython/rev/612d8dea7f6c David, maybe there's a workaround for your use case, if you can provide some more details. |
Erik, As you can no doubt guess, this is related to the questions I have been asking on SO that you have so expertly been answering. Thank you! When I solved the latest problem, getting at the internal buffer of a bytearray, I used the code in my previous comment, which came from a comment (now deleted) of yours to the question. Looking at this more closely I realise that, as you said in your answer, the array length is not important for my specific needs, to it is fine to use zero. That side steps this issue completely. Thanks again. |
This issue is specific to system with 32-bit long but 64-bit void* I guess? So only Windows is impacted? |
This issue is specific to system with 32-bit long but 64-bit size_t. Yes, seems the only supported impacted system is Windows. |
Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.
Show more details
GitHub fields:
bugs.python.org fields:
The text was updated successfully, but these errors were encountered: