-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
Closed
Description
In the context of a Java interface (jchar is uint16_t):
cdef const jchar *jchar_str = ...
unicode_str = (<char*>jchar_str)[:str_len * 2].decode("UTF-16")
Cython-generated code:
__pyx_t_3 = __Pyx_decode_c_string(((char *)__pyx_v_jchar_str), 0, __pyx_t_5, NULL, NULL, PyUnicode_DecodeUTF16); if (unlikely(!__pyx_t_3)) __PYX_ERR(5, 99, __pyx_L1_error)
But this is not correct, because the last argument of __Pyx_decode_c_string has the signature PyObject* (*decode_func)(const char *s, Py_ssize_t size, const char *errors), (which are the three parameters of PyUnicode_DecodeUTF8), whereas PyUnicode_DecodeUTF16 has an extra parameter, int *byteorder.
This results in a compiler warning passing argument 6 of '__Pyx_decode_c_string' from incompatible pointer type, and I don't expect the code will work correctly.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels