Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Decoding of UTF-16 C strings is broken #1696

Closed
mhsmith opened this issue May 3, 2017 · 0 comments
Closed

Decoding of UTF-16 C strings is broken #1696

mhsmith opened this issue May 3, 2017 · 0 comments

Comments

@mhsmith
Copy link

mhsmith commented May 3, 2017

In the context of a Java interface (jchar is uint16_t):

cdef const jchar *jchar_str = ...
unicode_str = (<char*>jchar_str)[:str_len * 2].decode("UTF-16")

Cython-generated code:

 __pyx_t_3 = __Pyx_decode_c_string(((char *)__pyx_v_jchar_str), 0, __pyx_t_5, NULL, NULL, PyUnicode_DecodeUTF16); if (unlikely(!__pyx_t_3)) __PYX_ERR(5, 99, __pyx_L1_error)

But this is not correct, because the last argument of __Pyx_decode_c_string has the signature PyObject* (*decode_func)(const char *s, Py_ssize_t size, const char *errors), (which are the three parameters of PyUnicode_DecodeUTF8), whereas PyUnicode_DecodeUTF16 has an extra parameter, int *byteorder.

This results in a compiler warning passing argument 6 of '__Pyx_decode_c_string' from incompatible pointer type, and I don't expect the code will work correctly.

@scoder scoder closed this as completed in 5c9b32c May 5, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant