Skip to content

Decoding of UTF-16 C strings is broken #1696

@mhsmith

Description

@mhsmith

In the context of a Java interface (jchar is uint16_t):

cdef const jchar *jchar_str = ...
unicode_str = (<char*>jchar_str)[:str_len * 2].decode("UTF-16")

Cython-generated code:

 __pyx_t_3 = __Pyx_decode_c_string(((char *)__pyx_v_jchar_str), 0, __pyx_t_5, NULL, NULL, PyUnicode_DecodeUTF16); if (unlikely(!__pyx_t_3)) __PYX_ERR(5, 99, __pyx_L1_error)

But this is not correct, because the last argument of __Pyx_decode_c_string has the signature PyObject* (*decode_func)(const char *s, Py_ssize_t size, const char *errors), (which are the three parameters of PyUnicode_DecodeUTF8), whereas PyUnicode_DecodeUTF16 has an extra parameter, int *byteorder.

This results in a compiler warning passing argument 6 of '__Pyx_decode_c_string' from incompatible pointer type, and I don't expect the code will work correctly.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions