Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PanicError in read_ndjson and polars.str.json_decode with empty struct #13433

Open
2 tasks done
deep8324 opened this issue Jan 4, 2024 · 6 comments
Open
2 tasks done
Labels
bug Something isn't working needs triage Awaiting prioritization by a maintainer python Related to Python Polars

Comments

@deep8324
Copy link

deep8324 commented Jan 4, 2024

Checks

  • I have checked that this issue has not already been reported.

  • I have confirmed this bug exists on the latest version of Polars.

Reproducible example

import polars as pl
import math
import io
import pytest
import json
import os
os.environ["RUST_BACKTRACE"]="1"

def test_json_format():
    buffer = io.StringIO("""{"offer":{"my_value":0,"condition":[{"applicationId":0,"conditionRequestReason":[{}]}]}}""")
    error_string = f"ListArray's child's DataType must match. However, the expected DataType is Struct"
    try:
        df = pl.read_ndjson(buffer)
        print(df)
        assert df.shape[0] == 1
    except pl.exceptions.PolarsPanicError as e:
        print(pl.__version__)
        full_string = f"Caught PanicException: {e}"
        assert error_string in full_string


def test_json_decode_error():
    json_str = '{"offer": {"my_value": 0, "condition": [{"applicationId": 0, "conditionRequestReason": [{}]}]}}'
    json_str1 = '{"offer": {"my_value": 0, "condition": [{"applicationId": 0, "conditionRequestReason": [{"a":0}]}]}}'
    json_list = [json_str, json_str1]
    error_string = 'ComputeError(ErrString("a StructArray must contain at least one field"))'
    for j in json_list:
        df = pl.DataFrame(
                {"__input__": j})
        try:
            cols = df.select(pl.col("__input__").str.json_decode()).unnest("__input__").columns
            print(f"Test passed as struct is not empty {j} - No Panic")
            assert cols == ["offer"]
        except pl.exceptions.PolarsPanicError as e:
            print(f"Test Passed due to panic exception in polars {j} - We don't want this")
            full_string = f"Caught PanicError: {e}"
            assert error_string in full_string

Log output

thread '<unnamed>' panicked at crates/polars-arrow/src/array/list/mod.rs:82:61:
called `Result::unwrap()` on an `Err` value: ComputeError(ErrString("ListArray's child's DataType must match. However, the expected DataType is Struct([Field { name: \"applicationId\", data_type: Int64, is_nullable: true, metadata: {} }, Field { name: \"conditionRequestReason\", data_type: LargeList(Field { name: \"item\", data_type: Struct([]), is_nullable: true, metadata: {} }), is_nullable: true, metadata: {} }]) while it got Struct([Field { name: \"applicationId\", data_type: Int64, is_nullable: true, metadata: {} }, Field { name: \"conditionRequestReason\", data_type: LargeList(Field { name: \"item\", data_type: Struct([Field { name: \"\", data_type: Null, is_nullable: true, metadata: {} }]), is_nullable: true, metadata: {} }), is_nullable: true, metadata: {} }])."))
stack backtrace:
   0: _rust_begin_unwind
   1: core::panicking::panic_fmt
   2: core::result::unwrap_failed
   3: polars_arrow::legacy::array::list::AnonymousBuilder::finish
   4: <polars_core::chunked_array::builder::list::anonymous::AnonymousOwnedListBuilder as polars_core::chunked_array::builder::list::ListBuilderTrait>::finish
   5: polars_core::series::any_value::<impl polars_core::series::Series>::from_any_values_and_dtype
   6: polars_core::series::any_value::<impl polars_core::series::Series>::from_any_values_and_dtype
   7: polars_core::frame::row::av_buffer::AnyValueBuffer::reset
   8: <core::iter::adapters::map::Map<I,F> as core::iter::traits::iterator::Iterator>::next
   9: polars_io::ndjson::core::CoreJsonReader::parse_json::{{closure}}::{{closure}}
  10: rayon::iter::plumbing::bridge_producer_consumer::helper
  11: rayon_core::thread_pool::ThreadPool::install::{{closure}}
  12: <rayon_core::job::StackJob<L,F,R> as rayon_core::job::Job>::execute
  13: rayon_core::registry::WorkerThread::wait_until_cold
note: Some details are omitted, run with `RUST_BACKTRACE=full` for a verbose backtrace.



thread '<unnamed>' panicked at crates/polars-arrow/src/array/struct_/mod.rs:117:52:
called `Result::unwrap()` on an `Err` value: ComputeError(ErrString("a StructArray must contain at least one field"))
stack backtrace:
   0: _rust_begin_unwind
   1: core::panicking::panic_fmt
   2: core::result::unwrap_failed
   3: polars_json::json::deserialize::_deserialize
   4: polars_json::json::deserialize::_deserialize
   5: polars_json::json::deserialize::_deserialize
   6: polars_json::json::deserialize::_deserialize
   7: polars_json::json::deserialize::_deserialize
   8: polars_json::json::deserialize::_deserialize
   9: polars_json::ndjson::deserialize::deserialize_iter::_deserializer
  10: <F as polars_plan::dsl::expr_dyn_fn::SeriesUdf>::call_udf
  11: polars_lazy::physical_plan::expressions::apply::ApplyExpr::eval_and_flatten
  12: <polars_lazy::physical_plan::expressions::apply::ApplyExpr as polars_lazy::physical_plan::expressions::PhysicalExpr>::evaluate
  13: polars_lazy::physical_plan::executors::projection_utils::run_exprs_seq
  14: polars_lazy::physical_plan::executors::projection_utils::evaluate_physical_expressions
  15: polars_lazy::physical_plan::executors::projection::ProjectionExec::execute_impl
  16: <polars_lazy::physical_plan::executors::projection::ProjectionExec as polars_lazy::physical_plan::executors::executor::Executor>::execute
  17: polars_lazy::frame::LazyFrame::collect
  18: polars::lazyframe::_::<impl polars::lazyframe::PyLazyFrame>::__pymethod_collect__
  19: pyo3::impl_::trampoline::trampoline
  20: _method_vectorcall_NOARGS
  21: __PyEval_EvalFrameDefault
  22: __PyFunction_Vectorcall
  23: __PyEval_EvalFrameDefault
  24: __PyFunction_Vectorcall
  25: __PyEval_EvalFrameDefault
  26: __PyFunction_Vectorcall
  27: __PyObject_Call_Prepend
  28: _slot_tp_call
  29: __PyEval_EvalFrameDefault
  30: __PyFunction_Vectorcall
  31: __PyEval_EvalFrameDefault
  32: __PyFunction_Vectorcall
  33: __PyObject_Call_Prepend
  34: _slot_tp_call
  35: __PyEval_EvalFrameDefault
  36: __PyFunction_Vectorcall
  37: __PyEval_EvalFrameDefault
  38: __PyFunction_Vectorcall
  39: __PyEval_EvalFrameDefault
  40: __PyFunction_Vectorcall
  41: __PyObject_Call_Prepend
  42: _slot_tp_call
  43: __PyEval_EvalFrameDefault
  44: __PyFunction_Vectorcall
  45: __PyEval_EvalFrameDefault
  46: __PyFunction_Vectorcall
  47: __PyObject_Call_Prepend
  48: _slot_tp_call
  49: __PyEval_EvalFrameDefault
  50: __PyFunction_Vectorcall
  51: __PyEval_EvalFrameDefault
  52: __PyFunction_Vectorcall
  53: __PyObject_Call_Prepend
  54: _slot_tp_call
  55: __PyEval_EvalFrameDefault
  56: __PyEval_Vector
  57: _PyEval_EvalCode
  58: _run_mod
  59: _pyrun_file
  60: __PyRun_SimpleFileObject
  61: __PyRun_AnyFileObject
  62: _pymain_run_file_obj
  63: _pymain_run_file
  64: _Py_RunMain
  65: _main
  66: <unknown>
note: Some details are omitted, run with `RUST_BACKTRACE=full` for a verbose backtrace.

Issue description

The issues happens when a valid json file with an empty struct as follows

{"offer":{"my_value":0,"condition":[{"applicationId":0,"conditionRequestReason":[{}]}]}}

is read using polars.read_ndjson it ends up in panic error.

Same issue appears if such a string is present as string and polars.str.json_decode() is used

Expected behavior

Expected behavior

the json file with value {"offer":{"my_value":0,"condition":[{"applicationId":0,"conditionRequestReason":[{}]}]}},

is read by read_ndjson and a polars data frame is produced

┌────────────────────┐
│ offer │
│ --- │
│ struct[2] │
╞════════════════════╡
│ {0,[{0,[{null}]}]} │
└────────────────────┘

Installed versions

--------Version info---------
Polars:               0.20.2
Index type:           UInt32
Platform:             macOS-14.2.1-x86_64-i386-64bit
Python:               3.11.6 | packaged by conda-forge | (main, Oct  3 2023, 10:40:37) [Clang 15.0.7 ]

----Optional dependencies----
adbc_driver_manager:  <not installed>
cloudpickle:          <not installed>
connectorx:           <not installed>
deltalake:            <not installed>
fsspec:               <not installed>
gevent:               <not installed>
matplotlib:           <not installed>
numpy:                1.26.0
openpyxl:             <not installed>
pandas:               2.1.3
pyarrow:              14.0.1
pydantic:             <not installed>
pyiceberg:            <not installed>
pyxlsb:               <not installed>
sqlalchemy:           <not installed>
xlsx2csv:             <not installed>
xlsxwriter:           <not installed>

@deep8324 deep8324 added bug Something isn't working python Related to Python Polars labels Jan 4, 2024
@deep8324
Copy link
Author

deep8324 commented Jan 4, 2024

The issue exists even with polars v 0.20.3

@jcmuel
Copy link

jcmuel commented Jan 4, 2024

Hi @deep8324,

the data schema of the DataFrame in your example is not well-defined, so it's no surprise that you get an error. It seems like the actual schema that the example seems to represent is the following:

OrderedDict([('offer', Struct({'my_value': Int64, 'condition': List(Struct({'applicationId': Int64, 'conditionRequestReason': List(Struct({'a': Int64}))}))}))])

The data for the list of struct with the field a contains a list with an empty struct [{}]. It is not clear what this is supposed to represent. When you explicitly specify the schema as mentioned above, then this would represent a list with a single struct where the value for a is null. This is also the behavior that you can observe when you run the following example. Note that in the example below, the well_defined_data contains an empty list [] instead of [{}]. The incomplete_schema_data is exactly your example, but with additional schema information, such that you can read the data anyway.

# pylint: disable=missing-class-docstring, missing-function-docstring
""" Test for reading NDJSON files with nested fields. """
import io
import json
import unittest
from typing import Sequence, Any, cast

import polars as pl
import polars.testing


class TestNdJson(unittest.TestCase):
    def test_json_format(self):
        well_defined_data = io.StringIO(
            """{"offer": {"my_value": 0, "condition": [{"applicationId": 0, "conditionRequestReason": []}]}}
            {"offer": {"my_value": 0, "condition": [{"applicationId": 0, "conditionRequestReason": [{"a":0}]}]}}""")

        # Initialize the frame from a sequence of dictionaries.
        dicts = cast(Sequence[dict[str, Any]],
                     (json.loads(line) for line in well_defined_data))
        frame_from_dicts = pl.from_dicts(dicts)
        print(frame_from_dicts.schema)

        # Initialize the frame from the NDJSON file.
        frame_from_ndjson = pl.read_ndjson(well_defined_data)
        print(frame_from_ndjson.schema)

        # Compare the content of the frames.
        polars.testing.assert_frame_equal(frame_from_dicts, frame_from_ndjson)

        # Initialize the frame from a malformed NDJSON
        incomplete_schema_data = io.StringIO(
            """{"offer": {"my_value": 0, "condition": [{"applicationId": 0, "conditionRequestReason": [{}]}]}}""")
        frame_with_schema = pl.read_ndjson(incomplete_schema_data, schema=frame_from_ndjson.schema)
        frame_with_schema_export = frame_with_schema.write_ndjson()

        self.assertEqual(
            '{"offer":{"my_value":0,"condition":[{"applicationId":0,"conditionRequestReason":[{"a":null}]}]}}\n',
            frame_with_schema_export)


if __name__ == '__main__':
    unittest.main()

However, I can observe that without schema information, the behavior of from_dicts and read_ndjson differs when the data contains the awkward list with the empty struct [{}].

    def test_read_ndjson_list_of_awkward_struct(self):
        input_data = io.StringIO(
            """{"offer": {"my_value": 0, "condition": [{"applicationId": 0, "conditionRequestReason": [{}]}]}}
            {"offer": {"my_value": 0, "condition": [{"applicationId": 0, "conditionRequestReason": [{"a":0}]}]}}""")

        # Initialize the frame from a sequence of dictionaries.
        dicts = cast(Sequence[dict[str, Any]],
                     (json.loads(line) for line in input_data))
        frame_from_dicts = pl.from_dicts(dicts)
        print(frame_from_dicts.schema)

        # Initialize the frame from the NDJSON file.
        frame_from_ndjson = pl.read_ndjson(input_data)
        print(frame_from_ndjson.schema)

        # Compare the content of the frames.
        polars.testing.assert_frame_equal(frame_from_dicts, frame_from_ndjson)

The behavior of from_dicts seems to be a bit unexpected:

>>> frame_from_dicts
shape: (2, 1)
┌─────────────────────────┐
│ offer                   │
│ ---                     │
│ struct[2]               │
╞═════════════════════════╡
│ {0,[{0,[{null,null}]}]} │
│ {0,[{0,[{null,0}]}]}    │
└─────────────────────────┘
>>> frame_from_dicts.schema
OrderedDict([('offer', Struct({'my_value': Int64, 'condition': List(Struct({'applicationId': Int64, 'conditionRequestReason': List(Struct({'': Null, 'a': Int64}))}))}))])

@jcmuel
Copy link

jcmuel commented Jan 5, 2024

Enclosed is a small simple example, where JSON decode doesn't determine the correct schema for a list of integer. It works for list of string, but not for list of integer.

# pylint: disable=missing-class-docstring, missing-function-docstring, too-few-public-methods
""" Unit tests for the polars JSON decode functionality. """
import io
import pytest
import polars as pl
import polars.testing


class TestPolarsJsonDecode:
    @pytest.mark.parametrize('test_name, input_ndjson', [
        ("str", """{"list_field": ["a", "b"]}
                   {"list_field": []}
                   {"list_field": ["c", "d", "e"]}"""),
        ("int", """{"list_field": [1, 2]}
                   {"list_field": []}
                   {"list_field": [4, 5, 6]}""")
    ])
    def test_json_decode_list_of_basic_type(self, test_name, input_ndjson):
        print(f"Reading list of {test_name}...")
        input_buf = io.StringIO(input_ndjson)
        frame_from_ndjson = pl.read_ndjson(input_buf)

        # pylint: disable-next=assignment-from-no-return
        series_with_json = pl.Series(values=input_buf).str.strip_chars_start()
        series_decoded_unnested = series_with_json.str.json_decode().struct.unnest()

        assert frame_from_ndjson.schema == series_decoded_unnested.schema
        polars.testing.assert_frame_equal(frame_from_ndjson, series_decoded_unnested)


if __name__ == '__main__':
    pytest.main()

@stinodego stinodego added the needs triage Awaiting prioritization by a maintainer label Jan 13, 2024
@jcmuel
Copy link

jcmuel commented Feb 2, 2024

I just did some tests with polars 0.20.6 (and pyarrow 13.0.0)

  • The issue test_json_decode_list_of_basic_type seems to be resolved and doesn't reproduce anymore (since polars 0.20.4; it was still present in polars 0.20.3.)
  • The issue test_read_ndjson_list_of_awkward_struct still reproduces.
  • The panic in test_json_format still reproduces.
  • The panic in test_json_decode_error still reproduces.

Is polars using arrow for decoding JSON input?

@jcmuel
Copy link

jcmuel commented Feb 2, 2024

The following minimal test reproduces the PanicException that @deep8324 reported above.

import io
import json
import pytest
import polars as pl


@pytest.mark.parametrize('input_ndjson', [
    '{"bar": [{}]}',  # nested_null
    '{"foo":[{"bar":[{}]}]}'  # nested_nested_null
])
def test_ndjson_nested_nested_null(input_ndjson):
    """ Test whether read_ndjson and from_dicts imports nested null structs in the same way. """
    buffer = io.StringIO(input_ndjson)
    json_obj = json.loads(next(buffer))
    df_from_dicts = pl.from_dicts([json_obj])
    df_from_ndjson = pl.read_ndjson(buffer)
    assert df_from_dicts.schema == df_from_ndjson.schema

First we have the nested_null subtest, where the assertion for the expected schema fails:

Expected :OrderedDict([('bar', List(Struct({})))])
Actual   :OrderedDict([('bar', List(Struct({'': Null})))])

And then we have the nested_nested_null subtest, that causes the PanicException, that reports exactly the same unexpected type mismatch:

ListArray's child's DataType must match. However, the expected DataType is 
Struct([Field { name: \"bar\", data_type: LargeList(Field { name: \"item\", data_type: 
>>>>>>>>Struct([]),
is_nullable: true, metadata: {} }), is_nullable: true, metadata: {} }]) 
while it got 
Struct([Field { name: \"bar\", data_type: LargeList(Field { name: \"item\", data_type: 
>>>>>>>>Struct([Field { name: \"\", data_type: Null, is_nullable: true, metadata: {} }]), 
is_nullable: true, metadata: {} }), is_nullable: true, metadata: {} }])."))

The unexpected difference is in the line that I highlighted with the arrows. Instead of an empty struct, the code receives a struct with a field that has the name "" and a null value.

polars unit tested behavior of the import of empty structs from dictionaries and the import of empty structs from NDJSON is inconsistent. Importing {} as Struct([]), like it is done for NDJSON, seems to be the right approach. Importing {} as Struct({'': Null}) seems to be a bug in from_dicts, even if this behavior is unit tested. (I had a look at the polars unit tests.)

Feel free to add this test to the polars unit tests.

@jcmuel
Copy link

jcmuel commented Feb 20, 2024

It still reproduces in polars 0.20.10. I think it really comes down to the fact that polars is treating the empty object "{}" in a different way, depending on where it shows up in the schema.

def test_ndjson_empty_object() -> None:
    """
    The actual type of `empty_object_column` (tested in `test_ndjson_null_buffer()`) is inconsistent with
    the actual type of `nested_empty_object_column` (tested in `test_ndjson_nested_null()`).
    """
    data = io.BytesIO(
        b"""\
    {"id": 1, "empty_object_column": {}, "nested_empty_object_column": [{}]}
    {"id": 2, "empty_object_column": {}, "nested_empty_object_column": [{}]}
    {"id": 3, "empty_object_column": {}, "nested_empty_object_column": [{}]}
    """
    )

    assert pl.read_ndjson(data).schema == {
        "id": pl.Int64,
        "empty_object_column": pl.Struct([]),  # or pl.Struct([pl.Field("", pl.Null)]) ?
        "nested_empty_object_column": pl.List(pl.Struct([]))
    }

The test above fails with the output:

E         -     'empty_object_column': Struct({}),
E         +     'empty_object_column': Struct({'': Null}),
E         ?                                    ++++++++

@ritchie46 : It seems like you wrote the test_ndjson_null_buffer test.
@alexander-beedie : It seems like you wrote the test_ndjson_nested_null test.

I'm mentioning the two of you, since I would be interested to know which of the two behaviors is the expected one.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working needs triage Awaiting prioritization by a maintainer python Related to Python Polars
Projects
None yet
Development

No branches or pull requests

3 participants