You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
SELECTt2.unnest.f1 AS f1, t2.unnest.f2 AS f2 FROM (SELECTt1.zipped, EXPLODE(t1.zipped) AS unnest FROM (SELECT ARRAYS_ZIP(t0.array1, t0.array2) AS zipped FROM source AS t0) AS t1) AS t2
Then I tried SQL:
SELECTt2.unnest.**array1**AS f1, t2.unnest.**array2**AS f2 FROM (SELECTt1.zipped, EXPLODE(t1.zipped) AS unnest FROM (SELECT ARRAYS_ZIP(t0.array1, t0.array2) AS zipped FROM source AS t0) AS t1) AS t2
It worked.
What version of ibis are you using?
main
What backend(s) are you using, if any?
pyspark
Relevant log output
/Users/ning.ln/anaconda3/envs/ibis-dev-arm64/bin/python /Users/ning.ln/Java/ibis_demo/test.py
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
24/04/25 17:11:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
/Users/ning.ln/anaconda3/envs/ibis-dev-arm64/lib/python3.11/site-packages/pyspark/sql/pandas/functions.py:407: UserWarning: In Python 3.6+ and Spark 3.0+, it is preferred to specify type hints forpandas UDF instead of specifying pandas UDF type which will be deprecatedin the future releases. See SPARK-28264 for more details.
warnings.warn(
SELECT `t2`.`unnest`.`f1` AS `f1`, `t2`.`unnest`.`f2` AS `f2` FROM (SELECT `t1`.`zipped`, EXPLODE(`t1`.`zipped`) AS `unnest` FROM (SELECT ARRAYS_ZIP(`t0`.`array1`, `t0`.`array2`) AS `zipped` FROM `source` AS `t0`) AS `t1`) AS `t2`
/Users/ning.ln/anaconda3/envs/ibis-dev-arm64/lib/python3.11/site-packages/pyspark/sql/pandas/functions.py:407: UserWarning: In Python 3.6+ and Spark 3.0+, it is preferred to specify type hints forpandas UDF instead of specifying pandas UDF type which will be deprecatedin the future releases. See SPARK-28264 for more details.
warnings.warn(
Traceback (most recent call last):
File "/Users/ning.ln/Java/ibis_demo/test.py", line 19, in<module>
print(df)
File "/Users/ning.ln/Java/ibis/ibis/expr/types/core.py", line 77, in __repr__
returnself._interactive_repr()
^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/ning.ln/Java/ibis/ibis/expr/types/core.py", line 64, in _interactive_repr
console.print(self)
File "/Users/ning.ln/anaconda3/envs/ibis-dev-arm64/lib/python3.11/site-packages/rich/console.py", line 1700, in print
extend(render(renderable, render_options))
File "/Users/ning.ln/anaconda3/envs/ibis-dev-arm64/lib/python3.11/site-packages/rich/console.py", line 1312, in render
render_iterable = renderable.__rich_console__(self, _options) # type: ignore[union-attr]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/ning.ln/Java/ibis/ibis/expr/types/core.py", line 115, in __rich_console__
raise e
File "/Users/ning.ln/Java/ibis/ibis/expr/types/core.py", line 96, in __rich_console__
rich_object = to_rich(self, console_width=console_width)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/ning.ln/Java/ibis/ibis/expr/types/pretty.py", line 271, in to_rich
return _to_rich_table(
^^^^^^^^^^^^^^^
File "/Users/ning.ln/Java/ibis/ibis/expr/types/pretty.py", line 342, in _to_rich_table
result = table.limit(max_rows + 1).to_pyarrow()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/ning.ln/Java/ibis/ibis/expr/types/core.py", line 483, in to_pyarrow
return self._find_backend(use_default=True).to_pyarrow(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/ning.ln/Java/ibis/ibis/backends/pyspark/__init__.py", line 793, in to_pyarrow
self.execute(table_expr, params=params, limit=limit, **kwargs),
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/ning.ln/Java/ibis/ibis/backends/sql/__init__.py", line 301, in execute
with self._safe_raw_sql(sql) as cur:
^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/ning.ln/Java/ibis/ibis/backends/pyspark/__init__.py", line 311, in _safe_raw_sql
return self.raw_sql(query)
^^^^^^^^^^^^^^^^^^^
File "/Users/ning.ln/Java/ibis/ibis/backends/pyspark/__init__.py", line 316, in raw_sql
query = self._session.sql(query)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/ning.ln/anaconda3/envs/ibis-dev-arm64/lib/python3.11/site-packages/pyspark/sql/session.py", line 1631, in sql
return DataFrame(self._jsparkSession.sql(sqlQuery, litArgs), self)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/ning.ln/anaconda3/envs/ibis-dev-arm64/lib/python3.11/site-packages/py4j/java_gateway.py", line 1322, in __call__
return_value = get_return_value(
^^^^^^^^^^^^^^^^^
File "/Users/ning.ln/anaconda3/envs/ibis-dev-arm64/lib/python3.11/site-packages/pyspark/errors/exceptions/captured.py", line 185, in deco
raise converted from None
pyspark.errors.exceptions.captured.AnalysisException: [FIELD_NOT_FOUND] No such struct field `f1`in`array1`, `array2`.; line 1 pos 7
Process finished with exit code 1
Code of Conduct
I agree to follow this project's Code of Conduct
The text was updated successfully, but these errors were encountered:
… schema (#9052)
Fix PySpark zip implementation to ensure that its output matches the
schema expected by Ibis. Fixes#9049.
---------
Co-authored-by: Gil Forsyth <gforsyth@users.noreply.github.com>
What happened?
Here is my step:
SQL is :
Then I tried SQL:
It worked.
What version of ibis are you using?
main
What backend(s) are you using, if any?
pyspark
Relevant log output
Code of Conduct
The text was updated successfully, but these errors were encountered: