You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
ERROR: test_with_key_complex (pyspark.sql.tests.test_pandas_cogrouped_map.CogroupedMapInPandasTests)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/spark/python/pyspark/sql/tests/test_pandas_cogrouped_map.py", line 160, in test_with_key_complex
result = self.data1 \
File "/spark/python/pyspark/sql/pandas/conversion.py", line 168, in toPandas
pandas_type = PandasConversionMixin._to_corrected_pandas_type(field.dataType)
File "/spark/python/pyspark/sql/pandas/conversion.py", line 238, in _to_corrected_pandas_type
return np.bool
File "/opt/conda/envs/arrow/lib/python3.8/site-packages/numpy/__init__.py", line 284, in __getattr__
raise AttributeError("module {!r} has no attribute "
AttributeError: module 'numpy' has no attribute 'bool'
Component(s)
Continuous Integration, Python
The text was updated successfully, but these errors were encountered:
…buteError on numpy.bool (#33714)
### Rationale for this change
Fix for nightly integration tests with PySpark 3.2.0 failure.
### What changes are included in this PR?
NumPy version pin in `docker-compose.yml`.
### Are these changes tested?
Will test on the open PR with the CI.
### Are there any user-facing changes?
No.
* Closes: #33697
Lead-authored-by: Alenka Frim <frim.alenka@gmail.com>
Co-authored-by: Alenka Frim <AlenkaF@users.noreply.github.com>
Co-authored-by: Sutou Kouhei <kou@cozmixng.org>
Signed-off-by: Raúl Cumplido <raulcumplido@gmail.com>
Describe the bug, including details regarding any error messages, version, and platform.
Nightly integration tests with PySpark 3.2.0 are failing with the following error:
test-conda-python-3.8-spark-v3.2.0
Component(s)
Continuous Integration, Python
The text was updated successfully, but these errors were encountered: