You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
=================================== FAILURES ===================================
_________________________ test_invalid_non_join_column _________________________
def test_invalid_non_join_column():
NUM_ITEMS = 30
t1 = pa.Table.from_pydict({
'id': range(NUM_ITEMS),
'array_column': [[z for z in range(3)] for x in range(NUM_ITEMS)],
})
t2 = pa.Table.from_pydict({
'id': range(NUM_ITEMS),
'value': [x for x in range(NUM_ITEMS)]
})
# check as left table
with pytest.raises(pa.lib.ArrowInvalid) as excinfo:
> t1.join(t2, 'id', join_type='inner')
pyarrow/tests/test_table.py:2440:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
pyarrow/table.pxi:4738: in pyarrow.lib.Table.join
return _pac()._perform_join(
pyarrow/lib.pyx:137: in pyarrow.lib._pac
import pyarrow.acero as pac
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
# ---------------------------------------------------------------------
# Implement Internal ExecPlan bindings
# cython: profile=False
# distutils: language = c++
# cython: language_level = 3
from pyarrow.lib import Table
from pyarrow.compute import Expression, field
try:
from pyarrow._acero import ( # noqa
Declaration,
ExecNodeOptions,
TableSourceNodeOptions,
FilterNodeOptions,
ProjectNodeOptions,
AggregateNodeOptions,
OrderByNodeOptions,
HashJoinNodeOptions,
)
except ImportError as exc:
> raise ImportError(
f"The pyarrow installation is not built with support for 'acero' ({str(exc)})"
) from None
E ImportError: The pyarrow installation is not built with support for 'acero' (No module named 'pyarrow._acero')
pyarrow/acero.py:40: ImportError
Component(s)
Continuous Integration, Python
The text was updated successfully, but these errors were encountered:
### Rationale for this change
`test_invalid_non_join_column` depends on Acero.
### What changes are included in this PR?
Add `@ pytest.mark.acero`.
### Are these changes tested?
Yes.
### Are there any user-facing changes?
No.
* Closes: #36680
Authored-by: Sutou Kouhei <kou@clear-code.com>
Signed-off-by: Raúl Cumplido <raulcumplido@gmail.com>
### Rationale for this change
`test_invalid_non_join_column` depends on Acero.
### What changes are included in this PR?
Add `@ pytest.mark.acero`.
### Are these changes tested?
Yes.
### Are there any user-facing changes?
No.
* Closes: apache#36680
Authored-by: Sutou Kouhei <kou@clear-code.com>
Signed-off-by: Raúl Cumplido <raulcumplido@gmail.com>
### Rationale for this change
`test_invalid_non_join_column` depends on Acero.
### What changes are included in this PR?
Add `@ pytest.mark.acero`.
### Are these changes tested?
Yes.
### Are there any user-facing changes?
No.
* Closes: apache#36680
Authored-by: Sutou Kouhei <kou@clear-code.com>
Signed-off-by: Raúl Cumplido <raulcumplido@gmail.com>
Describe the bug, including details regarding any error messages, version, and platform.
example-python-minimal-fedora-conda
:https://github.com/ursacomputing/crossbow/actions/runs/5549202130/jobs/10133027759#step:3:7841
Component(s)
Continuous Integration, Python
The text was updated successfully, but these errors were encountered: