Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BUG: Concat multiple different ExtensionArray types #22997

Merged
merged 9 commits into from Oct 18, 2018
Merged
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
1 change: 1 addition & 0 deletions doc/source/whatsnew/v0.24.0.txt
Expand Up @@ -602,6 +602,7 @@ update the ``ExtensionDtype._metadata`` tuple to match the signature of your
- :meth:`Series.astype` and :meth:`DataFrame.astype` now dispatch to :meth:`ExtensionArray.astype` (:issue:`21185:`).
- Slicing a single row of a ``DataFrame`` with multiple ExtensionArrays of the same type now preserves the dtype, rather than coercing to object (:issue:`22784`)
- Added :meth:`pandas.api.types.register_extension_dtype` to register an extension type with pandas (:issue:`22664`)
- Bug in concatenation an Series with two different extension dtypes not casting to object dtype (:issue:`22994`)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

double back ticks

- Series backed by an ``ExtensionArray`` now work with :func:`util.hash_pandas_object` (:issue:`23066`)
- Updated the ``.type`` attribute for ``PeriodDtype``, ``DatetimeTZDtype``, and ``IntervalDtype`` to be instances of the dtype (``Period``, ``Timestamp``, and ``Interval`` respectively) (:issue:`22938`)
- :func:`ExtensionArray.isna` is allowed to return an ``ExtensionArray`` (:issue:`22325`).
Expand Down
5 changes: 0 additions & 5 deletions pandas/core/dtypes/concat.py
Expand Up @@ -560,11 +560,6 @@ def _concat_sparse(to_concat, axis=0, typs=None):

fill_values = [x.fill_value for x in to_concat
if isinstance(x, SparseArray)]

if len(set(fill_values)) > 1:
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this was leftover from an early implementation that didn't handle multiple fill values. And I think @jorisvandenbossche noticed this in his review, but I misunderstood.

Regardless, we do allow concating Series[sparse] for different fill_values, we just change them all to have the first one's fill value (in SparseSeries._concat_same_type)

raise ValueError("Cannot concatenate SparseArrays with different "
"fill values")

fill_value = fill_values[0]

# TODO: Fix join unit generation so we aren't passed this.
Expand Down
3 changes: 1 addition & 2 deletions pandas/core/internals/managers.py
Expand Up @@ -1636,8 +1636,7 @@ def concat(self, to_concat, new_axis):
# check if all series are of the same block type:
if len(non_empties) > 0:
blocks = [obj.blocks[0] for obj in non_empties]

if all(type(b) is type(blocks[0]) for b in blocks[1:]): # noqa
if len({b.dtype for b in blocks}) == 1:
new_block = blocks[0].concat_same_type(blocks)
else:
values = [x.values for x in blocks]
Expand Down
13 changes: 13 additions & 0 deletions pandas/tests/reshape/test_concat.py
@@ -1,6 +1,7 @@
from warnings import catch_warnings, simplefilter
from itertools import combinations
from collections import deque
from decimal import Decimal

import datetime as dt
import dateutil
Expand All @@ -19,6 +20,7 @@
from pandas.util import testing as tm
from pandas.util.testing import (assert_frame_equal,
makeCustomDataframe as mkdf)
from pandas.tests.extension.decimal import to_decimal

import pytest

Expand Down Expand Up @@ -2361,6 +2363,17 @@ def test_concat_datetime_timezone(self):
index=idx1.append(idx1))
tm.assert_frame_equal(result, expected)

def test_concat_different_extension_dtypes_upcasts(self):
a = pd.Series(pd.core.arrays.integer_array([1, 2]))
b = pd.Series(to_decimal([1, 2]))

result = pd.concat([a, b], ignore_index=True)
expected = pd.Series([
1, 2,
Decimal(1), Decimal(2)
], dtype=object)
tm.assert_series_equal(result, expected)


@pytest.mark.parametrize('pdt', [pd.Series, pd.DataFrame, pd.Panel])
@pytest.mark.parametrize('dt', np.sctypes['float'])
Expand Down