Skip to content

Commit

Permalink
Squashed commit of the following:
Browse files Browse the repository at this point in the history
commit 11a0d93
Author: Tom Augspurger <tom.w.augspurger@gmail.com>
Date:   Wed Oct 3 14:26:34 2018 -0500

    typerror

commit a0cd5e7
Author: Tom Augspurger <tom.w.augspurger@gmail.com>
Date:   Wed Oct 3 14:25:38 2018 -0500

    TypeError for Series

commit 2247461
Author: Tom Augspurger <tom.w.augspurger@gmail.com>
Date:   Wed Oct 3 13:29:29 2018 -0500

    Test op(Series[EA], EA])

commit c9fe5d3
Author: Tom Augspurger <tom.w.augspurger@gmail.com>
Date:   Wed Oct 3 13:21:33 2018 -0500

    make strict

commit 7ef697c
Author: Tom Augspurger <tom.w.augspurger@gmail.com>
Date:   Wed Oct 3 13:14:52 2018 -0500

    Use super

commit 35d4213
Merge: 0671e7d ee80803
Author: Tom Augspurger <tom.w.augspurger@gmail.com>
Date:   Wed Oct 3 13:11:05 2018 -0500

    Merge remote-tracking branch 'upstream/master' into ea-divmod

commit ee80803
Author: Matthew Roeschke <emailformattr@gmail.com>
Date:   Wed Oct 3 08:25:44 2018 -0700

     BUG: Correctly weekly resample over DST (pandas-dev#22941)

    * test resample fix

    * move the localization until needed

    * BUG: Correctly weekly resample over DST

    * Move whatsnew to new section

commit fea27f0
Author: Tom Augspurger <TomAugspurger@users.noreply.github.com>
Date:   Wed Oct 3 08:49:44 2018 -0500

    CI: pin moto to 1.3.4 (pandas-dev#22959)

commit 15d32bb
Author: jbrockmendel <jbrockmendel@gmail.com>
Date:   Wed Oct 3 04:32:35 2018 -0700

    [CLN] Dispatch (some) Frame ops to Series, avoiding _data.eval (pandas-dev#22019)

    * avoid casting to object dtype in mixed-type frames

    * Dispatch to Series ops in _combine_match_columns

    * comment

    * docstring

    * flake8 fixup

    * dont bother with try_cast_result

    * revert non-central change

    * simplify

    * revert try_cast_results

    * revert non-central changes

    * Fixup typo syntaxerror

    * simplify assertion

    * use dispatch_to_series in combine_match_columns

    * Pass unwrapped op where appropriate

    * catch correct error

    * whatsnew note

    * comment

    * whatsnew section

    * remove unnecessary tester

    * doc fixup

commit 3e3256b
Author: alimcmaster1 <alimcmaster1@gmail.com>
Date:   Wed Oct 3 12:23:22 2018 +0100

    Allow passing a mask to NanOps (pandas-dev#22865)

commit e756e99
Author: jbrockmendel <jbrockmendel@gmail.com>
Date:   Wed Oct 3 02:19:27 2018 -0700

    CLN: Use is_period_dtype instead of ABCPeriodIndex checks (pandas-dev#22958)

commit 03181f0
Author: Wenhuan <lixx0880@gmail.com>
Date:   Wed Oct 3 15:28:07 2018 +0800

    BUG: fix Series(extension array) + extension array values addition (pandas-dev#22479)

commit 04ea51d
Author: Joris Van den Bossche <jorisvandenbossche@gmail.com>
Date:   Wed Oct 3 09:24:36 2018 +0200

    CLN: small clean-up of IntervalIndex (pandas-dev#22956)

commit b0f9a10
Author: Tony Tao <34781056+tonytao2012@users.noreply.github.com>
Date:   Tue Oct 2 19:01:08 2018 -0500

    DOC GH22893 Fix docstring of groupby in pandas/core/generic.py (pandas-dev#22920)

commit 08ecba8
Author: jbrockmendel <jbrockmendel@gmail.com>
Date:   Tue Oct 2 14:22:53 2018 -0700

    BUG: fix DataFrame+DataFrame op with timedelta64 dtype (pandas-dev#22696)

commit c44bad2
Author: Pamela Wu <pambot@users.noreply.github.com>
Date:   Tue Oct 2 17:16:25 2018 -0400

    CLN GH22873 Replace base excepts in pandas/core (pandas-dev#22901)

commit 8e749a3
Author: Pamela Wu <pambot@users.noreply.github.com>
Date:   Tue Oct 2 17:14:48 2018 -0400

    CLN GH22874 replace bare excepts in pandas/io/pytables.py (pandas-dev#22919)

commit 1102a33
Author: Joris Van den Bossche <jorisvandenbossche@gmail.com>
Date:   Tue Oct 2 22:31:36 2018 +0200

    DOC/CLN: clean-up shared_docs in generic.py (pandas-dev#20074)

commit 9caf048
Author: Tom Augspurger <TomAugspurger@users.noreply.github.com>
Date:   Tue Oct 2 13:25:22 2018 -0500

    CI: change windows vm image (pandas-dev#22948)

commit 0671e7d
Author: Tom Augspurger <tom.w.augspurger@gmail.com>
Date:   Tue Oct 2 11:10:42 2018 -0500

    Fixup

commit 1b4261f
Merge: c92a4a8 1d9f76c
Author: Tom Augspurger <tom.w.augspurger@gmail.com>
Date:   Tue Oct 2 10:58:43 2018 -0500

    Merge remote-tracking branch 'upstream/master' into ea-divmod

commit 1d9f76c
Author: Joris Van den Bossche <jorisvandenbossche@gmail.com>
Date:   Tue Oct 2 17:11:11 2018 +0200

    CLN: remove Index._to_embed (pandas-dev#22879)

    * CLN: remove Index._to_embed

    * pep8

commit 6247da0
Author: Tom Augspurger <TomAugspurger@users.noreply.github.com>
Date:   Tue Oct 2 08:50:41 2018 -0500

    Provide default implementation for `data_repated` (pandas-dev#22935)

commit c92a4a8
Author: Tom Augspurger <tom.w.augspurger@gmail.com>
Date:   Mon Oct 1 16:56:15 2018 -0500

    Update old test

commit 52538fa
Author: Tom Augspurger <tom.w.augspurger@gmail.com>
Date:   Mon Oct 1 16:51:48 2018 -0500

    BUG: divmod return type

commit 5ce06b5
Author: Matthew Roeschke <emailformattr@gmail.com>
Date:   Mon Oct 1 14:22:20 2018 -0700

     BUG: to_datetime preserves name of Index argument in the result (pandas-dev#22918)

    * BUG: to_datetime preserves name of Index argument in the result

    * correct test
  • Loading branch information
TomAugspurger committed Oct 3, 2018
1 parent 012be1c commit ff7c06c
Show file tree
Hide file tree
Showing 41 changed files with 849 additions and 319 deletions.
2 changes: 1 addition & 1 deletion ci/travis-27.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ dependencies:
# universal
- pytest
- pytest-xdist
- moto
- moto==1.3.4
- hypothesis>=3.58.0
- pip:
- backports.lzma
Expand Down
15 changes: 12 additions & 3 deletions doc/source/extending.rst
Original file line number Diff line number Diff line change
Expand Up @@ -160,9 +160,18 @@ your ``MyExtensionArray`` class, as follows:
MyExtensionArray._add_arithmetic_ops()
MyExtensionArray._add_comparison_ops()
Note that since ``pandas`` automatically calls the underlying operator on each
element one-by-one, this might not be as performant as implementing your own
version of the associated operators directly on the ``ExtensionArray``.
.. note::

Since ``pandas`` automatically calls the underlying operator on each
element one-by-one, this might not be as performant as implementing your own
version of the associated operators directly on the ``ExtensionArray``.

This implementation will try to reconstruct a new ``ExtensionArray`` with the
result of the element-wise operation. Whether or not that succeeds depends on
whether the operation returns a result that's valid for the ``ExtensionArray``.
If an ``ExtensionArray`` cannot be reconstructed, a list containing the scalars
returned instead.

.. _extending.extension.testing:

Expand Down
33 changes: 31 additions & 2 deletions doc/source/whatsnew/v0.24.0.txt
Original file line number Diff line number Diff line change
Expand Up @@ -579,6 +579,35 @@ Current Behavior:
...
OverflowError: Trying to coerce negative values to unsigned integers

.. _whatsnew_0240.api.crosstab_dtypes

Crosstab Preserves Dtypes
^^^^^^^^^^^^^^^^^^^^^^^^^

:func:`crosstab` will preserve now dtypes in some cases that previously would
cast from integer dtype to floating dtype (:issue:`22019`)

Previous Behavior:

.. code-block:: ipython

In [3]: df = pd.DataFrame({'a': [1, 2, 2, 2, 2], 'b': [3, 3, 4, 4, 4],
...: 'c': [1, 1, np.nan, 1, 1]})
In [4]: pd.crosstab(df.a, df.b, normalize='columns')
Out[4]:
b 3 4
a
1 0.5 0.0
2 0.5 1.0

Current Behavior:

.. code-block:: ipython

In [3]: df = pd.DataFrame({'a': [1, 2, 2, 2, 2], 'b': [3, 3, 4, 4, 4],
...: 'c': [1, 1, np.nan, 1, 1]})
In [4]: pd.crosstab(df.a, df.b, normalize='columns')

Datetimelike API Changes
^^^^^^^^^^^^^^^^^^^^^^^^

Expand Down Expand Up @@ -713,7 +742,7 @@ Timedelta
- Bug in :class:`Index` with numeric dtype when multiplying or dividing an array with dtype ``timedelta64`` (:issue:`22390`)
- Bug in :class:`TimedeltaIndex` incorrectly allowing indexing with ``Timestamp`` object (:issue:`20464`)
- Fixed bug where subtracting :class:`Timedelta` from an object-dtyped array would raise ``TypeError`` (:issue:`21980`)
-
- Fixed bug in adding a :class:`DataFrame` with all-`timedelta64[ns]` dtypes to a :class:`DataFrame` with all-integer dtypes returning incorrect results instead of raising ``TypeError`` (:issue:`22696`)
-

Timezones
Expand Down Expand Up @@ -841,6 +870,7 @@ Groupby/Resample/Rolling
- Bug in :meth:`Resampler.asfreq` when frequency of ``TimedeltaIndex`` is a subperiod of a new frequency (:issue:`13022`).
- Bug in :meth:`SeriesGroupBy.mean` when values were integral but could not fit inside of int64, overflowing instead. (:issue:`22487`)
- :func:`RollingGroupby.agg` and :func:`ExpandingGroupby.agg` now support multiple aggregation functions as parameters (:issue:`15072`)
- Bug in :meth:`DataFrame.resample` and :meth:`Series.resample` when resampling by a weekly offset (``'W'``) across a DST transition (:issue:`9119`, :issue:`21459`)

Sparse
^^^^^^
Expand Down Expand Up @@ -881,4 +911,3 @@ Other
- :meth:`DataFrame.nlargest` and :meth:`DataFrame.nsmallest` now returns the correct n values when keep != 'all' also when tied on the first columns (:issue:`22752`)
- :meth:`~pandas.io.formats.style.Styler.bar` now also supports tablewise application (in addition to rowwise and columnwise) with ``axis=None`` and setting clipping range with ``vmin`` and ``vmax`` (:issue:`21548` and :issue:`21526`). ``NaN`` values are also handled properly.
- Logical operations ``&, |, ^`` between :class:`Series` and :class:`Index` will no longer raise ``ValueError`` (:issue:`22092`)
-
16 changes: 12 additions & 4 deletions pandas/core/arrays/base.py
Original file line number Diff line number Diff line change
Expand Up @@ -775,10 +775,18 @@ def convert_values(param):
res = [op(a, b) for (a, b) in zip(lvalues, rvalues)]

if coerce_to_dtype:
try:
res = self._from_sequence(res)
except TypeError:
pass
if op.__name__ in {'divmod', 'rdivmod'}:
try:
a, b = zip(*res)
res = (self._from_sequence(a),
self._from_sequence(b))
except TypeError:
pass
else:
try:
res = self._from_sequence(res)
except TypeError:
pass

return res

Expand Down
7 changes: 1 addition & 6 deletions pandas/core/arrays/interval.py
Original file line number Diff line number Diff line change
Expand Up @@ -108,12 +108,7 @@ class IntervalArray(IntervalMixin, ExtensionArray):
_na_value = _fill_value = np.nan

def __new__(cls, data, closed=None, dtype=None, copy=False,
fastpath=False, verify_integrity=True):

if fastpath:
return cls._simple_new(data.left, data.right, closed,
copy=copy, dtype=dtype,
verify_integrity=False)
verify_integrity=True):

if isinstance(data, ABCSeries) and is_interval_dtype(data):
data = data.values
Expand Down
2 changes: 1 addition & 1 deletion pandas/core/arrays/period.py
Original file line number Diff line number Diff line change
Expand Up @@ -539,7 +539,7 @@ def asfreq(self, freq=None, how='E'):
if self.hasnans:
new_data[self._isnan] = iNaT

return self._simple_new(new_data, freq=freq)
return self._shallow_copy(new_data, freq=freq)

# ------------------------------------------------------------------
# Arithmetic Methods
Expand Down
2 changes: 1 addition & 1 deletion pandas/core/computation/pytables.py
Original file line number Diff line number Diff line change
Expand Up @@ -411,7 +411,7 @@ def visit_Subscript(self, node, **kwargs):
slobj = self.visit(node.slice)
try:
value = value.value
except:
except AttributeError:
pass

try:
Expand Down
2 changes: 1 addition & 1 deletion pandas/core/dtypes/common.py
Original file line number Diff line number Diff line change
Expand Up @@ -468,7 +468,7 @@ def is_timedelta64_dtype(arr_or_dtype):
return False
try:
tipo = _get_dtype_type(arr_or_dtype)
except:
except (TypeError, ValueError, SyntaxError):
return False
return issubclass(tipo, np.timedelta64)

Expand Down
8 changes: 4 additions & 4 deletions pandas/core/dtypes/dtypes.py
Original file line number Diff line number Diff line change
Expand Up @@ -360,11 +360,11 @@ def construct_from_string(cls, string):
try:
if string == 'category':
return cls()
except:
else:
raise TypeError("cannot construct a CategoricalDtype")
except AttributeError:
pass

raise TypeError("cannot construct a CategoricalDtype")

@staticmethod
def validate_ordered(ordered):
"""
Expand Down Expand Up @@ -514,7 +514,7 @@ def __new__(cls, unit=None, tz=None):
if m is not None:
unit = m.groupdict()['unit']
tz = m.groupdict()['tz']
except:
except TypeError:
raise ValueError("could not construct DatetimeTZDtype")

elif isinstance(unit, compat.string_types):
Expand Down
22 changes: 10 additions & 12 deletions pandas/core/frame.py
Original file line number Diff line number Diff line change
Expand Up @@ -3260,7 +3260,7 @@ def _ensure_valid_index(self, value):
if not len(self.index) and is_list_like(value):
try:
value = Series(value)
except:
except (ValueError, NotImplementedError, TypeError):
raise ValueError('Cannot set a frame with no defined index '
'and a value that cannot be converted to a '
'Series')
Expand Down Expand Up @@ -3629,7 +3629,8 @@ def align(self, other, join='outer', axis=None, level=None, copy=True,
fill_axis=fill_axis,
broadcast_axis=broadcast_axis)

@Appender(_shared_docs['reindex'] % _shared_doc_kwargs)
@Substitution(**_shared_doc_kwargs)
@Appender(NDFrame.reindex.__doc__)
@rewrite_axis_style_signature('labels', [('method', None),
('copy', True),
('level', None),
Expand Down Expand Up @@ -4479,7 +4480,8 @@ def f(vals):
# ----------------------------------------------------------------------
# Sorting

@Appender(_shared_docs['sort_values'] % _shared_doc_kwargs)
@Substitution(**_shared_doc_kwargs)
@Appender(NDFrame.sort_values.__doc__)
def sort_values(self, by, axis=0, ascending=True, inplace=False,
kind='quicksort', na_position='last'):
inplace = validate_bool_kwarg(inplace, 'inplace')
Expand Down Expand Up @@ -4521,7 +4523,8 @@ def sort_values(self, by, axis=0, ascending=True, inplace=False,
else:
return self._constructor(new_data).__finalize__(self)

@Appender(_shared_docs['sort_index'] % _shared_doc_kwargs)
@Substitution(**_shared_doc_kwargs)
@Appender(NDFrame.sort_index.__doc__)
def sort_index(self, axis=0, level=None, ascending=True, inplace=False,
kind='quicksort', na_position='last', sort_remaining=True,
by=None):
Expand Down Expand Up @@ -4886,7 +4889,7 @@ def _arith_op(left, right):
left, right = ops.fill_binop(left, right, fill_value)
return func(left, right)

if this._is_mixed_type or other._is_mixed_type:
if ops.should_series_dispatch(this, other, func):
# iterate over columns
return ops.dispatch_to_series(this, other, _arith_op)
else:
Expand All @@ -4896,7 +4899,6 @@ def _arith_op(left, right):
copy=False)

def _combine_match_index(self, other, func, level=None):
assert isinstance(other, Series)
left, right = self.align(other, join='outer', axis=0, level=level,
copy=False)
assert left.index.equals(right.index)
Expand All @@ -4916,11 +4918,7 @@ def _combine_match_columns(self, other, func, level=None, try_cast=True):
left, right = self.align(other, join='outer', axis=1, level=level,
copy=False)
assert left.columns.equals(right.index)

new_data = left._data.eval(func=func, other=right,
axes=[left.columns, self.index],
try_cast=try_cast)
return self._constructor(new_data)
return ops.dispatch_to_series(left, right, func, axis="columns")

def _combine_const(self, other, func, errors='raise', try_cast=True):
if lib.is_scalar(other) or np.ndim(other) == 0:
Expand Down Expand Up @@ -7747,7 +7745,7 @@ def convert(v):
values = np.array([convert(v) for v in values])
else:
values = convert(values)
except:
except (ValueError, TypeError):
values = convert(values)

else:
Expand Down
Loading

0 comments on commit ff7c06c

Please sign in to comment.