BUG: Inconsistency between behavior of cumsum when applied directly, and when applied within groupby. #44009
Labels
API - Consistency
Internal Consistency of API/Behavior
Bug
Groupby
Nested Data
Data where the values are collections (lists, sets, dicts, objects, etc.).
I have checked that this issue has not already been reported.
I have confirmed this bug exists on the latest version of pandas.
I have confirmed this bug exists on the master branch of pandas.
Reproducible Example
Issue Description
Inconsistency between behavior of cumsum() when applied directly, and when applied within groupby().
When used directly on a column of a dataframe that contains lists, cumsum() progressively concatenates the lists.
When used as part of groupby() on the same dataframe cumsum() throws "NotImplementedError: function is not implemented for this dtype: [how->cumsum,dtype->object]"
Expected Behavior
I'd expect cumsum() to progressively concatenate the lists of all rows within each group defined by groupby(), in a way similar to what it does with numeric values.
Installed Versions
INSTALLED VERSIONS
commit : 73c6825
python : 3.8.2.final.0
python-bits : 64
OS : Linux
OS-release : 4.15.0-147-generic
Version : #151-Ubuntu SMP Fri Jun 18 19:21:19 UTC 2021
machine : x86_64
processor : x86_64
byteorder : little
LC_ALL : None
LANG : en_US.UTF-8
LOCALE : en_US.UTF-8
pandas : 1.3.3
numpy : 1.19.2
pytz : 2020.1
dateutil : 2.8.1
pip : 20.1.1
setuptools : 46.1.3
Cython : 0.29.21
pytest : 6.2.2
hypothesis : None
sphinx : None
blosc : None
feather : None
xlsxwriter : 1.2.9
lxml.etree : 4.5.2
html5lib : None
pymysql : None
psycopg2 : None
jinja2 : 2.11.2
IPython : 7.16.1
pandas_datareader: None
bs4 : 4.9.1
bottleneck : 1.3.2
fsspec : None
fastparquet : None
gcsfs : None
matplotlib : 3.2.2
numexpr : 2.7.1
odfpy : None
openpyxl : None
pandas_gbq : None
pyarrow : 0.16.0
pyxlsb : None
s3fs : None
scipy : 1.5.0
sqlalchemy : 1.3.18
tables : 3.6.1
tabulate : None
xarray : None
xlrd : 1.2.0
xlwt : None
numba : 0.53.1
The text was updated successfully, but these errors were encountered: