-
-
Notifications
You must be signed in to change notification settings - Fork 18k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BUG: AttributeError: type object 'object' has no attribute 'dtype' with numpy 1.20.x and pandas versions 1.0.4 and earlier #39520
Comments
It started happening to me as well, using Python 3.8.7 and Pandas 1.0.2. It started happening in code that used to work:
FWIW it seems like upgrading Pandas (in my case to |
Found the same issue this morning due to some CI tests failing. Python==3.8.2 and pandas==1.0.1. CI tests ran again successfully after bumping to pandas==1.2.1. |
pls show_versions as instructed, this is mostly likely because of the numpy 1.20 release |
I got same errors on my environment. As @jreback pointed out, the combination of older version of pandas (at least $ pip install -u pandas==1.0.1
$ pip install -u pandas==1.2.0 >>> import pandas as pd
>>> pd.DataFrame(columns=['a']) |
The following also blow up our codes: >>> import pandas as pd
>>> f = lambda x: x
>>> pd.DataFrame.from_records(map(f, []), columns=['a'])
...
AttributeError: type object 'object' has no attribute 'dtype'
>>> pd.DataFrame.from_records(list(map(f, [])), columns=['a'])
Empty DataFrame
Columns: [a]
Index: [] |
yes, I try to pip install -U |
we'd actually like to see if we can patch this |
What is the root cause of the problem? If numpy broke the backward compatibility unintentionally, it should be fixed in |
the numpy break is intentional (though unsure why this didn't come up on our prior testing) as dtypes definitions are being updated |
Hmm, I think I saw this recently on a NumPy issue. The unfortunate thing is that I admit the break is annoying, but trying to swing in a direction that doesn't use a metaclass for dtype now is not something I can do easily/lightly (it would probably be a huge change in direction). I guess we got into a position where pandas fixed this, before I actually did the change in NumPy. |
FWIW; if you are stuck on |
I am not too happy that you have to pin NumPy, but I guess having an upstream package almost a year newer than the downstream package can be problematic more generally (if there had been a proper If it helps, in this case, I suspect the increadibly minimal fix, will be to just replace |
|
I have tried multiple different versions of pandas and numpy to no avail - I still get this bug. If I understood your post correctly Pandas 1.0.5 with numpy 1.19.5 (the earliest version of numpy I can see is 1 so assume there is a typo) should fix it and it does not. I tried 1.0.5 with the latest version of numpy and it still fails. I have tried several other combinations and still it persists. Please suggest how I can fix this bug. For reference I am using: |
@PaulBremner can you post a copy/paste-able example? i cant reproduce the error using any of the examples in this thread |
A simplified example that generates the error. I am using Python 3.9, pandas 1.2.4, numpy 1.19.5 (though any combination of numpy and pandas versions I tried has the same result).
|
You want |
No I definitely want dtype. When I try and run an ANOVA on my data it throws the error message mentioned by the OP (AttributeError: type object 'object' has no attribute 'dtype). My example code is just using the print statement to cause the error message to occur, I don't actually want to print anything. |
Thanks for clarifying. When I copy/paste the snippet I get |
Is there a work around for this bug? As far as I can see I have tried the solutions suggested above of rolling back numpy. I have tested: pandas==0.24.2 (python 3.6) Nothing seems to work, I still get the attribute error. |
.dtype is NOT a property on a DataFrame but on a Series |
I did not know that. Though I am still none the wiser about how to get my code to work. |
If you didn't know that, then we should revisit the idea that you should use |
Isn't it the case that the Pandas dev team simply needs to backport this commit to the 0.24.x (or any version older than 1)? I did it locally and it seems to work. I'm surprised that this important fix wasn't mentioned in https://pandas.pydata.org/docs/whatsnew/v1.0.5.html |
Yes, probably that is all that is needed. Note that pandas fixed it before NumPy did the upstream change that caused the bug to be a serious issue rather than a small one. My guess is that 0.24.x is out-of-life before the change in NumPy even happened. |
Many end-users are stuck on pandas 0.24.x because of msgpack. It might be worthwhile to cut one patch release with this backport. |
closing - we do not backport to older branches that are this old |
root@548977c7dc-62l72:/app# pip list | grep pandas
pandas 1.0.3
In ipython ,i try initializing df
`
In [1]: import pandas as pd
In [2]: pd.DataFrame([],columns=['a','b','c'])
AttributeErrorTraceback (most recent call last)
in
----> 1 pd.DataFrame([],columns=['clicks', 'uclicks', 'impressions'])
/usr/local/lib/python3.8/site-packages/pandas/core/frame.py in init(self, data, index, columns, dtype, copy)
488 mgr = init_ndarray(data, index, columns, dtype=dtype, copy=copy)
489 else:
--> 490 mgr = init_dict({}, index, columns, dtype=dtype)
491 else:
492 try:
/usr/local/lib/python3.8/site-packages/pandas/core/internals/construction.py in init_dict(data, index, columns, dtype)
237 else:
238 nan_dtype = dtype
--> 239 val = construct_1d_arraylike_from_scalar(np.nan, len(index), nan_dtype)
240 arrays.loc[missing] = [val] * missing.sum()
241
/usr/local/lib/python3.8/site-packages/pandas/core/dtypes/cast.py in construct_1d_arraylike_from_scalar(value, length, dtype)
1438 else:
1439 if not isinstance(dtype, (np.dtype, type(np.dtype))):
-> 1440 dtype = dtype.dtype
1441
1442 if length and is_integer_dtype(dtype) and isna(value):
AttributeError: type object 'object' has no attribute 'dtype'
`
The text was updated successfully, but these errors were encountered: