You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Using a series full of None with fillnadid not work as expected either:
importpandasaspdimportnumpyasnps=pd.Series([1,2,3,np.nan,5,6], dtype=object)
all_nones=pd.Series([None]*len(s), s.index, dtype=object)
s.fillna(all_nones) # Not as expected
Pandas version checks
I have checked that this issue has not already been reported.
I have confirmed this bug exists on the latest version of pandas.
I have confirmed this bug exists on the main branch of pandas.
Reproducible Example
Issue Description
If I have a series of generic object type with missing values represented as NaN from numpy, I cannot use fillna to conver them to python
None
.A related issue says this should have been fixed in 1.4, but it isn't: #45490
Expected Behavior
Code example should convert the NaN to None.
Installed Versions
INSTALLED VERSIONS
commit : 4bfe3d0
python : 3.9.12.final.0
python-bits : 64
OS : Linux
OS-release : 5.18.0-2-amd64
Version : #1 SMP PREEMPT_DYNAMIC Debian 5.18.5-1 (2022-06-16)
machine : x86_64
processor :
byteorder : little
LC_ALL : None
LANG : en_US.UTF-8
LOCALE : en_US.UTF-8
pandas : 1.4.2
numpy : 1.22.3
pytz : 2022.1
dateutil : 2.8.2
pip : 21.2.4
setuptools : 61.2.0
Cython : 0.29.28
pytest : 7.1.2
hypothesis : None
sphinx : None
blosc : None
feather : None
xlsxwriter : None
lxml.etree : None
html5lib : None
pymysql : None
psycopg2 : None
jinja2 : None
IPython : 7.28.0
pandas_datareader: None
bs4 : None
bottleneck : None
brotli : None
fastparquet : None
fsspec : None
gcsfs : None
markupsafe : None
matplotlib : 3.5.2
numba : None
numexpr : 2.8.3
odfpy : None
openpyxl : None
pandas_gbq : None
pyarrow : None
pyreadstat : None
pyxlsb : None
s3fs : None
scipy : 1.7.3
snappy : None
sqlalchemy : None
tables : None
tabulate : None
xarray : None
xlrd : None
xlwt : None
zstandard : None
The text was updated successfully, but these errors were encountered: