You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Below I present two methods to calculate the time difference between two pd.Timestamp's. The first way is the native python datetime subtraction, whereas the second way wraps things in a pd.Series. They should be equivalent. In example 1 the output is correct; but in example 2 the resulting timedelta is negative. This seems to be the case when the time difference is greater than ~300 years.
Yep, the issue is in core.arrays.datetimes._sub_datelike_dti where instead of new_values = self_i8 - other_i8 we should be using checked_add_with_arr. @shengpu1126 want to try a PR?
Problem description
Below I present two methods to calculate the time difference between two
pd.Timestamp
's. The first way is the native pythondatetime
subtraction, whereas the second way wraps things in apd.Series
. They should be equivalent. In example 1 the output is correct; but in example 2 the resulting timedelta is negative. This seems to be the case when the time difference is greater than ~300 years.Code Sample
Example 1:
Output 1:
Whereas Example 2:
Output 2:
Output of
pd.show_versions()
INSTALLED VERSIONS
commit: None
python: 3.6.5.final.0
python-bits: 64
OS: Linux
OS-release: 4.4.0-130-generic
machine: x86_64
processor: x86_64
byteorder: little
LC_ALL: None
LANG: en_US.UTF-8
LOCALE: en_US.UTF-8
pandas: 0.23.4
pytest: None
pip: 18.0
setuptools: 40.2.0
Cython: None
numpy: 1.15.1
scipy: 1.1.0
pyarrow: None
xarray: None
IPython: 6.5.0
sphinx: None
patsy: None
dateutil: 2.7.2
pytz: 2018.5
blosc: None
bottleneck: None
tables: None
numexpr: None
feather: None
matplotlib: 2.2.3
openpyxl: None
xlrd: 1.1.0
xlwt: None
xlsxwriter: None
lxml: None
bs4: None
html5lib: 0.9999999
sqlalchemy: None
pymysql: None
psycopg2: None
jinja2: 2.10
s3fs: None
fastparquet: None
pandas_gbq: None
pandas_datareader: None
The text was updated successfully, but these errors were encountered: