Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

date_range() with closed=left and sub-second granularity returns wrong number of elements #24110

Closed
gabrielreid opened this issue Dec 5, 2018 · 2 comments

Comments

Projects
None yet
3 participants
@gabrielreid
Copy link
Contributor

commented Dec 5, 2018

Code Sample, a copy-pastable example if possible

import pandas as pd
# Good case: a closed=left date_range call should return 'periods' - 1 entries
>>> pd.date_range(start='2018-01-01T00:00:01.000Z', end='2018-01-03T00:00:01.000Z', periods=2, closed='left')
DatetimeIndex(['2018-01-01 00:00:01+00:00'], dtype='datetime64[ns, UTC]', freq=None)

# Bad case: if the start and end have sub-second granularity (in some cases), the 
# returned DatetimeIndex has too many entries (2 instead of 1). The returned dates are also
# not correctly aligned to the start/end dates
>>> pd.date_range(start='2018-01-01T00:00:00.010Z', end='2018-01-03T00:00:00.010Z', periods=2, closed='left')
DatetimeIndex(['2018-01-01 00:00:00.009999872+00:00', '2018-01-03 00:00:00.009999872+00:00'], dtype='datetime64[ns, UTC]', freq=None)

# Unexpected case: this appears to be dependent on the date being used: it doesn't happen with older
# date ranges (using 2001 instead of 2018 as the year "resolves" the problem)
>>> pd.date_range(start='2001-01-01T00:00:00.010Z', end='2001-01-03T00:00:00.010Z', periods=2, closed='left')
DatetimeIndex(['2001-01-01 00:00:00.010000+00:00'], dtype='datetime64[ns, UTC]', freq=None)

Problem description

As far as I understand it, calling date_range with two absolute endpoints, a number of periods, and closed='left' should return a DatetimeIndex with periods - 1 entries. This appears to work as expected in most cases, but supplying more recent dates with sub-second granularity (e.g. 2018-01-01T00:00:00.010Z) appears to trigger an issue which causes periods entries to be contained in the returned DatetimeIndex instead of periods - 1.

I've verified this on a number of older (pre 0.24) versions, as well as in the current HEAD of the master branch, and it appears to be present in all cases.

Expected Output

>>> pd.date_range(start='2018-01-01T00:00:00.010Z', end='2018-01-03T00:00:00.010Z', periods=2, closed='left')
DatetimeIndex(['2018-01-01 00:00:00.010000+00:00'], dtype='datetime64[ns, UTC]', freq=None)

Output of pd.show_versions()

INSTALLED VERSIONS

commit: d7e96d8
python: 3.7.1.final.0
python-bits: 64
OS: Darwin
OS-release: 17.7.0
machine: x86_64
processor: i386
byteorder: little
LC_ALL: en_US.UTF-8
LANG: en_US.UTF-8
LOCALE: en_US.UTF-8

pandas: 0.24.0.dev0+1208.gd7e96d830
pytest: 4.0.1
pip: 18.1
setuptools: 40.6.2
Cython: 0.29
numpy: 1.15.4
scipy: 1.1.0
pyarrow: 0.11.1
xarray: 0.11.0
IPython: 7.2.0
sphinx: 1.8.2
patsy: 0.5.1
dateutil: 2.7.5
pytz: 2018.7
blosc: None
bottleneck: 1.2.1
tables: 3.4.4
numexpr: 2.6.8
feather: None
matplotlib: 3.0.1
openpyxl: 2.5.11
xlrd: 1.1.0
xlwt: 1.3.0
xlsxwriter: 1.1.2
lxml.etree: 4.2.5
bs4: 4.6.3
html5lib: 1.0.1
sqlalchemy: 1.2.14
pymysql: None
psycopg2: None
jinja2: 2.10
s3fs: None
fastparquet: 0.1.6
pandas_gbq: None
pandas_datareader: None
gcsfs: None

@mroeschke

This comment has been minimized.

Copy link
Member

commented Dec 5, 2018

Thanks for the report! Your second example is indeed bizarre behavior. Investigations and PR's always welcome.

gabrielreid added a commit to gabrielreid/pandas that referenced this issue Dec 6, 2018

BUG: date_range issue with sub-second granularity
Improves (but doesn't completely resolve) pandas-dev#24110, to avoid rounding
issues with sub-second granularity timestamps when creating a
date range.

gabrielreid added a commit to gabrielreid/pandas that referenced this issue Dec 6, 2018

BUG: date_range issue with sub-second granularity
Improves (but doesn't completely resolve) pandas-dev#24110, to avoid rounding
issues with sub-second granularity timestamps when creating a
date range.
@gabrielreid

This comment has been minimized.

Copy link
Contributor Author

commented Dec 6, 2018

This appears to be due to limitations of the integer resolution of doubles in python, brought on by the use of numpy.linspace.

I've added a PR with a fix which will largely reduce (but not completely resolve) the occurrence of this issue in #24129

gabrielreid added a commit to gabrielreid/pandas that referenced this issue Dec 7, 2018

BUG: date_range issue with sub-second granularity
Fixes pandas-dev#24110, by avoid floating-point rounding issues with
sub-second granularity timestamps when creating a date range.

@jreback jreback added this to the 0.24.0 milestone Dec 7, 2018

gabrielreid added a commit to gabrielreid/pandas that referenced this issue Dec 9, 2018

BUG: date_range issue with millisecond resolution
Fixes pandas-dev#24110, by avoid floating-point rounding issues with
millisecond resolution or higher timestamps when creating a date
range.

mroeschke added a commit that referenced this issue Dec 9, 2018

BUG: date_range issue with millisecond resolution (#24129)
Fixes #24110, by avoid floating-point rounding issues with
millisecond resolution or higher timestamps when creating a date
range.

Pingviinituutti added a commit to Pingviinituutti/pandas that referenced this issue Feb 28, 2019

BUG: date_range issue with millisecond resolution (pandas-dev#24129)
Fixes pandas-dev#24110, by avoid floating-point rounding issues with
millisecond resolution or higher timestamps when creating a date
range.

Pingviinituutti added a commit to Pingviinituutti/pandas that referenced this issue Feb 28, 2019

BUG: date_range issue with millisecond resolution (pandas-dev#24129)
Fixes pandas-dev#24110, by avoid floating-point rounding issues with
millisecond resolution or higher timestamps when creating a date
range.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.