You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
ISSUES THAT DO NOT CONTAIN NECESSARY INFORMATION MAY BE CLOSED, IMMEDIATELY
The issue tracker is NOT the place for general support. For questions and
technical assistance, come ask the Bokeh mailing list or join the chat on Gitter. For feature requests, please provide a detailed description or proposal of the new capability or behavior.
For defects or deficiencies, please provide ALL OF THE FOLLOWING:
ALL software version info (bokeh, python, notebook, OS, browser, any other relevant packages)
If anything, the microseconds should be zero since we're just comparing milliseconds_since_epoch to seconds_since_epoch * 1000. The ~25,200,000 difference is due to the naive datetime conversion. That's a 7 hour offset from GMT.
It seems that total_seconds() already gives the fractional seconds, so no need to add in microseconds.
While this doesn't make much difference for data points with second resolution, subsecond resolution results in shifting.
Time zone issues should be handled at this point but a mailing list post recently noted the issue with the sub-second component. After a (very) brief look, it seems like the milliseconds component wa doubled which would be explained by the hypotheses above.
Thanks for the issue @clarsen Such a simple-to-fix issue should have been resolved immediately. I'd like to apologize that this somehow did not get triaged earlier (and hence was overlooked until now)
READ AND FOLLOW THESE INSTRUCTIONS CAREFULLY
ISSUES THAT DO NOT CONTAIN NECESSARY INFORMATION MAY BE CLOSED, IMMEDIATELY
The issue tracker is NOT the place for general support. For questions and
technical assistance, come ask the Bokeh mailing list or join the chat on Gitter. For feature requests, please provide a detailed description or proposal of the new capability or behavior.
For defects or deficiencies, please provide ALL OF THE FOLLOWING:
ALL software version info (bokeh, python, notebook, OS, browser, any other relevant packages)
version_json = '''
{
"date": "2017-06-13T09:22:29-0500",
"dirty": false,
"error": null,
"full-revisionid": "6070bf96d9367150f423a4fa792a7d242421236b",
"version": "0.12.6"
}
Description of expected behavior and the observed behavior
the function (snippet shown)
def convert_datetime_type(obj):
# Datetime (datetime is a subclass of date)
elif isinstance(obj, dt.datetime):
return (obj - DT_EPOCH).total_seconds() * 1000. + obj.microsecond / 1000.
that converts a datetime to json for use in plotting seems to not compute the timestamp properly.
Complete, minimal, self-contained example code that reproduces the issue
when I do this by hand with a datetime that has non-zero microseconds,
For the unix timestamp
1501131481.011143 . which has a converted d
If anything, the microseconds should be zero since we're just comparing milliseconds_since_epoch to seconds_since_epoch * 1000. The ~25,200,000 difference is due to the naive datetime conversion. That's a 7 hour offset from GMT.
It seems that total_seconds() already gives the fractional seconds, so no need to add in microseconds.
While this doesn't make much difference for data points with second resolution, subsecond resolution results in shifting.
(i'll submit a patch if it isn't already fixed.)
Stack traceback and/or browser JavaScript console output
Screenshots or screencasts of the bug in action
The text was updated successfully, but these errors were encountered: