Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ChainAPI shouldn't fall back to local time #90

Open
ssfrr opened this issue Mar 12, 2018 · 2 comments
Open

ChainAPI shouldn't fall back to local time #90

ssfrr opened this issue Mar 12, 2018 · 2 comments

Comments

@ssfrr
Copy link
Member

ssfrr commented Mar 12, 2018

Looks like we still have some kind of time-zone problem. This is somewhat different from #62, where the issue was the ambiguity converting local to UTC in the fall. In this issue somehow we're getting a time that doesn't actually exist.

@bmayton do the tidpost scripts post in UTC or local time?

Traceback (most recent call last):

  File "/usr/local/lib/python2.7/dist-packages/django/core/handlers/base.py", line 114, in get_response
    response = wrapped_callback(request, *callback_args, **callback_kwargs)

  File "/usr/local/lib/python2.7/dist-packages/django/views/decorators/csrf.py", line 57, in wrapped_view
    return view_func(*args, **kwargs)

  File "/home/sfr/chain-api/chain/core/api.py", line 791, in create_view
    return cls.create_single(data, request)

  File "/home/sfr/chain-api/chain/core/api.py", line 809, in create_single
    response_data = cls.create_resource(data, request)

  File "/home/sfr/chain-api/chain/core/api.py", line 796, in create_resource
    new_resource = cls(data=data, request=request, filters=obj_params)

  File "/home/sfr/chain-api/chain/core/resources.py", line 148, in __init__
    self.timestamp = self.sanitize_field_value('timestamp', self._data.get('timestamp'))

  File "/home/sfr/chain-api/chain/core/resources.py", line 176, in sanitize_field_value
    return timezone.make_aware(timestamp, timezone.get_current_timezone())

  File "/usr/local/lib/python2.7/dist-packages/django/utils/timezone.py", line 304, in make_aware
    return timezone.localize(value, is_dst=None)

  File "/usr/lib/python2.7/dist-packages/pytz/tzinfo.py", line 327, in localize
    raise NonExistentTimeError(dt)

NonExistentTimeError: 2018-03-11 02:34:33.467103


<WSGIRequest
path:/scalar_data/create,
GET:<QueryDict: {u'sensor_id': [u'1267']}>,
POST:<QueryDict: {}>,
COOKIES:{},
META:{'CONTENT_LENGTH': '57',
 u'CSRF_COOKIE': u'SpZORQGimBPZoQCCuO6LFT24c8sc6Zn6',
 'HTTP_ACCEPT': '*/*',
 'HTTP_ACCEPT_ENCODING': 'gzip, deflate',
 'HTTP_AUTHORIZATION': 'Basic Y2hhaW5jb2xsZWN0b3JzOlY4VGlPRTZ5VndwaklpbTEzeWJoakE5Tw==',
 'HTTP_CONNECTION': 'close',
 'HTTP_HOST': 'chain-api.media.mit.edu',
 'HTTP_USER_AGENT': 'python-requests/2.3.0 CPython/2.7.3 Linux/3.2.0-4-amd64',
 'HTTP_X_FORWARDED_FOR': '18.85.58.99',
 'PATH_INFO': u'/scalar_data/create',
 'QUERY_STRING': 'sensor_id=1267',
 'RAW_URI': '/scalar_data/create?sensor_id=1267',
 'REMOTE_ADDR': '18.85.58.99',
 'REMOTE_PORT': '80',
 'REQUEST_METHOD': 'POST',
 'SCRIPT_NAME': u'',
 'SERVER_NAME': 'chain-api.media.mit.edu',
 'SERVER_PORT': '80',
 'SERVER_PROTOCOL': 'HTTP/1.0',
 'SERVER_SOFTWARE': 'gunicorn/18.0',
 'gunicorn.socket': <socket._socketobject object at 0x7f09e364e210>,
 'wsgi.errors': <open file '<stderr>', mode 'w' at 0x7f09f2e4f1e0>,
 'wsgi.file_wrapper': <class gunicorn.http.wsgi.FileWrapper at 0x7f09f09c2598>,
 'wsgi.input': <gunicorn.http.body.Body object at 0x7f09e35daed0>,
 'wsgi.multiprocess': True,
 'wsgi.multithread': False,
 'wsgi.run_once': False,
 'wsgi.url_scheme': 'http',
 'wsgi.version': (1, 0)}>
@bmayton
Copy link
Member

bmayton commented Mar 12, 2018

I think I found the issue. The script that posts the data from the atrium sensors uses my chainpost.py wrapper to look up the devices and metrics and post the data. The latest version is 0.4, but the version installed on the doppeldb.media server is 0.1. 0.4's timestamp logic:

if timestamp is None:
    timestamp = datetime.datetime.utcnow()
    tzoffset = "+00:00"
sensor = self.find_sensor(dev_name, metric, unit)

if tzoffset is not None:
    ts_str = timestamp.isoformat() + tzoffset
else:
    ts_str = timestamp.isoformat()

sensor_data = dict(
    value=value,
    timestamp=ts_str
)

Somewhat messy, but in the end ts_str should end up as an ISO-8601 timestamp with a timezone offset appended if at all possible. In the case of the atrium script, a timestamp is not passed into the function, so ts_str should end up as datetime.datetime.utcnow() + "00:00".

In version 0.1, the same logic is followed for assigning ts_str, before it is helpfully dropped on the floor and timestamp.isoformat(), without a time zone offset string (unless perhaps the timestamp object is already timezone-aware, which I don't believe a call to datetime.datetime.utcnow() is?):

sensor_data = dict(
    value=value,
    timestamp=timestamp.isoformat()
)

This leads to a request being sent to Chain with a UTC timestamp with no offset specified, which Chain treats as a local time. 2018-03-11 02:34:33+00:00 is a valid timestamp, 2018-03-11 02:34:33 is not if it's assumed to be a local time and the local timezone observes current US DST rules.

I just updated chainclient and chainpost on doppeldb.media to the latest versions. This does unfortunately mean that all exisisting data from the atrium sensors is probably offset by 4 or 5 hours depending on the time of year.

@ssfrr ssfrr changed the title Nonexistent time getting created during Spring DST switchover ChainAPI shouldn't fall back to local time Mar 12, 2018
@ssfrr
Copy link
Member Author

ssfrr commented Mar 12, 2018

OK - I'm renaming the issue to reflect the underlying bad behavior, which is that ChainAPI shouldn't interpret timestamps without time zones as in its own local time. It seems like either treating them as UTC time or just being strict and requiring the time zone suffix would both be pretty reasonable behaviors.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants