You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm selecting first and last values from last minute from an influx db timeseries, according to doc "Last returns the newest value (determined by the timestamp) of a single field."
here are requests :
But I do not understand returned results, for last and first values, value is correct but I get a time that does not match any timestamp in my measurements (see last request that queries values from last 2 minutes), I should have a timestampe:
2016-05-19T08:42:11.69Z for first element
2016-05-19T08:44:07.991Z for last element
Where do first and last values time come from? Why first and last value are not correct?
The text was updated successfully, but these errors were encountered:
Since this is 0.12, those times represent the beginning of the interval. In 0.12, selectors were all normalized to return the beginning of the interval for every aggregate function.
We reverted that in 0.13 to allow selectors in a specific situation return the actual time value in #5890. This should work as you expect it to in 0.13, but if you use a GROUP BY time(...) clause, the time returned will be the beginning of the interval rather than the actual point.
influxdb 0.12
I'm selecting first and last values from last minute from an influx db timeseries, according to doc "Last returns the newest value (determined by the timestamp) of a single field."
here are requests :
But I do not understand returned results, for last and first values, value is correct but I get a time that does not match any timestamp in my measurements (see last request that queries values from last 2 minutes), I should have a timestampe:
2016-05-19T08:42:11.69Z for first element
2016-05-19T08:44:07.991Z for last element
Where do first and last values time come from? Why first and last value are not correct?
The text was updated successfully, but these errors were encountered: