You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
and for us this means that the value 27.1 should be held for 29 seconds (until the next point), not that the temperature increases linearly from 27.1 to 27.5 in this time. How difficult would it be to support this type of interpolation?
Thanks
The text was updated successfully, but these errors were encountered:
Have thought this through a bit more and it does involve a lot of changes, unless I'm missing something:
'upsampling' in addition to 'downsampling', i.e. the downsampler should use the last datapoint before the interval and the next datapoint after the interval to work out the value during the interval.
If doing a query on an interval that contains no datapoints, need to do a lookback to find the previous datapoint and include that instead of returning nothing. This probably involves a 'lookback' parameter similar to the last_dp feature.
Implement a custom zero-order-hold mean aggregator that does a weighted average of the points in the interval (the weights are how long the values are held for).
Have looked through the source code and have an idea of how to implement these but they're probably best broken up into separate issues/PRs and will almost definitely need some discussion, so will post to the mailing list instead.
We would like to use OpenTSDB to ingest sensor data that arrives on change of value. For example,
and for us this means that the value 27.1 should be held for 29 seconds (until the next point), not that the temperature increases linearly from 27.1 to 27.5 in this time. How difficult would it be to support this type of interpolation?
Thanks
The text was updated successfully, but these errors were encountered: