Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.
Sign upAggregation to calculate the moving average on a histogram aggregation #10002
Comments
colings86
referenced this issue
Mar 5, 2015
Closed
Add ability to perform computations on aggregations #9876
polyfractal
added
>feature
v2.0.0-beta1
:Search/Aggregations
labels
Mar 5, 2015
polyfractal
self-assigned this
Mar 5, 2015
polyfractal
referenced this issue
Mar 6, 2015
Closed
Aggregations: Add moving average aggregation #10024
This comment has been minimized.
Show comment
Hide comment
This comment has been minimized.
|
Added in #10024 |
polyfractal
closed this
Apr 8, 2015
colings86
referenced this issue
Apr 13, 2015
Merged
Pipeline aggregations: Ability to perform computations on aggregations #10568
added a commit
that referenced
this issue
Apr 29, 2015
This comment has been minimized.
Show comment
Hide comment
This comment has been minimized.
elnur
commented
May 21, 2017
|
Anything like that for moving maximum? |
This comment has been minimized.
Show comment
Hide comment
This comment has been minimized.
This comment has been minimized.
Show comment
Hide comment
This comment has been minimized.
evanceheallyg
Aug 28, 2017
@polyfractal Is there any workaround to specify dynamic window? In our use case, we need to calculate the moving average for dynamic sliding window based on the number of months selected by the user. Ex, if the data set is for 1 year, then window should be 2, 2 years = 24 and so on.. Any thoughts?
evanceheallyg
commented
Aug 28, 2017
|
@polyfractal Is there any workaround to specify dynamic window? In our use case, we need to calculate the moving average for dynamic sliding window based on the number of months selected by the user. Ex, if the data set is for 1 year, then window should be 2, 2 years = 24 and so on.. Any thoughts? |
This comment has been minimized.
Show comment
Hide comment
This comment has been minimized.
polyfractal
Aug 29, 2017
Member
@evanceheallyg You'll have to determine the range of the data up-front at the moment, there's no way to specify the number of partitions rather than the size of each partition.
I'm not sure we'd be able to support that kind of functionality though. Moving average works on discrete buckets. So if your histogram has 10 buckets, but you request "9 partitions", we'd have to put 0.9 buckets into each partition... which isn't doable. The only way it'd work is if the number of partitions is a multiple/divisor of the histogram interval, which starts to get very unintuitive.
I think the best thing to do is just run a pre-aggregation to find the min/max of your dataset and then scale the window size accordingly.
|
@evanceheallyg You'll have to determine the range of the data up-front at the moment, there's no way to specify the number of partitions rather than the size of each partition. I'm not sure we'd be able to support that kind of functionality though. Moving average works on discrete buckets. So if your histogram has 10 buckets, but you request "9 partitions", we'd have to put 0.9 buckets into each partition... which isn't doable. The only way it'd work is if the number of partitions is a multiple/divisor of the histogram interval, which starts to get very unintuitive. I think the best thing to do is just run a pre-aggregation to find the min/max of your dataset and then scale the window size accordingly. |
This comment has been minimized.
Show comment
Hide comment
This comment has been minimized.
evanceheallyg
Aug 31, 2017
@polyfractal Thanks for the explanation and support. The problem is that we are using "https://github.com/PhaedrusTheGreek/transform_vis" for rendering the moving average in the dashboard and the DASHBOARD_CONTEXT used to filter is automatically fetched from dashboard filters and not able to control the same. Do you foresee any workaround in this situation.
Also, is there any workaround for the edge policies. Since I work in aerospace domain, its mandatory for us to calculate the MA with the previous set of data points, for Ex, if the plotting data set starts with Jan 2017, still we have to sum up the value from Jan 2017 and 11 months back, and move on.
evanceheallyg
commented
Aug 31, 2017
|
@polyfractal Thanks for the explanation and support. The problem is that we are using "https://github.com/PhaedrusTheGreek/transform_vis" for rendering the moving average in the dashboard and the DASHBOARD_CONTEXT used to filter is automatically fetched from dashboard filters and not able to control the same. Do you foresee any workaround in this situation. Also, is there any workaround for the edge policies. Since I work in aerospace domain, its mandatory for us to calculate the MA with the previous set of data points, for Ex, if the plotting data set starts with Jan 2017, still we have to sum up the value from Jan 2017 and 11 months back, and move on. |
polyfractal commentedMar 5, 2015
This aggregation will calculate the moving average of sibling metrics in histogram-style data (
histogram,date_histogram). Moving averages are useful when time series data is locally stationary and has a mean that changes slowly over time.Seasonal data may need a different analysis, as well as data that is bimodal, "bursty" or contains frequent extreme values (which are not necessarily outliers).
The
movavgaggregation supports several configurable options:Window Size
The user specifies the
windowsize they wish to calculate a moving average for. E.g. a user may want a 30-day sliding window over a histogram of 90 days total.Currently, if there is not enough data to "fill" the window, the moving average will be calculated with whatever is available. For example, if a user selects 30-day window, days 1-29 will calculate the moving average with between 1-29 days of data.
We could investigate adding more "edge policies", which determine how to handle gaps at the edge of the moving average
Weighting Type
Currently, the agg supports four types of weighting:
simple: A simple (arithmetic) average. Default.linear: A linearly weighted average, such that data becomes linearly less important as it gets "older" in the windowsingle_exp: Single exponentially weighted average (aka EWMA or Brown's Simple Exp Smoothing), such that data becomes exponentially less important as it get's "older".double_exp: Double exponentially weighted average (aka Holt-Winters). Uses two exponential terms: first smooth data exponentially likesingle_exp, but then apply second corrective smoothing to account for a trend.Todo: Expose alpha and beta
Alpha and beta are parameters which control the behavior of
single_expanddouble_exp.double_exp. Analogous to alpha, but applied to the trend smoothing rather than the data smoothing.Todo: Investigate metric-weighting
It's sometimes useful to weight a time period not by it's distance from the current time, but rather by some metric that happened in that time interval. E.g. weight by the volume of transactions that happened on that day.
It should be possible to weight based on metrics within the bucket, although it could get complicated if the value is missing.
Sample Request
This will calculate a moving average (sliding window of three days) over the sum of prices in each day:
GET /test/_search?search_type=count { "aggs": { "my_date_histo": { "date_histogram": { "field": "@timestamp", "interval": "day" }, "aggs": { "the_sum": { "sum": { "field": "price" } }, "the_movavg": { "movavg": { "bucketsPath": "the_sum", "window": 3 } } } } } }Sample Response
{ "took": 3, "timed_out": false, "aggregations": { "my_date_histo": { "buckets": [ { "key_as_string": "2014-12-01T00:00:00.000Z", "key": 1417392000000, "doc_count": 1, "the_sum": { "value": 1, "value_as_string": "1.0" }, "the_movavg": { "value": 1 } }, { "key_as_string": "2014-12-02T00:00:00.000Z", "key": 1417478400000, "doc_count": 1, "the_sum": { "value": 2, "value_as_string": "2.0" }, "the_movavg": { "value": 1.5 } }, { "key_as_string": "2014-12-04T00:00:00.000Z", "key": 1417651200000, "doc_count": 1, "the_sum": { "value": 4, "value_as_string": "4.0" }, "the_movavg": { "value": 2.3333333333333335 } }, { "key_as_string": "2014-12-05T00:00:00.000Z", "key": 1417737600000, "doc_count": 1, "the_sum": { "value": 5, "value_as_string": "5.0" }, "the_movavg": { "value": 3.6666666666666665 } }, { "key_as_string": "2014-12-08T00:00:00.000Z", "key": 1417996800000, "doc_count": 1, "the_sum": { "value": 8, "value_as_string": "8.0" }, "the_movavg": { "value": 5.666666666666667 } }, { "key_as_string": "2014-12-09T00:00:00.000Z", "key": 1418083200000, "doc_count": 1, "the_sum": { "value": 9, "value_as_string": "9.0" }, "the_movavg": { "value": 7.333333333333333 } } ] } } }