You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I know legend values are calculated client side, but it seems like even with client-side data it is inaccurate.
I have a metric with mostly null values and finite number of points (~30). The min (.85 sec) and max (5.01 sec) are correct, and total seems plausible (44.65 sec). The average (0.02 sec) however is nonsensical: it is drastically less than the minimum. The true average as determined server-side by graphite is 1.28 sec. The date range is 1 week.
Am I misunderstanding something about the limitations of the client side calculations? Otherwise it seems to me a simple summing up of known data points over the number of data points would produce a more accurate result. Not sure where 0.02 is coming from in that data.
Thanks!
The text was updated successfully, but these errors were encountered:
avg is calculated client side, so if many values are null then then avg is going to seems strange compared to min value. Also if you want accurate legend values you need to set a consolidateBy graphite function (for example if you want max to be accurate you need to set consolidateBy(max), graphite will autumatically use avg as point consolidation function otherwise).
if you want ignore nulls, try keepLastValue function.
I don't want to change the graph with keepLastValue; the client side average calculation should ignore nulls IMHO, unless they are requested to be drawn as zeros.
I know legend values are calculated client side, but it seems like even with client-side data it is inaccurate.
I have a metric with mostly null values and finite number of points (~30). The min (.85 sec) and max (5.01 sec) are correct, and total seems plausible (44.65 sec). The average (0.02 sec) however is nonsensical: it is drastically less than the minimum. The true average as determined server-side by graphite is 1.28 sec. The date range is 1 week.
Am I misunderstanding something about the limitations of the client side calculations? Otherwise it seems to me a simple summing up of known data points over the number of data points would produce a more accurate result. Not sure where 0.02 is coming from in that data.
Thanks!
The text was updated successfully, but these errors were encountered: