Clone this wiki locally
There are two kinds of counters:
- Events are discrete things you count, such as "cars crossing this line."
- Gauges are specific measurements, such as "amount of electric energy in this car battery."
istatd knows that a counter is an event if the name of the counter is prefixed by an asterisk (
*) This determination is only done the first time the name of a counter is seen, and then stored permanently in the file that records the counter data. If you get it wrong, and need to change the type of the counter, you have to stop istatd, delete the old file, make sure all data sources generate the correct prefix, and start istatd again.
For event counters, only the
timestamp is actually used for incoming counter data. The incoming data is collated in the istatd process until the current time "bucket" for the finest resolution is expired, and then a single sample with
sum of the sum of values is recorded, and
count is always 1.
The coarser resolution files are then updated as if they were gauges, except the
count is set to the interval divided by the finest resolution's interval. This means that, when plotting the data as
count, you will get "events per second" in all levels of coarseness.
There will be no standard deviation in the finest-resolution file, but the standard deviation, min and max values will be related to the finest-resolution buckets in the coarser files.
For example, if you have 10-second and 5-minute buckets, and have one 10-second period with 100 events, but the other periods have no events, then the 10-second file will contain one bucket with the value 100 and count 1, and the 5-minute file will contain one bucket with the value 100, the min 0, the max 100, and the count 30. If you now have another 10-second interval with 500 events, you will end up with one bucket in the 10-second data that has value 500 and count 1. And, when that 10-second bucket merges into the same 5-minute coarser resolution bucket as the previous 10-second bucket, the values of that 5-minute bucket will be: sum 600, min 0, max 500, count 30. (Additionally, sum-squared is calculated to allow for standard deviation calculation)
For gauge counters, each sample received is simply merged into the sum, min, max, sumsquared, and count values. This is what you want for measurements like "percent disk used" or "CPU load" etc. If the statistics aggregate into the same counters, then you can at a glance see what the average is, what the min and max is (given suitable UI), and what the standard deviation is.
You also want to check out Counter Retention Intervals.