You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
On logging several histograms to cloudwatch i was looking out for max values to match what i was seeing in my server logs. They were not the same. I investigated and looked at the stageMetricDatum method which is being used to convert both timer and histogram statistic sets. This method does a conversion by a factor depending on the time duration which makes sense for a timer but doesnt seem to make sense for max, min, sum over time as scaling by the duration shouldnt matter in this case. I added a simple test to show this, and a simple fix in the code base. I have a pull request to hopefully fix this if i understand the code correctly
The text was updated successfully, but these errors were encountered:
On logging several histograms to cloudwatch i was looking out for max values to match what i was seeing in my server logs. They were not the same. I investigated and looked at the stageMetricDatum method which is being used to convert both timer and histogram statistic sets. This method does a conversion by a factor depending on the time duration which makes sense for a timer but doesnt seem to make sense for max, min, sum over time as scaling by the duration shouldnt matter in this case. I added a simple test to show this, and a simple fix in the code base. I have a pull request to hopefully fix this if i understand the code correctly
The text was updated successfully, but these errors were encountered: