Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Histograms statistic sets sending the wrong max, min, and sum values #5

Closed
williedoran opened this issue Aug 14, 2017 · 1 comment
Closed

Comments

@williedoran
Copy link

On logging several histograms to cloudwatch i was looking out for max values to match what i was seeing in my server logs. They were not the same. I investigated and looked at the stageMetricDatum method which is being used to convert both timer and histogram statistic sets. This method does a conversion by a factor depending on the time duration which makes sense for a timer but doesnt seem to make sense for max, min, sum over time as scaling by the duration shouldnt matter in this case. I added a simple test to show this, and a simple fix in the code base. I have a pull request to hopefully fix this if i understand the code correctly

@azagniotov
Copy link
Owner

Closing since it was addressed in: #6

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants