Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

InvalidParameterValueException #4

Closed
ben-manes opened this issue Jun 21, 2017 · 12 comments
Closed

InvalidParameterValueException #4

ben-manes opened this issue Jun 21, 2017 · 12 comments

Comments

@ben-manes
Copy link

ERROR [2017-06-21 21:40:01,884] [io.github.azagniotov.metrics.reporter.cloudwatch.CloudWatchReporter] Error reporting metrics to CloudWatch. The data in this CloudWatch API request may have been discarded, did not make it to CloudWatch.
com.amazonaws.services.cloudwatch.model.InvalidParameterValueException: The value ? for parameter MetricData.member.3.Value is invalid.

According to blacklocus/metrics-cloudwatch#4 this occurs when the sample count is zero. In that implementation those metrics were filtered away. Does this one require a custom MetricFilter to do likewise? Could the same fix be incorporated?

@ben-manes
Copy link
Author

I think this is because the zero filter is only applied in the case of a counter. I think it should be done in stageMetricDatum by checking the metricValue. An example request that fails is,

INFO  [2017-06-21 23:20:48,719] [io.github.azagniotov.metrics.reporter.cloudwatch.CloudWatchReporter] Dry run - constructed PutMetricDataRequest: {Namespace: LoadDocs/Server,MetricData: [{MetricName: HikariPool-1.pool.ActiveConnections,Dimensions: [{Name: Region,Value: us-west-2}, {Name: , Environment,Value: 'dev'}, {Name: Type,Value: gauge}],Timestamp: Wed Jun 21 16:12:56 PDT 2017,Value: 0.0,Unit: None}, {MetricName: HikariPool-1.pool.IdleConnections,Dimensions: [{Name: Region,Value: us-west-2}, {Name: , Environment,Value: 'dev'}, {Name: Type,Value: gauge}],Timestamp: Wed Jun 21 16:13:26 PDT 2017,Value: 30.0,Unit: None}, {MetricName: HikariPool-1.pool.PendingConnections,Dimensions: [{Name: Region,Value: us-west-2}, {Name: , Environment,Value: 'dev'}, {Name: Type,Value: gauge}],Timestamp: Wed Jun 21 16:13:28 PDT 2017,Value: 0.0,Unit: None}, {MetricName: HikariPool-1.pool.TotalConnections,Dimensions: [{Name: Region,Value: us-west-2}, {Name: , Environment,Value: 'dev'}, {Name: Type,Value: gauge}],Timestamp: Wed Jun 21 16:13:33 PDT 2017,Value: 30.0,Unit: None}, {MetricName: PS-MarkSweep.count,Dimensions: [{Name: Region,Value: us-west-2}, {Name: , Environment,Value: 'dev'}, {Name: Type,Value: gauge}],Timestamp: Wed Jun 21 16:13:33 PDT 2017,Value: 3.0,Unit: None}, {MetricName: PS-MarkSweep.time,Dimensions: [{Name: Region,Value: us-west-2}, {Name: , Environment,Value: 'dev'}, {Name: Type,Value: gauge}],Timestamp: Wed Jun 21 16:13:33 PDT 2017,Value: 291.0,Unit: None}, {MetricName: PS-Scavenge.count,Dimensions: [{Name: Region,Value: us-west-2}, {Name: , Environment,Value: 'dev'}, {Name: Type,Value: gauge}],Timestamp: Wed Jun 21 16:13:33 PDT 2017,Value: 11.0,Unit: None}, {MetricName: PS-Scavenge.time,Dimensions: [{Name: Region,Value: us-west-2}, {Name: , Environment,Value: 'dev'}, {Name: Type,Value: gauge}],Timestamp: Wed Jun 21 16:13:33 PDT 2017,Value: 175.0,Unit: None}, {MetricName: blocked.count,Dimensions: [{Name: Region,Value: us-west-2}, {Name: , Environment,Value: 'dev'}, {Name: Type,Value: gauge}],Timestamp: Wed Jun 21 16:13:33 PDT 2017,Value: 0.0,Unit: None}, {MetricName: com.codahale.metrics.jvm.FileDescriptorRatioGauge,Dimensions: [{Name: Region,Value: us-west-2}, {Name: , Environment,Value: 'dev'}, {Name: Type,Value: gauge}],Timestamp: Wed Jun 21 16:13:33 PDT 2017,Value: 0.24365234375,Unit: None}, {MetricName: count,Dimensions: [{Name: Region,Value: us-west-2}, {Name: , Environment,Value: 'dev'}, {Name: Type,Value: gauge}],Timestamp: Wed Jun 21 16:13:33 PDT 2017,Value: 109.0,Unit: None}, {MetricName: daemon.count,Dimensions: [{Name: Region,Value: us-west-2}, {Name: , Environment,Value: 'dev'}, {Name: Type,Value: gauge}],Timestamp: Wed Jun 21 16:13:33 PDT 2017,Value: 53.0,Unit: None}, {MetricName: deadlock.count,Dimensions: [{Name: Region,Value: us-west-2}, {Name: , Environment,Value: 'dev'}, {Name: Type,Value: gauge}],Timestamp: Wed Jun 21 16:13:33 PDT 2017,Value: 0.0,Unit: None}, {MetricName: direct.capacity,Dimensions: [{Name: Region,Value: us-west-2}, {Name: , Environment,Value: 'dev'}, {Name: Type,Value: gauge}],Timestamp: Wed Jun 21 16:13:33 PDT 2017,Value: 2.69517087E8,Unit: None}, {MetricName: direct.count,Dimensions: [{Name: Region,Value: us-west-2}, {Name: , Environment,Value: 'dev'}, {Name: Type,Value: gauge}],Timestamp: Wed Jun 21 16:13:33 PDT 2017,Value: 38.0,Unit: None}, {MetricName: direct.used,Dimensions: [{Name: Region,Value: us-west-2}, {Name: , Environment,Value: 'dev'}, {Name: Type,Value: gauge}],Timestamp: Wed Jun 21 16:13:33 PDT 2017,Value: 2.69517088E8,Unit: None}, {MetricName: heap.committed,Dimensions: [{Name: Region,Value: us-west-2}, {Name: , Environment,Value: 'dev'}, {Name: Type,Value: gauge}],Timestamp: Wed Jun 21 16:13:33 PDT 2017,Value: 7.27711744E8,Unit: None}, {MetricName: heap.init,Dimensions: [{Name: Region,Value: us-west-2}, {Name: , Environment,Value: 'dev'}, {Name: Type,Value: gauge}],Timestamp: Wed Jun 21 16:13:33 PDT 2017,Value: 2.68435456E8,Unit: None}, {MetricName: heap.max,Dimensions: [{Name: Region,Value: us-west-2}, {Name: , Environment,Value: 'dev'}, {Name: Type,Value: gauge}],Timestamp: Wed Jun 21 16:13:33 PDT 2017,Value: 5.726797824E9,Unit: None}, {MetricName: heap.usage,Dimensions: [{Name: Region,Value: us-west-2}, {Name: , Environment,Value: 'dev'}, {Name: Type,Value: gauge}],Timestamp: Wed Jun 21 16:13:33 PDT 2017,Value: 0.09173450157405103,Unit: None}]}

@azagniotov
Copy link
Owner

Hi.. Thank you for that. Yeah, I saw that they have a fix for this. Can you issue a PR?

@ben-manes
Copy link
Author

I switched to their library, but I still see error, so it must not have been fully resolved.

Regarding this library, I'm a little concerned that it does the rollups instead of delegating that to the analyzer. Usually one exports monotonically increasing values, report into a time series database, and compute on the dashboard. A quick glance again and not sure if you're providing extra information or not reporting the raw counts.

@azagniotov
Copy link
Owner

You are right. I just need to find the time to address this. Thank you for the input!

@azagniotov
Copy link
Owner

@ben-manes FYI, @williedoran has addressed an issue where histogram statistic sets were reported by applying conversion by a factor. Now, histogram is reported as is:
#6

Thoughts?

@ben-manes
Copy link
Author

Sorry, I'm a bit too swamped to switch contexts and help out. But switching back is pretty quick, so I can give that a shot when you have a release.

@azagniotov
Copy link
Owner

Ben, I appreciate your help & time. I will ping you once it is live on Maven

azagniotov added a commit that referenced this issue Aug 20, 2017
- Issue #4: Not reporting zero values.
- PR #6: Histogram snapshot values being converted using a duration factor, instead of reporting raw values (https://github.com/williedoran)
- Checking 'isDebugEnabled' when logging debug information
azagniotov added a commit that referenced this issue Aug 20, 2017
- Issue #4: Not reporting zero values.
- PR #6: Histogram snapshot values being converted using a duration factor, instead of reporting raw values (https://github.com/williedoran)
- Checking 'isDebugEnabled' when logging debug information
azagniotov added a commit that referenced this issue Aug 20, 2017
- Issue #4: Not reporting zero values.
- PR #6: Histogram snapshot values being converted using a duration factor, instead of reporting raw values (https://github.com/williedoran)
- Checking 'isDebugEnabled' when logging debug information
azagniotov added a commit that referenced this issue Aug 20, 2017
- Issue #4: Not reporting zero values.
- PR #6: Histogram snapshot values being converted using a duration factor, instead of reporting raw values (https://github.com/williedoran)
- Checking 'isDebugEnabled' when logging debug information
azagniotov added a commit that referenced this issue Aug 20, 2017
- Issue #4: Not reporting zero values.
- PR #6: Histogram snapshot values being converted using a duration factor, instead of reporting raw values (https://github.com/williedoran)
- Checking 'isDebugEnabled' when logging debug information
@azagniotov
Copy link
Owner

Hi @ben-manes, version 1.0.4 is live on maven central. Feel free to give it a go when you got time. Cheers

@ben-manes
Copy link
Author

Thank you! :)

@azagniotov
Copy link
Owner

Actually, I just released 1.0.5. I think it should be on Maven in a few hours. Thanks!

@ben-manes
Copy link
Author

Sorry, I forgot about this. I reverted back to the original code I had for using this library, with the latest version of course. It sounds like you fixed this so I'll reopen if the issue persists.

Thanks!

@ben-manes
Copy link
Author

@azagniotov Not sure if this is related, but I'm seeing this strange error in the logs.

INFO  [2017-09-05 18:38:14,518] [com.amazonaws.http.DefaultErrorResponseHandler] Unable to parse HTTP response (Invocation Id:c8ef912c-35d9-197e-0788-dc703bbb1b6f) content to XML document ''
org.xml.sax.SAXParseException: Premature end of file.
	at org.apache.xerces.util.ErrorHandlerWrapper.createSAXParseException(Unknown Source)
	at org.apache.xerces.util.ErrorHandlerWrapper.fatalError(Unknown Source)
	at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source)
	at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source)
	at org.apache.xerces.impl.XMLVersionDetector.determineDocVersion(Unknown Source)
	at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
	at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
	at org.apache.xerces.parsers.XMLParser.parse(Unknown Source)
	at org.apache.xerces.parsers.DOMParser.parse(Unknown Source)
	at org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown Source)
	at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:121)
	at com.amazonaws.util.XpathUtils.documentFrom(XpathUtils.java:162)
	at com.amazonaws.util.XpathUtils.documentFrom(XpathUtils.java:169)
	at com.amazonaws.http.DefaultErrorResponseHandler.parseXml(DefaultErrorResponseHandler.java:124)
	at com.amazonaws.http.DefaultErrorResponseHandler.documentFromContent(DefaultErrorResponseHandler.java:105)
	at com.amazonaws.http.DefaultErrorResponseHandler.createAse(DefaultErrorResponseHandler.java:84)
	at com.amazonaws.http.DefaultErrorResponseHandler.handle(DefaultErrorResponseHandler.java:71)
	at com.amazonaws.http.DefaultErrorResponseHandler.handle(DefaultErrorResponseHandler.java:47)
	at com.amazonaws.http.AwsErrorResponseHandler.handleAse(AwsErrorResponseHandler.java:50)
	at com.amazonaws.http.AwsErrorResponseHandler.handle(AwsErrorResponseHandler.java:38)
	at com.amazonaws.http.AwsErrorResponseHandler.handle(AwsErrorResponseHandler.java:24)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleErrorResponse(AmazonHttpClient.java:1569)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1257)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1029)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:741)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:715)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:697)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:665)
	at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:647)
	at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:511)
	at com.amazonaws.services.cloudwatch.AmazonCloudWatchClient.doInvoke(AmazonCloudWatchClient.java:1320)
	at com.amazonaws.services.cloudwatch.AmazonCloudWatchClient.invoke(AmazonCloudWatchClient.java:1296)
	at com.amazonaws.services.cloudwatch.AmazonCloudWatchClient.executePutMetricData(AmazonCloudWatchClient.java:1204)
	at com.amazonaws.services.cloudwatch.AmazonCloudWatchAsyncClient$14.call(AmazonCloudWatchAsyncClient.java:774)
	at com.amazonaws.services.cloudwatch.AmazonCloudWatchAsyncClient$14.call(AmazonCloudWatchAsyncClient.java:768)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:748)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants