You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have set up "Detailed monitoring" in AWS RDS instance. It is logging to Cloudwatch logs.
After setting up Kinesis filter stream, this is what I see in Graylog:
2016-11-07 18:54:06,894 WARN : org.graylog.aws.inputs.flowlogs.FlowLogCodec - Received FlowLog message with not exactly 15 fields. Skipping. Message was: [14785344844000 {"engine":"Postgres","instanceID":null,"instanceResourceID":"db-...","timestamp":"2016-11-07T18:54:04Z","version":1.00,"uptime":"23:53:00","numVCPUs":4,"cpuUtilization":
Is there some kind of limitation? Is 15 fields some kind of magic number?
The text was updated successfully, but these errors were encountered:
I have set up "Detailed monitoring" in AWS RDS instance. It is logging to Cloudwatch logs.
After setting up Kinesis filter stream, this is what I see in Graylog:
2016-11-07 18:54:06,894 WARN : org.graylog.aws.inputs.flowlogs.FlowLogCodec - Received FlowLog message with not exactly 15 fields. Skipping. Message was: [14785344844000 {"engine":"Postgres","instanceID":null,"instanceResourceID":"db-...","timestamp":"2016-11-07T18:54:04Z","version":1.00,"uptime":"23:53:00","numVCPUs":4,"cpuUtilization":
Is there some kind of limitation? Is 15 fields some kind of magic number?
The text was updated successfully, but these errors were encountered: