Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

streams: nil record returned by aws-sdk-go? #27

Closed
mumoshu opened this issue May 28, 2018 · 1 comment
Closed

streams: nil record returned by aws-sdk-go? #27

mumoshu opened this issue May 28, 2018 · 1 comment

Comments

@mumoshu
Copy link
Collaborator

mumoshu commented May 28, 2018

I occasionally see interesting errors like this:

2018-05-27T13:49:36.652Z        DEBUG   [kinesis]       streams/client.go:56    mapped to records: [{
  Data: <binary> len 1191,
  PartitionKey: "/var/lib/docker/containers/cb718fd1316624b898befeb767dd22af9dd3db4fba34c8b8f8f169cd196c5d85/cb718fd1316624b898befeb767dd22af9dd3db4fba34c8b8f8f169cd196c5d85-json.log"
} {
<中略>
} {
  Data: <binary> len 969,
  PartitionKey: "/var/lib/docker/containers/03b7c50f36b202f66f3f08fe267fc6f913f12969da348d4d4e348a9865aaf2a7/03b7c50f36b202f66f3f08fe267fc6f913f12969da348d4d4e348a9865aaf2a7-json.log"
} {
  Data: <binary> len 1131,
  PartitionKey: "/var/lib/docker/containers/6dc39940d6815b9578af296b251ab0f2a53981f318b9703155efd672214f711f/6dc39940d6815b9578af296b251ab0f2a53981f318b9703155efd672214f711f-json.log"
} {
  Data: <binary> len 3367,
  PartitionKey: "/var/lib/docker/containers/beaada804640bfdfd5c5be5cc684d60044b27891e05577b1556af65fb27c805a/beaada804640bfdfd5c5be5cc684d60044b27891e05577b1556af65fb27c805a-json.log"
}]
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x8 pc=0x7efeeda62cd6]

goroutine 76 [running]:
github.com/s12v/awsbeats/streams.processFailedDeliveries(0xc42128f980, 0x155dbe0, 0xc420554280)
        /go/src/github.com/s12v/awsbeats/streams/client.go:129 +0xe6
github.com/s12v/awsbeats/streams.(*client).Publish(0xc4202bad80, 0x155dbe0, 0xc420554280, 0xc420198e40, 0xc42023bf78)
        /go/src/github.com/s12v/awsbeats/streams/client.go:63 +0x386
github.com/elastic/beats/libbeat/outputs.(*backoffClient).Publish(0xc42011d260, 0x155dbe0, 0xc420554280, 0x0, 0x0)
        /go/src/github.com/elastic/beats/libbeat/outputs/backoff.go:43 +0x4b
github.com/elastic/beats/libbeat/publisher/pipeline.(*netClientWorker).run(0xc420082800)
        /go/src/github.com/elastic/beats/libbeat/publisher/pipeline/output.go:90 +0x1a9
created by github.com/elastic/beats/libbeat/publisher/pipeline.makeClientWorker
        /go/src/github.com/elastic/beats/libbeat/publisher/pipeline/output.go:31 +0xf0

In theory, this happens only when there is at least one nil inside records in processFailedDeliveries:

		for i, r := range records {
			if *r.ErrorCode != "" {
				failedEvents = append(failedEvents, events[i])
			}
		}

I can hardly believe it there's such a bug in aws-sdk-go, but to gather information regarding the issue, I'd like to handle nil here and emit some logs for our note.

mumoshu added a commit to mumoshu/awsbeats that referenced this issue May 28, 2018
mumoshu added a commit to mumoshu/awsbeats that referenced this issue May 28, 2018
mumoshu added a commit that referenced this issue May 31, 2018
…e, plus Stream+Firehose+S3 support (#31)

* fix(streams): Handle nil record(s) returned by aws-sdk-go

Ref #27

* fix(doc,example): Filebeat docker example was referring invalid tag

* fix(streams): Fix panicing on a retry

Turns out awsbeats was calling `batch.RetryEvents` on different data and timing. Revise the implementation according to the [official beat outputs](https://github.com/elastic/beats/blob/c4af03c51373c1de7daaca660f5d21b3f602771c/libbeat/outputs/elasticsearch/client.go#L234)

Fixes #29

* Support for Kinesis DataStreams->Firehose->S3 pieplines

* Improve error propagation and not give up retrying too aggressively

* Add tests

* More fixes to actually trigger retry on failure. Tested with invalid AWS credential
@mumoshu
Copy link
Collaborator Author

mumoshu commented May 31, 2018

I believe this is addressed in #31. Closing until it reproduces with actual problem(s).

@mumoshu mumoshu closed this as completed May 31, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant