Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add batching to Graphite inputs #2683

Merged
merged 6 commits into from
May 28, 2015
Merged

Add batching to Graphite inputs #2683

merged 6 commits into from
May 28, 2015

Conversation

otoolep
Copy link
Contributor

@otoolep otoolep commented May 28, 2015

This change integrates the new PointBatcher with the Graphite inputs. Most of the work involves passing batching configuration parameters to the Graphite inputs and having those same inputs use the batcher.

There was also one change made to the batcher itself. Previously client code had to pass in channels for sending and receiving points (and point-batches). Now the batcher creates those same channels itself, and returns them. That can be seen in this commit: a9ba189

All Graphite inputs come up with a batch size of 1 by default. This means the lowest performance, but the maximum safety with regards to data loss.

This means that batch sizes of 0 mean send immediately (like a batch
size of 1).
This is more useful for client code, and is a more common Go pattern. As
a result the goroutine is launched within start.
@otoolep otoolep changed the title Graphite batch config control Add batching to Graphite inputs May 28, 2015
@otoolep
Copy link
Contributor Author

otoolep commented May 28, 2015

@corylanou

for {
// Read up to the next newline.
buf, err := reader.ReadBytes('\n')
if err != nil {
close(done)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need some type of flusher on the batcher so that when we shut down we can ensure it drains the batch before we exit?

@corylanou
Copy link
Contributor

This looks good. My only concern is that we don't drain a batch on shutdown, which could mean we lose data that was in flight. We could always add that later as well if we don't want to support that now.

@otoolep
Copy link
Contributor Author

otoolep commented May 28, 2015

Thanks @corylanou -- I agree that is a good idea. Let me get on that immediately after merging this, OK?

@corylanou
Copy link
Contributor

+1 🚢

otoolep added a commit that referenced this pull request May 28, 2015
@otoolep otoolep merged commit 69d7ec2 into master May 28, 2015
@otoolep otoolep deleted the batch_integration branch May 28, 2015 22:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants