Skip to content

feat(event processor): change batching to be compliant with Event Processor #262

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 16 commits into from
Sep 7, 2019

Conversation

jaeopt
Copy link
Contributor

@jaeopt jaeopt commented Sep 4, 2019

  • Add flush on {batchSize hit, revision change, project id change}
  • Clean up batch (batch as much as possible instead of falling back to a single event mode when batch failure detected in an event group)
  • Support maxQueueSize (configurable)
  • Add many more unit tests to increase coverage
  • Event notification is not included in this PR. It'll be prepared in a separate PR.

@coveralls
Copy link

coveralls commented Sep 4, 2019

Coverage Status

Coverage decreased (-2.2%) to 96.837% when pulling a9a22dc on jae/ep into 578f987 on master.

@jaeopt jaeopt closed this Sep 4, 2019
@jaeopt jaeopt reopened this Sep 4, 2019
Copy link
Contributor

@thomaszurkan-optimizely thomaszurkan-optimizely left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks good. A few small nits. Another nit: I think fix is an inappropriate header since it was more of a "feature" to make it compliant with Event Processor.

unsubscribe()
}

func addProjectChangeNotificationObservers() {
NotificationCenter.default.addObserver(forName: .didReceiveProjectIdChange, object: nil, queue: nil) { (notif) in
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we change these to didReceiveOptlyProjectIdChange? Or something with Optimizely in there?

dataStore.save(item: event)

setTimer()
if dataStore.count == batchSize {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

greater than and equal would probably be safer


guard let batchEvent = batched else {
// discard events that create invalid batch and continue
removeStoredEvents(num: numEvents)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i'm not sure this makes sense. you are saying if batching failed remove all the events that were attempted to batch.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In fact, it discards the single event when an invalid event is detected. The comment is misleading. What happens:

  • if an invalid one is found while batching, it batches all the valid ones before the invalid one and send it out.
  • when trying to batch next, it finds the invalid one at the header. It discards that specific invalid one and continue batching next ones.

I'll fix the comments to avoid confusion.

Copy link
Contributor

@thomaszurkan-optimizely thomaszurkan-optimizely Sep 5, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, I got it. But, numEvents should always be 1, right? Can you call that out somehow? You could have passed 1 into here to say remove the offending one. But, with numEvents, its hidden unless you look at the batch function and figure out where it passes back null. Guard numEvents == 1?

@jaeopt jaeopt changed the title fix(event processor): change batching to be compliant with Event Processor feat(event processor): change batching to be compliant with Event Processor Sep 6, 2019
Copy link
Contributor

@thomaszurkan-optimizely thomaszurkan-optimizely left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ship it! :)

@jaeopt jaeopt closed this Sep 7, 2019
@jaeopt jaeopt reopened this Sep 7, 2019
@jaeopt jaeopt merged commit 7204c06 into master Sep 7, 2019
@jaeopt jaeopt deleted the jae/ep branch September 7, 2019 01:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants