Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: Start from earliest slice for downscaling scenario #1012

Merged
merged 6 commits into from
Oct 5, 2023

Conversation

patriknw
Copy link
Member

  • start from earliest slice when projection key is changed

New iteration of #995 and #1004

* start from earliest slice when projection key is changed
Copy link
Contributor

@pvlugter pvlugter left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good. Earliest offset will always cover it, may just be further back than necessary.

Test environment no longer exists — should we set up a new one for testing these changes?

// offset of the earliest slice
val latestBySlice = newState.latestBySlice
val earliest = latestBySlice.minBy(_.timestamp).timestamp
Some(TimestampOffset(earliest, Map.empty))
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For the latest offset, the seen map is populated. When I was experimenting with the earliest or latest-by-split offsets, I also ended up creating the seen map from byPid for the selected offset here. Shouldn't affect behaviour when eventually deduplicated, but I think at least the metrics were confused (inactive projections had growing consumer lag metrics).

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think it's important to reconstruct it in this case, but I added the obvious from earliest record. 4ab18c3

@patriknw
Copy link
Member Author

patriknw commented Oct 4, 2023

@pvlugter @johanandren Shall we go with this? So that we include it in our testing?

Copy link
Member

@johanandren johanandren left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@pvlugter pvlugter left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sounds good to have it in testing. And I'll look to recreate the bigger tests that we've been running before final release.

@patriknw patriknw merged commit 799d472 into main Oct 5, 2023
26 checks passed
@patriknw patriknw deleted the wip-downscaling2-patriknw branch October 5, 2023 06:19
@patriknw patriknw added this to the 1.5.0-M5 milestone Oct 5, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants