Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reimplement MaxBatchSize as a pre-check #27696

Merged
1 commit merged into from
Mar 26, 2022
Merged

Reimplement MaxBatchSize as a pre-check #27696

1 commit merged into from
Mar 26, 2022

Conversation

roji
Copy link
Member

@roji roji commented Mar 24, 2022

For better perf on SQLite (#27681)

Also moves the handling of MaxBatchSize to ReaderModificationCommandBatch and does some cleanup.


_maxBatchSize = Math.Min(
options.Extensions.OfType<SqlServerOptionsExtension>().FirstOrDefault()?.MaxBatchSize ?? DefaultMaxBatchSize,
MaxMaxBatchSize);
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@AndriySvyryd we may want to start throwing if the MaxMaxBatchSize is exceeded, rather than silently reducing it (after all the user explicitly wants something else). Would be a pretty minor breaking change.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wouldn't worry about that. With the current way if MaxMaxBatchSize changes that user just needs to upgrade

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My thought was to either allow users to explicitl go beyond the 1000 MaxMaxBatchSize (it's their responsibility...), or to throw if they try to do so, rather than silently reduce to 1000.. But not very important.

@roji roji marked this pull request as ready for review March 26, 2022 16:35
@roji roji requested a review from AndriySvyryd March 26, 2022 16:36
For better perf on SQLite (dotnet#27681)

Also moves the handling of MaxBatchSize to
ReaderModificationCommandBatch and does some cleanup.
@ghost
Copy link

ghost commented Mar 26, 2022

Hello @roji!

Because this pull request has the auto-merge label, I will be glad to assist with helping to merge this pull request once all check-in policies pass.

p.s. you can customize the way I help with merging this pull request, such as holding this pull request until a specific person approves. Simply @mention me (@msftbot) and give me an instruction to get started! Learn more here.

@ghost
Copy link

ghost commented Mar 26, 2022

Apologies, while this PR appears ready to be merged, I've been configured to only merge when all checks have explicitly passed. The following integrations have not reported any progress on their checks and are blocking auto-merge:

  1. Azure Pipelines

These integrations are possibly never going to report a check, and unblocking auto-merge likely requires a human being to update my configuration to exempt these integrations from requiring a passing check.

Give feedback on this
From the bot dev team

We've tried to tune the bot such that it posts a comment like this only when auto-merge is blocked for exceptional, non-intuitive reasons. When the bot's auto-merge capability is properly configured, auto-merge should operate as you would intuitively expect and you should not see any spurious comments.

Please reach out to us at fabricbotservices@microsoft.com to provide feedback if you believe you're seeing this comment appear spuriously. Please note that we usually are unable to update your bot configuration on your team's behalf, but we're happy to help you identify your bot admin.

@ghost ghost merged commit 97b9376 into dotnet:main Mar 26, 2022
@roji roji deleted the MaxBatchSize branch March 26, 2022 18:14
This pull request was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants