Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Re-add chunked transaction batching support #254

Merged
merged 3 commits into from Sep 27, 2022

Conversation

Arkatufus
Copy link
Contributor

@Arkatufus Arkatufus commented Sep 26, 2022

Fixes #251

Re-add transaction batching code.

NOTE

This means that on a very high load system, atomic writes are NOT atomic anymore. Since transactions have to be chunked to 100 entries, partial writes can happen if an exception was thrown mid-write.

Copy link
Member

@Aaronontheweb Aaronontheweb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM but need to add comments in new PR


var entities = await _tableClient.QueryAsync<TableEntity>("PartitionKey eq 'test'", null, null, cts.Token)
.ToListAsync(cts.Token);
entities.Count.Should().Be(0);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM


var entities = await _tableClient.QueryAsync<TableEntity>("PartitionKey eq 'test'", null, null, cts.Token)
.ToListAsync(cts.Token);
entities.Count.Should().Be(50);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM


using var cts = new CancellationTokenSource(3.Seconds());
var result = await _tableClient.ExecuteBatchAsLimitedBatches(entries, cts.Token);
result.Count.Should().Be(505);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@@ -365,10 +370,10 @@ protected override async Task<IImmutableList<Exception>> WriteMessagesAsync(IEnu
if (_log.IsDebugEnabled && _settings.VerboseLogging)
_log.Debug("Attempting to write batch of {0} messages to Azure storage", batchItems.Count);

var response = await Table.SubmitTransactionAsync(batchItems, _shutdownCts.Token);
var response = await Table.ExecuteBatchAsLimitedBatches(batchItems, _shutdownCts.Token);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Need to explain somewhere why we call this method and what the historical reasons were, that we we don't accidentally remove this method again in the future. i.e. this is an Azure Table Storage API limitation we're working around and we know this breaks atomicity for batches over 100 but we can't do anything about that.

@Aaronontheweb Aaronontheweb merged commit ac47852 into petabridge:dev Sep 27, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants