You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Writing the DynamoDB specified 25 items per batch is super slow, however the docs mention that using multiple threads for the batch operation is totally fine.
Alternatively, the native batch_writer seems to be able to handle many more than 25 items at once. More experimentation is needed.
The text was updated successfully, but these errors were encountered:
As a follow up, it looks like even if you pass batch_writer N > 25 items, it still processes it in chunks of 25. That said (anecdotally), I did notice that data copying seemed to be happening slightly quicker. Maybe a couple minutes faster per chunk of ~7000 items from the AWS instance.
That said, I didn't robustly profile the operation, so it could just be a fluke.
Writing the DynamoDB specified 25 items per batch is super slow, however the docs mention that using multiple threads for the batch operation is totally fine.
Alternatively, the native
batch_writer
seems to be able to handle many more than 25 items at once. More experimentation is needed.The text was updated successfully, but these errors were encountered: