Skip to content

[BUG] ProvisionedThroughputExceededException when performing VACUUM using S3DynamoDBLogStore #3423

@scottsand-db

Description

@scottsand-db

Bug

Which Delta project/connector is this regarding?

  • Spark
  • Standalone
  • Flink
  • Kernel
  • Other (fill in here)

Describe the problem

Steps to reproduce

Perform VACUUM on a table configured to use the S3DynamoDBLogStore

Observed results

ProvisionedThroughputExceededException caused by:

at io.delta.storage.S3DynamoDBLogStore.getLatestExternalEntry(S3DynamoDBLogStore.java:185)
at io.delta.storage.BaseExternalLogStore.listFrom(BaseExternalLogStore.java:113)

Expected results

VACUUM shouldn't cause throughput-exceeded exception on the DynamoDB table.

Environment information

  • Delta Lake version: 3.2.0
  • Spark version: 3.5.1
  • Scala version: 2.12

Willingness to contribute

The Delta Lake Community encourages bug fix contributions. Would you or another member of your organization be willing to contribute a fix for this bug to the Delta Lake code base?

  • Yes. I can contribute a fix for this bug independently.
  • Yes. I would be willing to contribute a fix for this bug with guidance from the Delta Lake community.
  • No. I cannot contribute a bug fix at this time.

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions