Parent Issue
Part of #124 (support partitioned table)
Depends on #126, #127, #128, #129
Background
Currently, the only integration test (read_log_tables.rs) tests reading a non-partitioned, non-pk append-only table. We need integration tests specifically for partitioned tables.
What needs to be done
-
Prepare test fixtures
- Create a partitioned table warehouse fixture (or provide instructions to generate one using Java Paimon / Flink)
- Tables should cover:
- Single partition key (e.g.,
dt STRING)
- Multiple partition keys (e.g.,
dt STRING, hr INT)
- Different partition key types (String, Int, Date)
-
Integration tests
- Read a single-partition-key table, verify all partitions are returned with correct data
- Read a multi-partition-key table
- Verify partition column values are correct in the returned RecordBatches
-
Unit tests
- Test
TableScan.plan() on a partitioned table: verify DataSplits have correct bucket_path with partition segments
- Test that the number of splits matches expected
(partition, bucket) groups
Affected files
crates/integration_tests/tests/ — new test file(s)
crates/paimon/tests/fixtures/ — possible new fixtures
Parent Issue
Part of #124 (support partitioned table)
Depends on #126, #127, #128, #129
Background
Currently, the only integration test (
read_log_tables.rs) tests reading a non-partitioned, non-pk append-only table. We need integration tests specifically for partitioned tables.What needs to be done
Prepare test fixtures
dt STRING)dt STRING, hr INT)Integration tests
Unit tests
TableScan.plan()on a partitioned table: verify DataSplits have correctbucket_pathwith partition segments(partition, bucket)groupsAffected files
crates/integration_tests/tests/— new test file(s)crates/paimon/tests/fixtures/— possible new fixtures