Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Skip Ordered Append when only 1 child node is present #5547

Merged
merged 1 commit into from Apr 12, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
17 changes: 9 additions & 8 deletions CHANGELOG.md
Expand Up @@ -9,29 +9,30 @@ accidentally triggering the load of a previous DB version.**
**Features**
* #5212 Allow pushdown of reference table joins
* #5221 Improve Realtime Continuous Aggregate performance
* #5312 Add timeout support to the ping_data_node()
* #5361 Add parallel support for partialize_agg()
* #5252 Improve unique constraint support on compressed hypertables
* #5312 Add timeout support to ping_data_node()
* #5454 Add support for ON CONFLICT DO UPDATE for compressed hypertables
* #5312 Add timeout support to the ping_data_node()
* #5339 Support UPDATE/DELETE on compressed hypertables
* #5344 Enable JOINS for Hierarchical Continuous Aggregates
* #5361 Add parallel support for partialize_agg()
* #5417 Refactor and optimize distributed COPY
* #5339 Support UPDATE/DELETE on compressed hypertables
* #5454 Add support for ON CONFLICT DO UPDATE for compressed hypertables
* #5547 Skip Ordered Append when only 1 child node is present

**Bugfixes**
* #5233 Out of on_proc_exit slots on guc license change
* #5396 Fix SEGMENTBY columns predicates to be pushed down
* #5410 Fix file trailer handling in the COPY fetcher
* #5233 Out of on_proc_exit slots on guc license change
* #5427 Handle user-defined FDW options properly
* #5428 Use consistent snapshots when scanning metadata
* #5442 Decompression may have lost DEFAULT values
* #5446 Add checks for malloc failure in libpq calls
* #5470 Ensure superuser perms during copy/move chunk
* #5459 Fix issue creating dimensional constraints
* #5499 Do not segfault on large histogram() parameters
* #5462 Fix segfault after column drop on compressed table
* #5470 Ensure superuser perms during copy/move chunk
* #5497 Allow named time_bucket arguments in Cagg definition
* #5499 Do not segfault on large histogram() parameters
* #5500 Fix when no FROM clause in continuous aggregate definition
* #5462 Fix segfault after column drop on compressed table

**Thanks**
* @nikolaps for reporting an issue with the COPY fetcher
Expand Down
19 changes: 19 additions & 0 deletions src/planner/planner.c
Expand Up @@ -899,6 +899,25 @@ should_chunk_append(Hypertable *ht, PlannerInfo *root, RelOptInfo *rel, Path *pa
if (ht && ts_chunk_get_osm_chunk_id(ht->fd.id) != INVALID_CHUNK_ID)
return false;

/*
* If we only have 1 child node there is no need for the
* ordered append optimization. We might still benefit from
* a ChunkAppend node here due to runtime chunk exclusion
* when we have non-immutable constraints.
*/
if (list_length(merge->subpaths) == 1)
{
foreach (lc, rel->baserestrictinfo)
{
RestrictInfo *rinfo = (RestrictInfo *) lfirst(lc);

if (contain_mutable_functions((Node *) rinfo->clause) ||
ts_contain_param((Node *) rinfo->clause))
return true;
}
return false;
}

/*
* Check for partial compressed chunks.
*
Expand Down
46 changes: 19 additions & 27 deletions test/expected/agg_bookends-12.out
Expand Up @@ -51,36 +51,30 @@ SELECT setting, current_setting(setting) AS value from (VALUES ('timescaledb.ena
(1 row)

:PREFIX SELECT time, gp, temp FROM btest ORDER BY time;
QUERY PLAN
-------------------------------------------------------------------------------------------------------------
Custom Scan (ChunkAppend) on btest (actual rows=6 loops=1)
Order: btest."time"
-> Index Scan Backward using _hyper_1_1_chunk_btest_time_idx on _hyper_1_1_chunk (actual rows=6 loops=1)
(3 rows)
QUERY PLAN
-------------------------------------------------------------------------------------------------------
Index Scan Backward using _hyper_1_1_chunk_btest_time_idx on _hyper_1_1_chunk (actual rows=6 loops=1)
(1 row)

:PREFIX SELECT last(temp, time) FROM btest;
QUERY PLAN
------------------------------------------------------------------------------------------------------------------
QUERY PLAN
------------------------------------------------------------------------------------------------------------
Result (actual rows=1 loops=1)
InitPlan 1 (returns $0)
-> Limit (actual rows=1 loops=1)
-> Custom Scan (ChunkAppend) on btest (actual rows=1 loops=1)
Order: btest."time" DESC
-> Index Scan using _hyper_1_1_chunk_btest_time_idx on _hyper_1_1_chunk (actual rows=1 loops=1)
Index Cond: ("time" IS NOT NULL)
(7 rows)
-> Index Scan using _hyper_1_1_chunk_btest_time_idx on _hyper_1_1_chunk (actual rows=1 loops=1)
Index Cond: ("time" IS NOT NULL)
(5 rows)

:PREFIX SELECT first(temp, time) FROM btest;
QUERY PLAN
---------------------------------------------------------------------------------------------------------------------------
QUERY PLAN
---------------------------------------------------------------------------------------------------------------------
Result (actual rows=1 loops=1)
InitPlan 1 (returns $0)
-> Limit (actual rows=1 loops=1)
-> Custom Scan (ChunkAppend) on btest (actual rows=1 loops=1)
Order: btest."time"
-> Index Scan Backward using _hyper_1_1_chunk_btest_time_idx on _hyper_1_1_chunk (actual rows=1 loops=1)
Index Cond: ("time" IS NOT NULL)
(7 rows)
-> Index Scan Backward using _hyper_1_1_chunk_btest_time_idx on _hyper_1_1_chunk (actual rows=1 loops=1)
Index Cond: ("time" IS NOT NULL)
(5 rows)

:PREFIX SELECT last(temp, time_alt) FROM btest;
QUERY PLAN
Expand Down Expand Up @@ -281,16 +275,14 @@ INSERT INTO btest VALUES('2020-01-20T09:00:43', '2020-01-20T09:00:43', 2, 35.3);
--Previously, some bugs were found with NULLS and numeric types, so test that
INSERT INTO btest_numeric VALUES ('2019-01-20T09:00:43', NULL);
:PREFIX SELECT last(quantity, time) FROM btest_numeric;
QUERY PLAN
--------------------------------------------------------------------------------------------------------------------------
QUERY PLAN
--------------------------------------------------------------------------------------------------------------------
Result (actual rows=1 loops=1)
InitPlan 1 (returns $0)
-> Limit (actual rows=1 loops=1)
-> Custom Scan (ChunkAppend) on btest_numeric (actual rows=1 loops=1)
Order: btest_numeric."time" DESC
-> Index Scan using _hyper_2_6_chunk_btest_numeric_time_idx on _hyper_2_6_chunk (actual rows=1 loops=1)
Index Cond: ("time" IS NOT NULL)
(7 rows)
-> Index Scan using _hyper_2_6_chunk_btest_numeric_time_idx on _hyper_2_6_chunk (actual rows=1 loops=1)
Index Cond: ("time" IS NOT NULL)
(5 rows)

--check non-null element "overrides" NULL because it comes after.
INSERT INTO btest_numeric VALUES('2020-01-20T09:00:43', 30.5);
Expand Down
46 changes: 19 additions & 27 deletions test/expected/agg_bookends-13.out
Expand Up @@ -51,36 +51,30 @@ SELECT setting, current_setting(setting) AS value from (VALUES ('timescaledb.ena
(1 row)

:PREFIX SELECT time, gp, temp FROM btest ORDER BY time;
QUERY PLAN
-------------------------------------------------------------------------------------------------------------
Custom Scan (ChunkAppend) on btest (actual rows=6 loops=1)
Order: btest."time"
-> Index Scan Backward using _hyper_1_1_chunk_btest_time_idx on _hyper_1_1_chunk (actual rows=6 loops=1)
(3 rows)
QUERY PLAN
-------------------------------------------------------------------------------------------------------
Index Scan Backward using _hyper_1_1_chunk_btest_time_idx on _hyper_1_1_chunk (actual rows=6 loops=1)
(1 row)

:PREFIX SELECT last(temp, time) FROM btest;
QUERY PLAN
------------------------------------------------------------------------------------------------------------------
QUERY PLAN
------------------------------------------------------------------------------------------------------------
Result (actual rows=1 loops=1)
InitPlan 1 (returns $0)
-> Limit (actual rows=1 loops=1)
-> Custom Scan (ChunkAppend) on btest (actual rows=1 loops=1)
Order: btest."time" DESC
-> Index Scan using _hyper_1_1_chunk_btest_time_idx on _hyper_1_1_chunk (actual rows=1 loops=1)
Index Cond: ("time" IS NOT NULL)
(7 rows)
-> Index Scan using _hyper_1_1_chunk_btest_time_idx on _hyper_1_1_chunk (actual rows=1 loops=1)
Index Cond: ("time" IS NOT NULL)
(5 rows)

:PREFIX SELECT first(temp, time) FROM btest;
QUERY PLAN
---------------------------------------------------------------------------------------------------------------------------
QUERY PLAN
---------------------------------------------------------------------------------------------------------------------
Result (actual rows=1 loops=1)
InitPlan 1 (returns $0)
-> Limit (actual rows=1 loops=1)
-> Custom Scan (ChunkAppend) on btest (actual rows=1 loops=1)
Order: btest."time"
-> Index Scan Backward using _hyper_1_1_chunk_btest_time_idx on _hyper_1_1_chunk (actual rows=1 loops=1)
Index Cond: ("time" IS NOT NULL)
(7 rows)
-> Index Scan Backward using _hyper_1_1_chunk_btest_time_idx on _hyper_1_1_chunk (actual rows=1 loops=1)
Index Cond: ("time" IS NOT NULL)
(5 rows)

:PREFIX SELECT last(temp, time_alt) FROM btest;
QUERY PLAN
Expand Down Expand Up @@ -288,16 +282,14 @@ INSERT INTO btest VALUES('2020-01-20T09:00:43', '2020-01-20T09:00:43', 2, 35.3);
--Previously, some bugs were found with NULLS and numeric types, so test that
INSERT INTO btest_numeric VALUES ('2019-01-20T09:00:43', NULL);
:PREFIX SELECT last(quantity, time) FROM btest_numeric;
QUERY PLAN
--------------------------------------------------------------------------------------------------------------------------
QUERY PLAN
--------------------------------------------------------------------------------------------------------------------
Result (actual rows=1 loops=1)
InitPlan 1 (returns $0)
-> Limit (actual rows=1 loops=1)
-> Custom Scan (ChunkAppend) on btest_numeric (actual rows=1 loops=1)
Order: btest_numeric."time" DESC
-> Index Scan using _hyper_2_6_chunk_btest_numeric_time_idx on _hyper_2_6_chunk (actual rows=1 loops=1)
Index Cond: ("time" IS NOT NULL)
(7 rows)
-> Index Scan using _hyper_2_6_chunk_btest_numeric_time_idx on _hyper_2_6_chunk (actual rows=1 loops=1)
Index Cond: ("time" IS NOT NULL)
(5 rows)

--check non-null element "overrides" NULL because it comes after.
INSERT INTO btest_numeric VALUES('2020-01-20T09:00:43', 30.5);
Expand Down
46 changes: 19 additions & 27 deletions test/expected/agg_bookends-14.out
Expand Up @@ -51,36 +51,30 @@ SELECT setting, current_setting(setting) AS value from (VALUES ('timescaledb.ena
(1 row)

:PREFIX SELECT time, gp, temp FROM btest ORDER BY time;
QUERY PLAN
-------------------------------------------------------------------------------------------------------------
Custom Scan (ChunkAppend) on btest (actual rows=6 loops=1)
Order: btest."time"
-> Index Scan Backward using _hyper_1_1_chunk_btest_time_idx on _hyper_1_1_chunk (actual rows=6 loops=1)
(3 rows)
QUERY PLAN
-------------------------------------------------------------------------------------------------------
Index Scan Backward using _hyper_1_1_chunk_btest_time_idx on _hyper_1_1_chunk (actual rows=6 loops=1)
(1 row)

:PREFIX SELECT last(temp, time) FROM btest;
QUERY PLAN
------------------------------------------------------------------------------------------------------------------
QUERY PLAN
------------------------------------------------------------------------------------------------------------
Result (actual rows=1 loops=1)
InitPlan 1 (returns $0)
-> Limit (actual rows=1 loops=1)
-> Custom Scan (ChunkAppend) on btest (actual rows=1 loops=1)
Order: btest."time" DESC
-> Index Scan using _hyper_1_1_chunk_btest_time_idx on _hyper_1_1_chunk (actual rows=1 loops=1)
Index Cond: ("time" IS NOT NULL)
(7 rows)
-> Index Scan using _hyper_1_1_chunk_btest_time_idx on _hyper_1_1_chunk (actual rows=1 loops=1)
Index Cond: ("time" IS NOT NULL)
(5 rows)

:PREFIX SELECT first(temp, time) FROM btest;
QUERY PLAN
---------------------------------------------------------------------------------------------------------------------------
QUERY PLAN
---------------------------------------------------------------------------------------------------------------------
Result (actual rows=1 loops=1)
InitPlan 1 (returns $0)
-> Limit (actual rows=1 loops=1)
-> Custom Scan (ChunkAppend) on btest (actual rows=1 loops=1)
Order: btest."time"
-> Index Scan Backward using _hyper_1_1_chunk_btest_time_idx on _hyper_1_1_chunk (actual rows=1 loops=1)
Index Cond: ("time" IS NOT NULL)
(7 rows)
-> Index Scan Backward using _hyper_1_1_chunk_btest_time_idx on _hyper_1_1_chunk (actual rows=1 loops=1)
Index Cond: ("time" IS NOT NULL)
(5 rows)

:PREFIX SELECT last(temp, time_alt) FROM btest;
QUERY PLAN
Expand Down Expand Up @@ -288,16 +282,14 @@ INSERT INTO btest VALUES('2020-01-20T09:00:43', '2020-01-20T09:00:43', 2, 35.3);
--Previously, some bugs were found with NULLS and numeric types, so test that
INSERT INTO btest_numeric VALUES ('2019-01-20T09:00:43', NULL);
:PREFIX SELECT last(quantity, time) FROM btest_numeric;
QUERY PLAN
--------------------------------------------------------------------------------------------------------------------------
QUERY PLAN
--------------------------------------------------------------------------------------------------------------------
Result (actual rows=1 loops=1)
InitPlan 1 (returns $0)
-> Limit (actual rows=1 loops=1)
-> Custom Scan (ChunkAppend) on btest_numeric (actual rows=1 loops=1)
Order: btest_numeric."time" DESC
-> Index Scan using _hyper_2_6_chunk_btest_numeric_time_idx on _hyper_2_6_chunk (actual rows=1 loops=1)
Index Cond: ("time" IS NOT NULL)
(7 rows)
-> Index Scan using _hyper_2_6_chunk_btest_numeric_time_idx on _hyper_2_6_chunk (actual rows=1 loops=1)
Index Cond: ("time" IS NOT NULL)
(5 rows)

--check non-null element "overrides" NULL because it comes after.
INSERT INTO btest_numeric VALUES('2020-01-20T09:00:43', 30.5);
Expand Down