Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Autoanalyze cache tables #6599

Merged
merged 2 commits into from
Jun 12, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).
### Added

### Changed
- FlowDB now triggers an ANALYZE on newly created cache tables to generate statistics rather than waiting for autovacuum

### Fixed

Expand Down
7 changes: 7 additions & 0 deletions flowmachine/flowmachine/core/cache.py
Original file line number Diff line number Diff line change
Expand Up @@ -135,6 +135,7 @@ def write_query_to_cache(
ddl_ops_func: Callable[[str, str], List[str]],
schema: Optional[str] = "cache",
sleep_duration: Optional[int] = 1,
analyze=True,
) -> "Query":
"""
Write a Query object into a postgres table and update the cache metadata about it.
Expand All @@ -159,6 +160,8 @@ def write_query_to_cache(
Name of the schema to write to
sleep_duration : int, default 1
Number of seconds to wait between polls when monitoring a query being written from elsewhere
analyze : bool, default True
Set to False to _disable_ running analyze on the newly created table to generate statistics

Returns
-------
Expand Down Expand Up @@ -203,6 +206,10 @@ def write_query_to_cache(
except Exception as exc:
logger.error(f"Error executing SQL. Error was {exc}")
raise exc
if analyze:
logger.debug(f"Running analyze for {schema}.{name}.")
trans.exec_driver_sql(f"ANALYZE {schema}.{name};")
logger.debug(f"Ran analyze for {schema}.{name}.")
if schema == "cache":
try:
write_cache_metadata(
Expand Down