-
Notifications
You must be signed in to change notification settings - Fork 16.6k
chore: clean up deprecated utcnow #37538
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| return ( | ||
| last_success is not None | ||
| and self._report_schedule.grace_period | ||
| and datetime.utcnow() | ||
| and datetime.now(timezone.utc) | ||
| - timedelta(seconds=self._report_schedule.grace_period) | ||
| < last_success.end_dttm | ||
| ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Suggestion: Comparing an aware datetime produced by datetime.now(timezone.utc) with last_success.end_dttm can raise a TypeError if last_success.end_dttm is a naive datetime or be incorrect if it's None; ensure last_success.end_dttm is present and timezone-aware before the subtraction and comparison. [type error]
Severity Level: Major ⚠️
- ❌ Report grace-period check may raise TypeError.
- ⚠️ Alert grace handling (ReportSuccessState.next) affected.
- ⚠️ Most logs are timezone-aware; issue is rare.| return ( | |
| last_success is not None | |
| and self._report_schedule.grace_period | |
| and datetime.utcnow() | |
| and datetime.now(timezone.utc) | |
| - timedelta(seconds=self._report_schedule.grace_period) | |
| < last_success.end_dttm | |
| ) | |
| if last_success is None or not self._report_schedule.grace_period: | |
| return False | |
| now = datetime.now(timezone.utc) | |
| end_dttm = last_success.end_dttm | |
| if end_dttm is None: | |
| return False | |
| if end_dttm.tzinfo is None: | |
| end_dttm = end_dttm.replace(tzinfo=timezone.utc) | |
| return now - timedelta(seconds=self._report_schedule.grace_period) < end_dttm |
Steps of Reproduction ✅
1. Ensure a ReportSchedule exists with a non-zero grace_period (db row).
2. Have a legacy or manually-inserted ReportExecutionLog row returned by
ReportScheduleDAO.find_last_success_log(self._report_schedule) (called at
superset/commands/report/execute.py:762) whose end_dttm is NULL or a naive datetime (no
tzinfo).
3. Trigger report execution via AsyncExecuteReportScheduleCommand.run
(superset/commands/report/execute.py:1055) so the state machine reaches
ReportSuccessState.next and calls is_in_grace_period()
(superset/commands/report/execute.py:762).
4. At is_in_grace_period(), the expression datetime.now(timezone.utc) - timedelta(...) <
last_success.end_dttm executes and Python raises TypeError when comparing an aware
datetime to a naive one (or the logic yields incorrect result if end_dttm is None). Note:
create_log (superset/commands/report/execute.py:176) sets end_dttm to
datetime.now(timezone.utc) for logs created by this code, so reproducing requires
pre-existing legacy/hand-inserted rows or external writers.Prompt for AI Agent 🤖
This is a comment left during a code review.
**Path:** superset/commands/report/execute.py
**Line:** 762:768
**Comment:**
*Type Error: Comparing an aware datetime produced by `datetime.now(timezone.utc)` with `last_success.end_dttm` can raise a TypeError if `last_success.end_dttm` is a naive datetime or be incorrect if it's None; ensure `last_success.end_dttm` is present and timezone-aware before the subtraction and comparison.
Validate the correctness of the flagged issue. If correct, How can I resolve this? If you propose a fix, implement it and please make it concise.| return ( | ||
| self._report_schedule.working_timeout is not None | ||
| and self._report_schedule.last_eval_dttm is not None | ||
| and datetime.utcnow() | ||
| and datetime.now(timezone.utc) | ||
| - timedelta(seconds=self._report_schedule.working_timeout) | ||
| > last_working.end_dttm | ||
| ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Suggestion: The working-timeout check subtracts a timedelta from datetime.now(timezone.utc) and compares to last_working.end_dttm; if last_working.end_dttm is None or naive, the subtraction/comparison will raise or be incorrect—validate and normalize end_dttm first. [type error]
Severity Level: Major ⚠️
- ❌ Working-timeout check may raise TypeError.
- ⚠️ ReportWorkingState.next may fail unexpectedly.
- ⚠️ Most logs created here are timezone-aware; issue rare.| return ( | |
| self._report_schedule.working_timeout is not None | |
| and self._report_schedule.last_eval_dttm is not None | |
| and datetime.utcnow() | |
| and datetime.now(timezone.utc) | |
| - timedelta(seconds=self._report_schedule.working_timeout) | |
| > last_working.end_dttm | |
| ) | |
| if ( | |
| self._report_schedule.working_timeout is None | |
| or self._report_schedule.last_eval_dttm is None | |
| or last_working is None | |
| ): | |
| return False | |
| now = datetime.now(timezone.utc) | |
| end_dttm = last_working.end_dttm | |
| if end_dttm is None: | |
| return False | |
| if end_dttm.tzinfo is None: | |
| end_dttm = end_dttm.replace(tzinfo=timezone.utc) | |
| return now - timedelta(seconds=self._report_schedule.working_timeout) > end_dttm |
Steps of Reproduction ✅
1. Have a ReportSchedule with working_timeout set and last_state==WORKING so
ReportWorkingState.next runs (ReportWorkingState.next calls is_on_working_timeout at
superset/commands/report/execute.py:796).
2. Ensure ReportScheduleDAO.find_last_entered_working_log(self._report_schedule) (called
in is_on_working_timeout) returns a last_working whose end_dttm is NULL or a naive
datetime (no tzinfo). This can be simulated by inserting/updating a legacy
ReportExecutionLog row outside the code path.
3. Invoke AsyncExecuteReportScheduleCommand.run (superset/commands/report/execute.py:1055)
so the state machine evaluates is_on_working_timeout().
4. Observe the expression datetime.now(timezone.utc) - timedelta(...) >
last_working.end_dttm raising TypeError when comparing aware and naive datetimes, causing
ReportWorkingState.next to error. Note: create_log in this file
(superset/commands/report/execute.py:176) sets end_dttm with timezone, so reproducing
needs legacy/altered rows.Prompt for AI Agent 🤖
This is a comment left during a code review.
**Path:** superset/commands/report/execute.py
**Line:** 796:802
**Comment:**
*Type Error: The working-timeout check subtracts a timedelta from `datetime.now(timezone.utc)` and compares to `last_working.end_dttm`; if `last_working.end_dttm` is None or naive, the subtraction/comparison will raise or be incorrect—validate and normalize `end_dttm` first.
Validate the correctness of the flagged issue. If correct, How can I resolve this? If you propose a fix, implement it and please make it concise.| for report_schedule in db.session.query(ReportSchedule).all(): | ||
| if report_schedule.log_retention is not None: | ||
| from_date = datetime.utcnow() - timedelta( | ||
| from_date = datetime.now(timezone.utc) - timedelta( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Suggestion: Mixing a timezone-aware datetime (created via datetime.now(timezone.utc)) with database timestamps that are naive (no tzinfo) can raise a TypeError when SQLAlchemy/Python compares them; use a naive UTC datetime (datetime.utcnow()) to avoid aware-vs-naive comparison issues unless the DB/columns are known to be timezone-aware. [type error]
Severity Level: Major ⚠️
- ❌ Report schedule log pruning job may raise TypeError.
- ⚠️ ReportScheduleDAO bulk deletion may silently fail.
- ⚠️ Background maintenance task behavior becomes inconsistent.| from_date = datetime.now(timezone.utc) - timedelta( | |
| from_date = datetime.utcnow() - timedelta( |
Steps of Reproduction ✅
1. Trigger the prune routine by invoking AsyncPruneReportScheduleLogCommand.run() located
at superset/commands/report/log_prune.py:41 (method run). The method iterates schedules at
superset/commands/report/log_prune.py:42 and enters the branch at line 43 when a schedule
has log_retention set.
2. At superset/commands/report/log_prune.py:44-46 the code computes from_date =
datetime.now(timezone.utc) - timedelta(...). This produces a timezone-aware datetime
object (tzinfo=UTC).
3. The code then calls ReportScheduleDAO.bulk_delete_logs(...) at approximately
superset/commands/report/log_prune.py:48 (call site immediately after the from_date
computation). If ReportScheduleDAO.bulk_delete_logs performs an in-Python comparison (for
example, iterating log records and comparing model.timestamp < from_date) or constructs
SQLAlchemy filters that require Python-level datetime comparison, Python will raise
TypeError: "can't compare offset-naive and offset-aware datetimes".
4. Observe TypeError raised from the comparison inside ReportScheduleDAO.bulk_delete_logs,
causing the prune run to either append the exception to prune_errors
(superset/commands/report/log_prune.py:57) or fail depending on where the exception
occurs. If your DB/timestamps are timezone-aware and the DAO uses DB-side filtering, no
error is raised — the failure is conditional on how timestamps are represented and how
bulk_delete_logs implements the comparison.Prompt for AI Agent 🤖
This is a comment left during a code review.
**Path:** superset/commands/report/log_prune.py
**Line:** 44:44
**Comment:**
*Type Error: Mixing a timezone-aware `datetime` (created via `datetime.now(timezone.utc)`) with database timestamps that are naive (no tzinfo) can raise a TypeError when SQLAlchemy/Python compares them; use a naive UTC datetime (`datetime.utcnow()`) to avoid aware-vs-naive comparison issues unless the DB/columns are known to be timezone-aware.
Validate the correctness of the flagged issue. If correct, How can I resolve this? If you propose a fix, implement it and please make it concise.| "time": datetime_to_epoch(log.dttm), | ||
| "time_delta_humanized": humanize.naturaltime( | ||
| datetime.utcnow() - log.dttm | ||
| datetime.now(timezone.utc) - log.dttm |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Suggestion: Mixing timezone-aware datetime.now(timezone.utc) with a possibly naive log.dttm will raise a TypeError ("can't subtract offset-naive and offset-aware datetimes"). Ensure both sides use the same timezone-awareness before subtracting: detect if log.dttm is naive and make it timezone-aware (or convert both to UTC-aware) before performing the subtraction. [type error]
Severity Level: Major ⚠️
- ❌ Recent activity listing can raise TypeError and fail.
- ⚠️ API endpoints calling get_recent_activity may return 500s.
- ⚠️ Admin/UI "Recent Activity" timeline displays broken responses.| datetime.now(timezone.utc) - log.dttm | |
| (datetime.now(timezone.utc) - (log.dttm if log.dttm.tzinfo is not None else log.dttm.replace(tzinfo=timezone.utc))) |
Steps of Reproduction ✅
1. Start the application (or a Python shell) with the PR code so `superset/daos/log.py` is
loaded.
2. Create or ensure a Log row whose `dttm` is stored as an offset-naive datetime (e.g.,
construct `superset.models.core.Log(dttm=datetime(2020,1,1))` and persist it via
`db.session.add(...); db.session.commit()`).
3. Call LogDAO.get_recent_activity(...) in `superset/daos/log.py` (function context where
humanize is invoked; see lines around 142-146) — the query returns that Log instance and
the code reaches the loop that evaluates `humanize.naturaltime(datetime.now(timezone.utc)
- log.dttm)` at line 145.
4. At runtime Python raises TypeError: "can't subtract offset-naive and offset-aware
datetimes" when `datetime.now(timezone.utc)` (aware) is subtracted by a naive `log.dttm`,
causing the request/operation that called get_recent_activity to fail.Prompt for AI Agent 🤖
This is a comment left during a code review.
**Path:** superset/daos/log.py
**Line:** 145:145
**Comment:**
*Type Error: Mixing timezone-aware `datetime.now(timezone.utc)` with a possibly naive `log.dttm` will raise a TypeError ("can't subtract offset-naive and offset-aware datetimes"). Ensure both sides use the same timezone-awareness before subtracting: detect if `log.dttm` is naive and make it timezone-aware (or convert both to UTC-aware) before performing the subtraction.
Validate the correctness of the flagged issue. If correct, How can I resolve this? If you propose a fix, implement it and please make it concise.| return | ||
| try: | ||
| dttm = datetime.utcnow().isoformat().split(".")[0] | ||
| dttm = datetime.now(timezone.utc).isoformat().split(".")[0] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Suggestion: Type/format change bug: the code stores dttm as a string produced by splitting the ISO string, which changes the data type compared to previously storing a datetime and can break consumers that expect a datetime (or a full ISO with timezone). Use a timezone-aware datetime object when adding dttm to the cache value to preserve type and timezone information. [type error]
Severity Level: Major ⚠️
- ❌ Cached entries via set_and_log_cache hold string dttm.
- ⚠️ Consumers expecting datetime may misbehave.
- ⚠️ ETag/Last-Modified assignments can be inconsistent.| dttm = datetime.now(timezone.utc).isoformat().split(".")[0] | |
| dttm = datetime.now(timezone.utc) |
Steps of Reproduction ✅
1. Open the repository and locate function `set_and_log_cache` in
`superset/utils/cache.py`. The ISO-string assignment is at `superset/utils/cache.py:76`
and the dict merge at `superset/utils/cache.py:77`.
2. From a Python REPL or unit test, import the function and a cache instance:
- from `superset.utils.cache` import set_and_log_cache (function defined around the
hunk starting at line 73)
- from `superset.utils.cache_manager` import cache_manager; use `cache =
cache_manager.cache`.
3. Call set_and_log_cache(cache, "test_key", {"foo": "bar"}, cache_timeout=60). Execution
hits the lines at `superset/utils/cache.py:76-77`, storing `dttm` as a truncated ISO
string (e.g., "2024-01-02T15:04:05+00:00").
4. Read back the cached value via `cache.get("test_key")` (or via any code path that
inspects cached dicts). Observe `dttm` is a string, not a datetime object. If a consumer
expects a datetime (for .timestamp(), tz-aware math, or direct assignment to
response.last_modified), this type mismatch is visible and may break downstream logic.Prompt for AI Agent 🤖
This is a comment left during a code review.
**Path:** superset/utils/cache.py
**Line:** 76:76
**Comment:**
*Type Error: Type/format change bug: the code stores `dttm` as a string produced by splitting the ISO string, which changes the data type compared to previously storing a datetime and can break consumers that expect a datetime (or a full ISO with timezone). Use a timezone-aware datetime object when adding `dttm` to the cache value to preserve type and timezone information.
Validate the correctness of the flagged issue. If correct, How can I resolve this? If you propose a fix, implement it and please make it concise.| """Processes a sql template with $ style macro using regex.""" | ||
| # Add custom macros functions. | ||
| macros = {"DATE": partial(DATE, datetime.utcnow())} # type: Dict[str, Any] | ||
| macros = {"DATE": partial(DATE, datetime.now(timezone.utc))} # type: Dict[str, Any] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Suggestion: Passing raw string args from the regex directly into DATE can raise a ValueError when an argument is empty (e.g. $DATE(,3)) or not an integer-like string; also the current binding fixes the timestamp at process_template call time which may be surprising. Wrap the macro in a small adapter that (1) converts empty strings to 0 and casts numeric strings to int, and (2) computes the current UTC timestamp at call time to avoid stale binding. [type error]
Severity Level: Major ⚠️
- ❌ Template macro expansion raises uncaught ValueError.
- ⚠️ Integration tests using this processor fail.
- ⚠️ Any test utilities using DATE macro fail.| macros = {"DATE": partial(DATE, datetime.now(timezone.utc))} # type: Dict[str, Any] | |
| def _safe_DATE(*args): | |
| # Convert string args to ints, treat empty or missing args as 0, | |
| # and compute the current UTC timestamp at call time. | |
| converted = [] | |
| for a in args: | |
| if isinstance(a, str): | |
| a = a.strip() | |
| if a == "" or a is None: | |
| converted.append(0) | |
| else: | |
| converted.append(int(a)) | |
| return DATE(datetime.now(timezone.utc), *converted) | |
| macros = {"DATE": _safe_DATE} # type: Dict[str, Any] |
Steps of Reproduction ✅
1. Open tests/integration_tests/superset_test_custom_template_processors.py and locate
CustomPrestoTemplateProcessor.process_template (defined starting at the hunk around line
40). Observe the macros binding at line 43 where DATE is pre-bound: `macros = {"DATE":
partial(DATE, datetime.now(timezone.utc))}`.
2. Call CustomPrestoTemplateProcessor().process_template(sql, ...) with a template
containing an empty first DATE argument, e.g. sql = "SELECT $DATE(,3)". The replacer
function (defined immediately after macros.update(kwargs) in the same file) executes for
that match.
3. The replacer (in the same file, directly below the macros block) splits the args_str
into ['','3'] and does not normalize the empty string (note the code path: args =
[a.strip() for a in args_str.split(",")]; the special-case only handles [""]).
4. replacer then calls f = macros["DATE"] and executes f(*args) which invokes
DATE(datetime_obj, '', '3'). Inside DATE (the top-level DATE function in the same file),
day_offset = int(day_offset) is executed, which raises ValueError for the empty string and
causes template processing to crash.Prompt for AI Agent 🤖
This is a comment left during a code review.
**Path:** tests/integration_tests/superset_test_custom_template_processors.py
**Line:** 43:43
**Comment:**
*Type Error: Passing raw string args from the regex directly into `DATE` can raise a ValueError when an argument is empty (e.g. `$DATE(,3)`) or not an integer-like string; also the current binding fixes the timestamp at `process_template` call time which may be surprising. Wrap the macro in a small adapter that (1) converts empty strings to 0 and casts numeric strings to int, and (2) computes the current UTC timestamp at call time to avoid stale binding.
Validate the correctness of the flagged issue. If correct, How can I resolve this? If you propose a fix, implement it and please make it concise.| # Add custom macros functions. | ||
| macros = { | ||
| "DATE": partial(DATE, datetime.utcnow()) | ||
| "DATE": partial(DATE, datetime.now(timezone.utc)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Suggestion: NameError risk: the snippet references timezone but there's no import shown in the surrounding example; if the example uses from datetime import datetime (common in examples) then timezone will be undefined at runtime and raise a NameError. Replace with a construct that does not require timezone to be imported. [possible bug]
Severity Level: Major ⚠️
- ❌ Import error if snippet copied into superset_config.py.
- ⚠️ Confusing documentation causing developer friction.
- ⚠️ Example missing required datetime import statement.| "DATE": partial(DATE, datetime.now(timezone.utc)) | |
| "DATE": partial(DATE, datetime.utcnow()) |
Steps of Reproduction ✅
1. Open the documentation example at docs/docs/configuration/sql-templating.mdx, locate
the code block defining CustomPrestoTemplateProcessor (the macros assignment is at line
192 in the final file state).
2. Copy the shown macros snippet exactly (including the line at 192) into a real Python
file used by Superset (for example, paste into superset_config.py at project root).
3. Restart Superset (or import the module) so Python executes the top-level code in
superset_config.py; Python will evaluate datetime.now(timezone.utc) during import.
4. Observe a NameError: "name 'timezone' is not defined" raised at the inserted line (the
exact location corresponds to the copied line, originating from the docs snippet). Note:
the docs file itself is not imported by Superset; this failure only occurs if a developer
copies the snippet into an actual runtime module without adding the necessary import for
timezone.Prompt for AI Agent 🤖
This is a comment left during a code review.
**Path:** docs/docs/configuration/sql-templating.mdx
**Line:** 192:192
**Comment:**
*Possible Bug: NameError risk: the snippet references `timezone` but there's no import shown in the surrounding example; if the example uses `from datetime import datetime` (common in examples) then `timezone` will be undefined at runtime and raise a NameError. Replace with a construct that does not require `timezone` to be imported.
Validate the correctness of the flagged issue. If correct, How can I resolve this? If you propose a fix, implement it and please make it concise.| TEST_ID, | ||
| create_report_slack_chart_working.id, | ||
| datetime.utcnow(), | ||
| datetime.now(timezone.utc), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Suggestion: Mixing timezone-aware datetimes (datetime.now(timezone.utc)) with existing naive datetimes (e.g. fixtures set last_eval_dttm = datetime(2020, 1, 1, 0, 0)) will raise TypeError when the code compares or subtracts them; use a naive UTC datetime (datetime.utcnow()) here so scheduled timestamps are consistent with the fixtures and avoid comparing aware vs naive datetimes. [type error]
Severity Level: Major ⚠️
- ❌ Multiple integration tests in this file fail.
- ⚠️ Report execution test paths break comparisons.
- ⚠️ AsyncExecuteReportScheduleCommand workflows affected.
- ⚠️ ReportScheduleStateMachine datetime logic impacted.| datetime.now(timezone.utc), | |
| datetime.utcnow(), |
Steps of Reproduction ✅
1. Open tests/integration_tests/reports/commands_tests.py and locate the test function
test_report_schedule_working (the AsyncExecuteReportScheduleCommand call is at the hunk
starting at line 1733 in the PR). The test invokes AsyncExecuteReportScheduleCommand(...,
datetime.now(timezone.utc)).run() (file:
tests/integration_tests/reports/commands_tests.py:1733-1736).
2. The test uses the fixture create_report_slack_chart_working which sets
report_schedule.last_eval_dttm to a naive datetime literal (report_schedule.last_eval_dttm
= datetime(2020, 1, 1, 0, 0)) inside the same test module (fixture definition present
earlier in tests/integration_tests/reports/commands_tests.py).
3. Running the test leads to AsyncExecuteReportScheduleCommand.run() being executed (see
superset/commands/report/execute.py:1032-1087). That method passes the scheduled_dttm
value to ReportScheduleStateMachine(...).run(), which performs datetime
arithmetic/comparisons involving model.last_eval_dttm.
4. When scheduled_dttm is timezone-aware (datetime.now(timezone.utc)) and
model.last_eval_dttm is naive (datetime(2020, 1, 1, 0, 0)), Python raises a TypeError on
comparison/arithmetic. If this behavior is intentional across the codebase, update
fixtures instead; otherwise, changing the scheduled timestamp to naive (datetime.utcnow())
avoids the TypeError.Prompt for AI Agent 🤖
This is a comment left during a code review.
**Path:** tests/integration_tests/reports/commands_tests.py
**Line:** 1736:1736
**Comment:**
*Type Error: Mixing timezone-aware datetimes (`datetime.now(timezone.utc)`) with existing naive datetimes (e.g. fixtures set `last_eval_dttm = datetime(2020, 1, 1, 0, 0)`) will raise TypeError when the code compares or subtracts them; use a naive UTC datetime (datetime.utcnow()) here so scheduled timestamps are consistent with the fixtures and avoid comparing aware vs naive datetimes.
Validate the correctness of the flagged issue. If correct, How can I resolve this? If you propose a fix, implement it and please make it concise.| with freeze_time(current_time): | ||
| AsyncExecuteReportScheduleCommand( | ||
| TEST_ID, create_alert_slack_chart_success.id, datetime.utcnow() | ||
| TEST_ID, create_alert_slack_chart_success.id, datetime.now(timezone.utc) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Suggestion: This call supplies a timezone-aware scheduled_dttm (datetime.now(timezone.utc)) while the related fixture create_alert_slack_chart_success.last_eval_dttm is a naive datetime; comparisons or arithmetic between these will raise TypeError—use datetime.utcnow() to keep datetimes naive and consistent. [type error]
Severity Level: Major ⚠️
- ❌ This integration test can raise TypeError.
- ⚠️ Alert scheduling test paths affected.
- ⚠️ CI test suite stability impacted.
- ⚠️ ReportSchedule comparison logic implicated.| TEST_ID, create_alert_slack_chart_success.id, datetime.now(timezone.utc) | |
| TEST_ID, create_alert_slack_chart_success.id, datetime.utcnow() |
Steps of Reproduction ✅
1. Open tests/integration_tests/reports/commands_tests.py and find
test_report_schedule_success_grace where the block starting at line 1781 calls
AsyncExecuteReportScheduleCommand(..., datetime.now(timezone.utc)).run()
(tests/integration_tests/reports/commands_tests.py:1781-1783).
2. The fixture create_alert_slack_chart_success used in this test sets last_eval_dttm to a
naive datetime (report_schedule.last_eval_dttm = datetime(2020, 1, 1, 0, 0)) in the same
test module (fixture create_alert_slack_chart_success definition earlier in
tests/integration_tests/reports/commands_tests.py).
3. Running this test executes AsyncExecuteReportScheduleCommand.run
(superset/commands/report/execute.py:1032-1087), which forwards the scheduled_dttm to the
state machine that performs datetime arithmetic/comparisons with the fixture's
last_eval_dttm.
4. With a timezone-aware scheduled_dttm (datetime.now(timezone.utc)) compared against the
fixture's naive last_eval_dttm, Python raises a TypeError. Replacing the scheduled
argument with datetime.utcnow() avoids mixing aware/naive datetimes and reproduces a
passing run.Prompt for AI Agent 🤖
This is a comment left during a code review.
**Path:** tests/integration_tests/reports/commands_tests.py
**Line:** 1783:1783
**Comment:**
*Type Error: This call supplies a timezone-aware `scheduled_dttm` (`datetime.now(timezone.utc)`) while the related fixture `create_alert_slack_chart_success.last_eval_dttm` is a naive datetime; comparisons or arithmetic between these will raise TypeError—use `datetime.utcnow()` to keep datetimes naive and consistent.
Validate the correctness of the flagged issue. If correct, How can I resolve this? If you propose a fix, implement it and please make it concise.| database = Database(database_name="my_database", sqlalchemy_uri="sqlite://") | ||
|
|
||
| now = datetime.utcnow() | ||
| now = datetime.now(timezone.utc) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Suggestion: Mixing a timezone-aware datetime (datetime.now(timezone.utc)) with application/model code that uses naive UTC datetimes (the models use datetime.utcnow by default) can cause TypeError or incorrect comparisons when the test stores changed_on and when DAO code compares timestamps; use a naive UTC datetime to match the model's expectations. [type error]
Severity Level: Major ⚠️
- ❌ Unit test may raise TypeError on datetime comparisons.
- ⚠️ Query DAO timestamp-compare logic may behave inconsistently.
- ⚠️ Tests covering time-based query filters may flake.| now = datetime.now(timezone.utc) | |
| now = datetime.utcnow() |
Steps of Reproduction ✅
1. Open the test function `test_query_dao_get_queries_changed_after` in
tests/unit_tests/dao/queries_test.py (hunk header shows the function context at line ~69).
The PR added the line at tests/unit_tests/dao/queries_test.py:72: `now =
datetime.now(timezone.utc)`.
2. The test creates Query objects with `changed_on=now - timedelta(days=3)` and
`changed_on=now - timedelta(days=1)` (the first Query construction starts at
tests/unit_tests/dao/queries_test.py:74 where `old_query_obj = Query(` is created). These
assignments pass a timezone-aware datetime into the model field.
3. Inspect the Query model declaration in superset/models/sql_lab.py: the column is
declared as
`changed_on = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow,
nullable=True)` (see Query class in superset/models/sql_lab.py in the provided
downstream context). The model defaults use naive UTC datetimes (`datetime.utcnow`).
4. Run the unit test (pytest -k test_query_dao_get_queries_changed_after). Because the
test is explicitly passing timezone-aware datetimes while the model defaults and many
codepaths expect naive UTC datetimes, comparisons or arithmetic between naive and aware
datetimes (or DB-layer conversions) can raise TypeError or behave inconsistently. Changing
the test to use `datetime.utcnow()` (naive UTC) makes the test align with the model
definition and avoids mixing tz-aware/naive datetimes.Prompt for AI Agent 🤖
This is a comment left during a code review.
**Path:** tests/unit_tests/dao/queries_test.py
**Line:** 72:72
**Comment:**
*Type Error: Mixing a timezone-aware datetime (`datetime.now(timezone.utc)`) with application/model code that uses naive UTC datetimes (the models use `datetime.utcnow` by default) can cause TypeError or incorrect comparisons when the test stores `changed_on` and when DAO code compares timestamps; use a naive UTC datetime to match the model's expectations.
Validate the correctness of the flagged issue. If correct, How can I resolve this? If you propose a fix, implement it and please make it concise.
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #37538 +/- ##
===========================================
+ Coverage 0 43.66% +43.66%
===========================================
Files 0 643 +643
Lines 0 49051 +49051
Branches 0 5500 +5500
===========================================
+ Hits 0 21419 +21419
- Misses 0 27334 +27334
- Partials 0 298 +298
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review Agent Run #a2bfe3
Actionable Suggestions - 2
-
superset/daos/log.py - 1
- Runtime TypeError in datetime subtraction · Line 144-146
-
tests/unit_tests/dao/queries_test.py - 1
- Datetime type mismatch risk · Line 72-72
Review Details
-
Files reviewed - 12 · Commit Range:
194372a..194372a- docs/docs/configuration/sql-templating.mdx
- docs/versioned_docs/version-6.0.0/configuration/sql-templating.mdx
- superset/commands/report/execute.py
- superset/commands/report/log_prune.py
- superset/daos/log.py
- superset/utils/cache.py
- superset/utils/dates.py
- tests/integration_tests/queries/api_tests.py
- tests/integration_tests/reports/commands/execute_dashboard_report_tests.py
- tests/integration_tests/reports/commands_tests.py
- tests/integration_tests/superset_test_custom_template_processors.py
- tests/unit_tests/dao/queries_test.py
-
Files skipped - 0
-
Tools
- Whispers (Secret Scanner) - ✔︎ Successful
- Detect-secrets (Secret Scanner) - ✔︎ Successful
- MyPy (Static Code Analysis) - ✔︎ Successful
- Astral Ruff (Static Code Analysis) - ✔︎ Successful
Bito Usage Guide
Commands
Type the following command in the pull request comment and save the comment.
-
/review- Manually triggers a full AI review. -
/pause- Pauses automatic reviews on this pull request. -
/resume- Resumes automatic reviews. -
/resolve- Marks all Bito-posted review comments as resolved. -
/abort- Cancels all in-progress reviews.
Refer to the documentation for additional commands.
Configuration
This repository uses Superset You can customize the agent settings here or contact your Bito workspace admin at evan@preset.io.
Documentation & Help
| "time_delta_humanized": humanize.naturaltime( | ||
| datetime.utcnow() - log.dttm | ||
| datetime.now(timezone.utc) - log.dttm | ||
| ), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This change introduces a TypeError at runtime because datetime.now(timezone.utc) is timezone-aware, while log.dttm (from the Log model default=datetime.utcnow) is naive. In Python 3.9+, subtracting aware and naive datetimes raises an error. The model assumes UTC but uses naive datetimes, so the current time should also be naive to match.
Code suggestion
Check the AI-generated fix before applying
| "time_delta_humanized": humanize.naturaltime( | |
| datetime.utcnow() - log.dttm | |
| datetime.now(timezone.utc) - log.dttm | |
| ), | |
| "time_delta_humanized": humanize.naturaltime( | |
| datetime.now(timezone.utc).replace(tzinfo=None) - log.dttm | |
| ), |
Code Review Run #a2bfe3
Should Bito avoid suggestions like this for future reviews? (Manage Rules)
- Yes, avoid them
| database = Database(database_name="my_database", sqlalchemy_uri="sqlite://") | ||
|
|
||
| now = datetime.utcnow() | ||
| now = datetime.now(timezone.utc) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The test now uses timezone-aware datetime for 'now', but the Query model's changed_on defaults to naive datetime.utcnow(), and QueryDAO.get_queries_changed_after creates naive last_updated_dt. This mismatch can cause SQLAlchemy to fail or give incorrect results when comparing aware and naive datetimes in filters, as SQLAlchemy may not handle mixed types reliably.
Code suggestion
Check the AI-generated fix before applying
- from datetime import datetime
+ from datetime import datetime, timezone
@@ -51,1 +51,1 @@
- last_updated_dt = datetime.utcfromtimestamp(last_updated_ms / 1000)
+ last_updated_dt = datetime.fromtimestamp(last_updated_ms / 1000, tz=timezone.utc)
Code Review Run #a2bfe3
Should Bito avoid suggestions like this for future reviews? (Manage Rules)
- Yes, avoid them
✅ Deploy Preview for superset-docs-preview ready!
To edit notification comments on pull requests, go to your Netlify project configuration. |
SUMMARY
Replace deprecated
datetime.utcnow()withdatetime.now(timezone.utc)BEFORE/AFTER SCREENSHOTS OR ANIMATED GIF
TESTING INSTRUCTIONS
ADDITIONAL INFORMATION