Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: Refactor code quality issues #14920

Merged
merged 6 commits into from
Mar 23, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
2 changes: 1 addition & 1 deletion airflow/cli/simple_table.py
Original file line number Diff line number Diff line change
Expand Up @@ -127,5 +127,5 @@ def __init__(self, *args, **kwargs):

def add_column(self, *args, **kwargs) -> None: # pylint: disable=signature-differs
"""Add a column to the table. We use different default"""
kwargs["overflow"] = kwargs.get("overflow", None) # to avoid truncating
kwargs["overflow"] = kwargs.get("overflow") # to avoid truncating
super().add_column(*args, **kwargs)
3 changes: 0 additions & 3 deletions airflow/models/dag.py
Original file line number Diff line number Diff line change
Expand Up @@ -2029,7 +2029,6 @@ def get_serialized_fields(cls):
'user_defined_filters',
'user_defined_macros',
'partial',
'_old_context_manager_dags',
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kaxil can you take a look?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah it is safe, was just a duplicate from L2025

'_pickle_id',
'_log',
'is_subdag',
Expand Down Expand Up @@ -2335,8 +2334,6 @@ def factory(*args, **kwargs):
STATICA_HACK = True
globals()['kcah_acitats'[::-1].upper()] = False
if STATICA_HACK: # pragma: no cover
# Let pylint know about these relationships, without introducing an import cycle
from sqlalchemy.orm import relationship
Copy link
Member

@turbaszek turbaszek Mar 21, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, indeed. We import this at the beginning 👏


from airflow.models.serialized_dag import SerializedDagModel

Expand Down
2 changes: 0 additions & 2 deletions airflow/models/taskinstance.py
Original file line number Diff line number Diff line change
Expand Up @@ -2155,8 +2155,6 @@ def construct_task_instance(self, session=None, lock_for_update=False) -> TaskIn
STATICA_HACK = True
globals()['kcah_acitats'[::-1].upper()] = False
if STATICA_HACK: # pragma: no cover
# Let pylint know about these relationships, without introducing an import cycle
from sqlalchemy.orm import relationship

from airflow.job.base_job import BaseJob
from airflow.models.dagrun import DagRun
Expand Down
4 changes: 2 additions & 2 deletions airflow/sensors/smart_sensor.py
Original file line number Diff line number Diff line change
Expand Up @@ -672,13 +672,13 @@ def _execute_sensor_work(self, sensor_work):

def flush_cached_sensor_poke_results(self):
"""Flush outdated cached sensor states saved in previous loop."""
for key, cached_work in self.cached_dedup_works.items():
for key, cached_work in self.cached_dedup_works.copy().items():
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm Why do we need copy()?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Dictionaries are represented by a hash table and adding or removing items while iterating over it will alter the iteration order. This will cause a RuntimeError.

If you need to add items to the dictionary during iteration, it is recommended to iterate over a shallow copy of the dictionary.

You can checkout brief here : https://deepsource.io/gh/ankitdobhal/airflow/issue/PTC-W0056/description

if cached_work.is_expired():
self.cached_dedup_works.pop(key, None)
else:
cached_work.state = None

for ti_key, sensor_exception in self.cached_sensor_exceptions.items():
for ti_key, sensor_exception in self.cached_sensor_exceptions.copy().items():
if sensor_exception.fail_current_run or sensor_exception.is_expired():
self.cached_sensor_exceptions.pop(ti_key, None)

Expand Down