Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sourcery refactored develop branch #1

Merged
merged 1 commit into from
Dec 14, 2023
Merged

Sourcery refactored develop branch #1

merged 1 commit into from
Dec 14, 2023

Conversation

sourcery-ai[bot]
Copy link

@sourcery-ai sourcery-ai bot commented Dec 13, 2023

Branch develop refactored by Sourcery.

If you're happy with these changes, merge this Pull Request using the Squash and merge strategy.

See our documentation here.

Run Sourcery locally

Reduce the feedback loop during development by using the Sourcery editor plugin:

Review changes via command line

To manually merge these changes, make sure you're on the develop branch, then run:

git fetch origin sourcery/develop
git merge --ff-only FETCH_HEAD
git reset HEAD^

Help us improve this pull request!

@sourcery-ai sourcery-ai bot requested a review from LopeKinz December 13, 2023 08:29
Copy link
Author

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Due to GitHub API limits, only the first 60 comments can be shown.

Comment on lines -25 to +32
data = {
# The x series is made of random numbers between 1 and 10
"x": [random.uniform(1, 10) for i in y],
"y": y,
}
data = {"x": [random.uniform(1, 10) for _ in y], "y": y}

options = {
"error_x": {
"type": "data",
# Allows for a 'plus' and a 'minus' error data
"symmetric": False,
# The 'plus' error data is a series of random numbers
"array": [random.uniform(0, 5) for i in y],
# The 'minus' error data is a series of random numbers
"arrayminus": [random.uniform(0, 2) for i in y],
# Color of the error bar
"array": [random.uniform(0, 5) for _ in y],
"arrayminus": [random.uniform(0, 2) for _ in y],
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lines 25-40 refactored with the following changes:

This removes the following comments ( why? ):

# The 'minus' error data is a series of random numbers
# Color of the error bar
# The 'plus' error data is a series of random numbers
# Allows for a 'plus' and a 'minus' error data
# The x series is made of random numbers between 1 and 10

data = [random.random() for i in range(500)]
data = [random.random() for _ in range(500)]
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lines 21-21 refactored with the following changes:

data = {"Count": [random.random() for i in range(100)]}
data = {"Count": [random.random() for _ in range(100)]}
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lines 21-21 refactored with the following changes:

samples = {"x": [random.gauss() for i in range(100)]}
samples = {"x": [random.gauss() for _ in range(100)]}
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lines 21-21 refactored with the following changes:

data = [random.random() for i in range(100)]
data = [random.random() for _ in range(100)]
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lines 21-21 refactored with the following changes:

Comment on lines -54 to +63
else:
if not (
(isinstance(config_value, List) or isinstance(config_value, Set))
and all(map(lambda x: isinstance(x, child_config_class), config_value))
):
self._error(
config_key,
config_value,
f"{config_key} field of {parent_config_class.__name__} `{config_id}` must be populated with a list "
f"of {child_config_class.__name__} objects.",
)
elif not (
(isinstance(config_value, (List, Set)))
and all(map(lambda x: isinstance(x, child_config_class), config_value))
):
self._error(
config_key,
config_value,
f"{config_key} field of {parent_config_class.__name__} `{config_id}` must be populated with a list "
f"of {child_config_class.__name__} objects.",
)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function _ConfigChecker._check_children refactored with the following changes:

cls.__logger.error("ConfigurationUpdateBlocked: " + error_message)
cls.__logger.error(f"ConfigurationUpdateBlocked: {error_message}")
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function _ConfigBlocker._check refactored with the following changes:

match = re.fullmatch(cls._PATTERN, str(template))
if match:
if match := re.fullmatch(cls._PATTERN, str(template)):
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function _TemplateHandler._replace_template refactored with the following changes:

Comment on lines -80 to +79
return str.lower(val) == "true" or not (str.lower(val) == "false")
return str.lower(val) == "true" or str.lower(val) != "false"
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function _TemplateHandler._to_bool refactored with the following changes:

  • Simplify logical expression using De Morgan identities (de-morgan)

Comment on lines -116 to +118
return key[:5] + "taipy-" + key[5:]
return f"{key[:5]}taipy-{key[5:]}"

return key[:2] + "taipy-" + key[2:]
return f"{key[:2]}taipy-{key[2:]}"
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function _CoreCLI.__add_taipy_prefix refactored with the following changes:

if parent_entity._MANAGER_NAME in current_parent_dict.keys():
if parent_entity._MANAGER_NAME in current_parent_dict:
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function get_parents refactored with the following changes:

if scenario.cycle in cycles_scenarios.keys():
if scenario.cycle in cycles_scenarios:
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function get_cycles_scenarios refactored with the following changes:

Comment on lines -35 to +60
if preserve_file_path := os.getenv(__BACKUP_FILE_PATH_ENVIRONMENT_VARIABLE_NAME, None):
storage_folder = os.path.abspath(Config.core.storage_folder) + os.sep
if not os.path.abspath(to_remove_file_path).startswith(storage_folder):
try:
with open(preserve_file_path, "r+") as f:
old_backup = f.read()
to_remove_file_path = to_remove_file_path + "\n"

# To avoid removing the file path of different data nodes that are pointing
# to the same file. We will only replace the file path only once.
if old_backup.startswith(to_remove_file_path):
new_backup = old_backup.replace(to_remove_file_path, "", 1)
else:
new_backup = old_backup.replace("\n" + to_remove_file_path, "\n", 1)

if new_backup is not old_backup:
f.seek(0)
f.write(new_backup)
f.truncate()
except Exception:
pass
if not (
preserve_file_path := os.getenv(
__BACKUP_FILE_PATH_ENVIRONMENT_VARIABLE_NAME, None
)
):
return
storage_folder = os.path.abspath(Config.core.storage_folder) + os.sep
if not os.path.abspath(to_remove_file_path).startswith(storage_folder):
try:
with open(preserve_file_path, "r+") as f:
old_backup = f.read()
to_remove_file_path += "\n"

# To avoid removing the file path of different data nodes that are pointing
# to the same file. We will only replace the file path only once.
if old_backup.startswith(to_remove_file_path):
new_backup = old_backup.replace(to_remove_file_path, "", 1)
else:
new_backup = old_backup.replace("\n" + to_remove_file_path, "\n", 1)

if new_backup is not old_backup:
f.seek(0)
f.write(new_backup)
f.truncate()
except Exception:
pass
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function _remove_from_backup_file refactored with the following changes:

self._sorted_nodes = list(nodes for nodes in nx.topological_generations(dag))
self._sorted_nodes = list(nx.topological_generations(dag))
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function _DAG.__init__ refactored with the following changes:

return len(self._sorted_nodes), max([len(i) for i in self._sorted_nodes])
return len(self._sorted_nodes), max(len(i) for i in self._sorted_nodes)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function _DAG.__compute_size refactored with the following changes:

Comment on lines -132 to +136
return list(nodes for nodes in nx.topological_generations(dag) if (Task in (type(node) for node in nodes)))
return [
nodes
for nodes in nx.topological_generations(dag)
if (Task in (type(node) for node in nodes))
]
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function Submittable._get_sorted_tasks refactored with the following changes:

Comment on lines +145 to +148
elif elem := [x for x in self._subscribers if x.callback == callback]:
self._subscribers.remove(elem[0])
else:
raise ValueError
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function Submittable._remove_subscriber refactored with the following changes:

Comment on lines -51 to +52
if entity_type == "TASK" and "SCENARIO" in _id:
if entity_id in entity_data["tasks"]:
if entity_id in entity_data["tasks"]:
if entity_type == "TASK" and "SCENARIO" in _id:
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function __search_parent_ids refactored with the following changes:

Comment on lines +63 to -64
section_id = f"{entity_id}:SECTION"
for _id, entity_data in data.items():
section_id = f"{entity_id}:SECTION"
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function __search_parent_config refactored with the following changes:

if entity_type in ["JOB", "VERSION"]:
if entity_type in {"JOB", "VERSION"}:
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function __migrate_entities refactored with the following changes:

Comment on lines -80 to +99
for task in ts:
jobs.append(
cls._lock_dn_output_and_create_job(
task,
submission.id,
submission.entity_id,
callbacks=itertools.chain([submission._update_submission_status], callbacks or []),
force=force, # type: ignore
)
jobs.extend(
cls._lock_dn_output_and_create_job(
task,
submission.id,
submission.entity_id,
callbacks=itertools.chain(
[submission._update_submission_status], callbacks or []
),
force=force, # type: ignore
)

for task in ts
)
submission.jobs = jobs # type: ignore

cls._orchestrate_job_to_run_or_block(jobs)

if Config.job_config.is_development:
cls._check_and_execute_jobs_if_development_mode()
else:
if wait:
cls.__wait_until_job_finished(jobs, timeout=timeout)
elif wait:
cls.__wait_until_job_finished(jobs, timeout=timeout)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function _Orchestrator.submit refactored with the following changes:

Comment on lines -144 to +145
else:
if wait:
cls.__wait_until_job_finished(job, timeout=timeout)
elif wait:
cls.__wait_until_job_finished(job, timeout=timeout)
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function _Orchestrator.submit_task refactored with the following changes:

Comment on lines -161 to -165
job = _JobManagerFactory._build_manager()._create(
task, itertools.chain([cls._on_status_change], callbacks or []), submit_id, submit_entity_id, force=force
return _JobManagerFactory._build_manager()._create(
task,
itertools.chain([cls._on_status_change], callbacks or []),
submit_id,
submit_entity_id,
force=force,
)

return job
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function _Orchestrator._lock_dn_output_and_create_job refactored with the following changes:

Comment on lines -187 to +188
if timeout:
return (datetime.now() - start).seconds < timeout
return True
return (datetime.now() - start).seconds < timeout if timeout else True
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function _Orchestrator.__wait_until_job_finished refactored with the following changes:

Comment on lines -258 to +257
to_cancel_or_abandon_jobs = set([job])
to_cancel_or_abandon_jobs = {job}
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function _Orchestrator.cancel_job refactored with the following changes:

Comment on lines -193 to +197
query = query + f" AND {table_name}.version IN ({','.join(['?']*len(versions))})"
query += f" AND {table_name}.version IN ({','.join(['?'] * len(versions))})"
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function _SQLRepository.__get_entities_by_config_and_owner refactored with the following changes:

  • Replace assignment with augmented assignment (aug-assign)

d = {}
for idx, col in enumerate(cursor.description):
d[col[0]] = row[idx]
return d
return {col[0]: row[idx] for idx, col in enumerate(cursor.description)}
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function dict_factory refactored with the following changes:

Comment on lines -46 to +50
migration_fct = Config.unique_sections[MigrationConfig.name].migration_fcts.get(version, {}).get(config_id)
if migration_fct:
if (
migration_fct := Config.unique_sections[MigrationConfig.name]
.migration_fcts.get(version, {})
.get(config_id)
):
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function __get_migration_fcts_to_latest refactored with the following changes:

Comment on lines -57 to +63
if force:
cls.__logger.warning(
f"Option --force is detected, overriding the configuration of version {id} ..."
)
version.config = Config._applied_config
else:
if not force:
raise ConflictedConfigurationError()

cls.__logger.warning(
f"Option --force is detected, overriding the configuration of version {id} ..."
)
version.config = Config._applied_config
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function _VersionManager._get_or_create refactored with the following changes:

Comment on lines -215 to +223
def __check_production_migration_config(self):
def __check_production_migration_config(cls):
from ..config.checkers._migration_config_checker import _MigrationConfigChecker

collector = _MigrationConfigChecker(Config._applied_config, IssueCollector())._check()
for issue in collector._warnings:
self.__logger.warning(str(issue))
cls.__logger.warning(str(issue))
for issue in collector._infos:
self.__logger.info(str(issue))
cls.__logger.info(str(issue))
for issue in collector._errors:
self.__logger.error(str(issue))
cls.__logger.error(str(issue))
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Function _VersionManager.__check_production_migration_config refactored with the following changes:

@LopeKinz LopeKinz merged commit d4c4ff0 into develop Dec 14, 2023
2 checks passed
@LopeKinz LopeKinz deleted the sourcery/develop branch December 14, 2023 07:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant