Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add tuning history to OptHistory #228

Draft
wants to merge 17 commits into
base: main
Choose a base branch
from
Draft

Add tuning history to OptHistory #228

wants to merge 17 commits into from

Conversation

MorrisNein
Copy link
Collaborator

@MorrisNein MorrisNein commented Oct 25, 2023

Adds an option to save graph to OptHistory after each objective evaluation in the process of tuning (Solves #226).

Allows tuners to work with Individual and Fitness for graphs comparison & history saving.

Changes OptHistory fields:

  • adds tuning_result
  • renames final_choices -> evolution_results
  • renames archive_history -> evolution_best_archive

Allows histories backward compatibility.

Fixes previously unknown bug with multi objective tuning.

@aim-pep8-bot
Copy link
Collaborator

aim-pep8-bot commented Oct 25, 2023

Hello @MorrisNein! Thanks for updating this PR. We checked the lines you've touched for PEP 8 issues, and found:

There are currently no PEP 8 issues detected in this Pull Request. Cheers! 🍻

Comment last updated at 2023-11-04 12:07:43 UTC

@pep8speaks
Copy link

pep8speaks commented Oct 25, 2023

Hello @MorrisNein! Thanks for updating this PR. We checked the lines you've touched for PEP 8 issues, and found:

Line 38:25: E131 continuation line unaligned for hanging indent

Comment last updated at 2023-11-04 12:07:51 UTC

@MorrisNein MorrisNein linked an issue Oct 25, 2023 that may be closed by this pull request
@codecov-commenter
Copy link

codecov-commenter commented Oct 26, 2023

Codecov Report

Merging #228 (02e3ecd) into main (29cd362) will increase coverage by 0.13%.
Report is 3 commits behind head on main.
The diff coverage is 92.66%.

@@            Coverage Diff             @@
##             main     #228      +/-   ##
==========================================
+ Coverage   72.16%   72.29%   +0.13%     
==========================================
  Files         135      136       +1     
  Lines        8061     8184     +123     
==========================================
+ Hits         5817     5917     +100     
- Misses       2244     2267      +23     
Files Coverage Δ
golem/core/optimisers/genetic/gp_optimizer.py 98.80% <100.00%> (+0.01%) ⬆️
...core/optimisers/opt_history_objects/opt_history.py 76.37% <100.00%> (+0.79%) ⬆️
golem/core/optimisers/populational_optimizer.py 95.74% <100.00%> (+0.04%) ⬆️
golem/core/tuning/hyperopt_tuner.py 97.61% <100.00%> (+0.05%) ⬆️
golem/core/tuning/iopt_tuner.py 100.00% <100.00%> (ø)
golem/core/tuning/optuna_tuner.py 93.10% <100.00%> (+0.08%) ⬆️
golem/core/tuning/sequential.py 95.45% <100.00%> (+0.06%) ⬆️
golem/core/tuning/simultaneous.py 90.14% <100.00%> (+2.81%) ⬆️
...em/serializers/coders/opt_history_serialization.py 98.76% <100.00%> (+0.03%) ⬆️
...em/visualisation/opt_history/graphs_interactive.py 0.00% <ø> (ø)
... and 6 more

... and 21 files with indirect coverage changes

MorrisNein added a commit to aimclub/FEDOT that referenced this pull request Oct 26, 2023
@kasyanovse kasyanovse self-requested a review October 26, 2023 11:06
golem/core/tuning/tuner_interface.py Outdated Show resolved Hide resolved
golem/core/tuning/tuner_interface.py Outdated Show resolved Hide resolved
golem/core/tuning/tuner_interface.py Outdated Show resolved Hide resolved
Comment on lines 157 to 158
for tuned_graph in tuned_graphs:
obtained_metric = self.get_metric_value(graph=tuned_graph)
obtained_metric = self.evaluate_graph(graph=tuned_graph, label='tuning_result')
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Немного не догоняю что происходит)) В этом цикле будут создаваться новые поколения в history для каждого посчитанного графа?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

В этом цикле будут создаваться новые поколения в history для каждого посчитанного графа?

Да, но так быть не должно. Исправил, разделив оценку графов и сохранение истории.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Я правильно понимаю, что есть тест, в котором проверяется соответствие количества новых поколений количеству шагов тюнера?

test/unit/optimizers/test_composing_history.py Outdated Show resolved Hide resolved
test/unit/optimizers/test_composing_history.py Outdated Show resolved Hide resolved
test/unit/tuning/test_tuning.py Show resolved Hide resolved
@MorrisNein
Copy link
Collaborator Author

MorrisNein commented Oct 30, 2023

Добавил в тюнеры работу с классами истории оптимизации, раз уж теперь они могут сохранять в неё результаты.

Сравнение графов между собой, на мой взгляд, стало удобнее, особенно для MultiObjFitness: не нужно перегонять метрики в класс MultiObjFitness и обратно.


import numpy as np
from iOpt.method.listener import ConsoleFullOutputListener
from iOpt.problem import Problem
from iOpt.solver import Solver
from iOpt.solver_parametrs import SolverParameters
from iOpt.trial import Point, FunctionValue
from iOpt.trial import FunctionValue, Point
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Почему в этом тюнере не добавлена возможность сохранить в историю?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Пока что сохраняются изначальный и финальный графы через initial_check и final_check.

Но тут можно прокинуть в GolemProblem функцию для сохранения промежуточных результатов. Пожалуй, так и сделаю.

@@ -36,8 +43,7 @@ def __init__(self,
default_save_dir: Optional[os.PathLike] = None):
self._objective = objective or ObjectiveInfo()
self._generations: List[Generation] = []
self.archive_history: List[List[Individual]] = []
self._tuning_result: Optional[Graph] = None
self.evolution_best_archive: List[List[Individual]] = []
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Я не совсем понимаю зачем нужно такое переименование?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Я пока не реализовал ведение archive_history для тюнинга, потому что для оптимизации его ведёт фронт Парето в составе оптимизатора. Поэтому название поля явно отражает, что оно относится к истории эволюции, но не тюнинга.

Честно, не знаю, как тут лучше поступить.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ну точно не нужна отсылка к эволюции, для большей абстрактности.
Мне кажется идея что archive_history это основная история, а для файн-тюнинга - отдельная, будет доступной пользователю.

class OptHistoryLabels(str, Enum):
initial_assumptions = 'initial_assumptions'
extended_initial_assumptions = 'extended_initial_assumptions'
evolution_results = 'evolution_results'
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Аналогично, можно про эволюцию явно не писать.

@@ -61,8 +67,8 @@ def add_to_history(self, individuals: Sequence[Individual], generation_label: Op
generation = Generation(individuals, self.generations_count, generation_label, generation_metadata)
self.generations.append(generation)

def add_to_archive_history(self, individuals: Sequence[Individual]):
self.archive_history.append(list(individuals))
def add_to_evolution_best_archive(self, individuals: Sequence[Individual]):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Может сделать единый метод, просто передавать в параметрах нужную метку?

self.init_metric = None
self.obtained_metric = None
self.init_individual = None
self.obtained_individual = None
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Лучше final

@nicl-nno
Copy link
Collaborator

Хорошо бы в одном из примеров показать как использовать новую фунциональность. У @YamLyubov был пример с раскраской графа, например.

@MorrisNein MorrisNein marked this pull request as draft December 14, 2023 18:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Save tuning history as historical individuals
7 participants