Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow the user to reach the search algorithm's internal optimizer #89

Merged
merged 1 commit into from
Oct 2, 2023

Conversation

engintoklu
Copy link
Collaborator

With this pull request, algorithms inheriting from the base class GaussianSearchAlgorithm now have a read-only property (named optimizer) that allows the user to reach the internal optimizer object employed by the search algorithm.

By reaching this internal optimizer, the user can now read or modify its hyperparameters.

For example, the learning rate of an internal optimizer (e.g. of Adam or ClipUp) can be retrieved like this:

from evotorch.algorithms import PGPE

my_search_algorithm = PGPE(
    optimizer="adam",  # can also be "sgd" or "clipup"
    ...
)

lr = my_search_algorithm.optimizer.param_groups[0]["lr"]

If one wishes to update the learning rate, the following is possible:

my_search_algorithm.optimizer.param_groups[0]["lr"] = new_lr

Algorithms inheriting from the base class
`GaussianSearchAlgorithm` now have a property
that allows the user to reach the internal
optimizer object employed by the search algorithm.

By reaching this internal optimizer, the user can
now read or modify its hyperparameters.
@engintoklu engintoklu added the enhancement New feature or request label Sep 30, 2023
@codecov
Copy link

codecov bot commented Sep 30, 2023

Codecov Report

Attention: 20 lines in your changes are missing coverage. Please review.

Comparison is base (9d31d59) 77.83% compared to head (e79e1bc) 77.66%.

Additional details and impacted files
@@            Coverage Diff             @@
##           master      #89      +/-   ##
==========================================
- Coverage   77.83%   77.66%   -0.18%     
==========================================
  Files          49       49              
  Lines        7332     7373      +41     
==========================================
+ Hits         5707     5726      +19     
- Misses       1625     1647      +22     
Files Coverage Δ
src/evotorch/algorithms/distributed/gaussian.py 83.00% <66.66%> (-0.25%) ⬇️
src/evotorch/optimizers.py 77.20% <50.00%> (-10.55%) ⬇️

... and 1 file with indirect coverage changes

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@Higgcz Higgcz merged commit d2bec95 into master Oct 2, 2023
4 checks passed
@Higgcz Higgcz deleted the feature/reach-optimizer branch October 2, 2023 15:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants