Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optimizers notes #21

Open
PrzeChoj opened this issue Aug 6, 2022 · 0 comments
Open

Optimizers notes #21

PrzeChoj opened this issue Aug 6, 2022 · 0 comments
Labels
enhancement New feature or request good first issue Good for newcomers help wanted Extra attention is needed

Comments

@PrzeChoj
Copy link
Owner

PrzeChoj commented Aug 6, 2022

In this issue, I will share my thoughts on optimizers in find_MAP().

For an introduction to the topic, see the vignette("Optimizers", package="gips") also available as the pkgdown page.

We encourage anyone to comment on those.

Our MH "walks" on permutations, not on groups

We want to find the cyclic group. In the reference paper, the process of walking on groups is referred to as the First approach in 4.1.1. The process of walking on permutations is referred to as the Second approach in 4.1.2.

In the gips package, the Second approach is implemented as the "MH" optimizer. We examined the First approach throughout and decided not to implement it. We don't reject it, though, and we may choose to do it in the future.

"BF" optimization

find_MAP(optimizer = "BF") for perm_size < 10 calculates a posteriori function only for group generators, while for 10 <= perm_size calculates a posteriori function for all permutations of a given size. This is faster to compute it for fewer perms, but we had to stop somewhere and decided to stop at perm_size == 10. We can always later extend the perm_group_generators_list object generated in data-raw/perm_group_generators_list.R.

Other optimizers

We have implemented other optimizers that we decided not to yet add to gips. Those include:

  1. Random - In every iteration, draw a random permutation uniformly on all possible ones. It was spectacularly worse than other optimizers.
  2. Simulated Annealing - Generalization of MH; has an additional parameter that makes it easier/harder to go to worse perm. The results were promising, but the optimal choice of the additional hyperparameter was hard. We tested only small spaces (p < 30).
  3. Evolutionary Optimization - the one we implemented gave promising results in small spaces (for p < 30, the Evolutionary Optimization was better than "MH"). Still, for bigger spaces, it was a complete disappointment (p = 100). It added a massive number of hyperparameters.

Ideas for future optimizers

  1. Greedy - in an iteration, it will go through all the transpositions, just like the Hill Climbing, but will go to the first found better permutation.
@PrzeChoj PrzeChoj added the enhancement New feature or request label Aug 6, 2022
@PrzeChoj PrzeChoj changed the title Possibly faster way of doing "full" optimization Optimizers notes Sep 3, 2022
@PrzeChoj PrzeChoj added help wanted Extra attention is needed good first issue Good for newcomers labels Sep 3, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request good first issue Good for newcomers help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

1 participant