Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DRAFT] Powell's algorithm #234

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
1 change: 1 addition & 0 deletions argmin/src/solver/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ pub mod linesearch;
pub mod neldermead;
pub mod newton;
pub mod particleswarm;
pub mod powell;
pub mod quasinewton;
pub mod simulatedannealing;
pub mod trustregion;
97 changes: 97 additions & 0 deletions argmin/src/solver/powell/mod.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,97 @@
use crate::core::{
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor: Copyright notice is missing

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks will add this before the PR is final. Leaving this unresolved as a reminder for myself :)

ArgminFloat, CostFunction, DeserializeOwnedAlias, Executor, IterState, LineSearch,
OptimizationResult, SerializeAlias, Solver, State,
};
use argmin_math::{ArgminAdd, ArgminDot, ArgminSub, ArgminZeroLike};
#[cfg(feature = "serde1")]
use serde::{Deserialize, Serialize};
use std::mem;

#[derive(Clone)]
#[cfg_attr(feature = "serde1", derive(Serialize, Deserialize))]
pub struct PowellLineSearch<P, L> {
search_vectors: Vec<P>,
linesearch: L,
}

impl<P, L> PowellLineSearch<P, L> {
pub fn new(initial_search_vectors: Vec<P>, linesearch: L) -> Self {
PowellLineSearch {
search_vectors: initial_search_vectors,
linesearch,
}
}
}

impl<O, P, F, L> Solver<O, IterState<P, (), (), (), F>> for PowellLineSearch<P, L>
where
O: CostFunction<Param = P, Output = F>,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since the problem is passed to a line search as well, you'll probably also require O to implement Gradient. But I'm not a 100% sure how these requirements of the LS are passed along. Maybe we just need to give it a try.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe one of the advantages of Powell's method is that it doesn't need gradients. I thought that as long as the linesearch method is given a direction to search along it doesn't need gradients to be computed, i.e. the calculation of gradients should be independent from the linesearch algorith, which only searches along a provided search direction. However, I could be wrong, but I can just write tests and see if it works. I will also implement a version of this using golden section search and brent's method.

P: Clone
+ SerializeAlias
+ DeserializeOwnedAlias
+ ArgminAdd<P, P>
+ ArgminZeroLike
+ ArgminSub<P, P>
+ ArgminDot<P, F>,
F: ArgminFloat,
L: Clone + LineSearch<P, F> + Solver<O, IterState<P, (), (), (), F>>,
{
const NAME: &'static str = "Powell-LS";

fn next_iter(
&mut self,
problem: &mut crate::core::Problem<O>,
mut state: IterState<P, (), (), (), F>,
) -> Result<(IterState<P, (), (), (), F>, Option<crate::core::KV>), anyhow::Error> {
Trombach marked this conversation as resolved.
Show resolved Hide resolved
let param = state
.take_param()
.ok_or_else(argmin_error_closure!(NotInitialized, "not initialized"))?; // TODO add Error message

// new displacement vector created from displacement vectors of line searches
let new_displacement = param.zero_like();
let mut best_direction: (usize, F) = (0, float!(0.0));

// init line search
let (ls_state, _) = self.linesearch.init(problem, state.clone())?;
Trombach marked this conversation as resolved.
Show resolved Hide resolved

// Loop over all search vectors and perform line optimization along each search direction
for (i, search_vector) in self.search_vectors.iter().enumerate() {
self.linesearch.search_direction(search_vector.clone());

let line_cost = ls_state.get_cost();
Trombach marked this conversation as resolved.
Show resolved Hide resolved

// Run solver
let OptimizationResult {
problem: _sub_problem,
state: sub_state,
..
} = Executor::new(problem.take_problem().unwrap(), self.linesearch.clone())
Trombach marked this conversation as resolved.
Show resolved Hide resolved
.configure(|state| state.param(param.clone()).cost(line_cost))
.ctrlc(false)
.run()?;

// update new displacement vector
let displacement = &sub_state.get_best_param().unwrap().sub(&param);
let displacement_magnitude = displacement.dot(&displacement).sqrt();
new_displacement.add(displacement);
Trombach marked this conversation as resolved.
Show resolved Hide resolved

//store index of lowest cost displacement vector
if best_direction.1 < displacement_magnitude {
best_direction.0 = i;
best_direction.1 = displacement_magnitude;
}
}

// replace best performing search direction with new search direction
let _ = mem::replace(
&mut self.search_vectors[best_direction.0],
new_displacement.clone(),
);

// set new parameters
let param = param.add(&new_displacement);
let cost = problem.cost(&param);

Ok((state.param(param).cost(cost?), None))
Trombach marked this conversation as resolved.
Show resolved Hide resolved
}
}