This work was part of the project I did during my undergrad research internship in the summer of 2018 at the Centre for Process Integration, The University of Manchester on stochastic optimization.
Simulated Annealing (SA) is inspired by the physical process performed on metals or glass called annealing. This is a heat treatment in which the solid is heated up at high temperatures and then cool down very slowly. This process causes the internal solid configuration to be rearranged and to reach its minimum lattice energy state (Jacobson, 2010). This algorithm is usually used to address discrete optimization problems; however, its use for continuous problems is also possible.
The general idea behind SA is to store the "best" visited point at each iteration. However, instead of always taking the actual best point found from iteration to iteration, some worse points are accepted according to the probability given by:
where stands for the probability of accepting a worse point; is the difference between the proposed point with the current best point and is the temperature that is reduce from iteration to iteration.
The reason for accepting "worse" steps (with less fitness than the current "best" point) is to escape from local minima; otherwise, if only the fitter points are accepted, the algorithm will be easily stuck in a local optimum. The proposed point to move to is chosen randomly in the neighborhood of the current point. As can be seen in the previous equation, the probability is dependent on the temperature, which is progressively reduced. In this way, as the iterations progress, the probability of accepting worse points decreases.
The function requires Python 3.0 (or more recent versions).
SA(f, p_init, bounds, t_init, t_red, rep_M, radii)
1. The function to be optimized. The functions needs to be of the form .
2. The initial location of the first point for the first iteration.
3. The bounds for each dimension of the fucntion. This has to be a list of the form [(lb1, ub1), (1b2, ub2), ...]
.
4. The initial temperature which is going to be reduced to zero by succesive substraction of t_red
.
5. The temperature reduction applied at each iteration, with the form .
6. The number of random proposed locations at each temperature value.
7. The radius parameter used to define the neighborhood of the current best point. This value divides the total extension of the bounds. For instance, if the distance between the lower bound and upper bound is 10, and the radii
is set to 2, the neightborhood will be defined with a distance of 5 from the current point.
Optimum: (class) Results with:
Optimum.f: (float) The best value of the funtion found in the optimization
Optimum.x: (array) The best point in which the function was evaluated
Optimum.traj_f: (array) Trajectory of function values
Optimum.traj_x: (array) Trajectory of positions
Optimum.traj_t: (array) Trajectory of temperatures
-
In this version, the neighborhood of the current best point is always restricted to remain within the bounds provided by the user.
-
The stopping criterion in this version is the temperature value reaching zero.
Jacobson, A. G. (2010). Simulated Annealing. In J. Y. M. Gendreau, Handbook of Metaheuristics (pp. 1 - 39). Buffalo: International Series in Operations Research & Management Sciences.
This repository contains a MIT LICENSE