A Ruby metaheuristics library
Switch branches/tags
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
extra
lib
test
.gitignore
.projections.json
.travis.yml
Gemfile
LICENSE
README.md
Rakefile
TODO
mhl.gemspec

README.md

ruby-mhl - A Ruby metaheuristics library

Gem Version Build Status Code Climate

ruby-mhl is a scientific library that provides a fairly large array of advanced computational intelligence methods for continuous optimization solutions.

More specifically, ruby-mhl currently supports several implementations of Genetic Algorithms (bitstring and integer vector genotype representations) and Particle Swarm Optimization (constrained PSO, quantum-inspired PSO, and a multi-swarm version of quantum-inspired PSO), extended with adaptation mechanisms to provide support for dynamic optimization problems.

ruby-mhl was designed for heavy duty target functions, whose evaluation typically involves one or more long simulation runs, possibly defined on very complex domains (or search spaces). To this end, ruby-mhl allows to evaluate the target function in a concurrent fashion, automatically taking advantage of the full amount of parallelism offered by the processor if the application is running on a Ruby interpreter that does not have a GIL, such as JRuby. (ruby-mhl will work with any Ruby interpreter, but the adoption of JRuby is strongly recommended for performance reasons.)

Installation

Stable version (recommended)

You can get the stable version of ruby-mhl by installing the mhl gem from RubyGems:

gem install mhl

or by adding:

gem 'mhl'

to your application's Gemfile and running:

bundle install

Development version

If you want to try the development version of ruby-mhl, instead, just place this line:

gem 'mhl', git: 'https://github.com/mtortonesi/ruby-mhl.git'

in your Gemfile and run:

bundle install

Genetic Algorithm (GA)

ruby-mhl provides a GA solver capable of working with either the traditional bitstring chromosome representation or a integer vector representation variant.

Example: Solving the parabola function with a integer vector GA

Here is an example demonstrating how to find the argument that minimizes the 2-dimension parabola f(x) = x12 + x22 equation with a genetic algorithm:

require 'mhl'

solver = MHL::GeneticAlgorithmSolver.new(
  population_size: 80,
  genotype_space_type: :integer,
  mutation_probability: 0.5,
  recombination_probability: 0.5,
  genotype_space_conf: {
    dimensions: 2,
    recombination_type: :intermediate,
    random_func: lambda { Array.new(2) { rand(100) } }
  },
  exit_condition: lambda {|generation,best| best[:fitness] == 0}
)
solver.solve(Proc.new{|x| -(x[0] ** 2 + x[1] ** 2) })

Particle Swarm Optimization (PSO)

ruby-mhl implements the constrained version of PSO, defined by equation 4.30 of [SUN11], which we report here for full clarity. The velocity and position update equation for particle i are:

Movement equations for Constrained PSO

In which Xi(t) = (Xi,1(t), ..., Xi,N(t)) is the particle location, whose components Xi,j(t) represent the decision variables of the problem; Vi(t) = (Vi,1(t), ..., Vi,N(t)) is a velocity vector which captures the movement of the particle; Pi(t) = (Pi,1(t), ..., Pi,N(t)) is a particle attractor representing the 'highest' (best) position that the particle has encountered so far; G(t) is the swarm attractor, representing the 'highest' (best) position that the entire swarm has encountered so far; ri,j(t) and Ri,j(t) are random sequences uniformly sampled in the (0,1) interval; and C1 and C2 are constants.

Note that, in order to guarantee convergence, we must have:

Convergence criteria for Constrained PSO

As a result, by default ruby-mhl sets C1 = C2 = 2.05 and calculates χ accordingly (approximately 0.72984), which is considered the best practice [BLACKWELL04]. For more information about this (much more than you'll ever want to know, believe me) please refer to [CLERC02].

Example: Solving the parabola function with PSO

Here is an example demonstrating how to find the argument that minimizes the 2-dimension parabola f(x) = x12 + x22 equation with PSO:

require 'mhl'

solver = MHL::ParticleSwarmOptimizationSolver.new(
  swarm_size: 40, # 40 is the default swarm size
  constraints: {
    min: [ -100, -100 ],
    max: [  100,  100 ],
  },
  exit_condition: lambda {|iteration,best| best[:height].abs < 0.001 },
)
solver.solve(Proc.new{|x| -(x[0] ** 2 + x[1] ** 2) })

Quantum-Inspired Particle Swarm Optimization (QPSO)

Quantum-inspired PSO is another particularly interesting PSO variant. It aims at simulating interactions between a group of humans by borrowing concepts from (the uncertainty typical of) quantum mechanics.

ruby-mhl implements the Quantum-inspired version of PSO (QPSO), Type 2, as defined by equation 4.82 of [SUN11], which we report here for full clarity.

Movement Equations for Quantum-inspired PSO

where Pi(t) is the personal best of particle i; C(t) is the mean of the personal bests of all the particles in the swarm; G(t) is the swarm attractor; and φi,j(t) and ui,j(t+1) are sequences of random numbers uniformly distributed on the (0,1) interval.

Example: Solving the parabola function with QPSO

Here is an example demonstrating how to find the argument that minimizes the 2-dimension parabola f(x) = x12 + x22 equation with PSO:

require 'mhl'

solver = MHL::QuantumPSOSolver.new(
  swarm_size: 40, # 40 is the default swarm size
  constraints: {
    min: [ -100, -100 ],
    max: [  100,  100 ],
  },
  exit_condition: lambda {|iteration,best| best[:height].abs < 0.001 },
)
solver.solve(Proc.new{|x| -(x[0] ** 2 + x[1] ** 2) })

License

MIT

Publications

ruby-mhl was used in the following scientific publications:

[TORTONESI16] M. Tortonesi, L. Foschini, "Business-driven Service Placement for Highly Dynamic and Distributed Cloud Systems", IEEE Transactions on Cloud Computing, 2016 (in print).

[TORTONESI15] M.Tortonesi, "Exploring Continuous Optimization Solutions for Business-driven IT Managment Problems", in Proceedings of the 14th IFIP/IEEE Integrated Network Management Symposium (IM 2015) - Short papers track, 11-15 May 2015, Ottawa, Canada.

[GRABARNIK14] G. Grabarnik, L. Shwartz, M. Tortonesi, "Business-Driven Optimization of Component Placement for Complex Services in Federated Clouds", in Proceedings of the 14th IEEE/IFIP Network Operations and Management Symposium (NOMS 2014) - Mini-conference track, 5-9 May 2014, Krakow, Poland.

[FOSCHINI13] L. Foschini, M. Tortonesi, "Adaptive and Business-driven Service Placement in Federated Cloud Computing Environments", in Proceedings of the 8th IFIP/IEEE International Workshop on Business-driven IT Management (BDIM 2013), 27 May 2013, Ghent, Belgium.

If you are interested in ruby-mhl, please consider reading and citing them.

Acknowledgements

The research work that led to the development of ruby-mhl was supported in part by the DICET - INMOTO - ORganization of Cultural HEritage for Smart Tourism and Real-time Accessibility (OR.C.HE.S.T.R.A.) project, funded by the Italian Ministry of University and Research on Axis II of the National operative programme (PON) for Research and Competitiveness 2007-13 within the call 'Smart Cities and Communities and Social Innovation' (D.D. n.84/Ric., 2 March 2012).

References

[MUHLENBEIN93] H. Mühlenbein, D. Schlierkamp-Voosen, "Predictive Models for the Breeder Genetic Algorithm - I. Continuous Parameter Optimization", Journal of Evolutionary Computation, Vol. 1, No. 1, pp. 25-49, March 1993.

[SUN11] J. Sun, C.-H. Lai, X.-J. Wu, "Particle Swarm Optimisation: Classical and Quantum Perspectives", CRC Press, 2011.

[CLERC02] M. Clerc, J. Kennedy, "The particle swarm - explosion, stability, and convergence in a multidimensional complex space", IEEE Transactions on Evolutionary Computation, Vol. 6, No. 1, pp. 58-73, 2002, DOI: 10.1109/4235.985692

[BLACKWELL04] T. Blackwell, J. Branke, "Multi-swarm Optimization in Dynamic Environments", Applications of Evolutionary Computing, pp. 489-500, Springer, 2004. DOI: 10.1007/978-3-540-24653-4_50

[REZAEEJORDEHI13] A. Rezaee Jordehi, J. Jasni, "Parameter selection in particle swarm optimisation: a survey", Journal of Experimental & Theoretical Artificial Intelligence, Vol. 25, No. 4, pp. 527-542, 2013. DOI: 10.1080/0952813X.2013.782348

[CLERC12] M. Clerc, "Standard Particle Swarm Optimisation - From 2006 to 2011", available at: http://clerc.maurice.free.fr/pso/SPSO_descriptions.pdf

[LUKE15] S. Luke, "Essentials of Metaheuristics", 2nd Edition, available at: https://cs.gmu.edu/~sean/book/metaheuristics/, Online Version 2.2, October 2015.

[DEEP07] K. Deep, M. Thakur, "A new mutation operator for real coded genetic algorithms", Applied Mathematics and Computation, Vol. 193, No. 1, pp. 211-230, October 2007.