Skip to content

Python Implementation and Visualization of Numerical Optimization Techniques

Notifications You must be signed in to change notification settings

rlax59us/Numerical_Optimization

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 

Repository files navigation

Numerical_Optimization

Numerical Optimization is a mathematical formulation that allows to minimize or maximize a particular objective function subjected to constraints on its variables(Nocedal and Wright, 2006).

Since generality is key point in AI models, it is important to understand numerical optimization techniques for finding general optimal solution in given AI problems.

Here's various optimization techniques from univariate to multivariate problems, from unconstrained optimization techniques to constrained optimization techniques and some stochastic global optimization techniques.

Optimization Techniques

  • Univariate Optimization(code)

    • Root Finding Techniques (Bisection method, Newton's method, Regular falsi method, Secant method)
    • Comparison Optimization Techniques (Fibonacci search method, Golden section search method)
  • Multivariate Optimization(code)

    • Non-Smooth Function (Nelder-Mead method, Powell's method)
    • Gradient-based Method (Steepest Descent, Newton's method, Two Quasi Newton's method(SR1, BFGS))
    • Conjugate-Gradient Method (linear, nonlinear: CG-FR, CG-PR, CG-HS)
    • Least Square Methods
      • Gauss-Newton's method
      • Levenberg-Marquardt method
  • Constrained Optimization

    • I will update it later
  • Global Optimization

    • I will update Genetic Algorithms in this repository