Skip to content

A set of Jupyter notebooks that investigate and compare the performance of several numerical optimization techniques, both unconstrained (univariate search, Powell's method and Gradient Descent (fixed step and optimal step)) and constrained (Exterior Penalty method).

License

Notifications You must be signed in to change notification settings

MohEsmail143/numerical-optimization-techniques

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Numerical Optimization Techniques

A set of Jupyter notebooks that investigate and compare the performance of:

  • Numerical unconstrained optimization techniques (univariate search, Powell's method and Gradient Descent (fixed step and optimal step)) against these benchmark functions: De Jong’s function in 2D, Rosenbrock’s valley in 2D, Rastrigin’s function in 2D, 4-Easom’s function, Branins’s function.
  • Numerical constrained optimization technique "Exterior Penalty method" against several self-constructed examples of objective functions to minimize/maximize.

This project was developed as part of the course Optimization Techniques in the Fall 2022 semester at the Faculty of Engineering, Alexandria University, under the Computer and Communications Engineering department, supervised by Dr. Yasmine Abou Elseoud.

Prerequisites

This project was developed in the following environment:

  • Jupyter Notebook
  • Miniconda
  • Python 3.11.5

Installing

1- Clone the repository to your local machine:

git clone https://github.com/MohEsmail143/numerical-optimization-techniques.git

2- Open Jupyter notebook.

3- Check out the the Jupyter notebooks part1.ipynb and part2.ipynb.

License

This project is licensed under the MIT License - see the LICENSE.md file for details.