This repository explores various optimization techniques, both for unconstrained and constrained problems. The methods are implemented and visualized in a Jupyter Notebook, providing insights into their convergence and efficiency. These methods have been studied as part of the MASD course.
The notebook is divided into two main parts:
-
Unconstrained Optimization Methods
- Gradient Descent with Fixed Step Size
- Gradient Descent with Optimal Step Size
- Nesterov’s Accelerated Gradient Descent
-
Constrained Optimization Methods
- Projected Gradient Descent
- Exterior Penalization Method
- Uzawa’s Algorithm
Each method's performance is evaluated using contour plots and numerical results, comparing:
- Convergence behavior
- Number of iterations
- Accuracy of the found solution