This repository contains the implementation of convex optimization algorithms using gradient descent and Newton's method in Python. These algorithms can be used to solve optimization problems involving convex functions efficiently.
Convex optimization is a fundamental problem in various fields, including machine learning, operations research, and control systems. This repository implements convex optimization algorithms using gradient descent and Newton's method.
The implemented algorithms are capable of minimizing convex objective functions efficiently. They can be used for various optimization problems, such as linear regression, logistic regression, support vector machines, etc.
The problem descriptions are in Coding_Assignment_Problem.pdf
.
The following algorithms are implemented in this repository:
- Gradient Descent: A first-order optimization algorithm that iteratively updates the parameters in the direction of the negative gradient of the objective function.
- Newton's Method: A second-order optimization algorithm approximates the objective function locally as a quadratic function and updates the parameters accordingly.
These algorithms are commonly used in convex optimization due to their efficiency and effectiveness in finding optimal solutions.
Please Refer to Problem1.ipynb
, Problem2.ipynb
, and Assignment_Report.pdf
regarding the implementations of algorithms and saved outputs.