This project explores finding minima of functions in one variable using gradient descent in Python. We implement gradient descent from scratch, experiment with different learning rates and starting points, visualize convergence, and extend the method using random restarts to better approximate the global minimum.
The project is split into two main tasks:
- Minimizing a cubic function: ( f(x) = x^3 - 6x^2 + 9x + 6 )
- Minimizing a trigonometric function: ( f(x) = 2 + \cos(x) + \frac{\cos(2x - 1/2)}{2} ) with gradient descent and random restarts.
Authors: Marc Esteller Sanchez & José Pablo Del Moral García
- Implementation of gradient descent in one dimension
- Adjustable learning rate and starting guess
- Visualization of gradient descent paths
- Experiments showing effect of learning rate and initial guess on convergence
- Gradient descent with random restarts to find global minima
Homework3_GradientDescent/
- README.md
- homework3.ipynb # Full code for tasks, plots, and experiments
- utils/
- gradient_descent.py # Optional: helper functions for gradient calculation
- plots/
- *.png # Optional: exported plots
- Clone the repository:
git clone https://github.com/yourusername/Homework3_GradientDescent.git