Skip to content

This project explores finding minima of one-dimensional functions using gradient descent. It includes experiments with different learning rates and starting points, visualizations of convergence paths, and an extension using random restarts to approximate global minima.

License

Notifications You must be signed in to change notification settings

josepablodmg/Gradient-Descent-and-Optimization-in-Python

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 

Repository files navigation

Homework: Gradient Descent and Optimization in Python

Project Overview

This project explores finding minima of functions in one variable using gradient descent in Python. We implement gradient descent from scratch, experiment with different learning rates and starting points, visualize convergence, and extend the method using random restarts to better approximate the global minimum.

The project is split into two main tasks:

  1. Minimizing a cubic function: ( f(x) = x^3 - 6x^2 + 9x + 6 )
  2. Minimizing a trigonometric function: ( f(x) = 2 + \cos(x) + \frac{\cos(2x - 1/2)}{2} ) with gradient descent and random restarts.

Authors: Marc Esteller Sanchez & José Pablo Del Moral García

Features

  • Implementation of gradient descent in one dimension
  • Adjustable learning rate and starting guess
  • Visualization of gradient descent paths
  • Experiments showing effect of learning rate and initial guess on convergence
  • Gradient descent with random restarts to find global minima

Repository Structure

Homework3_GradientDescent/

  • README.md
  • homework3.ipynb # Full code for tasks, plots, and experiments
  • utils/
    • gradient_descent.py # Optional: helper functions for gradient calculation
  • plots/
    • *.png # Optional: exported plots

Getting Started

  1. Clone the repository:
git clone https://github.com/yourusername/Homework3_GradientDescent.git

About

This project explores finding minima of one-dimensional functions using gradient descent. It includes experiments with different learning rates and starting points, visualizations of convergence paths, and an extension using random restarts to approximate global minima.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published