Skip to content
#

gradient-descent-implementation

Here are 13 public repositories matching this topic...

Gradient Descent is the process of minimizing a function by following the gradients of the cost function. This involves knowing the form of the cost as well as the derivative so that from a given point you know the gradient and can move in that direction, e.g. downhill towards the minimum value.

  • Updated Jun 22, 2020
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the gradient-descent-implementation topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the gradient-descent-implementation topic, visit your repo's landing page and select "manage topics."

Learn more