Newton and Quasi-Newton optimization with PyTorch
-
Updated
Mar 10, 2024 - Python
Newton and Quasi-Newton optimization with PyTorch
Hessian-based stochastic optimization in TensorFlow and keras
Implementation and visualization (some demos) of search and optimization algorithms.
This package is dedicated to high-order optimization methods. All the methods can be used similarly to standard PyTorch optimizers.
Newton’s second-order optimization methods in python
A Unified Pytorch Optimizer for Numerical Optimization
Implementation of Unconstrained minimization algorithms. These are listed below:
Compilation of the assignments of the course of COL726: Numerical Algorithms (Spring 2021) and their solutions
In numerical analysis, Newton's method, also known as the Newton–Raphson method, named after Isaac Newton and Joseph Raphson, is a root-finding algorithm which produces successively better approximations to the roots (or zeroes) of a real-valued function.
Project II from CISC.820.01 - Quantitative Foundations
Code for the AISTATS 2024 paper: "Krylov Cubic Regularized Newton: A Subspace Second-Order Method with Dimension-Free Convergence Rate".
a Python implementation of various optimization methods for functions using Streamlit.
Nonlinear optimization course's project is about iterative methods due to finding an optimal point for a function in Rn.
Este repositorio contiene los archivos que crean una interfaz gráfica para el simulador de un movimiento parabólico amortiguado.
Global Optmization with Gradients: Python implementation of an experimental non-local Newton method
Solving problems from the course on the basics of computational physics
simple code to solve algebraic equation and system of equations
Dynamics of Newton's Method for a Nonanalytical Perturbation on a Complex Quadratic Function
This project is a graphical calculator for solving equations using the Newton Method and finding function minima using the Gradient Descent algorithm, utilizing PySide6, SymPy, and Matplotlib.
An application for visualizing and comparing the work of multidimensional optimization methods with using derivative of the objective function: Gradient Descent Method, Newton Method, Fletcher-Reeves Method.
Add a description, image, and links to the newton-method topic page so that developers can more easily learn about it.
To associate your repository with the newton-method topic, visit your repo's landing page and select "manage topics."