Check the Optimization in Deep Learning and Engineering material.
Optymus is part of quantsci project.
This library provides a comprehensive collection of optimization methods, both with and without constraints. The main goal is provide a simple structure to improve research and development in optimization problems.
Method | Description |
---|---|
bfgs | Broyden-Fletcher-Goldfarb-Shanno (BFGS) |
steepdesc | Steepest Descent |
newton_raphson | Newton-Raphson Method |
powell | Powell's Method |
fletcher_reeves | Fletcher-Reeves |
To begin using optymus, follow these steps:
-
Install optymus:
pip install optymus
-
Explore the Documentation: Visit the official documentation to understand the available optimization methods and how to use them effectively.
-
Get Started:
from optymus.optim import Optimizer from optymus.utils import sphere_function import jax.numpy as jnp f = sphere_function() initial_point = jnp.array([2., 2.]) opt = Optimizer(f_obj=f, x0=initial_point, method='bfgs') opt.report() opt.plot()
Refer to the documentation for detailed information on each method and its application.
We are working to implement a simple way to add your own optimization method.
Contributions to Optymus are highly appreciated. If you have additional optimization methods, improvements, or bug fixes, please submit a pull request following the contribution guidelines.
If you use Optymus in your research, please consider citing the library using the following BibTeX entry:
@misc{optymus2024,
author = {Costa, Kleyton and Menezes, Ivan},
title = {Optymus: Optimization Methods Library for Python},
year = {2024},
note = {GitHub Repository},
url = {https://github.com/quantsci/optymus}
}