Skip to content

vduruiss/AccOpt_via_GNI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

59 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Accelerated Optimization via Geometric Numerical Integration

Authors: Valentin Duruisseaux and Melvin Leok


This repository provides the code for our paper

Practical Perspectives on Symplectic Accelerated Optimization.
Valentin Duruisseaux and Melvin Leok.
Optimization Methods and Software, Vol.38, Issue 6, pages 1230-1268, 2023.

It contains simple implementations of the optimization algorithms in MATLAB, Julia, and Python, and more sophisticated Python code implementations which allow the optimizers to be called conveniently within the TensorFlow and PyTorch frameworks.




Table of Contents





Simple MATLAB Codes

See the directory ./Simple_MATLAB/

A simple implementation of the ExpoSLC-RTL and PolySLC-RTL algorithms from Practical Perspectives on Symplectic Accelerated Optimization as MATLAB functions is given in

	ExpoSLC_RTL.m     and     PolySLC_RTL.m

We also provide two simple MATLAB scripts to show how these optimizers can be used:

	Expo_Script.m     and     Poly_Script.m





Simple Julia Codes

See the directory ./Simple_Julia/

A simple implementation of the ExpoSLC-RTL and PolySLC-RTL algorithms from Practical Perspectives on Symplectic Accelerated Optimization as Julia functions is given in

	ExpoSLC_RTL.jl     and     PolySLC_RTL.jl

We also provide two simple Julia scripts to show how these optimizers can be used:

	Expo_Script.jl     and     Poly_Script.jl





Simple Python Codes

See the directory ./Simple_Python/

A simple implementation of the ExpoSLC-RTL and PolySLC-RTL algorithms from Practical Perspectives on Symplectic Accelerated Optimization as Python functions is stored in

	SLC_Optimizers.py

We also provide two simple python scripts to show how these optimizers can be used, once SLC_Optimizers.py is imported:

	ExpoScript.py     and     PolyScript.py

Usage:

	python ./Simple_Python/ExpoScript.py





PyTorch Codes

See the directory ./PyTorch_Codes/

BrAVO Algorithms

We have implemented the BrAVO algorithms in BrAVO_torch.py within the PyTorch framework.

They can be called in a similar way as the ADAM algorithm. For instance,

	optimizer = torch.optim.Adam(model.parameters(), lr = 0.01)

can be replaced by

	optimizer = BrAVO_torch.eBravo(model.parameters(), lr = 1)

Applications

We have tested the BrAVO algorithms on a collection of optimization problems from machine learning, with a variety of model architectures, loss functions to minimize, and applications.

Fashion-MNIST Image Classification

We consider the popular multi-label image classification problem based on the Fashion-MNIST dataset.

"Fashion-MNIST is a dataset of Zalando's article images consisting of a training set of 60,000 examples and a test set of 10,000 examples. Each example is a 28x28 grayscale image, associated with a label from 10 classes (t-shirt/top, trouser, pullover, dress, coat, sandal, shirt, sneaker, bag, ankle boot)."

We learn the 55,050 parameters of a Neural Network classification model.

Usage:

	python ./PyTorch_Codes/FashionMNIST.py

CIFAR-10 Image Classification

We consider the popular multi-label image classification problem based on the CIFAR-10 dataset.

"The CIFAR-10 dataset consists of 60000 32x32 colour images in 10 mutually exclusive classes (airplane, automobile, bird, cat, deer, dog, frog, horse, ship, truck), with 6000 images per class."

We learn the 62,006 parameters of a Convolutional Neural Network classification model which is very similar to the LeNet-5 architecture.

Usage:

	python ./PyTorch_Codes/CIFAR10.py

Dynamics Learning and Control

We have test our algorithms for dynamics learning and control. We consider a Hamiltonian-based neural ODE network (with 231310 parameters) for dynamics learning and control on the SO(3) manifold, applied to a fully-actuated pendulum.

More details about this application and the code used can be found at

Hamiltonian-based Neural ODE Networks on the SE(3) Manifold For Dynamics Learning and Control. Thai Duong and Nikolay Atanasov. Proceedings of Robotics: Science and Systems, July 2021.




TensorFlow Codes

See the directory ./TensorFlow_Codes/

BrAVO Algorithms

We have implemented the BrAVO algorithms in BrAVO_tf.py within the TensorFlow framework.

They can be called in a similar way as the ADAM algorithm. For instance,

	optimizer = tf.keras.optimizers.Adam(learning_rate = 0.001)

can be replaced by

	optimizer = BrAVO_tf.eBravo(learning_rate = 1)

Applications

We have tested the BrAVO algorithms on a collection of optimization problems from machine learning, with a variety of model architectures, loss functions to minimize, and applications.

Binary Classification

Given a set of feature vectors and associated labels , we want to find a vector such that is a good model for .

Usage:

	python ./TensorFlow_Codes/BinaryClassification.py

Fermat-Weber Location Problem

Given a set of points and associated positive weights , we want to find the location whose sum of weighted distances from the points is minimized.

Usage:

	python ./TensorFlow_Codes/LocationProblem.py

Data Fitting

Given data points from a noisy version of a chosen function, we learn the 4,355 parameters of a Neural Network to obtain a model which fits the data points as well as possible

Usage:

	python ./TensorFlow_Codes/DataFitting.py

Natural Language Processing: arXiv Classification

We consider the Natural Language Processing problem of constructing a multi-label text classifier which can provide suggestions for the most appropriate subject areas for arXiv papers based on their abstracts.

Usage:

	python ./TensorFlow_Codes/NLP_arXivClassification.py

Timeseries Forecasting for Weather Prediction

We consider timeseries forecasting for weather prediction, using a Long Short-Term Memory (LSTM) model (with 5153 parameters).

Usage:

	python ./TensorFlow_Codes/WeatherForecasting.py





Additional Information

If you use this code in your research, please consider citing:

@Article{Duruisseaux2023Practical,
	author        = {V. Duruisseaux and M. Leok},
	title         = {Practical Perspectives on Symplectic Accelerated Optimization},
	year  = {2023},
	volume = {38},
	number = {6},
	pages = {1230-1268},
	publisher = {Taylor & Francis},
	journal = {Optimization Methods and Software},
	url={https://doi.org/10.1080/10556788.2023.2214837},
}

The software is available under the MIT License.

If you have any questions, feel free to contact Valentin Duruisseaux or Melvin Leok.

About

Accelerated Optimization via Geometric Numerical Integration

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published