Skip to content

Sheng-Cheng/DiffTuneOpenSource

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DiffTune: Auto-Tuning through Auto-Differentiation

This is a DiffTune toolset for controller auto-tuning using sensitivity propagation. This toolset is intended to facilitate users' DiffTune applications in two ways. First, it enables the automatic generation of the partial derivatives required for sensitivity propagation. In this way, a user only needs to specify the dynamics and controller, eliminating the need to manually provide partial derivatives. Second, we provide examples that demonstrate the applications of DiffTune and a template to facilitate quick deployment to other applications.

Details of the DiffTune can be found in:
DiffTune: Auto-Tuning through Auto-Differentiation
DiffTune+: Hyperparameter-Free Auto-Tuning using Auto-Differentiation
DiffTune-MPC: Closed-Loop Learning for Model Predictive Control

If you think this toolset is helpful to your research/application, please cite:

@article{cheng2022difftune,
  title={DiffTune: Auto-Tuning through Auto-Differentiation},
  author={Cheng, Sheng and Kim, Minkyung and Song, Lin and Yang, Chengyu and Jin, Yiquan and Wang, Shenlong and Hovakimyan, Naira},
  journal={accepted for publication by IEEE Transactions on Robotics, arXiv preprint arXiv:2209.10021},
  year={2024}
}
@inproceedings{cheng2023difftunePlus,
  title={DiffTune+: Hyperparameter-Free Auto-Tuning using Auto-Differentiation},
  author={Cheng, Sheng and Song, Lin and Kim, Minkyung and Wang, Shenlong and Hovakimyan, Naira},
  booktitle={Learning for Dynamics and Control Conference},
  pages={170--183},
  year={2023},
  organization={PMLR}
}
@article{tao2024difftunempc,
  title={{DiffTune-MPC}: Closed-loop learning for model predictive control},
  author={Tao, Ran and Cheng, Sheng and Wang, Xiaofeng and Wang, Shenlong and Hovakimyan, Naira},
  journal={IEEE Robotics and Automation Letters},
  year={2024},
  publisher={IEEE}
}

Prerequisites

You need to install CasAdi on your computer (Make sure you add CasADi's directory to your MATLAB's path by addpath('<yourpath>/<casadi-folder-name>');savepath;). We will use the autogenerated C code by CasAdi and compile it into mex in MATLAB. Make sure to configure your MATLAB's C compiler by mex -setup c.

Run examples

We offer two examples: a quadrotor model and a Dubin's car model. Simply navigate to one of the folders under /examples. Take the quadrotor case as an illustrative example, first run QuadrotorAutoGeneration.m. This script automatically generates the C code for evaluating the Jacobians in sensitivity propagation and compiles it into mex files, which are available under the subfolder /mex once the script finishes. Now you can run runDiffTune.m under /examples/quadrotor. You should be able to see the loss is printed out at each iteration while a figure updates the tracking performance and RMSE of the controller. You can turn on the video generation option that will record the tracking performance and RMSE reduction at run time (as illustrated below). We use the geometric controller by Taeyoung Lee et al and modified the source code to our use case.

quadrotor.mp4

Use the template

A template for the usage of DiffTune on custom systems and controllers is provided in /template. Users are recommended to fill in the dynamics.m and controllers.m first and run the runDiffTune.m (with the DiffTune-related components commented out) to make sure the basic simulation can run as expected. The next step is to fill in the templateAutoGeneration.m with discretized dynamics and run this script to generate and compile the functions for online Jacobians evaluation. Once done, you should see *.mex and *.c files under the directory /template/mex. Now, you can retain the commented-out sections in runDiffTune.m and run this script for your own DiffTune application.

Implicitly differentiable model predictive controllers

If your application involves a model predictive controller, you will then need special treatment to differentiate the solution from the model predictive controller. DiffTune's implementation is accessible in DiffTune-MPC. Details can be found in our paper DiffTune-MPC: Closed-Loop Learning for Model Predictive Control.

Issues/Questions/Suggestions

Feel free to open up an issue if you run into trouble.

Authors

Sheng Cheng Lin Song Minkyung Kim

License

This project is licensed under the GPL-3.0 License - see the LICENSE file for details

Hits

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages