Skip to content

isl-org/generalized-smoothing

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Generalizing Gaussian Smoothing for Random Search

This repository contains code implementing the algorithms proposed in the paper Generalizing Gaussian Smoothing for Random Search, Gao and Sener (ICML 2022).

In particular, we provide the code used to obtain the experimental results on linear regression and the Nevergrad benchmark. For online RL, we used the ARS repository; our proposed algorithms may be implemented by modifying the sampling distribution of the shared noise table. Please see the paper for additional details and the hyperparameters used.

Requirements

The code is written in Python 3. Aside from the standard libraries, NumPy and Matplotlib are needed. For linear regression, you also need SciPy, and for Nevergrad the corresponding package.

Running the experiments

Please see the READMEs in the LinearRegression and benchmarks folders for further instructions.

Citation

To cite this repository in your research, please reference the following paper:

Gao, Katelyn, and Ozan Sener. "Generalizing Gaussian Smoothing for Random Search." International Conference on Machine Learning. PMLR, 2022.

@inproceedings{gao2022generalizing,
  title={Generalizing Gaussian Smoothing for Random Search},
  author={Gao, Katelyn and Sener, Ozan},
  booktitle={International Conference on Machine Learning},
  pages={7077--7101},
  year={2022},
  organization={PMLR}
}

Contact

If you have questions, please contact katelyn.gao@intel.com.

About

Companion code for the ICML 2022 paper "Generalizing Gaussian Smoothing for Random Search"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages