Skip to content

smonsays/modular-hyperteacher

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Discovering modular solutions that generalize compositionally

Official code to reproduce experiments in Discovering modular solutions that generalize compositionally. Code is based on metax, a meta-learning research library in jax.

Installation

Install jax according to the instructions for your platform after which you can install the remaining dependencies with:

pip install -r requirements.txt

Structure

All experiments have a corresponding sweep file in sweeps/ and can be run using

`wandb sweep /sweeps/[folder]/[name].yaml`

where [folder] and [name] need to be replaced accordingly.

Hyperparameters for all methods and experiments can be found in configs/. If you'd like to directly run a specific experiment for a single seed you can use:

python run_fewshot.py --config 'configs/[experiment].py:[method]'

where experiment can be

  • compositional_grid
  • hyperteacher
  • preference_grid

and method can be

  • hnet_linear
  • hnet_deepmlp
  • anil512
  • learned_init384

For the empirical validation of the theory consider run_theory.py.

Citation

If you use this code in your research, please cite the paper:

@article{2023discovering,
  title={Discovering modular solutions that generalize compositionally}, 
  author={Simon Schug and Seijin Kobayashi and Yassir Akram and Maciej Wołczyk and Alexandra Proca and Johannes von Oswald and Razvan Pascanu and João Sacramento and Angelika Steger},
  year={2023},
  url = {https://arxiv.org/abs/2312.15001},
}

Acknowledgements

Research supported with Cloud TPUs from Google's TPU Research Cloud (TRC).

About

Code accompanying the paper Discovering modular solutions that generalize compositionally

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages