Skip to content

nicolas-guerra/learning-where-to-learn

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

This repository contains code to reproduce experiments from the paper. It includes:

  • Bilevel kernel-based function approximation experiments
  • Dirichlet-to-Neumann (NtD) and Darcy flow examples

Both folders are organized independently, but share a unified objective of exploring optimal training distributions for out-of-distribution (OOD) generalization.


Repository Structure

./
├── function_approximation/         # Kernel-based function approximation experiments
│   ├── driver.py
│   ├── plot.py
│   ├── Project.yml
│   └── ...
│
├── operator_learning/              # NtD and Darcy flow examples
│   ├── NtDExample.py
│   ├── DarcyFlowGPU.py
│   ├── PlotResults.ipynb
│   ├── requirements.txt
│   └── ...

Installation

For Bilevel Experiments

conda env create -f function_approximation/Project.yml
conda activate bilevel

For NtD/Darcy Flow Experiments

pip install -r operator_learning/requirements.txt

Note: Both environments assume GPU support if available.


Usage

Bilevel Function Approximation

  1. Run experiments:

    cd function_approximation
    python -u driver.py

    Outputs: errors.npy and other intermediate files.

  2. Plot results:

    python plot.py

NtD and Darcy Flow Experiments

  1. Run NtD example:

    cd operator_learning
    python NtDExample.py
  2. Run Darcy flow example:

    python DarcyFlowGPU.py
  3. Plot results:

    jupyter notebook PlotResults.ipynb
    • Make sure NtD_results.pkl and DarcyFlow_results.pkl are in the directory.

Notes & Recommendations

  • Hyperparameters (e.g., sample sizes, training iterations) can be modified within each script.
  • Ensure all required dependencies are installed before executing code.
  • For plotting, a LaTeX distribution may be required.
  • The Github for the AMINO architecture used in Figure 1 of our paper can be found here.

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •