Skip to content
This repository has been archived by the owner on Dec 29, 2022. It is now read-only.

google/autol2

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commits
 
 
 
 
 
 
 
 
 
 
 
 

This code trains a Wide ResNet on different datasets and includes the AutoL2 algorithm described in the paper. Implemented by Aitor Lewkowycz, based on code by Sam Schoenholz. Requirements can be installed from requirements.txt. It is made to work on TPUs. Can also work on GPU by adding -noTPU and installing the GPU jaxlib package of https://github.com/google/jax.

Commands to generate data used for figures

Figure 1a.

for L2 in L2LIST:
  do
  python3 jax_wideresnet_exps.py -L2=$L2 -epochs=200 -std_wrn_sch 
  python3 jax_wideresnet_exps.py -L2=$L2 -physicalL2 -epochs=0.02 -std_wrn_sch # This is evolved for a time 0.02/eta/lambda=0.1/lambda epochs. 
  done

Figure 1b is generated by comparing the performance of models with our prediction.

for L2 in L2LIST:
  do
  python3 jax_wideresnet_exps.py -L2=$L2  -epochs=2000
  done

To obtain the t* prediction, we run the following.

python3 jax_wideresnet_exps.py -L2=0.01  -epochs=2

Figure 1c: Evolve with lr=0.2 for 200 epochs with L0=0.1 and L2_sch vs L2=0.0001.

python3 jax_wideresnet_exps.py -L2=0.1 -L2_sch
python3 jax_wideresnet_exps.py -L2=0.0001 -noL2_sch

The Wide ResNet experiments in Figure 2 are similar.

for lr in LRLIST:
  do
  for L2 in L2LIST:
    do
      python3 jax_wideresnet_exps.py -L2=$L2 -physicalL2 -epochs=0.1 -nomomentum -noaugment
    done
  done

About

No description, website, or topics provided.

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages