Skip to content

ccffccffcc/stochastic_collapse

Repository files navigation

Stochastic Collapse: How Gradient Noise Attracts SGD Dynamics Towards Simpler Subnetworks

Stochastic Collapse: How Gradient Noise Attracts SGD Dynamics Towards Simpler Subnetworks
Feng Chen*, Daniel Kunin*, Atsushi Yamamura*, Surya Ganguli (* equal contribution. ordered alphabetically)


In this work, we reveal a strong implicit bias of stochastic gradient descent (SGD) that drives overly expressive networks to much simpler subnetworks, thereby dramatically reducing the number of independent parameters, and improving generalization. To reveal this bias, we identify invariant sets, or subsets of parameter space that remain unmodified by SGD. We focus on two classes of invariant sets that correspond to simpler (sparse or low-rank) subnetworks and commonly appear in modern architectures. Our analysis uncovers that SGD exhibits a property of stochastic attractivity towards these simpler invariant sets. We establish a sufficient condition for stochastic attractivity based on a competition between the loss landscape's curvature around the invariant set and the noise introduced by stochastic gradients. Remarkably, we find that an increased level of noise strengthens attractivity, leading to the emergence of attractive invariant sets associated with saddle-points or local maxima of the train loss. We observe empirically the existence of attractive invariant sets in trained deep neural networks, implying that SGD dynamics often collapses to simple subnetworks with either vanishing or redundant neurons. We further demonstrate how this simplifying process of stochastic collapse benefits generalization in a linear teacher-student framework. Finally, through this analysis, we mechanistically explain why early training with large learning rates for extended periods benefits subsequent generalization.

Dependencies and Installation of Python Packages

  1. Before Start: make sure your cuda version is 11.3+.
  2. Clone the GitHub Repository:
    git clone git@github.com:ccffccffcc/stochastic_collapse.git
    
  3. Create Conda Environment: It is reommended to Create a New Virtual Environment. You can replace stochastic_collapse with your preferred environment name.
    conda create -y --name stochastic_collapse python=3.10
    conda activate stochastic_collapse
    
  4. Install Dependencies: Please make sure you install the correct version of Pytorch and MosaicML.
    conda install -c conda-forge cudatoolkit
    pip3 install torch==1.11.0 torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/cu113
    pip install --upgrade mosaicml==0.7.1
    pip install black GPUtil isort ipython jupyter matplotlib pandas python-dotenv seaborn scipy wandb ffcv numba opencv-python cupy-cuda113
    pip install -e .
    
  5. Test installation: Please set environment variable DATA_DIR to the desired directory for dataset. You would either start to download or verify the CIFAR dataset.
    python run/make_cifar.py
    

Run Experiment with Python

Assign environment variable DATA_DIR and EXP_DIR to your data directory and logging saving directory. The experiments configurations can be found at /configs

To run experiment, for example, you can use the following codes:

python run/exp.py train -f configs/cifar10_sgd/train_sce_gelu.yaml

To finetune a trained model, for example, you can use the following codes:

python run/exp.py finetune -f configs/cifar10_sgd/train_sce_gelu_finetune.yaml

Codes for experiments with toy models can be found at /nbs.

Dependencies on Julia

Figure 2 in our paper is generated by an experiment with Julia Version 1.7.3. The pakage dependencies are specified in /Project.toml and /Manifest.toml

Citation

If you find our results or codes useful for your research, please consider citing our paper:

@misc{chen2023stochastic,
     title={Stochastic Collapse: How Gradient Noise Attracts SGD Dynamics Towards Simpler Subnetworks}, 
     author={Feng Chen and Daniel Kunin and Atsushi Yamamura and Surya Ganguli},
     year={2023},
     eprint={2306.04251},
     archivePrefix={arXiv},
     primaryClass={cs.LG}
     }

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published