Skip to content

YulaiCong/BigCooperativeLearning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 

Repository files navigation

Big Cooperative Learning

The official code for the paper "Big Cooperative Learning" by Yulai Cong.

Figure_video_25GMM_BLexloration

Abstract

Cooperation plays a pivotal role in the evolution of human intelligence; moreover, it also underlies the recent revolutionary advancement of artificial intelligence (AI) that is driven by foundation models. Specifically, we reveal that the training of foundation models can be interpreted as a form of big cooperative learning (abbr. big learning), where massive learning individuals/tasks cooperate to approach the unique essence of data from diverse perspectives of data prediction, leveraging a universal model. The presented big learning therefore unifies most training objectives of foundation models within a consistent framework, where their underlying assumptions are exposed simultaneously. We design tailored simulations to demonstrate the principle of big learning, based on which we provide learning-perspective justifications for the successes of foundation models, with interesting side-products. Furthermore, we reveal that big learning is a new dimension for upgrading conventional machine learning paradigms, valuable for endowing reinvigorations to associated applications; as an illustrative example, we propose the BigLearn-GAN, which is a novel adversarially-trained foundation model with versatile data sampling capabilities.

Directory Explanation

filetree 
├── Section3.3_2GMM_simulation
├── Section4.1_25GMM_simulation
├── Section4.2_BigLearn_GAN
├── Section4.3_BigLearn_multimodal

Reference

Please consider citing our paper if you refer to this code in your research.

@misc{cong2023big,
      title={Big Cooperative Learning}, 
      author={Yulai Cong},
      year={2024},
      eprint={24...},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages