Skip to content

Official implementation for "MAPL: Model Agnostic Peer-to-peer Learning"

License

Notifications You must be signed in to change notification settings

SayakMukherjee/MAPL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Model Agnostic Peer-to-peer Learning (MAPL)

This is the official repository for our paper "Model Agnostic Peer-to-peer Learning"

Model Agnostic Peer-to-peer Learning
Sayak Mukherjee, Andrea Simonetto, Hadi Jamali-Rad

[arXiv]

Overview

Effective collaboration among heterogeneous clients in a decentralized setting is a rather unexplored avenue in the literature. To structurally address this, we introduce Model Agnostic Peer-to-peer Learning (coined as MAPL) a novel approach to simultaneously learn heterogeneous personalized models as well as a collaboration graph through peer-to-peer communication among neighboring clients. MAPL is comprised of two main modules: (i) local-level Personalized Model Learning (PML), leveraging a combination of intra- and inter-client contrastive losses; (ii) network-wide decentralized Collaborative Graph Learning (CGL) dynamically refining collaboration weights in a privacy-preserving manner based on local task similarities. Our extensive experimentation demonstrates the efficacy of MAPL and its competitive (or, in most cases, superior) performance compared to its centralized model-agnostic counterparts without relying on any central server.

Architecture

Owing to model heterogeneity, the local models cannot be aggregated to facilitate collaboration. We address this using PML which incorporates learnable prototypes in a local contrastive learning setting to promote learning unbiased representations.

Local

MAPL can identify relevant neighbors in a fully decentralized manner using CGL which is advantageous in two key aspects: (1) reduces the communication cost by sparsifying the collaboration graph, and (2) improves the overall performance by facilitating collaboration among clients with similar data distribution. To infer similarity without violating the data privacy, CGL utilizes the weight vector of the locally trained classifier heads a proxy measure for client similarity.

Global

Installation

This code is written in Python 3.9 and requires the packages listed in environment.yml.

To run the code, set up a virtual environment using conda:

cd <path-to-cloned-directory>
conda env create --file environment.yml
conda activate mapl

Running experiments

To run an experiment create a new configuration file in the configs directory. The experiments can be can run using the following command:

cd <path-to-cloned-directory>\src
python  main.py --exp_config ..\configs\<config-file-name>.json

We provide the configuration files for the running MAPL with heterogeneous models for scenarios 1 - 4.

Citation

If you make use of the code, please cite the following work:

@misc{mukherjee2024mapl,
      title={MAPL: Model Agnostic Peer-to-peer Learning}, 
      author={Sayak Mukherjee and Andrea Simonetto and Hadi Jamali-Rad},
      year={2024},
      eprint={2403.19792},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
}

License

This project is under the MIT license.

Remarks

Our code is partly based on the following open-source projects: FedProto, FedClassAvg and Federated Learning Toolkit. We convey our gratitude to the developers of these resources.

About

Official implementation for "MAPL: Model Agnostic Peer-to-peer Learning"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages