Skip to content

Conversation

@sanaAyrml
Copy link
Collaborator

@sanaAyrml sanaAyrml commented Apr 9, 2024

PR Type

[Feature]

Short Description

Clickup Ticket(s): Link

This is the implementation of MR-MTL: On Privacy and Personalization in Cross-Silo Federated Learning.
The method is very related to FedProx and Ditto. Essentially, at the start of each client training round, we do not update local model weights directly with the global model weights, but instead we constrain the model weights to be close to the initial global model weights at the start of each client training round. Initial global model weights are computed by averaging the model weights at the end of the previous round. Such a mean-regularized training is done by adding a penalty term to the loss function that constrains the local model weights to be close to the initial global model weights.

Tests Added

Three tests are added to tests/clients/test_mr_mtl_client.py regarding setting global weights (as we don't set them for local model but just save them for computing mr loss), forming mr loss and computing loss.

@@ -1,4 +1,4 @@
# FedProx Federated Learning Example
# Ditto Federated Learning Example
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this left from copy past so I fixed it.

@sanaAyrml sanaAyrml changed the title Mr mtl client Add MR-MTL Method Apr 9, 2024
@emersodb emersodb requested review from emersodb and fatemetkl April 15, 2024 17:38
# Shutdown the client gracefully
client.shutdown()

client.metrics_reporter.dump()
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@lotif: Do you think we should just add this client.metrics_report.dump() to the shutdown so that it just always happens?

device: torch.device,
loss_meter_type: LossMeterType = LossMeterType.AVERAGE,
checkpointer: Optional[TorchCheckpointer] = None,
lam: float = 1.0,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This isn't part of the original implementation (and shouldn't be in this PR), but it would be interesting to see what effect adapting this value based on the adaptive FedProx approach or the generalization gap of FedDG-GA would have on the approach. A similar idea could be said for the Ditto parameter.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am confused by this comment, which part do you mean exactly?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Apologies, this was just me recording a thought about something to investigate in the future. That is, would MR-MTL or Ditto benefit from an adaptive implementation of the FedProx-like parameter $\lambda$. It could be adjusted as they suggested in the original paper, or perhaps we could adjust it based on a similar measure to that proposed in FedDG-GA (which is a generalization gap).

TL;DR: No need to do anything. Was just thinking about potential future experimentation.

@emersodb emersodb self-requested a review April 15, 2024 19:16
@emersodb
Copy link
Collaborator

Overall, I think the implementation looks good. Just a few small comments before we can merge.

Copy link
Collaborator

@emersodb emersodb left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good to go for me.

@sanaAyrml sanaAyrml merged commit 50f73a0 into main Apr 16, 2024
@sanaAyrml sanaAyrml deleted the mr_mtl_client branch April 16, 2024 15:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants