-
Notifications
You must be signed in to change notification settings - Fork 16
Adding a README to the examples folder. #226
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
scarere
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
John might have better comments as he is more well versed on FL strategies, but overall was pretty clear to for me who has limited knowledge on the subject which I think is a good thing.
examples/README.MD
Outdated
| [dp_fed_examples](dp_fed_examples) | ||
| </td> | ||
| <td> | ||
| The examples in this folder are basic implementations of differentially private (DP) FL training. There are two levels of DP guarantees. Client-level DP refers to guarantees protecting <b>client</b> participation in FL training. Instance-level DP refers to guarantees protecting <b>datapoint</b> participation in FL training. Instance-level DP is classically achieved using DP-SGD on the client-side. The client-level examples include adaptive clipping implementations. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe one sentence explaining what differential privacy is? What does it mean to 'protect client participation' or 'protect datapoint participation'? This is also partially my own curiosity. Is it guarantees against information leakage from the client dataset to the server/other clients?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sure. Will add a sentence there. In brief, there are different "threat-models" that change the privacy you're guaranteeing. However, most of what we consider is a "third-party" perspective, which is an outside observer who only receives the final model. Client-level DP prevents this "third-party" observer from being able to tell if a particular client participated in training the model with certainty. Instance-level DP prevents this observer from being able to tell if a particular data point was used during training.
jewelltaylor
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Other than the small changes mentioned by Shawn and I, this looks good!
PR Type
Documentation
Short Description
Clickup Ticket(s): Link(s) if applicable.
Adding a README, along the lines specified in the ticket below, to the examples folder.
Tests Added
NA