Skip to content
master
Switch branches/tags
Go to file
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Distributed Privacy-Preserving Empirical Risk Minimization

This work combines differential privacy and multi-party computation protocol to achieve distributed machine learning. Based on the paper "Distributed Learning without Distress: Privacy-Preserving Empirical Risk Minimization" (http://papers.nips.cc/paper/7871-distributed-learning-without-distress-privacy-preserving-empirical-risk-minimization) that has been accepted at NIPS 2018.

The code contains privacy preserving implementation of L2 Regularized Logistic Regression and Linear Regression models.

Requirements

Code Execution

Execute make files in model_aggregate_gaussian and model_aggregate_laplace directories using make command to obtain the respective a.out executable files.

Run python model_wrapper.py

About

This work combines differential privacy and multi-party computation protocol to achieve distributed machine learning.

Resources

Releases

No releases published

Packages

No packages published