Skip to content

ppml-news/ppml-news.github.io

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 

Repository files navigation

News in Privacy-Preserving Machine Learning

February 2020

Papers

January 2020

Papers

February 2019

Papers

Bonus

January 2019

Papers

News

Bonus

31 December 2018

Papers

News

14 December 2018

Papers

News

Bonus

30 November 2018

Papers

News

31 October 2018

Papers

28 September 2018

Papers

27 July 2018

Papers

27 June 2018

25 May 2018

Papers

News

Bonus

18 May 2018

Small but good: we only dug up one paper this week but it comes with very interesting claims.

Papers

  • SecureNN: Efficient and Private Neural Network Training
    Following recent approachs but reporting significant performance improvements via specialized protocols for the 3 and 4-server setting: the claimed cost of encrypted training is in some cases only 13-33 times that of training on cleartext data. Big factor in this is the avoidance of bit-decomposition and garbled circuits when computing comparisons and ReLUs.

11 May 2018

If anyone had any doubt that private machine learning is a growing area then this week might take care of that.

Papers

Secure multiparty computation:

Homomorphic encryption:

  • Unsupervised Machine Learning on Encrypted Data
    Implements K-means privately using fully homomorphic encryption and a bit-wise rational encoding, with suggestions for tweaking K-means to make it more practical for this setting. The TFHE library (see below) is used for experiments.

  • TFHE: Fast Fully Homomorphic Encryption over the Torus
    Proclaimed as the fastest FHE library currently available, this paper is the extended version of previous descriptions of the underlying scheme and optimizations.

  • Homomorphic Secret Sharing: Optimizations and Applications
    Further work on a hybrid scheme between homomorphic encryption and secret sharing: operations can be performed locally by each share holder as in the former, yet a final combination is needed in the end to recover the result as in the latter: "this enables a level of compactness and efficiency of reconstruction that is impossible to achieve via standard FHE".

Secure enclaves:

Differential privacy:

Bonus

27 April 2018

Papers

  • Towards Dependable Deep Convolutional Neural Networks (CNNs) with Out-distribution Learning
    "in this paper we propose to add an additional dustbin class containing natural out-distribution samples" "We show that such an augmented CNN has a lower error rate in the presence of adversarial examples because it either correctly classifies adversarial samples or rejects them to a dustbin class."

  • Weak labeling for crowd learning
    "weak labeling for crowd learning is proposed, where the annotators may provide more than a single label per instance to try not to miss the real label"

  • Decentralized learning with budgeted network load using Gaussian copulas and classifier ensembles
    "In this article, we place ourselves in a context where the amount of transferred data must be anticipated but a limited portion of the local training sets can be shared. We also suppose a minimalist topology where each node can only send information unidirectionally to a single central node which will aggregate models trained by the nodes" "Using shared data on the central node, we then train a probabilistic model to aggregate the base classifiers in a second stage."

  • Securing Distributed Machine Learning in High Dimensions
    Some results towards the issue of input pollution in federated learning, where a fraction of gradient providers may give arbitrarily malicious inputs to an aggregation protocol. "The core of our method is a robust gradient aggregator based on the iterative filtering algorithm for robust mean estimation".

20 April 2018

Papers

News

  • Sharemind, one of the biggest and earliest players pushing MPC to industry, has launched a new privacy service based on secure computation using secure enclaves with the promise that it can handle big data. Via @positium.

  • Interesting interview with Lea Kissner, the head of Google's privacy team NightWatch. Few details are given but "She recently tried to obscure some data using cryptography, so that none of it would be visible to Google upon upload ... but it turned out that [it] would require more spare computing power than Google has" sounds like techniques that could be related to MPC or HE. Via @rosa.

  • Google had two AI presentations at this year's RSA conference, one on fraud detection and one on adversarial techniques. Via @goodfellow_ian.

Bonus

13 April 2018

Papers

News

Bonus

30 March 2018

Papers

16 March 2018

Papers

Bonus

9 March 2018

News

Papers

Blogs

2 March 2018

News

Papers

  • Scalable Private Learning with PATE
    Follow-up work to the celebrated Student-Teacher way of ensuring privacy of training data via differential privacy, now with better privacy bounds and hence less added noise. This is partially achieved by switching to Gaussian noise and more advanced (trusted) aggregation mechanisms.

  • Privacy-Preserving Logistic Regression Training
    Fitting a logistic model from homomorphically encrypted data using the Newton-Raphson iterative method, but with a fixed and approximated Hessian matrix. Performance is evaluated on the iDASH cancer detection scenario.

  • Privacy-Preserving Boosting with Random Linear Classifiers for Learning from User-Generated Data
    Presents the SecureBoost framework for mixing boosting algorithms with secure computation. The former uses randomly generated linear classifiers at the base and the latter comes in three variants: RLWE+GC, Paillier+GC, and SecretSharing+GC. Performance experiments on both the model itself and on the secure versions are provided.

  • Machine learning and genomics: precision medicine vs. patient privacy
    Non-technical paper illustrating that secure computation techniques are finding their way into otherwise unrelated research areas, and hitting home-run with "data access restrictions are a burden for researchers, particularly junior researchers or small labs that do not have the clout to set up collaborations with major data curators".

Blogs

23 February 2018

Papers

About

News in Privacy-Preserving Machine Learning

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published