-
Notifications
You must be signed in to change notification settings - Fork 1
/
info.json
21 lines (21 loc) · 1.45 KB
/
info.json
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
{
"abstract": "The scale of modern datasets necessitates the development of efficient distributed optimization methods for machine learning. We present a general-purpose framework for distributed computing environments, CoCoA, that has an efficient communication scheme and is applicable to a wide variety of problems in machine learning and signal processing. We extend the framework to cover general non-strongly-convex regularizers, including L1-regularized problems like lasso, sparse logistic regression, and elastic net regularization, and show how earlier work can be derived as a special case. We provide convergence guarantees for the class of convex regularized loss minimization objectives, leveraging a novel approach in handling non-strongly-convex regularizers and non-smooth loss functions. The resulting framework has markedly improved performance over state-of-the- art methods, as we illustrate with an extensive set of experiments on real distributed datasets.",
"authors": [
"Virginia Smith",
"Simone Forte",
"Chenxin Ma",
"Martin Tak{{\\'a}}{\\v{c}}",
"Michael I. Jordan",
"Martin Jaggi"
],
"id": "16-512",
"issue": 230,
"pages": [
1,
49
],
"title": "{CoCoA}: A General Framework for Communication-Efficient Distributed Optimization",
"title_html": "CoCoA: A General Framework for Communication-Efficient Distributed Optimization",
"volume": 18,
"year": 2018
}