-
Notifications
You must be signed in to change notification settings - Fork 1
/
info.json
17 lines (17 loc) · 1.65 KB
/
info.json
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
{
"abstract": "We study a decomposition-based scalable approach to kernel ridge regression, and show that it achieves minimax optimal convergence rates under relatively mild conditions. The method is simple to describe: it randomly partitions a dataset of size $N$ into $m$ subsets of equal size, computes an independent kernel ridge regression estimator for each subset using a careful choice of the regularization parameter, then averages the local solutions into a global predictor. This partitioning leads to a substantial reduction in computation time versus the standard approach of performing kernel ridge regression on all $N$ samples. Our two main theorems establish that despite the computational speed-up, statistical optimality is retained: as long as $m$ is not too large, the partition-based estimator achieves the statistical minimax rate over all estimators using the set of $N$ samples. As concrete examples, our theory guarantees that the number of subsets $m$ may grow nearly linearly for finite-rank or Gaussian kernels and polynomially in $N$ for Sobolev spaces, which in turn allows for substantial reductions in computational cost. We conclude with experiments on both simulated data and a music-prediction task that complement our theoretical results, exhibiting the computational and statistical benefits of our approach.",
"authors": [
"Yuchen Zhang",
"John Duchi",
"Martin Wainwright"
],
"id": "zhang15d",
"issue": 102,
"pages": [
3299,
3340
],
"title": "Divide and Conquer Kernel Ridge Regression: A Distributed Algorithm with Minimax Optimal Rates",
"volume": 16,
"year": 2015
}