/
info.json
17 lines (17 loc) · 1.54 KB
/
info.json
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
{
"abstract": "We consider the problem of estimating the parameters of a Gaussian or\nbinary distribution in such a way that the resulting undirected\ngraphical model is sparse. Our approach is to solve a maximum\nlikelihood problem with an added <i>l</i><sub>1</sub>-norm penalty term. The\nproblem as formulated is convex but the memory requirements and\ncomplexity of existing interior point methods are prohibitive for\nproblems with more than tens of nodes. We present two new algorithms\nfor solving problems with at least a thousand nodes in the Gaussian\ncase. Our first algorithm uses block coordinate descent, and can be\ninterpreted as recursive <i>l</i><sub>1</sub>-norm penalized regression. Our\nsecond algorithm, based on Nesterov's first order method, yields a\ncomplexity estimate with a better dependence on problem size than\nexisting interior point methods. Using a log determinant relaxation\nof the log partition function (Wainwright and Jordan, 2006), we show that these\nsame algorithms can be used to solve an approximate sparse maximum\nlikelihood problem for the binary case. We test our algorithms on\nsynthetic data, as well as on gene expression and senate voting\nrecords data.",
"authors": [
"Onureena Banerjee",
"Laurent El Ghaoui",
"Alexandre d'Aspremont"
],
"id": "banerjee08a",
"issue": 15,
"pages": [
485,
516
],
"title": "Model Selection Through Sparse Maximum Likelihood Estimation for Multivariate Gaussian or Binary Data",
"volume": "9",
"year": "2008"
}