AISTAT2018: Fast and Scalable Learning of Sparse Changes in High-Dimensional Gaussian Graphical Model Structure
Switch branches/tags
Nothing to show
Clone or download
Latest commit 7868d25 Oct 12, 2018
Type Name Latest commit message Commit time
Failed to load latest commit information.
diffee-cran Integrate DIFFEEK Oct 5, 2018
.gitignore Integrate DIFFEEK Oct 5, 2018
2017-diffeenips17workshop.pdf add poster and talk PDFs Apr 11, 2018
2018-DIFFEE-talk.pdf small update in 2018-DIFFEE-talk Apr 19, 2018 add demo code in Apr 19, 2018
full-DIFFEE-arxiv1710.11223.pdf add the full paper in Oct 12, 2018


R package "diffee": @ CRAN Website



  title = 	 {Fast and Scalable Learning of Sparse Changes in High-Dimensional Gaussian Graphical Model Structure},
  author = 	 {Beilun Wang and arshdeep Sekhon and Yanjun Qi},
  booktitle = 	 {Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics},
  pages = 	 {1691--1700},
  year = 	 {2018},
  editor = 	 {Amos Storkey and Fernando Perez-Cruz},
  volume = 	 {84},
  series = 	 {Proceedings of Machine Learning Research},
  address = 	 {Playa Blanca, Lanzarote, Canary Islands},
  month = 	 {09--11 Apr},
  publisher = 	 {PMLR},
  pdf = 	 {},
  url = 	 {},
  abstract = 	 {We focus on the problem of estimating the change in the dependency structures of two $p$-dimensional Gaussian Graphical models (GGMs). Previous studies for sparse change estimation in GGMs involve expensive and difficult non-smooth optimization. We propose a novel method, DIFFEE for estimating DIFFerential networks via an Elementary Estimator under a high-dimensional situation. DIFFEE is solved through a faster and closed form solution that enables it to work in large-scale settings. We conduct a rigorous statistical analysis showing that surprisingly DIFFEE achieves the same asymptotic convergence rates as the state-of-the-art estimators that are much more difficult to compute. Our experimental results on multiple synthetic datasets and one real-world data about brain connectivity show strong performance improvements over baselines, as well as significant computational benefits.}

more details in project website