Skip to content

Code for the paper "Doubly-Competitive Distribution Estimation"

Notifications You must be signed in to change notification settings

ucsdyi/competitive

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Code for paper "Doubly-Competitive Distribution Estimation" by Hao & Orlitsky (ICML 2019)

Paper is available at http://proceedings.mlr.press/v97/hao19a.html

Abstract

Distribution estimation is a statistical-learning cornerstone. Its classical min-max formulation minimizes the estimation error for the worst distribution, hence under-performs for practical distributions that, like power-law, are often rather simple. Modern research has therefore focused on two frameworks: structural estimation that improves learning accuracy by assuming a simple structure of the underlying distribution; and competitive, or instance-optimal, estimation that achieves the performance of a genie aided estimator up to a small excess error that vanishes as the sample size grows, regardless of the distribution. This paper combines and strengthens the two frameworks. It designs a single estimator whose excess error vanishes both at a universal rate as the sample size grows, as well as when the (unknown) distribution gets simpler. We show that the resulting algorithm significantly improves the performance guarantees for numerous competitive- and structural-estimation results. The algorithm runs in near-linear time and is robust to model misspecification and domain-symbol permutations.

About

Code for the paper "Doubly-Competitive Distribution Estimation"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages