TY - GEN
T1 - Density estimation for shift-invariant multidimensional distributions
AU - De, Anindya
AU - Long, Philip M.
AU - Servedio, Rocco A.
PY - 2019/1/1
Y1 - 2019/1/1
N2 - We study density estimation for classes of shift-invariant distributions over Rd. A multidimensional distribution is “shift-invariant” if, roughly speaking, it is close in total variation distance to a small shift of it in any direction. Shift-invariance relaxes smoothness assumptions commonly used in non-parametric density estimation to allow jump discontinuities. The different classes of distributions that we consider correspond to different rates of tail decay. For each such class we give an efficient algorithm that learns any distribution in the class from independent samples with respect to total variation distance. As a special case of our general result, we show that d-dimensional shift-invariant distributions which satisfy an exponential tail bound can be learned to total variation distance error ε using Õd(1/εd+2) examples and Õd(1/ε2d+2) time. This implies that, for constant d, multivariate log-concave distributions can be learned in Õd(1/ε2d+2) time using Õd(1/εd+2) samples, answering a question of [29]. All of our results extend to a model of noise-tolerant density estimation using Huber’s contamination model, in which the target distribution to be learned is a (1 − ε, ε) mixture of some unknown distribution in the class with some other arbitrary and unknown distribution, and the learning algorithm must output a hypothesis distribution with total variation distance error O(ε) from the target distribution. We show that our general results are close to best possible by proving a simple Ω 1/εd information-theoretic lower bound on sample complexity even for learning bounded distributions that are shift-invariant.
AB - We study density estimation for classes of shift-invariant distributions over Rd. A multidimensional distribution is “shift-invariant” if, roughly speaking, it is close in total variation distance to a small shift of it in any direction. Shift-invariance relaxes smoothness assumptions commonly used in non-parametric density estimation to allow jump discontinuities. The different classes of distributions that we consider correspond to different rates of tail decay. For each such class we give an efficient algorithm that learns any distribution in the class from independent samples with respect to total variation distance. As a special case of our general result, we show that d-dimensional shift-invariant distributions which satisfy an exponential tail bound can be learned to total variation distance error ε using Õd(1/εd+2) examples and Õd(1/ε2d+2) time. This implies that, for constant d, multivariate log-concave distributions can be learned in Õd(1/ε2d+2) time using Õd(1/εd+2) samples, answering a question of [29]. All of our results extend to a model of noise-tolerant density estimation using Huber’s contamination model, in which the target distribution to be learned is a (1 − ε, ε) mixture of some unknown distribution in the class with some other arbitrary and unknown distribution, and the learning algorithm must output a hypothesis distribution with total variation distance error O(ε) from the target distribution. We show that our general results are close to best possible by proving a simple Ω 1/εd information-theoretic lower bound on sample complexity even for learning bounded distributions that are shift-invariant.
KW - Density estimation
KW - Log-concave distributions
KW - Non-parametrics
KW - Unsupervised learning
UR - http://www.scopus.com/inward/record.url?scp=85069444156&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85069444156&partnerID=8YFLogxK
U2 - 10.4230/LIPIcs.ITCS.2019.28
DO - 10.4230/LIPIcs.ITCS.2019.28
M3 - Conference contribution
AN - SCOPUS:85069444156
T3 - Leibniz International Proceedings in Informatics, LIPIcs
BT - 10th Innovations in Theoretical Computer Science, ITCS 2019
A2 - Blum, Avrim
PB - Schloss Dagstuhl- Leibniz-Zentrum fur Informatik GmbH, Dagstuhl Publishing
T2 - 10th Innovations in Theoretical Computer Science, ITCS 2019
Y2 - 10 January 2019 through 12 January 2019
ER -