Relative entropy and score function: New information-estimation relationships through arbitrary additive perturbation

Dongning Guo*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

27 Scopus citations

Abstract

This paper establishes new information-estimation relationships pertaining to models with additive noise of arbitrary distribution. In particular, we study the change in the relative entropy between two probability measures when both of them are perturbed by a small amount of the same additive noise. It is shown that the rate of the change with respect to the energy of the perturbation can be expressed in terms of the mean squared difference of the score functions of the two distributions, and, rather surprisingly, is unrelated to the distribution of the perturbation otherwise. The result holds true for the classical relative entropy (or Kullback-Leibler distance), as well as two of its generalizations: Rényi's relative entropy and the f-divergence. The result generalizes a recent relationship between the relative entropy and mean squared errors pertaining to Gaussian noise models, which in turn supersedes many previous information- estimation relationships. A generalization of the de Bruijn identity to non-Gaussian models can also be regarded as consequence of this new result.

Original languageEnglish (US)
Title of host publication2009 IEEE International Symposium on Information Theory, ISIT 2009
Pages814-818
Number of pages5
DOIs
StatePublished - 2009
Event2009 IEEE International Symposium on Information Theory, ISIT 2009 - Seoul, Korea, Republic of
Duration: Jun 28 2009Jul 3 2009

Publication series

NameIEEE International Symposium on Information Theory - Proceedings
ISSN (Print)2157-8102

Other

Other2009 IEEE International Symposium on Information Theory, ISIT 2009
Country/TerritoryKorea, Republic of
CitySeoul
Period6/28/097/3/09

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Information Systems
  • Modeling and Simulation
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Relative entropy and score function: New information-estimation relationships through arbitrary additive perturbation'. Together they form a unique fingerprint.

Cite this