Abstract
This paper establishes new information-estimation relationships pertaining to models with additive noise of arbitrary distribution. In particular, we study the change in the relative entropy between two probability measures when both of them are perturbed by a small amount of the same additive noise. It is shown that the rate of the change with respect to the energy of the perturbation can be expressed in terms of the mean squared difference of the score functions of the two distributions, and, rather surprisingly, is unrelated to the distribution of the perturbation otherwise. The result holds true for the classical relative entropy (or Kullback-Leibler distance), as well as two of its generalizations: Rényi's relative entropy and the f-divergence. The result generalizes a recent relationship between the relative entropy and mean squared errors pertaining to Gaussian noise models, which in turn supersedes many previous information- estimation relationships. A generalization of the de Bruijn identity to non-Gaussian models can also be regarded as consequence of this new result.
Original language | English (US) |
---|---|
Title of host publication | 2009 IEEE International Symposium on Information Theory, ISIT 2009 |
Pages | 814-818 |
Number of pages | 5 |
DOIs | |
State | Published - Nov 19 2009 |
Event | 2009 IEEE International Symposium on Information Theory, ISIT 2009 - Seoul, Korea, Republic of Duration: Jun 28 2009 → Jul 3 2009 |
Other
Other | 2009 IEEE International Symposium on Information Theory, ISIT 2009 |
---|---|
Country | Korea, Republic of |
City | Seoul |
Period | 6/28/09 → 7/3/09 |
ASJC Scopus subject areas
- Theoretical Computer Science
- Information Systems
- Modeling and Simulation
- Applied Mathematics