Abstract
In recent years, a number of new connections between information measures and estimation have been found under various models, including, predominantly, Gaussian and Poisson models. This paper develops similar results for the binomial and negative binomial models. In particular, it is shown that the derivative of the relative entropy and the derivative of the mutual information for the binomial and negative binomial models can be expressed through the expectation of closed-form expressions that have conditional estimates as the main argument. Under mild conditions, those derivatives take the form of an expected Bregman divergence.
Original language | English (US) |
---|---|
Article number | 6746122 |
Pages (from-to) | 2630-2646 |
Number of pages | 17 |
Journal | IEEE Transactions on Information Theory |
Volume | 60 |
Issue number | 5 |
DOIs | |
State | Published - May 2014 |
Keywords
- Binomial model
- Bregman divergence
- mutual information
- negative binomial model
- relative entropy
ASJC Scopus subject areas
- Information Systems
- Computer Science Applications
- Library and Information Sciences