This monograph surveys the interactions between information measures and estimation measures as well as their applications. The emphasis is on formulas that express the major information measures, such as entropy, mutual information and relative entropy in terms of the minimum mean square error achievable when estimating random variables contaminated by Gaussian noise. These relationships lead to wide applications ranging from a universal relationship in continuoustime nonlinear filtering to optimal power allocation in communication systems, to the simplified proofs of important results in information theory such as the entropy power inequality and converses in multiuser information theory.
ASJC Scopus subject areas
- Signal Processing