Abstract
This paper deals with arbitrarily distributed finite-power input signals observed through an additive Gaussian noise channel. It shows a new formula that connects the input-output mutual information and the minimum mean-square error (MMSE) achievable by optimal estimation of the input given the output. That is, the derivative of the mutual information (nats) with respect to the signal-to-noise ratio (SNR) is equal to half the MMSE, regardless of the input statistics. This relationship holds for both scalar and vector signals, as well as for discrete-time and continuous-time noncausal MMSE estimation. This fundamental information-theoretic result has an unexpected consequence in continuous-time nonlinear estimation: For any input signal with finite power, the causal filtering MMSE achieved at SNR is equal to the average value of the noncausal smoothing MMSE achieved with a channel whose SNR is chosen uniformly distributed between 0 and SNR.
Original language | English (US) |
---|---|
Pages (from-to) | 1261-1282 |
Number of pages | 22 |
Journal | IEEE Transactions on Information Theory |
Volume | 51 |
Issue number | 4 |
DOIs | |
State | Published - Apr 2005 |
Keywords
- Gaussian channel
- Minimum mean-square error (MMSE)
- Mutual information
- Nonlinear filtering
- Optimal estimation
- Smoothing
- Wiener process
ASJC Scopus subject areas
- Information Systems
- Computer Science Applications
- Library and Information Sciences