Mutual information and minimum mean-square error in Gaussian channels

Dongning Guo*, Shlomo Shamai, Sergio Verdú

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

881 Scopus citations

Abstract

This paper deals with arbitrarily distributed finite-power input signals observed through an additive Gaussian noise channel. It shows a new formula that connects the input-output mutual information and the minimum mean-square error (MMSE) achievable by optimal estimation of the input given the output. That is, the derivative of the mutual information (nats) with respect to the signal-to-noise ratio (SNR) is equal to half the MMSE, regardless of the input statistics. This relationship holds for both scalar and vector signals, as well as for discrete-time and continuous-time noncausal MMSE estimation. This fundamental information-theoretic result has an unexpected consequence in continuous-time nonlinear estimation: For any input signal with finite power, the causal filtering MMSE achieved at SNR is equal to the average value of the noncausal smoothing MMSE achieved with a channel whose SNR is chosen uniformly distributed between 0 and SNR.

Original languageEnglish (US)
Pages (from-to)1261-1282
Number of pages22
JournalIEEE Transactions on Information Theory
Volume51
Issue number4
DOIs
StatePublished - Apr 2005

Keywords

  • Gaussian channel
  • Minimum mean-square error (MMSE)
  • Mutual information
  • Nonlinear filtering
  • Optimal estimation
  • Smoothing
  • Wiener process

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Fingerprint

Dive into the research topics of 'Mutual information and minimum mean-square error in Gaussian channels'. Together they form a unique fingerprint.

Cite this