A simple proof of the entropy-power inequality

Sergio Verdú*, Dongning Guo

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

101 Scopus citations

Abstract

This correspondence gives a simple proof of Shannon's entropy-power inequality (EPI) using the relationship between mutual information and minimum mean-square error (MMSE) in Gaussian channels.

Original languageEnglish (US)
Pages (from-to)2165-2166
Number of pages2
JournalIEEE Transactions on Information Theory
Volume52
Issue number5
DOIs
StatePublished - May 2006

Funding

Manuscript received July 13, 2005. This work was supported in part by the National Science Foundation under Grants NCR-0074277 and CCR-0312879. S. Verdú is with the Department of Electrical Engineering, Princeton University, Princeton, NJ 08544 USA (e-mail: [email protected]). D. Guo is with the Department of Electrical Engineering and Computer Science, Northwestern University, Evanston, IL 60208 USA (e-mail: [email protected]). Communicated by Y. Steinberg, Associate Editor for Shannon Theory. Digital Object Identifier 10.1109/TIT.2006.872978 1For convenience, throughout this correspondence we assume that all logarithms are natural.

Keywords

  • Differential entropy
  • Entropy-power inequality (EPI)
  • Minimum mean-square error (MMSE)

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Fingerprint

Dive into the research topics of 'A simple proof of the entropy-power inequality'. Together they form a unique fingerprint.

Cite this