Abstract
This correspondence gives a simple proof of Shannon's entropy-power inequality (EPI) using the relationship between mutual information and minimum mean-square error (MMSE) in Gaussian channels.
Original language | English (US) |
---|---|
Pages (from-to) | 2165-2166 |
Number of pages | 2 |
Journal | IEEE Transactions on Information Theory |
Volume | 52 |
Issue number | 5 |
DOIs | |
State | Published - May 2006 |
Funding
Manuscript received July 13, 2005. This work was supported in part by the National Science Foundation under Grants NCR-0074277 and CCR-0312879. S. Verdú is with the Department of Electrical Engineering, Princeton University, Princeton, NJ 08544 USA (e-mail: [email protected]). D. Guo is with the Department of Electrical Engineering and Computer Science, Northwestern University, Evanston, IL 60208 USA (e-mail: [email protected]). Communicated by Y. Steinberg, Associate Editor for Shannon Theory. Digital Object Identifier 10.1109/TIT.2006.872978 1For convenience, throughout this correspondence we assume that all logarithms are natural.
Keywords
- Differential entropy
- Entropy-power inequality (EPI)
- Minimum mean-square error (MMSE)
ASJC Scopus subject areas
- Information Systems
- Computer Science Applications
- Library and Information Sciences