Derivative of mutual information at zero SNR: The gaussian-noise case

Yihong Wu*, Dongning Guo, Sergio Verdú

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

Assuming additive Gaussian noise, a general sufficient condition on the input distribution is established to guarantee that the ratio of mutual information to signal-to-noise ratio (SNR) goes to one half nat as SNR vanishes. The result allows SNR-dependent input distribution and side information.

Original languageEnglish (US)
Article number5953516
Pages (from-to)7307-7312
Number of pages6
JournalIEEE Transactions on Information Theory
Volume57
Issue number11
DOIs
StatePublished - Nov 2011

Funding

Manuscript received December 19, 2010; revised April 22, 2011; accepted June 06, 2011. Date of current version November 11, 2011. This work was supported in part by the NSF under grants CCF-0644344 and CCF-0635154. Y. Wu and S. Verdú are with the Department of Electrical Engineering, Princeton University, Princeton, NJ 08544 USA. D. Guo is with the Department of Electrical Engineering and Computer Science, Northwestern University, Evanston, IL 60208 USA. Communicated by D. Palomar, Associate Editor for Detection and Estimation. Digital Object Identifier 10.1109/TIT.2011.2161752 1A finer Taylor series expansion is found in [5]. 2Throughout the paper, natural logarithms are adopted and information units are nats.

Keywords

  • Gaussian noise
  • low-power regime
  • minimum mean-square error (MMSE)
  • mutual information
  • signal-to-noise ratio (SNR)

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Library and Information Sciences

Fingerprint

Dive into the research topics of 'Derivative of mutual information at zero SNR: The gaussian-noise case'. Together they form a unique fingerprint.

Cite this