It has recently been shown that the derivative of the input-output mutual information of Gaussian noise channels with respect to the signal-to-noise ratio is equal to the minimum mean-square error. This paper considers general additive noise channels where the noise may not be Gaussian distributed. It is found that, for every fixed input distribution, the derivative of the mutual information with respect to the signal strength is equal to the correlation of two conditional mean estimates associated with the input and the noise respectively. Special versions of the result are given in the respective cases of additive exponentially distributed noise, Cauchy noise, Laplace noise, and Rayleigh noise. The previous result on Gaussian noise channels is also recovered as a special case.
|Title of host publication||Proceedings of IEEE International Symposium on Information Theory|
|State||Published - Aug 2005|
|Event||Proceedings of IEEE International Symposium on Information Theory - Adelaide, Australia|
Duration: Aug 1 2005 → …
|Conference||Proceedings of IEEE International Symposium on Information Theory|
|Period||8/1/05 → …|