Â鶹´«Ã½AV

Pointwise relations between information and estimation in Gaussian noise
Kartik Venkat Tsachy Weissman
Proceedings of the Â鶹´«Ã½AV International Symposium on Information Theory, Cambridge, MA, USA, July 2012
Abstract

Many of the classical and recentÌý relations Ìý between Ìý information ÌýandÌý estimation Ìýin the presence of Gaussian noise can be viewed as identitiesÌý between Ìýexpectations of random quantities. These include the I-MMSE relationship of Guo et al.; the relative entropy and mismatchedÌý estimation relationship of Verdu; the relationshipÌý between ÌýcausalÌý estimation Ìýand mutualÌý information Ìýof Duncan, and its extension to the presence of feedback by Kadota et al.; the relationshipÌý between Ìýcausal and non-casualÌý estimation Ìýof Guo et al., and its mismatched version of Weissman. We dispense with the expectations and explore the nature of theÌý pointwise Ìý relations Ìý between Ìýthe respective random quantities. TheÌý pointwise Ìý relations Ìýthat we find are as succinctly stated as - and give considerable insight into - the original expectation identities. As an illustration of our results, consider Duncan's 1970 discovery that the mutualÌý information Ìýis equal to the causal MMSE in the AWGN channel, which can equivalently be expressed saying that the differenceÌý between Ìýthe input-outputÌý information density and half the causalÌý estimation Ìýerror is a zero mean random variable (regardless of the distribution of the channel input). We characterize this random variable explicitly, rather than merely its expectation. ClassicalÌý estimation ÌýandÌý information Ìýtheoretic quantities emerge with new and surprising roles. For example, the variance of this random variable turns out to be given by the causal MMSE (which, in turn, is equal to the mutualÌý information Ìýby Duncan's result).