Lower Bounds on the Recovery of Random Signals (e.g., Bayesian Lower Bounds)Brief OverviewFinding lower bounds on the minimum mean sqaured error (MMSE), which is often difficult to compute in closed form, is an important aspect in signal estimation as such lower bounds provide fundamental limits on signal recovery. Broadly speaking there are three type of families or approaches for finding lower bounds: 1) WeinsteinWeiss type bounds (of which the Cram'erRao lower bound is a special case). This family of bounds is derived through the CauchySchwarz inequality. 2) Ziv Zaki type bounds . This family of bounds connects estimation theoretic quantities to the binary hypothesis testing. 3) Bounds via maximum entropy principle. The classical bound of this type is the socalled continuous Fano's inequality, that relates conditional entropy and the MMSE. Recently, in Preprint 1 we have proposed a new approach for finding lower bounds on the MMSE. The approach consists of two parts: i) For the distribution of the signal of interest find the best Gaussian approximation in terms of some probabilistic distance or divergence (e.g., Kullback–Leibler divergence). Let the best Gaussian distribution be denoted by \(Q_0\) the value of approximation be denote by \(\epsilon\); and ii) minimize the MMSE over all possible distributions lying in the \(\epsilon\)ball centered at \(Q_0\) where the \(\epsilon\)ball is defined with respect to the chosen distance or divergence. For example, the choice of Kullback–Leibler divergence results in bounds that can be tighter than the Cram'erRao lower bound and defined for a larger class of signal distributions. Finally, the lower bounds easily extend to linear combinations of MMSE's and are useful in the study of the efficiency of decentralized estimation.] Journal Papers
Conference Papers
