HomeResearchCompressive Sensing and Sparse Signal RecoveryA Topological Viewpoint to Exact and Robust Sparse Recovery for (F)-minimization

A Topological Viewpoint to Exact and Robust Sparse Recovery for \(F\)-minimization

1  Introduction

This work Robustness of Sparse Recovery via F-minimization: A Topological Viewpoint by Jingbo Liu, Jian Jin, and Yuantao Gu has been published in IEEE Transactions on Information Theory, 61(7):3996-4014, 2015. Part of this work will be presented at IEEE International Symposium on Information Theory (ISIT 2013).

In fields such as compressed sensing or statistical learning, a basic model problem is to recover a sparse signal $\mathbf{\bar{x}}\in\mathbb{R}^n$ from a set of noisy linear measurements $\mathbf{y}=\mathbf{A}\mathbf{\bar{x}}+\mathbf{v}\in\mathbb{R}^m$, where $m\le n$. Ideally, the optimal reconstruction method is the $l_0$-norm minimization method:
\begin{equation}\label{l0problem}
\min_{\mathbf{x}\in \mathbb{R}^n}~\|\mathbf{x}\|_0\quad\textrm{s.t.}~\bf Ax=y.
\end{equation}

Since (\ref{l0problem}) is a hard combinatorial problem, $l_1$-minimization (BP) is usually adopted as a computationally tractable alternative to (\ref{l0problem}). Recently however, there is a trend to consider nonlinear functions in place of the $l_1$ cost functions. In other words, a general cost function is employed.
$$
J({\mathbf{x}}):=\sum_{k=1}^n F(|x_k|)
$$

Examples of such an $F$ include: $l_p$ cost function ($0<p<1$) and several approximate $l_0$ cost functions.Various practical algorithms can be adapted to these nonlinear problems, e.g. IRLS, iterative thresholding, and Zero-point Attracting Projection (ZAP).In general the nonlinear algorithms empirically outperforms $l_1$-minimization,because nonlinear cost functions can better promote sparsity than the $l_1$ cost  function.

 

 

 

          Figure 1: typical sparseness measures

In this work, from a topological viewpoint, we reveal the relation between two natural questions for general $F$-minimization:

·           the exact recovery condition (ERC), which requires that all sparse signals can be exactly recovered in the noiseless case, and

·           the robust recovery condition (RRC), which requires that if the measurement is noisy, then the reconstruction error must be bounded by the energy of the noise vector multiplied by a constant factor.

2  Equivalence Lost: A Topological Characterization of RRC

For Lebesgue-almost all measurement matrix $\mathbf{A}$, the null space of $\mathbf{A}$ corresponds to an element in the Grassmannian $G_{l}(\mathbb{R}^n)$; and we can show that this element is sufficient to determine whether ERC and RRC are satisfied. Hence we shall examine $\Omega_J, \Omega^r_J\subset G_l(\mathbb{R}^n)$, which denote the collection of the null spaces satisfying ERC and RRC, respectively.

For example, if two cost functions induced from the sparseness measures $F,G$ satisfy the following condition
\begin{equation}\label{e_comp}
\Omega_{J_G}\subseteq\Omega_{J_F},
\end{equation}

then ERC for $G$-minimization implies ERC for $F$-minimization, i.e., $F$ is a better sparseness than $G$ in the sense of ERC. In the light of this we can describe and compare the performances of different sparseness measures in terms of ERC by a simple set inclusion relation like (\ref{e_comp}).

It is first proved that the necessary and sufficient condition of RRC is $J({\bf z}_T+{\bf n}_T)<J({\bf z}_{T^c}+{\bf n}_{T^c})$ for all ${\bf n}$ in the $d\|{\bf z}\|$-ball of $|T|$-sparse vector $\bf z$. Then with the standard topology on the Grassmannian $G_l({\mathbb{R}^n})$, the following relation is found
$$
\Omega^r_J=\textrm{int}(\Omega_J).
$$

For the special case of $l_p$-minimization, our interior set characterization leads to a topological proof that $\Omega_{l_p}$ is open, hence $\Omega_{l_p}^r=\Omega_{l_p}$, which is an available work that demonstrated ERC is equivalent to RRC for $l_p$-minimization.

3  Equivalence Regained: the Probabilistic Equivalence

In practice the measurement matrix A is usually generated by using i.i.d. entries drawn from some continuous distribution. In this case, with some mild (but not redundant) monotonicity assumption that of $F$ is a non-decreasing function and using Lebesgue density theorem in measure theory, we find that
$$
\mu(\Omega_J\setminus\Omega^r_J)=0,
$$

where $\mu$ denotes the Haar measure on the Grassmannian $G_{l}(\mathbb{R}^n)$. In other words, it is shown that RRC and ERC still occur with the same probability.

More generally, suppose $P$ is the probability measure corresponding to the distribution of the null space of $\mathbf{A}$, and $P$ is absolutely continuous with respect to $\mu$, then $P(\Omega_J\setminus\Omega_J^r)=0$. One can show that this absolute continuity holds if the entries of $\mathbf{A}$ are i.i.d. generated from a certain continuous distribution.

4  Download the paper


[1] J. Liu, J. Jin, and Y. Gu, Robustness of Sparse Recovery via F-minimization: A Topological Viewpoint, submitted for possible publication at IEEE Transactions on Information Theory.

[2] J. Liu, J. Jin, and Y. Gu, Relation between Exact and Robust Recovery for F-minimization: A Topological Viewpoint, accepted for presentation at IEEE International Symposium on Information Theory (ISIT), July 7-12, 2013, Istanbul, Turkey.

5  Contact the authors

For any question and comment about the code and the website, please contact Yuantao Gu and Jingbo Liu, who is now a PhD student at Princeton University.