Skip to yearly menu bar Skip to main content


Poster

Sparse Support Recovery with Non-smooth Loss Functions

Kévin Degraux · Gabriel Peyré · Jalal Fadili · Laurent Jacques

Area 5+6+7+8 #87

Keywords: [ Information Theory ] [ Convex Optimization ] [ (Other) Optimization ] [ (Other) Regression ] [ Sparsity and Feature Selection ] [ Regularization and Large Margin Methods ]


Abstract: In this paper, we study the support recovery guarantees of underdetermined sparse regression using the $\ell_1$-norm as a regularizer and a non-smooth loss function for data fidelity. More precisely, we focus in detail on the cases of $\ell_1$ and $\ell_\infty$ losses, and contrast them with the usual $\ell_2$ loss.While these losses are routinely used to account for either sparse ($\ell_1$ loss) or uniform ($\ell_\infty$ loss) noise models, a theoretical analysis of their performance is still lacking. In this article, we extend the existing theory from the smooth $\ell_2$ case to these non-smooth cases. We derive a sharp condition which ensures that the support of the vector to recover is stable to small additive noise in the observations, as long as the loss constraint size is tuned proportionally to the noise level. A distinctive feature of our theory is that it also explains what happens when the support is unstable. While the support is not stable anymore, we identify an "extended support" and show that this extended support is stable to small additive noise. To exemplify the usefulness of our theory, we give a detailed numerical analysis of the support stability/instability of compressed sensing recovery with these different losses. This highlights different parameter regimes, ranging from total support stability to progressively increasing support instability.

Live content is unavailable. Log in and register to view live content