Bias Versus Non-Convexity in Compressed Sensing
Journal article, 2022

Cardinality and rank functions are ideal ways of regularizing under-determined linear systems, but optimization of the resulting formulations is made difficult since both these penalties are non-convex and discontinuous. The most common remedy is to instead use the ℓ1- and nuclear norms. While these are convex and can therefore be reliably optimized, they suffer from a shrinking bias that degrades the solution quality in the presence of noise. This well-known drawback has given rise to a fauna of non-convex alternatives, which usually features better global minima at the price of maybe getting stuck in undesired local minima. We focus in particular penalties based on the quadratic envelope, which have been shown to have global minima which even coincide with the “oracle solution,” i.e., there is no bias at all. So, which one do we choose, convex with a definite bias, or non-convex with no bias but less predictability? In this article, we develop a framework which allows us to interpolate between these alternatives; that is, we construct sparsity inducing penalties where the degree of non-convexity/bias can be chosen according to the specifics of the particular problem.

Compressed sensing

Quadratic envelopes

Non-convex optimization

Author

Daniele Gerosa

Lund University

Marcus Carlsson

Lund University

Carl Olsson

Chalmers, Electrical Engineering, Signal Processing and Biomedical Engineering

Lund University

Journal of Mathematical Imaging and Vision

0924-9907 (ISSN) 1573-7683 (eISSN)

Vol. 64 4 379-394

Subject Categories

Computational Mathematics

Control Engineering

Mathematical Analysis

DOI

10.1007/s10851-022-01071-5

More information

Latest update

3/7/2024 9