Asymptotically exact error analysis for the generalized equation-LASSO
Paper i proceeding, 2015

Given an unknown signal x0 ϵ ℝn and linear noisy measurements y = Ax0 + σv ϵ ℝm, the generalized equation-LASSO solves equation. Here, f is a convex regularization function (e.g. ℓ1-norm, nuclear-norm) aiming to promote the structure of x0 (e.g. sparse, low-rank), and, λ ≥ 0 is the regularizer parameter. A related optimization problem, though not as popular or well-known, is often referred to as the generalized ℓ2-LASSO and takes the form equation, and has been analyzed by Oymak, Thrampoulidis and Hassibi. Oymak et al. further made conjectures about the performance of the generalized equation-LASSO. This paper establishes these conjectures rigorously. We measure performance with the normalized squared error equation. Assuming the entries of A are i.i.d. Gaussian N(0, 1/m) and those of v are i.i.d. N(0, 1), we precisely characterize the 'asymptotic NSE' aNSE :=limσ→0 NSE(σ) when the problem dimensions tend to infinity in a proportional manner. The role of λ, f and x0 is explicitly captured in the derived expression via means of a single geometric quantity, the Gaussian distance to the subdifferential. We conjecture that aNSE = supσ>0 NSE(σ). We include detailed discussions on the interpretation of our result, make connections to relevant literature and perform computational experiments that validate our theoretical findings.


Squared errors

Computation theory

Convex regularizations

Noisy measurements

Generalized Equations

Optimization problems

Information theory

Computational experiment


Geometric quantities


C. Thrampoulidis

California Institute of Technology

Ashkan Panahi

Signaler och system, Signalbehandling och medicinsk teknik, Signalbehandling

B. Hassibi

California Institute of Technology

2015 IEEE International Symposium on Information Theory (ISIT)

2157-8095 (ISSN)



Elektroteknik och elektronik