Accelerated proximal incremental algorithm schemes for non-strongly convex functions
Artikel i vetenskaplig tidskrift, 2020

There have been a number of recent advances in accelerated gradient and proximal schemes for optimization of convex finite sum problems. Defazio introduced a simple accelerated scheme for incremental stochastic proximal algorithms inspired by gradient based methods like SAGA. He was able to prove O(1/k) convergence for non-smooth function but only under the assumption of strong convexity of component terms. We introduce a slight modification of his scheme, called MP-SAGA for which we can prove O(1/k) convergence without strong convexity, but for smooth functions. Numerical results show that our method has better or comparable convergence to Defazio's scheme, even for non-strongly convex functions. As important special cases, we also derive an accelerated schemes for a multi–class formulation of SVM as well as clustering based on the SON regularization. Finally, we introduce a simplification of Point–SAGA, called SP–SAGA for problems such as SON with large number of variables and sparse relation between variables and objective terms.

Rate analysis

SAG

Stochastic average gradient

Vector clustering

SAGA

Stochastic optimization

Non-smooth optimization

Support vector machine

Convex optimization

Författare

Ashkan Panahi

North Carolina State University

Morteza Haghir Chehreghani

Chalmers, Data- och informationsteknik, Data Science

Devdatt Dubhashi

Chalmers, Data- och informationsteknik, Data Science

Theoretical Computer Science

0304-3975 (ISSN)

Vol. 812 203-213

Ämneskategorier

Beräkningsmatematik

Reglerteknik

Matematisk analys

DOI

10.1016/j.tcs.2019.10.030

Mer information

Senast uppdaterat

2020-12-18