Accelerated proximal incremental algorithm schemes for non-strongly convex functions
Journal article, 2019

There have been a number of recent advances in accelerated gradient and proximal schemes for optimization of convex finite sum problems. Defazio introduced a simple accelerated scheme for incremental stochastic proximal algorithms inspired by gradient based methods like SAGA. He was able to prove O(1/k) convergence for non-smooth function but only under the assumption of strong convexity of component terms. We introduce a slight modification of his scheme, called MP-SAGA for which we can prove O(1/k) convergence without strong convexity, but for smooth functions. Numerical results show that our method has better or comparable convergence to Defazio's scheme, even for non-strongly convex functions. As important special cases, we also derive an accelerated schemes for a multi–class formulation of SVM as well as clustering based on the SON regularization. Finally, we introduce a simplification of Point–SAGA, called SP–SAGA for problems such as SON with large number of variables and sparse relation between variables and objective terms.

SAGA

Support vector machine

Rate analysis

Vector clustering

Stochastic average gradient

SAG

Convex optimization

Non-smooth optimization

Stochastic optimization

Author

Ashkan Panahi

North Carolina State University

Morteza Haghir Chehreghani

Chalmers, Computer Science and Engineering (Chalmers), Data Science

Devdatt Dubhashi

Chalmers, Computer Science and Engineering (Chalmers), Data Science

Theoretical Computer Science

0304-3975 (ISSN)

Vol. 812 203-213

Subject Categories

Computational Mathematics

Control Engineering

Mathematical Analysis

DOI

10.1016/j.tcs.2019.10.030

More information

Latest update

2/27/2020