Bulletin of the
Korean Mathematical Society
BKMS

ISSN(Print) 1015-8634 ISSN(Online) 2234-3016

Article

HOME ALL ARTICLES View

Bull. Korean Math. Soc. 2018; 55(4): 1303-1315

Online first article May 2, 2018      Printed July 31, 2018

https://doi.org/10.4134/BKMS.b170770

Copyright © The Korean Mathematical Society.

A stochastic variance reduction method for PCA by an exact penalty approach

Yoon Mo Jung, Jae Hwa Lee, Sangwoon Yun

Sungkyunkwan University, Sungkyunkwan University, Sungkyunkwan University

Abstract

For principal component analysis (PCA) to efficiently analyze large scale matrices, it is crucial to find a few singular vectors in cheaper computational cost and under lower memory requirement. To compute those in a fast and robust way, we propose a new stochastic method. Especially, we adopt the stochastic variance reduced gradient (SVRG) method \cite{JZ} to avoid asymptotically slow convergence in stochastic gradient descent methods. For that purpose, we reformulate the PCA problem as a unconstrained optimization problem using a quadratic penalty. In general, increasing the penalty parameter to infinity is needed for the equivalence of the two problems. However, in this case, exact penalization is guaranteed by applying the analysis in \cite{WYLZ}. We establish the convergence rate of the proposed method to a stationary point and numerical experiments illustrate the validity and efficiency of the proposed method.

Keywords: principal component analysis, stochastic variance reduction, exact penalty

MSC numbers: 62H25, 90C30, 15A18

Stats or Metrics

Share this article on :

Related articles in BKMS