A stochastic variance reduction method for PCA by an exact penalty approach
Bull. Korean Math. Soc. 2018 Vol. 55, No. 4, 1303-1315
Published online May 2, 2018
Printed July 31, 2018
Yoon Mo Jung, Jae Hwa Lee, Sangwoon Yun
Sungkyunkwan University, Sungkyunkwan University, Sungkyunkwan University
Abstract : For principal component analysis (PCA) to efficiently analyze large scale matrices, it is crucial to find a few singular vectors in cheaper computational cost and under lower memory requirement. To compute those in a fast and robust way, we propose a new stochastic method. Especially, we adopt the stochastic variance reduced gradient (SVRG) method \cite{JZ} to avoid asymptotically slow convergence in stochastic gradient descent methods. For that purpose, we reformulate the PCA problem as a unconstrained optimization problem using a quadratic penalty. In general, increasing the penalty parameter to infinity is needed for the equivalence of the two problems. However, in this case, exact penalization is guaranteed by applying the analysis in \cite{WYLZ}. We establish the convergence rate of the proposed method to a stationary point and numerical experiments illustrate the validity and efficiency of the proposed method.
Keywords : principal component analysis, stochastic variance reduction, exact penalty
MSC numbers : 62H25, 90C30, 15A18
Downloads: Full-text PDF  

Copyright © Korean Mathematical Society. All Rights Reserved.
The Korea Science Technology Center (Rm. 411), 22, Teheran-ro 7-gil, Gangnam-gu, Seoul 06130, Korea
Tel: 82-2-565-0361  | Fax: 82-2-565-0364  | E-mail: paper@kms.or.kr   | Powered by INFOrang Co., Ltd