期刊文献+

Adaptive Linearized Alternating Direction Method of Multipliers for Non-Convex Compositely Regularized Optimization Problems 被引量:5

Adaptive Linearized Alternating Direction Method of Multipliers for Non-Convex Compositely Regularized Optimization Problems
原文传递
导出
摘要 We consider a wide range of non-convex regularized minimization problems, where the non-convex regularization term is composite with a linear function engaged in sparse learning. Recent theoretical investigations have demonstrated their superiority over their convex counterparts. The computational challenge lies in the fact that the proximal mapping associated with non-convex regularization is not easily obtained due to the imposed linear composition. Fortunately, the problem structure allows one to introduce an auxiliary variable and reformulate it as an optimization problem with linear constraints, which can be solved using the Linearized Alternating Direction Method of Multipliers (LADMM). Despite the success of LADMM in practice, it remains unknown whether LADMM is convergent in solving such non-convex compositely regularized optimizations. In this research, we first present a detailed convergence analysis of the LADMM algorithm for solving a non-convex compositely regularized optimization problem with a large class of non-convex penalties. Furthermore, we propose an Adaptive LADMM (AdaLADMM) algorithm with a line-search criterion. Experimental results on different genres of datasets validate the efficacy of the proposed algorithm. We consider a wide range of non-convex regularized minimization problems, where the non-convex regularization term is composite with a linear function engaged in sparse learning. Recent theoretical investigations have demonstrated their superiority over their convex counterparts. The computational challenge lies in the fact that the proximal mapping associated with non-convex regularization is not easily obtained due to the imposed linear composition. Fortunately, the problem structure allows one to introduce an auxiliary variable and reformulate it as an optimization problem with linear constraints, which can be solved using the Linearized Alternating Direction Method of Multipliers (LADMM). Despite the success of LADMM in practice, it remains unknown whether LADMM is convergent in solving such non-convex compositely regularized optimizations. In this research, we first present a detailed convergence analysis of the LADMM algorithm for solving a non-convex compositely regularized optimization problem with a large class of non-convex penalties. Furthermore, we propose an Adaptive LADMM (AdaLADMM) algorithm with a line-search criterion. Experimental results on different genres of datasets validate the efficacy of the proposed algorithm.
出处 《Tsinghua Science and Technology》 SCIE EI CAS CSCD 2017年第3期328-341,共14页 清华大学学报(自然科学版(英文版)
基金 supported by the National Natural Science Foundation of China(Nos.61303264,61202482,and 61202488) Guangxi Cooperative Innovation Center of Cloud Computing and Big Data(No.YD16505) Distinguished Young Scientist Promotion of National University of Defense Technology
关键词 adaptive linearized alternating direction method of multipliers non-convex compositely regularizedoptimization cappled-ll regularized logistic regression adaptive linearized alternating direction method of multipliers non-convex compositely regularizedoptimization cappled-ll regularized logistic regression
  • 相关文献

同被引文献22

引证文献5

二级引证文献14

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部