期刊文献+

A NEW STEPSIZE FOR THE STEEPEST DESCENT METHOD 被引量:16

A NEW STEPSIZE FOR THE STEEPEST DESCENT METHOD
原文传递
导出
摘要 The steepest descent method is the simplest gradient method for optimization. It is well known that exact line searches along each steepest descent direction may converge very slowly. An important result was given by Barzilar and Borwein, which is proved to be superlinearly convergent for convex quadratic in two dimensional space, and performs quite well for high dimensional problems. The BB method is not monotone, thus it is not easy to be generalized for general nonlinear functions unless certain non-monotone techniques being applied. Therefore, it is very desirable to find stepsize formulae which enable fast convergence and possess the monotone property. Such a stepsize αk for the steepest descent method is suggested in this paper. An algorithm with this new stepsize in even iterations and exact line search in odd iterations is proposed. Numerical results are presented, which confirm that the new method can find the exact solution within 3 iteration for two dimensional problems. The new method is very efficient for small scale problems. A modified version of the new method is also presented, where the new technique for selecting the stepsize is used after every two exact line searches. The modified algorithm is comparable to the Barzilar-Borwein method for large scale problems and better for small scale problems. The steepest descent method is the simplest gradient method for optimization. It is well known that exact line searches along each steepest descent direction may converge very slowly. An important result was given by Barzilar and Borwein, which is proved to be superlinearly convergent for convex quadratic in two dimensional space, and performs quite well for high dimensional problems. The BB method is not monotone, thus it is not easy to be generalized for general nonlinear functions unless certain non-monotone techniques being applied. Therefore, it is very desirable to find stepsize formulae which enable fast convergence and possess the monotone property. Such a stepsize αk for the steepest descent method is suggested in this paper. An algorithm with this new stepsize in even iterations and exact line search in odd iterations is proposed. Numerical results are presented, which confirm that the new method can find the exact solution within 3 iteration for two dimensional problems. The new method is very efficient for small scale problems. A modified version of the new method is also presented, where the new technique for selecting the stepsize is used after every two exact line searches. The modified algorithm is comparable to the Barzilar-Borwein method for large scale problems and better for small scale problems.
作者 Ya-xiang Yuan
机构地区 LSEC
出处 《Journal of Computational Mathematics》 SCIE EI CSCD 2006年第2期149-156,共8页 计算数学(英文)
关键词 Steepest descent Line search Unconstrained optimization Convergence. Steepest descent, Line search, Unconstrained optimization, Convergence.
  • 相关文献

参考文献15

  • 1H. Akaike, On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method, Ann. Inst. Statist. Math. Tokyo, 11 (1959), 1-16.
  • 2J. Barzilai and J.M. Borwein, Two point step size gradient methods, IMA J. Numer. Anal, 8(1988), 141-148.
  • 3A. Cauchy, Methode generale pour la resolution des systems d'equations simultanees, Comp.Rend. Sci. Paris, 25 (1847), 46-89.
  • 4Y.H. Dai, Alternate step gradient method, Report AMSS-2001-041, Academy of Mathematics and Systems Sciences, Chinese Academy of Sciences, 2001.
  • 5Y.H. Dai, J.Y. Yuan, and Y. Yuan, Modified two-point stepsize gradient methods for unconstrained optimization, Computational Optimization and Applications, 22 (2002), 103-109.
  • 6Y.H. Dai and Y. Yuan, Alternate minimization gradient method, IMA Journal of Numerical Analysis, 23 (2003), 377-393.
  • 7Y.H. Dai and H. Zhang, An Adaptive Two-Point Stepsize Gradient Method, Research report, Institute of Computational Mathematics and Scientific/Engineering Computing, Chinese Academy of Sciences, 2001.
  • 8R. Fletcher, Practical Methods of Optimization(second Edition), John Wiley and Sons, Chichester, 1987.
  • 9R. Fletcher, On the Barzilar-Borwein method, Research Report, University of Dundee, UK, 2001.
  • 10G. E. Forsythe, On the asymptotic directions of the s-dimensional optimum gradient method,Numerische Mathematik, 11 (1968), 57-76.

同被引文献21

引证文献16

二级引证文献58

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部