女王调教

您所在的位置:网站女王调教 > 学术活动 > 学术报告 > 正文

New stepsizes for the gradient method
发布时间:2019-10-25 00:00:00 访问次数: 字号:
报告地点:行健楼-526
邀请人: 姜波 副教授
摘要:It is popular to solve large scale problems by gradient methods. We propose a new framework combining Cauchy steps with fixed step lengths, to update stepsizes in a cyclic way. Four different gradient algorithms are proposed with various fixed step lengths. For 2-dimensional convex quadratic function minimization problems, the algorithms either terminate in finite iterations or converges superlinearly; for n-dimensional problems, they all converge linearly. Moreover, we propose new stepsizes based on the analysis to find the optimal solution in 5 iterations for 3-dimensional convex quadratic function minimization problems. By plugging the new stepsizes into the proposed cyclic framework, we have new gradient methods, which guarantee finite terminations for 3-dimensional problems, and converge R-linearly for general n-dimensional problems. Numerical tests show the superior performance of the proposed method over the states of the art.