女王调教

您所在的位置:网站女王调教 > 学术活动 > 学术报告 > 正文

A linearly convergent stochastic recursive gradient method for convex optimization
发布时间:2019-11-09 00:00:00 访问次数: 字号:
报告地点:行健楼-526
邀请人:姜波 副教授
摘要:The stochastic recursive gradient algorithm (SARAH) attracts much interest recently. It admits a simple recursive framework for updating stochastic gradient estimates. Motivated by this, in this paper, we propose a SARAH-I method incorporating importance sampling, whose linear convergence rate of the sequence of distances between iterates and the optima set is proven under both strong convexity and non-strong convexity conditions. Further, we propose to use the Barzilai-Borwein (BB) method to automatically compute step sizes for SARAH-I, named as SARAH-I-BB, and we establish its convergence and complexity properties in different cases. Finally numerical tests are reported to indicate promising performances of SARAH-I-BB.