爱可可AI论文推介(10月17日)( 三 )


爱可可AI论文推介(10月17日)文章插图
爱可可AI论文推介(10月17日)文章插图
爱可可AI论文推介(10月17日)文章插图
5、[LG] Neograd: gradient descent with an adaptive learning rate
【爱可可AI论文推介(10月17日)】M F. Zimmer
具有自适应学习率的梯度下降算法族Neograd , 用基于更新错误率指标ρ的公式化估计 , 可在训练的每一步动态调整学习速率 , 而不必试运行以估计整个优化过程的单一学习速率 , 增加的额外成本微不足道 。 该算法族成员NeogradM , 可迅速达到比其他一阶算法低得多的代价函数值 , 性能有很大提高 。
Since its inception by Cauchy in 1847, the gradient descent algorithm has been without guidance as to how to efficiently set the learning rate. This paper identifies a concept, defines metrics, and introduces algorithms to provide such guidance. The result is a family of algorithms (Neograd) based on a constant ρ ansatz, where ρ is a metric based on the error of the updates. This allows one to adjust the learning rate at each step, using a formulaic estimate based on ρ. It is now no longer necessary to do trial runs beforehand to estimate a single learning rate for an entire optimization run. The additional costs to operate this metric are trivial. One member of this family of algorithms, NeogradM, can quickly reach much lower cost function values than other first order algorithms. Comparisons are made mainly between NeogradM and Adam on an array of test functions and on a neural network model for identifying hand-written digits. The results show great performance improvements with NeogradM.
爱可可AI论文推介(10月17日)文章插图
爱可可AI论文推介(10月17日)文章插图
爱可可AI论文推介(10月17日)文章插图


推荐阅读