Soodabeh, Asadi and Vogel, Manfred (2020) A Learning Rate Method for Full-Batch Gradient Descent. Papers on Technical Science, 13. pp. 174-177. ISSN 2601-5773
|
Text
10.33894_mtk-2020.13.33.pdf Available under License Creative Commons Attribution Non-commercial No Derivatives. Download (324kB) | Preview |
Abstract
In this paper, we present a learning rate method for gradient descent using only first order information. This method requires no manual tuning of the learning rate. We applied this method on a linear neural network built from scratch, along with the full-batch gradient descent, where we calculated the gradients for the whole dataset to perform one parameter update. We tested the method on a moderate sized dataset of housing information and compared the result with that of the Adam optimizer used with a sequential neural network model from Keras. The comparison shows that our method finds the minimum in a much fewer number of epochs than does Adam.
Item Type: | Article |
---|---|
Subjects: | T Technology / alkalmazott, műszaki tudományok > TA Engineering (General). Civil engineering (General) / általános mérnöki tudományok |
Depositing User: | Zsolt Baráth |
Date Deposited: | 18 Aug 2022 14:03 |
Last Modified: | 18 Aug 2022 14:03 |
URI: | http://real.mtak.hu/id/eprint/146640 |
Actions (login required)
Edit Item |