Effect of Local Dynamic Learning Rate Adaptation on Mini-Batch Gradient Descent for Training Deep Learning Models with Back Propagation Algorithm
Journal: International Journal of Science and Research (IJSR) (Vol.9, No. 4)Publication Date: 2020-04-05
Authors : Joyce K. Ndauka; Zheng Xiao Yan;
Page : 339-342
Keywords : Mini-batch gradient descent; Learning rate; Backpropagation;
Abstract
Backpropagation has gained much popularity for training neural network models including Deep learning models. Despite its popularity, ordinary backpropagation suffers from low convergence rate due existence of constant learning rate. Since error surface is not smooth learning rate needs to be dynamically adapted to speed up rate of convergence. Much work has been done showing benefits of employing dynamic learning rate on backpropagation algorithm to speed up the rate of convergence focusing on global learning rate adaption and local adaptive learning rate on batch gradient descent. In this work we tried to observe the effect of local dynamic learning rate adaptation using improved iRprop- algorithm with mini batch gradient descent to improve convergence rate of back propagation. Experiment was conducted in python using cifar 10 dataset. Results shows the proposed algorithm outperform the ordinary back propagation algorithm in terms of speed when batch size.
Other Latest Articles
- Comparative Study of the Current Processes used by Insurance Companies and application of Block Chain Technology on Business Performance in Kenya
- Adoption Level of the Use of Organic Fertilizer for Lowland Rice in Cikoneng Subdistrict Ciamis
- Socially Significant Diseases of Childhood and Adolescence: Cerebral Palsy and Its Causes, Risk Factors
- An Architectural Mode and Its Growth on the Buddha Image House Design in Sri Lanka (From the 2nd Century AC up to 13th AC)
- Relationship between Procrastination, Self-Esteem, Self-Efficacy and Motivation among College Students
Last modified: 2021-06-28 17:03:45