Modified Long Short-Term Memory Recurrent Neural Network Architectures
Journal: International Journal of Science and Research (IJSR) (Vol.6, No. 11)Publication Date: 2017-11-05
Authors : Manish Rana; Shubham Mishra;
Page : 36-39
Keywords : Long Short-Term Memory; LSTM; recurrent neural network; RNN;
Abstract
Long Short-Term Memory (LSTM) is a specific recurrent neural network (RNN) architecture that was designed to model temporal sequences and their long-range dependencies more accurately than conventional RNNs. In this paper, we explore LSTM RNN architectures and made some changes for its better performance. LSTM RNNs are more effective than DNNs. Here, we have changed the gates calculation and also have removed some unnecessary features of standard LSTM architecture. This architecture makes more effective use of model parameters than the others considered, converges quickly, and outperforms a deep feed forward neural network having an order of magnitude more parameters.
Other Latest Articles
- The Effectiveness of the use of a Pocket book in Sufferers in the Control of Hypertension in Teladan Medan Community Health Center
- Influence of Physical Fitness against for a Long Period Time in First and Second Stage of Labor in Primigravida at Medan City Indonesia
- A Review on NoSQL Databases
- Preliminary Study of Spiders (Order: Araneae) from Taranga Hills
- Children and Adolescents? Fitness Skill Level in Physical Activity: A Motivational Concern for Public Health Education
Last modified: 2021-06-30 20:02:28