ResearchBib Share Your Research, Maximize Your Social Impacts
Sign for Notice Everyday Sign up >> Login

Review on Text Sequence Processing with use of different Deep Neural Network Model

Journal: International Journal of Advanced Trends in Computer Science and Engineering (IJATCSE) (Vol.8, No. 5)

Publication Date:

Authors : ; ;

Page : 2224-2230

Keywords : Bidirectional Encoder-Decoder Techniques; CNN; GRU-RNN; LSTM-RNN; Natural Language Processing (NLP); Sequence Learning; Text Processing.;

Source : Downloadexternal Find it from : Google Scholarexternal

Abstract

This paper underlines the importance of Deep Neural Network model in different learning tasks for Natural language processing. Deep learning techniques carry a lot of support to handle a large amount of trained or labeled data sets. Deep Neural Network model involves multiple processing layers to learn Sequential representations of Text data to achieve excellent performance in many domains. In this paper, we review significant deep learning models with end to end approach for Natural language processing that makes a more accurate assumption on the word sequence as well as represents a different experimental analysis for NLP tasks in terms of text and words with the help of CNN, RNN, LSTM, and GRU bidirectional Encode-Decoder. Numerous NLP tasks are involved in semantic execution to understand and generate a complete sentence where different Deep learning models represent awareness about numerous NLP tasks including sequential information processing. That's a reason why Deep Neural Network Model gradually became more famous for different NLP applications. Even we have endorsed reinforcement learning as an extension to the Deep neural model which is frequently used in Natural Language Processing. This paper differentiates the Deep Neural Model in various language processing tasks and directs the selection of Deep Neural Model in different NLP tasks.

Last modified: 2019-11-11 18:41:04