Please use this identifier to cite or link to this item:
http://idr.nitk.ac.in/jspui/handle/123456789/13726
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Shiva Prakash B. | |
dc.contributor.author | Sanjeev K.V. | |
dc.contributor.author | Prakash R. | |
dc.contributor.author | Chandrasekaran K. | |
dc.date.accessioned | 2020-03-31T14:15:19Z | - |
dc.date.available | 2020-03-31T14:15:19Z | - |
dc.date.issued | 2019 | |
dc.identifier.citation | Advances in Intelligent Systems and Computing, 2019, Vol.817, pp.57-66 | en_US |
dc.identifier.uri | 10.1007/978-981-13-1595-4_5 | |
dc.identifier.uri | http://idr.nitk.ac.in/jspui/handle/123456789/13726 | - |
dc.description.abstract | The expanding textual information and significance of examining the substance has started a colossal research in the field of synopsis. Text summarization is the process of conveying the gist of a text with a minimized representation. The requirement for automation of the procedure is at its apex with exponential burst of information because of digitization. Text captioning comes under the branch of abstractive summarization which captures the gist of the article in a few words. In this paper, we present an approach to text captioning using recurrent neural networks which comprise of an encoder–decoder model. The key challenges dealt here was to figure out the ideal input required to produce the desired output. The model performs better when the input is fed with the summary as compared to the original article itself. The recurrent neural network model with LSTM results has been effective in transcribing a caption for the textual data. © Springer Nature Singapore Pte Ltd. 2019 | en_US |
dc.title | A survey on recurrent neural network architectures for sequential learning | en_US |
dc.type | Book Chapter | en_US |
Appears in Collections: | 3. Book Chapters |
Files in This Item:
There are no files associated with this item.
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.