TY - GEN
T1 - An Abstractive Text Summarization Using Recurrent Neural Network
AU - Debnath, Dipanwita
AU - Pakray, Partha
AU - Das, Ranjita
AU - Gelbukh, Alexander
N1 - Publisher Copyright:
© 2023, Springer Nature Switzerland AG.
PY - 2023
Y1 - 2023
N2 - With the accelerated advancement of technology and massive content surging over the Internet, it has become an arduous task to abstract the information efficiently. However, automatic text summarization provides an acceptable means for fast procurement of such information in the form of a summary through compression and refinement. Abstractive text summarization, in particular, builds an internal semantic representation of the text and uses natural language generation techniques to create summaries closer to human-generated summaries. This paper uses Long Short Term Memory (LSTM) based Recurrent Neural Network to generate comprehensive abstractive summaries. To train LSTM based model requires a corpus having a significant number of instances containing parallel running article and summary pairs. For this purpose, we have used various news corpus, namely DUC 2003, DUC 2004 and Gigaword corpus, after eliminating the noise and other irrelevant data. Experiments and analyses of this work are performed on a subset of these whole corpora and evaluated using ROUGE evaluation. The experimental result verifies the accuracy and validity of the proposed system.
AB - With the accelerated advancement of technology and massive content surging over the Internet, it has become an arduous task to abstract the information efficiently. However, automatic text summarization provides an acceptable means for fast procurement of such information in the form of a summary through compression and refinement. Abstractive text summarization, in particular, builds an internal semantic representation of the text and uses natural language generation techniques to create summaries closer to human-generated summaries. This paper uses Long Short Term Memory (LSTM) based Recurrent Neural Network to generate comprehensive abstractive summaries. To train LSTM based model requires a corpus having a significant number of instances containing parallel running article and summary pairs. For this purpose, we have used various news corpus, namely DUC 2003, DUC 2004 and Gigaword corpus, after eliminating the noise and other irrelevant data. Experiments and analyses of this work are performed on a subset of these whole corpora and evaluated using ROUGE evaluation. The experimental result verifies the accuracy and validity of the proposed system.
KW - Abstractive text summarization
KW - Long short term memory
KW - OpenNMT
KW - ROUGE
KW - Recurrent Neural Network
UR - http://www.scopus.com/inward/record.url?scp=85149916584&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-23804-8_29
DO - 10.1007/978-3-031-23804-8_29
M3 - Contribución a la conferencia
AN - SCOPUS:85149916584
SN - 9783031238031
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 364
EP - 378
BT - Computational Linguistics and Intelligent Text Processing - 19th International Conference, CICLing 2018, Revised Selected Papers
A2 - Gelbukh, Alexander
PB - Springer Science and Business Media Deutschland GmbH
T2 - 19th International Conference on Computational Linguistics and Intelligent Text Processing, CICLing 2018
Y2 - 18 March 2018 through 24 March 2018
ER -