No Cover Image

Journal article 458 views 44 downloads

Sentence Graph Attention for Content-Aware Summarization

Giovanni Siragusa Orcid Logo, Livio Robaldo Orcid Logo

Applied Sciences, Volume: 12, Issue: 20, Start page: 10382

Swansea University Author: Livio Robaldo Orcid Logo

  • applsci-12-10382(1).pdf

    PDF | Version of Record

    © 2022 by the authors. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license

    Download (858.45KB)

Check full text

DOI (Published version): 10.3390/app122010382

Abstract

Neural network-based encoder–decoder (ED) models are widely used for abstractive text summarization. While the encoder first reads the source document and embeds salient information, the decoder starts from such encoding to generate the summary word-by-word. However, the drawback of the ED model is...

Full description

Published in: Applied Sciences
ISSN: 2076-3417
Published: MDPI AG 2022
Online Access: Check full text

URI: https://cronfa.swan.ac.uk/Record/cronfa61559
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract: Neural network-based encoder–decoder (ED) models are widely used for abstractive text summarization. While the encoder first reads the source document and embeds salient information, the decoder starts from such encoding to generate the summary word-by-word. However, the drawback of the ED model is that it treats words and sentences equally, without discerning the most relevant ones from the others. Many researchers have investigated this problem and provided different solutions. In this paper, we define a sentence-level attention mechanism based on the well-known PageRank algorithm to find the relevant sentences, then propagate the resulting scores into a second word-level attention layer. We tested the proposed model on the well-known CNN/Dailymail dataset, and found that it was able to generate summaries with a much higher abstractive power than state-of-the-art models, in spite of an unavoidable (but slight) decrease in terms of the Rouge scores.
Keywords: summarization; knowledge graph; neural networks; pagerank; natural language processing
College: Faculty of Humanities and Social Sciences
Funders: This research received no external funding.
Issue: 20
Start Page: 10382