Global-aware Beam Search for Neural Abstractive Summarization
This study develops a calibrated beam-based algorithm with awareness of the global attention distribution for neural abstractive summarization, aiming to improve the local optimality problem of the original beam search in a rigorous way. Specifically, a novel global protocol is proposed based on the...
Saved in:
| Main Authors | , , , |
|---|---|
| Format | Journal Article |
| Language | English |
| Published |
15.09.2020
|
| Subjects | |
| Online Access | Get full text |
| DOI | 10.48550/arxiv.2009.06891 |
Cover
| Summary: | This study develops a calibrated beam-based algorithm with awareness of the
global attention distribution for neural abstractive summarization, aiming to
improve the local optimality problem of the original beam search in a rigorous
way. Specifically, a novel global protocol is proposed based on the attention
distribution to stipulate how a global optimal hypothesis should attend to the
source. A global scoring mechanism is then developed to regulate beam search to
generate summaries in a near-global optimal fashion. This novel design enjoys a
distinctive property, i.e., the global attention distribution could be
predicted before inference, enabling step-wise improvements on the beam search
through the global scoring mechanism. Extensive experiments on nine datasets
show that the global (attention)-aware inference significantly improves
state-of-the-art summarization models even using empirical hyper-parameters.
The algorithm is also proven robust as it remains to generate meaningful texts
with corrupted attention distributions. The codes and a comprehensive set of
examples are available. |
|---|---|
| DOI: | 10.48550/arxiv.2009.06891 |