Asymmetry Sensitive Architecture for Neural Text Matching

Question-answer matching can be viewed as a puzzle where missing pieces of information are provided by the answer. To solve this puzzle, one must understand the question to find out a correct answer. Semantic-based matching models rely mainly in semantic relatedness the input text words. We show tha...

Full description

Saved in:
Bibliographic Details
Published inAdvances in Information Retrieval Vol. 11438; pp. 62 - 69
Main Authors Belkacem, Thiziri, Moreno, Jose G., Dkaki, Taoufiq, Boughanem, Mohand
Format Book Chapter
LanguageEnglish
Published Switzerland Springer International Publishing AG 2019
Springer International Publishing
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text
ISBN9783030157180
3030157180
ISSN0302-9743
1611-3349
DOI10.1007/978-3-030-15719-7_8

Cover

More Information
Summary:Question-answer matching can be viewed as a puzzle where missing pieces of information are provided by the answer. To solve this puzzle, one must understand the question to find out a correct answer. Semantic-based matching models rely mainly in semantic relatedness the input text words. We show that beyond the semantic similarities, matching models must focus on the most important words to find the correct answer. We use attention-based models to take into account the word saliency and propose an asymmetric architecture that focuses on the most important words of the question or the possible answers. We extended several state-of-the-art models with an attention-based layer. Experimental results, carried out on two QA datasets, show that our asymmetric architecture improves the performances of well-known neural matching algorithms.
ISBN:9783030157180
3030157180
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-030-15719-7_8