Asymmetry Sensitive Architecture for Neural Text Matching
Question-answer matching can be viewed as a puzzle where missing pieces of information are provided by the answer. To solve this puzzle, one must understand the question to find out a correct answer. Semantic-based matching models rely mainly in semantic relatedness the input text words. We show tha...
Saved in:
| Published in | Advances in Information Retrieval Vol. 11438; pp. 62 - 69 |
|---|---|
| Main Authors | , , , |
| Format | Book Chapter |
| Language | English |
| Published |
Switzerland
Springer International Publishing AG
2019
Springer International Publishing |
| Series | Lecture Notes in Computer Science |
| Subjects | |
| Online Access | Get full text |
| ISBN | 9783030157180 3030157180 |
| ISSN | 0302-9743 1611-3349 |
| DOI | 10.1007/978-3-030-15719-7_8 |
Cover
| Summary: | Question-answer matching can be viewed as a puzzle where missing pieces of information are provided by the answer. To solve this puzzle, one must understand the question to find out a correct answer. Semantic-based matching models rely mainly in semantic relatedness the input text words. We show that beyond the semantic similarities, matching models must focus on the most important words to find the correct answer. We use attention-based models to take into account the word saliency and propose an asymmetric architecture that focuses on the most important words of the question or the possible answers. We extended several state-of-the-art models with an attention-based layer. Experimental results, carried out on two QA datasets, show that our asymmetric architecture improves the performances of well-known neural matching algorithms. |
|---|---|
| ISBN: | 9783030157180 3030157180 |
| ISSN: | 0302-9743 1611-3349 |
| DOI: | 10.1007/978-3-030-15719-7_8 |