Novel multi‐domain attention for abstractive summarisation
The existing abstractive text summarisation models only consider the word sequence correlations between the source document and the reference summary, and the summary generated by models lacks the cover of the subject of source document due to models' small perspective. In order to make up thes...
Saved in:
Published in | CAAI Transactions on Intelligence Technology Vol. 8; no. 3; pp. 796 - 806 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
Beijing
John Wiley & Sons, Inc
01.09.2023
Wiley |
Subjects | |
Online Access | Get full text |
ISSN | 2468-2322 2468-2322 |
DOI | 10.1049/cit2.12117 |
Cover
Abstract | The existing abstractive text summarisation models only consider the word sequence correlations between the source document and the reference summary, and the summary generated by models lacks the cover of the subject of source document due to models' small perspective. In order to make up these disadvantages, a multi‐domain attention pointer (MDA‐Pointer) abstractive summarisation model is proposed in this work. First, the model uses bidirectional long short‐term memory to encode, respectively, the word and sentence sequence of source document for obtaining the semantic representations at word and sentence level. Furthermore, the multi‐domain attention mechanism between the semantic representations and the summary word is established, and the proposed model can generate summary words under the proposed attention mechanism based on the words and sentences. Then, the words are extracted from the vocabulary or the original word sequences through the pointer network to form the summary, and the coverage mechanism is introduced, respectively, into word and sentence level to reduce the redundancy of summary content. Finally, experiment validation is conducted on CNN/Daily Mail dataset. ROUGE evaluation indexes of the model without and with the coverage mechanism are improved respectively, and the results verify the validation of model proposed by this paper. |
---|---|
AbstractList | The existing abstractive text summarisation models only consider the word sequence correlations between the source document and the reference summary, and the summary generated by models lacks the cover of the subject of source document due to models' small perspective. In order to make up these disadvantages, a multi‐domain attention pointer (MDA‐Pointer) abstractive summarisation model is proposed in this work. First, the model uses bidirectional long short‐term memory to encode, respectively, the word and sentence sequence of source document for obtaining the semantic representations at word and sentence level. Furthermore, the multi‐domain attention mechanism between the semantic representations and the summary word is established, and the proposed model can generate summary words under the proposed attention mechanism based on the words and sentences. Then, the words are extracted from the vocabulary or the original word sequences through the pointer network to form the summary, and the coverage mechanism is introduced, respectively, into word and sentence level to reduce the redundancy of summary content. Finally, experiment validation is conducted on CNN/Daily Mail dataset. ROUGE evaluation indexes of the model without and with the coverage mechanism are improved respectively, and the results verify the validation of model proposed by this paper. Abstract The existing abstractive text summarisation models only consider the word sequence correlations between the source document and the reference summary, and the summary generated by models lacks the cover of the subject of source document due to models' small perspective. In order to make up these disadvantages, a multi‐domain attention pointer (MDA‐Pointer) abstractive summarisation model is proposed in this work. First, the model uses bidirectional long short‐term memory to encode, respectively, the word and sentence sequence of source document for obtaining the semantic representations at word and sentence level. Furthermore, the multi‐domain attention mechanism between the semantic representations and the summary word is established, and the proposed model can generate summary words under the proposed attention mechanism based on the words and sentences. Then, the words are extracted from the vocabulary or the original word sequences through the pointer network to form the summary, and the coverage mechanism is introduced, respectively, into word and sentence level to reduce the redundancy of summary content. Finally, experiment validation is conducted on CNN/Daily Mail dataset. ROUGE evaluation indexes of the model without and with the coverage mechanism are improved respectively, and the results verify the validation of model proposed by this paper. The existing abstractive text summarisation models only consider the word sequence correlations between the source document and the reference summary, and the summary generated by models lacks the cover of the subject of source document due to models' small perspective. In order to make up these disadvantages, a multi‐domain attention pointer (MDA‐Pointer) abstractive summarisation model is proposed in this work. First, the model uses bidirectional long short‐term memory to encode, respectively, the word and sentence sequence of source document for obtaining the semantic representations at word and sentence level. Furthermore, the multi‐domain attention mechanism between the semantic representations and the summary word is established, and the proposed model can generate summary words under the proposed attention mechanism based on the words and sentences. Then, the words are extracted from the vocabulary or the original word sequences through the pointer network to form the summary, and the coverage mechanism is introduced, respectively, into word and sentence level to reduce the redundancy of summary content. Finally, experiment validation is conducted on CNN/Daily Mail dataset. ROUGE evaluation indexes of the model without and with the coverage mechanism are improved respectively, and the results verify the validation of model proposed by this paper. |
Author | Chen, Yinong Yang, Wu Wang, Aijuan Lu, Ling Qu, Chunxia |
Author_xml | – sequence: 1 givenname: Chunxia orcidid: 0000-0002-3341-810X surname: Qu fullname: Qu, Chunxia organization: College of Computer Science and Engineering Chongqing University of Technology Chongqing China – sequence: 2 givenname: Ling surname: Lu fullname: Lu, Ling organization: College of Computer Science and Engineering Chongqing University of Technology Chongqing China – sequence: 3 givenname: Aijuan surname: Wang fullname: Wang, Aijuan organization: College of Computer Science and Engineering Chongqing University of Technology Chongqing China – sequence: 4 givenname: Wu surname: Yang fullname: Yang, Wu organization: College of Computer Science and Engineering Chongqing University of Technology Chongqing China – sequence: 5 givenname: Yinong orcidid: 0000-0002-8780-3994 surname: Chen fullname: Chen, Yinong organization: School of Computing and Augmented Intelligence Arizona State University Tempe Arizona USA |
BookMark | eNptkMlKBDEQhoMouM3FJ2jwJoxWlklPgRcRNxC96DlUpxPJ0NPRJCN48xF8Rp_EnhkREU-1_fVX8e2yzT72jrEDDsccFJ7YUMQxF5zXG2xHKD0dCynE5q98m41yngEAR8SJrHfY6V18dV01X3QlfL5_tHFOoa-oFNeXEPvKx1RRk0siW8Krq_JiPqcUMi2n-2zLU5fd6DvuscfLi4fz6_Ht_dXN-dnt2ErNy1g05Mgj194pBATNhdZNLUGAHGqJLREX0tbosAHpa1I4Ja2V9XwKZOUeu1n7tpFm5jmF4YU3EymYVSOmJ0OpBNs5o5tGtNB6FNgqPsGpVtI6B16CFsq6wetw7fWc4svC5WJmcZH64X0jATlqVAoH1dFaZVPMOTn_c5WDWcI2S9hmBXsQwx_xMFwBGrCF7r-VL_lbg6o |
CitedBy_id | crossref_primary_10_3390_app13179771 crossref_primary_10_1080_02533839_2025_2466647 crossref_primary_10_7717_peerj_cs_1496 crossref_primary_10_1016_j_engappai_2024_108148 crossref_primary_10_1109_TBME_2023_3280987 crossref_primary_10_3390_s24248103 |
Cites_doi | 10.18653/v1/D15-1044 10.13053/cys-21-4-2855 10.18653/v1/P16-1008 10.18653/v1/K17-1045 10.18653/v1/P18-2027 10.3115/1073445.1073465 10.18653/v1/P16-1154 10.1016/j.neunet.2020.07.025 10.1007/s10831-020-09214-8 10.18653/v1/K16-1028 10.1109/ASRU.2015.7404790 10.18653/v1/P17-1099 10.18653/v1/P18-1061 10.1145/3368926.3369728 10.18653/v1/D16-1112 10.1016/j.neunet.2019.12.024 10.14569/IJACSA.2017.081052 10.18653/v1/P18-1014 10.18653/v1/N16-1012 10.18653/v1/D19-1304 10.18653/v1/2021.findings-acl.298 10.3115/v1/D14-1179 10.1609/aaai.v32i1.11987 |
ContentType | Journal Article |
Copyright | 2023. This work is published under http://creativecommons.org/licenses/by-nc-nd/4.0/ (the "License"). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
Copyright_xml | – notice: 2023. This work is published under http://creativecommons.org/licenses/by-nc-nd/4.0/ (the "License"). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
DBID | AAYXX CITATION 8FE 8FG ABUWG AFKRA ARAPS AZQEC BENPR BGLVJ CCPQU DWQXO GNUQQ HCIFZ JQ2 K7- P62 PHGZM PHGZT PIMPY PKEHL PQEST PQGLB PQQKQ PQUKI PRINS DOA |
DOI | 10.1049/cit2.12117 |
DatabaseName | CrossRef ProQuest SciTech Collection ProQuest Technology Collection ProQuest Central (Alumni) ProQuest Central UK/Ireland ProQuest Advanced Technologies & Aerospace Collection ProQuest Central Essentials ProQuest Central Technology Collection ProQuest One Community College ProQuest Central ProQuest Central Student SciTech Premium Collection ProQuest Computer Science Collection Computer Science Database ProQuest Advanced Technologies & Aerospace Collection ProQuest Central Premium ProQuest One Academic (New) Publicly Available Content Database ProQuest One Academic Middle East (New) ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central China DOAJ Directory of Open Access Journals - NZ |
DatabaseTitle | CrossRef Publicly Available Content Database Advanced Technologies & Aerospace Collection Computer Science Database ProQuest Central Student Technology Collection ProQuest One Academic Middle East (New) ProQuest Advanced Technologies & Aerospace Collection ProQuest Central Essentials ProQuest Computer Science Collection ProQuest One Academic Eastern Edition ProQuest Central (Alumni Edition) SciTech Premium Collection ProQuest One Community College ProQuest Technology Collection ProQuest SciTech Collection ProQuest Central China ProQuest Central ProQuest One Applied & Life Sciences ProQuest One Academic UKI Edition ProQuest Central Korea ProQuest Central (New) ProQuest One Academic ProQuest One Academic (New) |
DatabaseTitleList | CrossRef Publicly Available Content Database |
Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: 8FG name: ProQuest Technology Collection url: https://search.proquest.com/technologycollection1 sourceTypes: Aggregation Database |
DeliveryMethod | fulltext_linktorsrc |
EISSN | 2468-2322 |
EndPage | 806 |
ExternalDocumentID | oai_doaj_org_article_6bb2d0df929d41598643cee0f30624ce 10_1049_cit2_12117 |
GroupedDBID | 0R~ 1OC 24P AAEDW AAHJG AAJGR AALRI AAMMB AAXUO AAYWO AAYXX ABMAC ABQXS ACCMX ACESK ACGFS ACVFH ACXQS ADBBV ADCNI ADMLS ADVLN AEFGJ AEUPX AEXQZ AFKRA AFPUW AGXDD AIDQK AIDYY AIGII AITUG AKBMS AKRWK AKYEP ALMA_UNASSIGNED_HOLDINGS ALUQN AMRAJ ARAPS ARCSS AVUZU BCNDV BENPR BGLVJ CCPQU CITATION EBS EJD FDB GROUPED_DOAJ HCIFZ IAO ICD IDLOA ITC K7- M41 M43 O9- OK1 PHGZM PHGZT PIMPY PQGLB PUEGO ROL RUI SSZ WIN 8FE 8FG ABUWG AZQEC DWQXO GNUQQ JQ2 P62 PKEHL PQEST PQQKQ PQUKI PRINS |
ID | FETCH-LOGICAL-c361t-2baeaf916fe4909061266b73020390939daa123c79e9b03f7a498a664cf180ac3 |
IEDL.DBID | DOA |
ISSN | 2468-2322 |
IngestDate | Wed Aug 27 01:22:33 EDT 2025 Sat Jul 26 00:21:02 EDT 2025 Thu Apr 24 23:12:16 EDT 2025 Wed Oct 01 06:40:19 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 3 |
Language | English |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c361t-2baeaf916fe4909061266b73020390939daa123c79e9b03f7a498a664cf180ac3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ORCID | 0000-0002-3341-810X 0000-0002-8780-3994 |
OpenAccessLink | https://doaj.org/article/6bb2d0df929d41598643cee0f30624ce |
PQID | 3091969449 |
PQPubID | 6852857 |
PageCount | 11 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_6bb2d0df929d41598643cee0f30624ce proquest_journals_3091969449 crossref_primary_10_1049_cit2_12117 crossref_citationtrail_10_1049_cit2_12117 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2023-09-00 20230901 2023-09-01 |
PublicationDateYYYYMMDD | 2023-09-01 |
PublicationDate_xml | – month: 09 year: 2023 text: 2023-09-00 |
PublicationDecade | 2020 |
PublicationPlace | Beijing |
PublicationPlace_xml | – name: Beijing |
PublicationTitle | CAAI Transactions on Intelligence Technology |
PublicationYear | 2023 |
Publisher | John Wiley & Sons, Inc Wiley |
Publisher_xml | – name: John Wiley & Sons, Inc – name: Wiley |
References | e_1_2_10_23_1 e_1_2_10_24_1 e_1_2_10_21_1 e_1_2_10_22_1 Vaswani A. (e_1_2_10_18_1) 2017 Sutskever I. (e_1_2_10_13_1) 2014 Mihalcea R. (e_1_2_10_5_1) 2004 e_1_2_10_4_1 e_1_2_10_3_1 e_1_2_10_19_1 e_1_2_10_6_1 e_1_2_10_16_1 e_1_2_10_17_1 e_1_2_10_8_1 e_1_2_10_14_1 e_1_2_10_7_1 e_1_2_10_15_1 Duchi J. (e_1_2_10_31_1) 2011; 12 e_1_2_10_12_1 e_1_2_10_9_1 e_1_2_10_34_1 e_1_2_10_10_1 e_1_2_10_33_1 e_1_2_10_11_1 e_1_2_10_32_1 e_1_2_10_30_1 Vinyals O. (e_1_2_10_20_1) 2015 Liu J.Y. (e_1_2_10_2_1) 2017; 35 e_1_2_10_29_1 e_1_2_10_27_1 e_1_2_10_28_1 e_1_2_10_25_1 e_1_2_10_26_1 |
References_xml | – ident: e_1_2_10_8_1 doi: 10.18653/v1/D15-1044 – ident: e_1_2_10_10_1 doi: 10.13053/cys-21-4-2855 – ident: e_1_2_10_25_1 doi: 10.18653/v1/P16-1008 – ident: e_1_2_10_30_1 – ident: e_1_2_10_6_1 doi: 10.18653/v1/K17-1045 – ident: e_1_2_10_26_1 doi: 10.18653/v1/P18-2027 – ident: e_1_2_10_32_1 doi: 10.3115/1073445.1073465 – ident: e_1_2_10_24_1 doi: 10.18653/v1/P16-1154 – ident: e_1_2_10_28_1 doi: 10.1016/j.neunet.2020.07.025 – volume: 12 start-page: 2121 year: 2011 ident: e_1_2_10_31_1 article-title: Adaptive subgradient methods for online learning and stochastic optimization publication-title: J. Mach. Learn. Res – ident: e_1_2_10_4_1 doi: 10.1007/s10831-020-09214-8 – start-page: 3104 volume-title: Advances in Neural Information Processing Systems year: 2014 ident: e_1_2_10_13_1 – ident: e_1_2_10_7_1 doi: 10.18653/v1/K16-1028 – ident: e_1_2_10_16_1 doi: 10.1109/ASRU.2015.7404790 – ident: e_1_2_10_9_1 doi: 10.18653/v1/P17-1099 – start-page: 6000 volume-title: Proceedings of the 2017 31st International Conference on Neural Information Processing Systems year: 2017 ident: e_1_2_10_18_1 – start-page: 2692 volume-title: Proceedings of the 2015 28th International Conference on Neural Information Processing Systems year: 2015 ident: e_1_2_10_20_1 – ident: e_1_2_10_12_1 doi: 10.18653/v1/P18-1061 – ident: e_1_2_10_29_1 – ident: e_1_2_10_34_1 – ident: e_1_2_10_33_1 doi: 10.1145/3368926.3369728 – ident: e_1_2_10_15_1 doi: 10.18653/v1/D16-1112 – ident: e_1_2_10_27_1 doi: 10.1016/j.neunet.2019.12.024 – ident: e_1_2_10_3_1 doi: 10.14569/IJACSA.2017.081052 – ident: e_1_2_10_21_1 doi: 10.18653/v1/P18-1014 – ident: e_1_2_10_19_1 doi: 10.18653/v1/N16-1012 – start-page: 404 volume-title: Proceedings of the 2004 Conference on Empirical Methods in Natural Language Processing year: 2004 ident: e_1_2_10_5_1 – ident: e_1_2_10_17_1 – ident: e_1_2_10_22_1 doi: 10.18653/v1/D19-1304 – ident: e_1_2_10_23_1 doi: 10.18653/v1/2021.findings-acl.298 – volume: 35 start-page: 154 issue: 7 year: 2017 ident: e_1_2_10_2_1 article-title: A review of automatic text summarization research in recent 70 years publication-title: Inf. Sci – ident: e_1_2_10_14_1 doi: 10.3115/v1/D14-1179 – ident: e_1_2_10_11_1 doi: 10.1609/aaai.v32i1.11987 |
SSID | ssj0001999537 |
Score | 2.293157 |
Snippet | The existing abstractive text summarisation models only consider the word sequence correlations between the source document and the reference summary, and the... Abstract The existing abstractive text summarisation models only consider the word sequence correlations between the source document and the reference summary,... |
SourceID | doaj proquest crossref |
SourceType | Open Website Aggregation Database Enrichment Source Index Database |
StartPage | 796 |
SubjectTerms | abstracting convolutional neural nets Datasets Documents Information retrieval Library and information science Neural networks Readability recurrent neural nets Redundancy Representations Semantics Sentences text analysis word processing Words (language) |
SummonAdditionalLinks | – databaseName: ProQuest Technology Collection dbid: 8FG link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV2xTsMwELWgLCwIBIhCQZFgYYiaxI4TS0gIEKVi6ESlbpHt2AipNKUNzHwC38iXcOc47QBijOPJ9t17dz6_I-SCAqwnSvBQljaHAEXbEEDYhkpFscnSTBnrqi1GfDhmj5N04hNuS19W2fpE56jLSmOOvE8B2AQXjInr-VuIXaPwdtW30NgkW3ECJwlfig8e1jkWYD8pzVpVUib6-qVOUFDB9Sdb45CT6__ljR3EDHbJjueGwU2zmXtkw8z2ydWo-jDTwBX-fX9-ldUrxPIBqmK6OsUASGcgFSYsnOcKmtdovkjngIwH9093w9C3PAg15XEdJkoaaYGyWcNEJJB_cK7ACpOIwjcVpZSANToTRqiI2kwykUvOmbZxHklND0lnVs3MEQlSQG-hpbuLwygmF5JSiXJjmOqJVZdctgtQaK8Hjm0ppoW7l2YCR5PCLVaXnK_mzhsVjD9n3eI6rmagcrUbqBbPhTeEgiuVlFFp4USUQB5QHZ4CUEcWYpeEadMlvXYXCm9Oy2K9-cf__z4h29gPvikC65FOvXg3p8AaanXmjsYPLcPCAA priority: 102 providerName: ProQuest |
Title | Novel multi‐domain attention for abstractive summarisation |
URI | https://www.proquest.com/docview/3091969449 https://doaj.org/article/6bb2d0df929d41598643cee0f30624ce |
Volume | 8 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
journalDatabaseRights | – providerCode: PRVAON databaseName: DOAJ Directory of Open Access Journals customDbUrl: eissn: 2468-2322 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0001999537 issn: 2468-2322 databaseCode: DOA dateStart: 20180101 isFulltext: true titleUrlDefault: https://www.doaj.org/ providerName: Directory of Open Access Journals – providerCode: PRVEBS databaseName: Inspec with Full Text customDbUrl: eissn: 2468-2322 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0001999537 issn: 2468-2322 databaseCode: ADMLS dateStart: 20200901 isFulltext: true titleUrlDefault: https://www.ebsco.com/products/research-databases/inspec-full-text providerName: EBSCOhost – providerCode: PRVBHI databaseName: IET Digital Library Open Access customDbUrl: eissn: 2468-2322 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0001999537 issn: 2468-2322 databaseCode: IDLOA dateStart: 20170601 isFulltext: true titleUrlDefault: https://digital-library.theiet.org/content/collections providerName: Institution of Engineering and Technology – providerCode: PRVLSH databaseName: Elsevier Journals customDbUrl: mediaType: online eissn: 2468-2322 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0001999537 issn: 2468-2322 databaseCode: AKRWK dateStart: 20160101 isFulltext: true providerName: Library Specific Holdings – providerCode: PRVPQU databaseName: ProQuest Central customDbUrl: http://www.proquest.com/pqcentral?accountid=15518 eissn: 2468-2322 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0001999537 issn: 2468-2322 databaseCode: BENPR dateStart: 20170601 isFulltext: true titleUrlDefault: https://www.proquest.com/central providerName: ProQuest – providerCode: PRVWIB databaseName: KBPluse Wiley Online Library: Open Access customDbUrl: eissn: 2468-2322 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0001999537 issn: 2468-2322 databaseCode: AVUZU dateStart: 20160101 isFulltext: true titleUrlDefault: https://www.kbplus.ac.uk/kbplus7/publicExport/pkg/559 providerName: Wiley-Blackwell – providerCode: PRVWIB databaseName: Wiley Online Library Open Access customDbUrl: eissn: 2468-2322 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0001999537 issn: 2468-2322 databaseCode: 24P dateStart: 20170101 isFulltext: true titleUrlDefault: https://authorservices.wiley.com/open-science/open-access/browse-journals.html providerName: Wiley-Blackwell |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1BS8MwFA46L15EUXE6R0EvHsraJk0b8LLp5hAdog52K0magDA70erZn-Bv9Jf4XtrNgYIXT6UltOW95n3fS16_R8gxBViPlOC-zG0KCYq2PoCw9ZUKQpPEiTLWVVuM-HDMLifxZKnVF9aEVfLAleE6XKkoD3ILd8gBbFBNnEJgDyxw3Yhpg9EXYGwpmXKrK8B7YprM9UiZ6OiHMkIpBdeZ7BuBnFD_jzjswGWwSTZqVuh1q7fZIium2Cano9mbmXqu5O_z_SOfPUIW76EepqtQ9IBuelLhUoWLWV71H1pdnrNDxoP-_dnQr5sd-JrysPQjJY20QNasYSIQyDw4VzD_ooDCORW5lIAyOhFGqIDaRDKRSs6ZtmEaSE13SaOYFWaPeDHgttDS7cJh_pIKSalEoTFc5AlVk5zMDZDpWgkcG1JMM7cjzQRejTJnrCY5Wox9qvQvfh3VQzsuRqBmtbsAnsxqT2Z_ebJJWnMvZPVEesko8BnBBWNi_z-ecUDWsV98VSTWIo3y-dUcAqsoVZuspoOLNlnrnl9f3cGx1x_d3LbdZ_UFBQrOmw |
linkProvider | Directory of Open Access Journals |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV3NTtwwEB4BPcClAhXEtgtEKhw4RGRtrxNLrRC03S5_ewKJW7AdGyHRDbBLEbc-Ak_CQ_EkzDgJeyjixtUZ5TAez3wzHn8DsM4xrDOjZKwLn2GCYn2MQdjHxiQdl3ZT43zothjI_onYP-2eTsFj8xaG2iobnxgcdVFaqpFvcQxsSioh1PbVdUxTo-h2tRmhUZnFgbu_w5Rt9H3vJ-7vBmO9X8c_-nE9VSC2XHbGMTPaaY-oyDuhEkUhXkqDhs4SzP8VV4XW6M5tqpwyCfepFirTUgrrO1miLcf_TsMHwTknrv6s93tS00G01eVpw4Iq1Ja9GDMicAjz0CZxL4wH-M_7h5DWm4ePNRaNdirjWYApN_wE3wblX3cZhUbDp38PRflHXwwjYuEMfZERgtxIGyqQBE8ZVa_f6qagRTh5F2UswcywHLpliLqIFpTV4e6PsqZMac410ZtRaaljWrDZKCC3Nf84jcG4zMM9uFC0yvKgrBZ8fZG9qlg3XpXaJT2-SBBTdlgob87z-uDl0hhWJIVHCywQrBAbPUdgkHjMlZiwrgXtZhfy-viO8omxfX778xrM9o-PDvPDvcHBF5ijWfRVA1obZsY3t24FEcvYrAYzieDsve3yGaRc_Xw |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Novel+multi%E2%80%90domain+attention+for+abstractive+summarisation&rft.jtitle=CAAI+Transactions+on+Intelligence+Technology&rft.au=Qu%2C+Chunxia&rft.au=Lu%2C+Ling&rft.au=Wang%2C+Aijuan&rft.au=Yang%2C+Wu&rft.date=2023-09-01&rft.issn=2468-2322&rft.eissn=2468-2322&rft.volume=8&rft.issue=3&rft.spage=796&rft.epage=806&rft_id=info:doi/10.1049%2Fcit2.12117&rft.externalDBID=n%2Fa&rft.externalDocID=10_1049_cit2_12117 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2468-2322&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2468-2322&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2468-2322&client=summon |