Zero‐anaphora resolution in Korean based on deep language representation model: BERT
It is necessary to achieve high performance in the task of zero anaphora resolution (ZAR) for completely understanding the texts in Korean, Japanese, Chinese, and various other languages. Deep‐learning‐based models are being employed for building ZAR systems, owing to the success of deep learning in...
Saved in:
| Published in | ETRI journal Vol. 43; no. 2; pp. 299 - 312 |
|---|---|
| Main Authors | , , |
| Format | Journal Article |
| Language | English |
| Published |
Electronics and Telecommunications Research Institute (ETRI)
01.04.2021
한국전자통신연구원 |
| Subjects | |
| Online Access | Get full text |
| ISSN | 1225-6463 2233-7326 2233-7326 |
| DOI | 10.4218/etrij.2019-0441 |
Cover
| Abstract | It is necessary to achieve high performance in the task of zero anaphora resolution (ZAR) for completely understanding the texts in Korean, Japanese, Chinese, and various other languages. Deep‐learning‐based models are being employed for building ZAR systems, owing to the success of deep learning in the recent years. However, the objective of building a high‐quality ZAR system is far from being achieved even using these models. To enhance the current ZAR techniques, we fine‐tuned a pre‐trained bidirectional encoder representations from transformers (BERT). Notably, BERT is a general language representation model that enables systems to utilize deep bidirectional contextual information in a natural language text. It extensively exploits the attention mechanism based upon the sequence‐transduction model Transformer. In our model, classification is simultaneously performed for all the words in the input word sequence to decide whether each word can be an antecedent. We seek end‐to‐end learning by disallowing any use of hand‐crafted or dependency‐parsing features. Experimental results show that compared with other models, our approach can significantly improve the performance of ZAR. |
|---|---|
| AbstractList | It is necessary to achieve high performance in the task of zero anaphora resolution (ZAR) for completely understanding the texts in Korean, Japanese, Chinese, and various other languages. Deep‐learning‐based models are being employed for building ZAR systems, owing to the success of deep learning in the recent years. However, the objective of building a high‐quality ZAR system is far from being achieved even using these models. To enhance the current ZAR techniques, we fine‐tuned a pre‐trained bidirectional encoder representations from transformers (BERT). Notably, BERT is a general language representation model that enables systems to utilize deep bidirectional contextual information in a natural language text. It extensively exploits the attention mechanism based upon the sequence‐transduction model Transformer. In our model, classification is simultaneously performed for all the words in the input word sequence to decide whether each word can be an antecedent. We seek end‐to‐end learning by disallowing any use of hand‐crafted or dependency‐parsing features. Experimental results show that compared with other models, our approach can significantly improve the performance of ZAR. AbstractIt is necessary to achieve high performance in the task of zero anaphora resolution (ZAR) for completely understanding the texts in Korean, Japanese, Chinese, and various other languages. Deep‐learning‐based models are being employed for building ZAR systems, owing to the success of deep learning in the recent years. However, the objective of building a high‐quality ZAR system is far from being achieved even using these models. To enhance the current ZAR techniques, we fine‐tuned a pre‐trained bidirectional encoder representations from transformers (BERT). Notably, BERT is a general language representation model that enables systems to utilize deep bidirectional contextual information in a natural language text. It extensively exploits the attention mechanism based upon the sequence‐transduction model Transformer. In our model, classification is simultaneously performed for all the words in the input word sequence to decide whether each word can be an antecedent. We seek end‐to‐end learning by disallowing any use of hand‐crafted or dependency‐parsing features. Experimental results show that compared with other models, our approach can significantly improve the performance of ZAR. It is necessary to achieve high performance in the task of zero anaphora resolution (ZAR) for completely understanding the texts in Korean, Japanese, Chinese, and various other languages. Deep‐learning‐based models are being employed for building ZAR systems, owing to the success of deep learning in the recent years. However, the objective of building a high‐quality ZAR system is far from being achieved even using these models. To enhance the current ZAR techniques, we fine‐tuned a pre‐trained bidirectional encoder representations from transformers (BERT). Notably, BERT is a general language representation model that enables systems to utilize deep bidirectional contextual information in a natural language text. It extensively exploits the attention mechanism based upon the sequence‐transduction model Transformer. In our model, classification is simultaneously performed for all the words in the input word sequence to decide whether each word can be an antecedent. We seek end‐to‐end learning by disallowing any use of hand‐crafted or dependency‐parsing features. Experimental results show that compared with other models, our approach can significantly improve the performance of ZAR. KCI Citation Count: 7 |
| Author | Lim, Soojong Kim, Youngtae Ra, Dongyul |
| Author_xml | – sequence: 1 givenname: Youngtae surname: Kim fullname: Kim, Youngtae organization: Yonsei University – sequence: 2 givenname: Dongyul orcidid: 0000-0003-1449-4614 surname: Ra fullname: Ra, Dongyul email: dyra2246@gmail.com organization: Yonsei University – sequence: 3 givenname: Soojong surname: Lim fullname: Lim, Soojong organization: Electronics and Telecommunications Research Institute |
| BackLink | https://www.kci.go.kr/kciportal/ci/sereArticleSearch/ciSereArtiView.kci?sereArticleSearchBean.artiId=ART002706551$$DAccess content in National Research Foundation of Korea (NRF) |
| BookMark | eNqFkrFu2zAQhokiAeqknbtq7aCEpCiS6pYGbmM0QADD6dCFOFEnlw5DCpSMwFsfoc-YJ6ksBR06NNMBh-_7D-DPM3ISYkBCPjB6ITjTlzgkt7vglFU5FYK9IQvOiyJXBZcnZME4L3MpZPGWnPX9jlJORakX5PsPTPH5128I0P2MCbKEffT7wcWQuZB9iwkhZDX02GTjqkHsMg9hu4ctjmw34hgGmPjH2KD_lH1erjfvyGkLvsf3L_Oc3H9Zbq5v8tu7r6vrq9vcCl7xXIOSWoJUGq1izKoKbanLBlorWctaqrCokVnRINRS6kZZ3dSaj1hd0dYW5-TjnBtSax6sMxHcNLfRPCRztd6sTKWUEEqO7Gpmmwg70yX3COkwCdMipq2BNDjr0QgQXAFQiaISEutaU6zGu6pSTBRNPWbROWsfOjg8gfd_Axk1xz7M1Ic59mGOfYxKOSs2xb5P2Brr5ocbEjj_H-_yH-_1S3I2npzHw2u4WW7WnPFi_C1_ACAytJU |
| CitedBy_id | crossref_primary_10_1038_s41598_023_41484_9 crossref_primary_10_4218_etrij_2023_0100 crossref_primary_10_14801_jkiit_2023_21_2_43 crossref_primary_10_1109_ACCESS_2021_3112682 crossref_primary_10_1186_s40494_023_01068_2 |
| Cites_doi | 10.1162/neco.1997.9.8.1735 10.4218/etrij.2017-0085 10.18653/v1/D16-1132 10.1145/3325884 |
| ContentType | Journal Article |
| Copyright | 2020 ETRI |
| Copyright_xml | – notice: 2020 ETRI |
| DBID | AAYXX CITATION ADTOC UNPAY DOA ACYCR |
| DOI | 10.4218/etrij.2019-0441 |
| DatabaseName | CrossRef Unpaywall for CDI: Periodical Content Unpaywall DOAJ Directory of Open Access Journals Korean Citation Index |
| DatabaseTitle | CrossRef |
| DatabaseTitleList | |
| Database_xml | – sequence: 1 dbid: DOA name: Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: UNPAY name: Unpaywall url: https://proxy.k.utb.cz/login?url=https://unpaywall.org/ sourceTypes: Open Access Repository |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Engineering |
| EISSN | 2233-7326 |
| EndPage | 312 |
| ExternalDocumentID | oai_kci_go_kr_ARTI_9774476 oai_doaj_org_article_4a427aa06e4946ebb80e9b82797143db 10.4218/etrij.2019-0441 10_4218_etrij_2019_0441 ETR212322 |
| Genre | article |
| GrantInformation_xml | – fundername: National Research Foundation of Korea funderid: 2017R1D1A3B03031855 – fundername: Institute for Information and communications Technology Planning and Evaluation funderid: 2013‐0‐00131 |
| GroupedDBID | -~X .4S .DC .UV 0R~ 1OC 29G 2WC 5GY 5VS 9ZL AAKPC AAYBS ACGFS ACXQS ACYCR ADBBV ADDVE AENEX ALMA_UNASSIGNED_HOLDINGS ARCSS AVUZU BCNDV DU5 E3Z EBS EDO EJD GROUPED_DOAJ IPNFZ ITG ITH JDI KQ8 KVFHK MK~ ML~ O9- OK1 P5Y RIG RNS TR2 TUS WIN XSB AAMMB AAYXX ADMLS AEFGJ AGXDD AIDQK AIDYY CITATION OVT ADTOC UNPAY |
| ID | FETCH-LOGICAL-c4292-8a7686a678ec711c79ec585dafc61f1f07e3be1c4deab668d7c8db829ecb90fc3 |
| IEDL.DBID | DOA |
| ISSN | 1225-6463 2233-7326 |
| IngestDate | Sat Oct 25 08:02:50 EDT 2025 Fri Oct 03 12:51:34 EDT 2025 Tue Aug 19 19:48:08 EDT 2025 Thu Apr 24 23:09:13 EDT 2025 Wed Oct 01 02:46:33 EDT 2025 Wed Jan 22 16:29:07 EST 2025 |
| IsDoiOpenAccess | true |
| IsOpenAccess | true |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 2 |
| Language | English |
| License | http://doi.wiley.com/10.1002/tdm_license_1.1 http://onlinelibrary.wiley.com/termsAndConditions#vor |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c4292-8a7686a678ec711c79ec585dafc61f1f07e3be1c4deab668d7c8db829ecb90fc3 |
| Notes | This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education, Rep. of Korea (2017R1D1A3B03031855), and by Institute for Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT), Rep. of Korea (No. 2013‐0‐00131, Development of Knowledge Evolutionary WiseQA Platform Technology for Human Knowledge Augmented Services). Funding Information https://doi.org/10.4218/etrij.2019-0441 |
| ORCID | 0000-0003-1449-4614 |
| OpenAccessLink | https://doaj.org/article/4a427aa06e4946ebb80e9b82797143db |
| PageCount | 14 |
| ParticipantIDs | nrf_kci_oai_kci_go_kr_ARTI_9774476 doaj_primary_oai_doaj_org_article_4a427aa06e4946ebb80e9b82797143db unpaywall_primary_10_4218_etrij_2019_0441 crossref_citationtrail_10_4218_etrij_2019_0441 crossref_primary_10_4218_etrij_2019_0441 wiley_primary_10_4218_etrij_2019_0441_ETR212322 |
| ProviderPackageCode | CITATION AAYXX |
| PublicationCentury | 2000 |
| PublicationDate | April 2021 |
| PublicationDateYYYYMMDD | 2021-04-01 |
| PublicationDate_xml | – month: 04 year: 2021 text: April 2021 |
| PublicationDecade | 2020 |
| PublicationTitle | ETRI journal |
| PublicationYear | 2021 |
| Publisher | Electronics and Telecommunications Research Institute (ETRI) 한국전자통신연구원 |
| Publisher_xml | – name: Electronics and Telecommunications Research Institute (ETRI) – name: 한국전자통신연구원 |
| References | 2017; 30 2015; 28 2012 2011 2017; 44 2019 1996 2018 2007 2016 2018; 40 2015 2014 1997; 9 Sennrich R. (e_1_2_8_34_1) 2016 Vinyals O. (e_1_2_8_8_1) 2015; 28 e_1_2_8_24_1 e_1_2_8_27_1 Okumura M. (e_1_2_8_13_1) Vaswani A. (e_1_2_8_6_1) 2017; 30 Lim S. (e_1_2_8_17_1) Sutskever I. (e_1_2_8_29_1) 2014 Chen C. (e_1_2_8_10_1) e_1_2_8_20_1 e_1_2_8_21_1 e_1_2_8_22_1 Nariyama S. (e_1_2_8_15_1) e_1_2_8_23_1 Park C. (e_1_2_8_9_1) 2017; 44 Iida R. (e_1_2_8_19_1) 2011 Devlin J. (e_1_2_8_5_1) 2019 Sasano R. (e_1_2_8_2_1) Zhao S. (e_1_2_8_18_1) 2007 e_1_2_8_36_1 Seki K. (e_1_2_8_16_1) e_1_2_8_35_1 Yin Q. (e_1_2_8_11_1) 2018 Kong F. (e_1_2_8_25_1) 2019 e_1_2_8_37_1 Goodfellow I. (e_1_2_8_26_1) 2016 Chung J. (e_1_2_8_28_1) 2014 Murata M. (e_1_2_8_14_1) Tsochantaridis I. (e_1_2_8_7_1) Schuster M. (e_1_2_8_33_1) 2012 Nakaiwa H. (e_1_2_8_3_1) 1996 Iida R. (e_1_2_8_4_1) e_1_2_8_32_1 e_1_2_8_31_1 e_1_2_8_12_1 e_1_2_8_30_1 |
| References_xml | – volume: 44 start-page: 496 year: 2017 end-page: 502 article-title: Co‐reference resolution for Korean pronouns using pointer networks publication-title: J. Korean Inst. Inform. Sci. Eng. – start-page: 5149 year: 2012 end-page: 5152 – start-page: 812 year: 1996 end-page: 817 – start-page: 871 end-page: 887 – start-page: 1 year: 2019 end-page: 21 – volume: 30 start-page: 6000 year: 2017 end-page: 6010 article-title: Attention is all you need publication-title: Advances Neural Inform. Process. Syst. – start-page: 1715 year: 2016 end-page: 1725 – volume: 28 start-page: 2674 year: 2015 end-page: 2682 article-title: Pointer networks publication-title: Advances in Neural Inform. Process. Syst. – start-page: 541 year: 2007 end-page: 550 – start-page: 1 end-page: 22 – year: 2016 – volume: 9 start-page: 1735 year: 1997 end-page: 1780 article-title: Long short‐term memory publication-title: Neural Comput. – start-page: 75 end-page: 80 – year: 2014 – start-page: 823 end-page: 830 – start-page: 804 year: 2011 end-page: 813 – start-page: 778 end-page: 788 – start-page: 135 end-page: 145 – start-page: 13 year: 2018 end-page: 23 – start-page: 3104 year: 2014 end-page: 3112 – start-page: 215 end-page: 219 – start-page: 769 end-page: 776 – start-page: 911 end-page: 917 – volume: 40 start-page: 257 year: 2018 end-page: 264 article-title: Deep neural architecture for recovering dropped pronouns in Korean publication-title: ETRI J. – start-page: 4171 year: 2019 end-page: 4186 – year: 2015 – start-page: 1 volume-title: ACM Trans. Asian Language Inform. Process. ident: e_1_2_8_4_1 – ident: e_1_2_8_31_1 – ident: e_1_2_8_37_1 – start-page: 3104 volume-title: in Proc. Int. Conf. Neural Inform. Process. Syst year: 2014 ident: e_1_2_8_29_1 – start-page: 769 volume-title: in Proc. 22nd Int. Conf. Comput. Linguistics ident: e_1_2_8_2_1 – ident: e_1_2_8_27_1 doi: 10.1162/neco.1997.9.8.1735 – start-page: 135 volume-title: in Proc. Int. Conf. Theoret. Methodol. Issues Mach. Transl ident: e_1_2_8_15_1 – start-page: 804 volume-title: in Proc. Annu. Meet. Assoc. Comput. Linguistics year: 2011 ident: e_1_2_8_19_1 – start-page: 823 volume-title: in Proc. 21st Int. Conf. Mach. Learn ident: e_1_2_8_7_1 – start-page: 541 volume-title: in Proc. Joint Conf. Empirical Methods Natural Language Process. Comput. Natural Language Learn year: 2007 ident: e_1_2_8_18_1 – ident: e_1_2_8_22_1 – ident: e_1_2_8_20_1 doi: 10.4218/etrij.2017-0085 – start-page: 812 volume-title: in Proc. Int. Conf. Comput. Linguistics year: 1996 ident: e_1_2_8_3_1 – start-page: 1715 volume-title: in Proc. Annu. Meet. Assoc. Comput. Linguistics year: 2016 ident: e_1_2_8_34_1 – ident: e_1_2_8_32_1 – ident: e_1_2_8_36_1 – volume: 28 start-page: 2674 year: 2015 ident: e_1_2_8_8_1 article-title: Pointer networks publication-title: Advances in Neural Inform. Process. Syst. – start-page: 75 volume-title: in Proc. Natural Language Process. Pacific Rim Symp ident: e_1_2_8_14_1 – ident: e_1_2_8_21_1 – start-page: 215 volume-title: in Proc. Int. Joint Conf. Natural Language Process ident: e_1_2_8_17_1 – ident: e_1_2_8_12_1 doi: 10.18653/v1/D16-1132 – volume: 44 start-page: 496 year: 2017 ident: e_1_2_8_9_1 article-title: Co‐reference resolution for Korean pronouns using pointer networks publication-title: J. Korean Inst. Inform. Sci. Eng. – volume-title: in Proc. Neural Inform. Process. Syst., Workshop Deep Learn year: 2014 ident: e_1_2_8_28_1 – start-page: 4171 volume-title: in Proc. North American Chap. Assoc. Comput. Linguistics year: 2019 ident: e_1_2_8_5_1 – start-page: 778 volume-title: in Proc. Annu. Meet. Assoc. Comput. Linguistics ident: e_1_2_8_10_1 – start-page: 13 volume-title: in Proc. Int. Conf. Comput. Linguistics year: 2018 ident: e_1_2_8_11_1 – start-page: 5149 volume-title: in Proc. IEEE Int. Conf. Acoust., Speech Signal Process. (Kyoto, Japan) year: 2012 ident: e_1_2_8_33_1 – volume-title: Deep learning year: 2016 ident: e_1_2_8_26_1 – start-page: 871 volume-title: in Proc. Int. Conf. Computat. Linguistics ident: e_1_2_8_13_1 – ident: e_1_2_8_35_1 – ident: e_1_2_8_30_1 – ident: e_1_2_8_24_1 doi: 10.1145/3325884 – start-page: 1 volume-title: ACM Trans. Asian Low‐Resour. Lang. Inf. Process. 19 year: 2019 ident: e_1_2_8_25_1 – ident: e_1_2_8_23_1 – volume: 30 start-page: 6000 year: 2017 ident: e_1_2_8_6_1 article-title: Attention is all you need publication-title: Advances Neural Inform. Process. Syst. – start-page: 911 volume-title: in Proc. Int. Conf. Comput. Linguistics ident: e_1_2_8_16_1 |
| SSID | ssj0020458 |
| Score | 2.3059053 |
| Snippet | It is necessary to achieve high performance in the task of zero anaphora resolution (ZAR) for completely understanding the texts in Korean, Japanese, Chinese,... AbstractIt is necessary to achieve high performance in the task of zero anaphora resolution (ZAR) for completely understanding the texts in Korean, Japanese,... |
| SourceID | nrf doaj unpaywall crossref wiley |
| SourceType | Open Website Open Access Repository Enrichment Source Index Database Publisher |
| StartPage | 299 |
| SubjectTerms | attention bidirectional encoder representations from transformers (BERT) deep learning language representation model zero‐anaphora resolution (ZAR) 전자/정보통신공학 |
| SummonAdditionalLinks | – databaseName: Unpaywall dbid: UNPAY link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV3NbtQwELaq7QE48I-6_MlCHOCQbZw4dsKtRVsVEBWqdlHhYtmOXcqukijsCsGpj9Bn7JN0xsmuKAJViFMUy7Zie8bzjTP-hpDnJitT473DX_9pxEVsozyxIkpz7hmTzuaBvvj9gdif8rdH2dEGGa_uwnT8EOsDN9SMsF-jgjel7_Z5VHUOtmkbk059xfgsPODH6-ubIgNIPiCb04MPO5_Q2QJ5jQQPGdXAEqaRBLzSUfz8qYdL1imQ-IPNqVp4ubasGv3ju57PL8PYYIf2bnXxIt8CfSGGn8xGy4UZ2Z-_kTv-9xBvk5s9UqU7nWjdIRuuuktu_MJfeI98_Oza-vz0TFe6-QKyRMF370WZnlT0XQ2ItKJoKEsKRaVzDV0dkNJAp7m6-lTRkJLnFd0dH07uk-neePJ6P-oTNUQWs11FuQanRWiwe85KxqwsnAU3pNTeCuaZj6VLjWOWl04bIfJS2rw0eQLVTBF7mz4gg6qu3BahxmdFITOdZzxBrFgIBxKTMB1z5zMmhmS0WiBlexZzTKYxV-DN4ISpMGEKJ0zhhA3Ji3WDpiPw-HvVXVzxdTVk3g4FdXusekVWXPNEah0LxwsunDF57AoYiywwk3xphuQZyIua2ZPQHp_HtZq1CvyTNwrRNpcwipdrcbr6q7aDjFxVT40nh4hDkuThP_T-iFxPMEonxCI9JoNFu3RPAGYtzNNeiS4AYEQjxA priority: 102 providerName: Unpaywall |
| Title | Zero‐anaphora resolution in Korean based on deep language representation model: BERT |
| URI | https://onlinelibrary.wiley.com/doi/abs/10.4218%2Fetrij.2019-0441 https://onlinelibrary.wiley.com/doi/pdfdirect/10.4218/etrij.2019-0441 https://doaj.org/article/4a427aa06e4946ebb80e9b82797143db https://www.kci.go.kr/kciportal/ci/sereArticleSearch/ciSereArtiView.kci?sereArticleSearchBean.artiId=ART002706551 |
| UnpaywallVersion | publishedVersion |
| Volume | 43 |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| ispartofPNX | ETRI Journal, 2021, 43(2), , pp.299-312 |
| journalDatabaseRights | – providerCode: PRVAFT databaseName: Open Access Digital Library customDbUrl: eissn: 2233-7326 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0020458 issn: 2233-7326 databaseCode: KQ8 dateStart: 19930101 isFulltext: true titleUrlDefault: http://grweb.coalliance.org/oadl/oadl.html providerName: Colorado Alliance of Research Libraries – providerCode: PRVAON databaseName: Directory of Open Access Journals customDbUrl: eissn: 2233-7326 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0020458 issn: 2233-7326 databaseCode: DOA dateStart: 20170101 isFulltext: true titleUrlDefault: https://www.doaj.org/ providerName: Directory of Open Access Journals – providerCode: PRVEBS databaseName: Inspec with Full Text customDbUrl: eissn: 2233-7326 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0020458 issn: 2233-7326 databaseCode: ADMLS dateStart: 20090601 isFulltext: true titleUrlDefault: https://www.ebsco.com/products/research-databases/inspec-full-text providerName: EBSCOhost – providerCode: PRVWIB databaseName: KBPluse Wiley Online Library: Open Access customDbUrl: eissn: 2233-7326 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0020458 issn: 2233-7326 databaseCode: AVUZU dateStart: 19930401 isFulltext: true titleUrlDefault: https://www.kbplus.ac.uk/kbplus7/publicExport/pkg/559 providerName: Wiley-Blackwell |
| link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3NbtQwELZQeygcKiggtkBlIQ5wCBsnjn-4tWirAqJC1S4qXCzbsUvpyrtatUK98Qg8I0_CjJNdtQe0F05RrLFkz4wy88Xjbwh56Zq2djEGPPqvCy5KX6jKi6JWPDImg1eZvvjTsTia8A-nzemNVl9YE9bRA3eKG3LLK2ltKQLXXATnVBm0U5XU2Lm7dfj1LZVegqkeauHxH0It8NZCcFF3pD4c4tkQG1X9wJouPBTg7FY8yrT9EGXSAl62rtLcXv-00-ntxDVHnsP7ZLtPGel-t9QH5E5IO-TeDSLBh-TLt7CY_fn12yY7_w5GpQCie5-i54l-nEFqmChGrJbCUBvCnC7_VNLMa7m8g5Ro7o3zlh6MTsaPyORwNH53VPQdEwqPbacKZQE9CAsBKHjJmJc6eMADrY1esMhiKUPtAvO8DdYJoVrpVQvKBDGny-jrx2QjzVJ4QqiLjdaysarhFSZtWgQwXcVsyUNsmBiQN0u9Gd_TiWNXi6kBWIGKNlnRBhVtUNED8mo1Yd4xafxb9AANsRJDCuw8AI5hescw6xxjQF6AGc2FP8_z8Xk2MxcLA0DhvcG0l0vYxeuVldevapi9YJ2cGY1PMCGoqt3_sY-n5G6FdTS5WugZ2bhcXIXnkAhdur3s83tkc3L8ef_rXyTHA-4 |
| linkProvider | Directory of Open Access Journals |
| linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lb9QwELbQ9lA4oPISCwUsxAEOoXHiOA63Fm21pY9DtVtVvVi2Y5elK2cVbYW48RP4jfwSZrzZqEVCFaco1tiKZ8aZh-1vCHlnijo33jvc-s8TLlKbyMyKJJfcM1Y6KyN88fGJGE_5l_Pi_MZdmBU-RJ9ww5UR_9e4wDEhjaucg1lCKS7b2Tc8nIXZfby7vlGgcRqQjd2z6cW0D7twKxDDLtDcRHCRrwB-cJCdv4a4ZZsihD9YnNDCy-Z1WOgf3_V8ftuJjVZof4s87NxHuruS9yNyz4XH5MENUMEn5OzCtc3vn7900IuvIGAKAXWnX3QW6GEDbmKgaL1qCk21cwu6zlrSiHG5vo8UaKyT84nujU4nT8l0fzT5PE666gmJxRJUidQQSQgNxsjZkjFbVs5CbFBrbwXzzKely41jltdOGyFkXVpZG5kBmalSb_NnZBCa4J4TanxRVWWhZcEzdOAq4UCMGdMpd75gYkg-rvmmbActjhUu5gpCDGS0ioxWyGiFjB6S932HxQpV49-keyiIngzhsGND016qbnUprnlWap0KxysunDEydRXMpaywvHtthuQtiFFd2Vnsj8_LRl21CoKGA4UuMC9hFh96Kd_9VTtRC-6iU6PJKToHWfbiv3u8IZvjyfGROjo4OXxJ7md4mCYeGdomg2V77V6BN7Q0rzt1_wNtPAMB |
| linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1La9wwEBZlA30cQp90kz5E6aE9uPFDlu3cknaXpGlDCbsh5CIkeZRus9jGbAi99Sf0N-aXdEbrNUmhhJ6MxUhY8_DM6PENY29NWibGOaCt_yQQMrRBHlsZJLlwUZSBzT188ddDuTcVn0_Sk2t3YZb4EP2CG1mG_1-TgUNTOrJygW6JpLhoZz_ocBat7tPd9TX05qEYsLWd4-nptE-7aCuQ0i7U3EAKmSwBfmiQrb-GuOGbPIQ_epyqxZd7F1Wjf17q-fxmEOu90PghW-_CR76zlPcjdgeqx-zBNVDBJ-z4FNr66tdvXenmOwqYY0Ld6RefVfygxjCx4uS9So5NJUDDV6uW3GNcru4jVdzXydnmu6OjyVM2HY8mH_eCrnpCYKkEVZBrzCSkRmcENosimxVgMTcotbMycpELM0gMRFaUoI2UeZnZvDR5jGSmCJ1NnrFBVVfwnHHj0qLIUp2nIqYArpCAYowjHQpwaSSH7MOKb8p20OJU4WKuMMUgRivPaEWMVsToIXvXd2iWqBr_Jt0lQfRkBIftG-r2THXWpYQWcaZ1KEEUQoIxeQgFziUrqLx7aYbsDYpRnduZ70_Ps1qdtwqThn1FIbDIcBbveynf_lVbXgtuo1OjyREFB3G88d89XrO73z6N1Zf9w4NNdj-mszT-xNALNli0F_ASg6GFedVp-x8jwAKQ |
| linkToUnpaywall | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV3NbtQwELaq7QE48I-6_MlCHOCQbZw4dsKtRVsVEBWqdlHhYtmOXcqukijsCsGpj9Bn7JN0xsmuKAJViFMUy7Zie8bzjTP-hpDnJitT473DX_9pxEVsozyxIkpz7hmTzuaBvvj9gdif8rdH2dEGGa_uwnT8EOsDN9SMsF-jgjel7_Z5VHUOtmkbk059xfgsPODH6-ubIgNIPiCb04MPO5_Q2QJ5jQQPGdXAEqaRBLzSUfz8qYdL1imQ-IPNqVp4ubasGv3ju57PL8PYYIf2bnXxIt8CfSGGn8xGy4UZ2Z-_kTv-9xBvk5s9UqU7nWjdIRuuuktu_MJfeI98_Oza-vz0TFe6-QKyRMF370WZnlT0XQ2ItKJoKEsKRaVzDV0dkNJAp7m6-lTRkJLnFd0dH07uk-neePJ6P-oTNUQWs11FuQanRWiwe85KxqwsnAU3pNTeCuaZj6VLjWOWl04bIfJS2rw0eQLVTBF7mz4gg6qu3BahxmdFITOdZzxBrFgIBxKTMB1z5zMmhmS0WiBlexZzTKYxV-DN4ISpMGEKJ0zhhA3Ji3WDpiPw-HvVXVzxdTVk3g4FdXusekVWXPNEah0LxwsunDF57AoYiywwk3xphuQZyIua2ZPQHp_HtZq1CvyTNwrRNpcwipdrcbr6q7aDjFxVT40nh4hDkuThP_T-iFxPMEonxCI9JoNFu3RPAGYtzNNeiS4AYEQjxA |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Zero%E2%80%90anaphora+resolution+in+Korean+based+on+deep+language+representation+model%3A+BERT&rft.jtitle=ETRI+journal&rft.au=Kim%2C+Youngtae&rft.au=Ra%2C+Dongyul&rft.au=Lim%2C+Soojong&rft.date=2021-04-01&rft.issn=1225-6463&rft.eissn=2233-7326&rft.volume=43&rft.issue=2&rft.spage=299&rft.epage=312&rft_id=info:doi/10.4218%2Fetrij.2019-0441&rft.externalDBID=10.4218%252Fetrij.2019-0441&rft.externalDocID=ETR212322 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1225-6463&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1225-6463&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1225-6463&client=summon |