Dual Appearance-Aware Enhancement for Oriented Object Detection
Oriented detectors have become the mainstream of object detection in remote-sensing images since they provide more precise bounding boxes and contain less background. However, there remain several challenges that restrict the detection performance and need to be tackled. This article focuses on the...
Saved in:
Published in | IEEE transactions on geoscience and remote sensing Vol. 62; pp. 1 - 14 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
New York
IEEE
2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
ISSN | 0196-2892 1558-0644 |
DOI | 10.1109/TGRS.2023.3344195 |
Cover
Abstract | Oriented detectors have become the mainstream of object detection in remote-sensing images since they provide more precise bounding boxes and contain less background. However, there remain several challenges that restrict the detection performance and need to be tackled. This article focuses on the following two aspects: 1) numerous tiny objects in remote-sensing images pose a challenge for the detectors pursuing high recall and accurate localization and 2) specific categories with large aspect ratios and arbitrary angles also trouble the regression of the detectors. We attempt to alleviate the above problems by constructing a weak feature extraction network (WFEN) and a dual appearance-aware (DA) loss. Specifically, WFEN is used to extract hierarchical weight vectors for multiscale feature layers by employing a lightweight convolutional module, aiming to fuse activation features distributed in different layers and provide pure features for subsequent regression and classification. DA loss is tailored to regressions of tiny and slender objects by dynamically modulating the associated loss on objects with various appearances, which consists of two auxiliary losses, termed scale-aware loss <inline-formula> <tex-math notation="LaTeX">{\mathcal {L}}_{S} </tex-math></inline-formula> and aspect-ratio-aware loss <inline-formula> <tex-math notation="LaTeX">{\mathcal {L}}_{A} </tex-math></inline-formula>. These two components can contribute to each other, that is, the former provides more accurate features for detection tasks, while the latter can reciprocate the former by imposing constraints on crucial objects, and together constitute an appearance sensitivity detector (ASDet). Extensive experiments on three public datasets demonstrate that our ASDet outperforms all refine-stage detectors in terms of accuracy while maintaining the superior inference speed of single-stage counterparts. |
---|---|
AbstractList | Oriented detectors have become the mainstream of object detection in remote-sensing images since they provide more precise bounding boxes and contain less background. However, there remain several challenges that restrict the detection performance and need to be tackled. This article focuses on the following two aspects: 1) numerous tiny objects in remote-sensing images pose a challenge for the detectors pursuing high recall and accurate localization and 2) specific categories with large aspect ratios and arbitrary angles also trouble the regression of the detectors. We attempt to alleviate the above problems by constructing a weak feature extraction network (WFEN) and a dual appearance-aware (DA) loss. Specifically, WFEN is used to extract hierarchical weight vectors for multiscale feature layers by employing a lightweight convolutional module, aiming to fuse activation features distributed in different layers and provide pure features for subsequent regression and classification. DA loss is tailored to regressions of tiny and slender objects by dynamically modulating the associated loss on objects with various appearances, which consists of two auxiliary losses, termed scale-aware loss [Formula Omitted] and aspect-ratio-aware loss [Formula Omitted]. These two components can contribute to each other, that is, the former provides more accurate features for detection tasks, while the latter can reciprocate the former by imposing constraints on crucial objects, and together constitute an appearance sensitivity detector (ASDet). Extensive experiments on three public datasets demonstrate that our ASDet outperforms all refine-stage detectors in terms of accuracy while maintaining the superior inference speed of single-stage counterparts. Oriented detectors have become the mainstream of object detection in remote-sensing images since they provide more precise bounding boxes and contain less background. However, there remain several challenges that restrict the detection performance and need to be tackled. This article focuses on the following two aspects: 1) numerous tiny objects in remote-sensing images pose a challenge for the detectors pursuing high recall and accurate localization and 2) specific categories with large aspect ratios and arbitrary angles also trouble the regression of the detectors. We attempt to alleviate the above problems by constructing a weak feature extraction network (WFEN) and a dual appearance-aware (DA) loss. Specifically, WFEN is used to extract hierarchical weight vectors for multiscale feature layers by employing a lightweight convolutional module, aiming to fuse activation features distributed in different layers and provide pure features for subsequent regression and classification. DA loss is tailored to regressions of tiny and slender objects by dynamically modulating the associated loss on objects with various appearances, which consists of two auxiliary losses, termed scale-aware loss <inline-formula> <tex-math notation="LaTeX">{\mathcal {L}}_{S} </tex-math></inline-formula> and aspect-ratio-aware loss <inline-formula> <tex-math notation="LaTeX">{\mathcal {L}}_{A} </tex-math></inline-formula>. These two components can contribute to each other, that is, the former provides more accurate features for detection tasks, while the latter can reciprocate the former by imposing constraints on crucial objects, and together constitute an appearance sensitivity detector (ASDet). Extensive experiments on three public datasets demonstrate that our ASDet outperforms all refine-stage detectors in terms of accuracy while maintaining the superior inference speed of single-stage counterparts. |
Author | Zhao, Hongyu Sheng, Kai Gong, Maoguo Feng, Kai-Yuan Wu, Yue Tang, Zedong |
Author_xml | – sequence: 1 givenname: Maoguo orcidid: 0000-0002-0415-8556 surname: Gong fullname: Gong, Maoguo email: gong@ieee.org organization: Key Laboratory of Collaborative Intelligence Systems, Ministry of Education, Xidian University, Xi'an, China – sequence: 2 givenname: Hongyu surname: Zhao fullname: Zhao, Hongyu email: dmhz@live.cn organization: Key Laboratory of Collaborative Intelligence Systems, Ministry of Education, Xidian University, Xi'an, China – sequence: 3 givenname: Yue orcidid: 0000-0002-3459-5079 surname: Wu fullname: Wu, Yue email: ywu@xidian.edu.cn organization: Key Laboratory of Collaborative Intelligence Systems, Ministry of Education, Xidian University, Xi'an, China – sequence: 4 givenname: Zedong surname: Tang fullname: Tang, Zedong email: omegatangzd@gmail.com organization: Key Laboratory of Collaborative Intelligence Systems, Ministry of Education, Xidian University, Xi'an, China – sequence: 5 givenname: Kai-Yuan orcidid: 0000-0003-4970-4175 surname: Feng fullname: Feng, Kai-Yuan email: fkylwl@gmail.com organization: Key Laboratory of Collaborative Intelligence Systems, Ministry of Education, Xidian University, Xi'an, China – sequence: 6 givenname: Kai surname: Sheng fullname: Sheng, Kai email: kaisheng@xidian.edu.cn organization: Key Laboratory of Collaborative Intelligence Systems, Ministry of Education, Xidian University, Xi'an, China |
BookMark | eNp9kEFLAzEQhYNUsK3-AMHDguetmSS7m5yktLUKhYLWc8imE9zSZtdsivjv3aU9iAdPMwPvm3nzRmTga4-E3AKdAFD1sFm-vk0YZXzCuRCgsgsyhCyTKc2FGJAhBZWnTCp2RUZtu6MURAbFkDzOj2afTJsGTTDeYjr9MgGThf_opwP6mLg6JOtQdS1uk3W5QxuTOcauVLW_JpfO7Fu8OdcxeX9abGbP6Wq9fJlNV6llSsSUsZwKablSDgtDMy65zDt7bFsWgAYYt8pktlQOuHQlFiCdtd1njhmmMs7H5P60twn15xHbqHf1MfjupGYKQEgleK8qTiob6rYN6LStoul9xmCqvQaq-7R0n5bu09LntDoS_pBNqA4mfP_L3J2YChF_6XkupJD8Bz6ydfs |
CODEN | IGRSD2 |
CitedBy_id | crossref_primary_10_3390_rs16214065 crossref_primary_10_1109_TGRS_2024_3500013 crossref_primary_10_1109_TGRS_2024_3523305 crossref_primary_10_1109_TGRS_2025_3541220 crossref_primary_10_1109_TCSVT_2024_3444795 crossref_primary_10_3390_rs16234482 crossref_primary_10_1109_TGRS_2024_3481951 crossref_primary_10_1109_TGRS_2024_3486559 |
Cites_doi | 10.1109/tmm.2023.3305120 10.1145/2964284.2967274 10.1109/TGRS.2019.2930982 10.1007/978-3-030-58598-3_40 10.1109/CVPR52688.2022.00187 10.1109/CVPR.2019.00091 10.1007/978-3-030-20893-6_10 10.1109/CVPR.2016.90 10.1109/ICCV.2019.00832 10.1109/TPAMI.2021.3117983 10.48550/arXiv.1911.08287 10.1109/CVPR46437.2021.00281 10.1109/tgrs.2022.3149780 10.1109/tgrs.2023.3269642 10.1109/ICCV.2019.00840 10.1109/ICCV.2017.324 10.1609/aaai.v35i3.16336 10.1109/ICCV.2019.00929 10.1109/tgrs.2022.3231340 10.1007/978-3-030-58558-7_12 10.1109/TGRS.2020.2981203 10.1109/TIP.2022.3167307 10.1007/s11633-022-1339-y 10.1609/aaai.v36i1.19975 10.1109/TGRS.2019.2899955 10.1109/CVPR42600.2020.01122 10.1109/ICCV48922.2021.00350 10.1109/CVPR.2019.00296 10.1109/TCSVT.2022.3148392 10.1007/978-3-030-58545-7_6 10.1109/tgrs.2022.3183022 10.1109/tgrs.2022.3216215 10.1109/TPAMI.2016.2577031 10.1109/tgrs.2021.3069056 10.1109/TIP.2012.2219547 10.1109/CVPR46437.2021.00868 10.1109/TPAMI.2020.2974745 10.1109/ICCV.2019.00972 10.1109/tgrs.2021.3095186 10.1145/3503161.3548541 10.1109/CVPR.2018.00418 10.1109/tgrs.2021.3062048 10.1109/CVPR42600.2020.00978 10.1109/TMM.2018.2818020 10.1609/aaai.v35i4.16426 10.1609/aaai.v35i3.16347 10.1109/CVPR.2019.00075 10.1109/LGRS.2019.2936173 10.1109/CVPR.2017.106 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024 |
DBID | 97E RIA RIE AAYXX CITATION 7UA 8FD C1K F1W FR3 H8D H96 KR7 L.G L7M |
DOI | 10.1109/TGRS.2023.3344195 |
DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef Water Resources Abstracts Technology Research Database Environmental Sciences and Pollution Management ASFA: Aquatic Sciences and Fisheries Abstracts Engineering Research Database Aerospace Database Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources Civil Engineering Abstracts Aquatic Science & Fisheries Abstracts (ASFA) Professional Advanced Technologies Database with Aerospace |
DatabaseTitle | CrossRef Aerospace Database Civil Engineering Abstracts Aquatic Science & Fisheries Abstracts (ASFA) Professional Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources Technology Research Database ASFA: Aquatic Sciences and Fisheries Abstracts Engineering Research Database Advanced Technologies Database with Aerospace Water Resources Abstracts Environmental Sciences and Pollution Management |
DatabaseTitleList | Aerospace Database |
Database_xml | – sequence: 1 dbid: RIE name: IEEE Electronic Library (IEL) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering Physics |
EISSN | 1558-0644 |
EndPage | 14 |
ExternalDocumentID | 10_1109_TGRS_2023_3344195 10364848 |
Genre | orig-research |
GrantInformation_xml | – fundername: National Natural Science Foundation of China grantid: 62036006 funderid: 10.13039/501100001809 – fundername: Key-Area Research and Development Program of Guangdong Province grantid: 2020B090921001 |
GroupedDBID | -~X 0R~ 29I 4.4 5GY 5VS 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACGFO ACGFS ACIWK ACNCT AENEX AETIX AFRAH AGQYO AGSQL AHBIQ AI. AIBXA AKJIK AKQYR ALLEH ALMA_UNASSIGNED_HOLDINGS ASUFR ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 EBS EJD F5P HZ~ H~9 IBMZZ ICLAB IFIPE IFJZH IPLJI JAVBF LAI M43 O9- OCL P2P RIA RIE RNS RXW TAE TN5 VH1 Y6R AAYXX CITATION 7UA 8FD C1K F1W FR3 H8D H96 KR7 L.G L7M |
ID | FETCH-LOGICAL-c294t-226048c399fe7a05383865582db71ea123c9a5cb9f138fbe718fcc110f2a29533 |
IEDL.DBID | RIE |
ISSN | 0196-2892 |
IngestDate | Mon Jun 30 08:24:19 EDT 2025 Thu Apr 24 23:09:51 EDT 2025 Wed Oct 01 02:58:00 EDT 2025 Wed Aug 27 02:07:39 EDT 2025 |
IsPeerReviewed | true |
IsScholarly | true |
Language | English |
License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c294t-226048c399fe7a05383865582db71ea123c9a5cb9f138fbe718fcc110f2a29533 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ORCID | 0000-0002-3459-5079 0000-0002-0415-8556 0000-0003-4970-4175 |
PQID | 2911489433 |
PQPubID | 85465 |
PageCount | 14 |
ParticipantIDs | crossref_citationtrail_10_1109_TGRS_2023_3344195 ieee_primary_10364848 proquest_journals_2911489433 crossref_primary_10_1109_TGRS_2023_3344195 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 20240000 2024-00-00 20240101 |
PublicationDateYYYYMMDD | 2024-01-01 |
PublicationDate_xml | – year: 2024 text: 20240000 |
PublicationDecade | 2020 |
PublicationPlace | New York |
PublicationPlace_xml | – name: New York |
PublicationTitle | IEEE transactions on geoscience and remote sensing |
PublicationTitleAbbrev | TGRS |
PublicationYear | 2024 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref13 ref15 ref53 ref52 ref11 ref10 ref54 ref17 Li (ref12); 33 ref16 ref19 ref18 ref51 ref50 ref46 ref45 ref47 ref42 ref41 ref44 ref43 ref8 ref7 ref9 ref4 ref3 ref6 Tan (ref35) ref5 ref40 Yang (ref14) ref34 ref37 ref36 ref31 Yang (ref49) 2022 ref30 ref33 ref32 ref2 ref1 ref39 ref38 ref24 ref23 ref26 ref25 ref20 ref22 ref21 Yang (ref48); 34 ref28 ref27 ref29 |
References_xml | – ident: ref20 doi: 10.1109/tmm.2023.3305120 – ident: ref9 doi: 10.1145/2964284.2967274 – ident: ref43 doi: 10.1109/TGRS.2019.2930982 – volume: 34 start-page: 18381 volume-title: Proc. Adv. Neural Inf. Process. Syst. ident: ref48 article-title: Learning high-precision bounding box for rotated object detection via Kullback–Leibler divergence – ident: ref47 doi: 10.1007/978-3-030-58598-3_40 – year: 2022 ident: ref49 article-title: The KFIoU loss for rotated object detection publication-title: arXiv:2201.12558 – ident: ref51 doi: 10.1109/CVPR52688.2022.00187 – ident: ref34 doi: 10.1109/CVPR.2019.00091 – ident: ref42 doi: 10.1007/978-3-030-20893-6_10 – ident: ref38 doi: 10.1109/CVPR.2016.90 – ident: ref6 doi: 10.1109/ICCV.2019.00832 – ident: ref53 doi: 10.1109/TPAMI.2021.3117983 – ident: ref11 doi: 10.48550/arXiv.1911.08287 – ident: ref26 doi: 10.1109/CVPR46437.2021.00281 – ident: ref44 doi: 10.1109/tgrs.2022.3149780 – ident: ref8 doi: 10.1109/tgrs.2023.3269642 – ident: ref16 doi: 10.1109/ICCV.2019.00840 – ident: ref39 doi: 10.1109/ICCV.2017.324 – ident: ref46 doi: 10.1609/aaai.v35i3.16336 – ident: ref33 doi: 10.1109/ICCV.2019.00929 – ident: ref7 doi: 10.1109/tgrs.2022.3231340 – start-page: 6105 volume-title: Proc. Int. Conf. Mach. Learn. (ICML) ident: ref35 article-title: EfficientNet: Rethinking model scaling for convolutional neural networks – ident: ref13 doi: 10.1007/978-3-030-58558-7_12 – ident: ref18 doi: 10.1109/TGRS.2020.2981203 – ident: ref3 doi: 10.1109/TIP.2022.3167307 – ident: ref4 doi: 10.1007/s11633-022-1339-y – ident: ref19 doi: 10.1609/aaai.v36i1.19975 – ident: ref21 doi: 10.1109/TGRS.2019.2899955 – ident: ref27 doi: 10.1109/CVPR42600.2020.01122 – ident: ref31 doi: 10.1109/ICCV48922.2021.00350 – ident: ref23 doi: 10.1109/CVPR.2019.00296 – ident: ref5 doi: 10.1109/TCSVT.2022.3148392 – ident: ref40 doi: 10.1007/978-3-030-58545-7_6 – ident: ref28 doi: 10.1109/tgrs.2022.3183022 – ident: ref2 doi: 10.1109/tgrs.2022.3216215 – start-page: 11830 volume-title: Proc. Int. Conf. Mach. Learn. (ICML) ident: ref14 article-title: Rethinking rotated object detection with Gaussian Wasserstein distance loss – volume: 33 start-page: 21002 volume-title: Proc. Adv. Neural Inf. Process. Syst. ident: ref12 article-title: Generalized focal loss: Learning qualified and distributed bounding boxes for dense object detection – ident: ref41 doi: 10.1109/TPAMI.2016.2577031 – ident: ref17 doi: 10.1109/tgrs.2021.3069056 – ident: ref1 doi: 10.1109/TIP.2012.2219547 – ident: ref29 doi: 10.1109/CVPR46437.2021.00868 – ident: ref30 doi: 10.1109/TPAMI.2020.2974745 – ident: ref37 doi: 10.1109/ICCV.2019.00972 – ident: ref22 doi: 10.1109/tgrs.2021.3095186 – ident: ref54 doi: 10.1145/3503161.3548541 – ident: ref52 doi: 10.1109/CVPR.2018.00418 – ident: ref24 doi: 10.1109/tgrs.2021.3062048 – ident: ref50 doi: 10.1109/CVPR42600.2020.00978 – ident: ref15 doi: 10.1109/TMM.2018.2818020 – ident: ref25 doi: 10.1609/aaai.v35i4.16426 – ident: ref45 doi: 10.1609/aaai.v35i3.16347 – ident: ref10 doi: 10.1109/CVPR.2019.00075 – ident: ref36 doi: 10.1109/LGRS.2019.2936173 – ident: ref32 doi: 10.1109/CVPR.2017.106 |
SSID | ssj0014517 |
Score | 2.5025742 |
Snippet | Oriented detectors have become the mainstream of object detection in remote-sensing images since they provide more precise bounding boxes and contain less... |
SourceID | proquest crossref ieee |
SourceType | Aggregation Database Enrichment Source Index Database Publisher |
StartPage | 1 |
SubjectTerms | Anchor-free appearance sensitivity detector (ASDet) Aspect ratio Detection Detectors dual appearance-aware (DA) loss Feature extraction Localization Object detection Object recognition oriented object detection Remote sensing remote-sensing images Sensitivity Sensors Task analysis Training Vectors weak feature extraction |
Title | Dual Appearance-Aware Enhancement for Oriented Object Detection |
URI | https://ieeexplore.ieee.org/document/10364848 https://www.proquest.com/docview/2911489433 |
Volume | 62 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
journalDatabaseRights | – providerCode: PRVIEE databaseName: IEEE Electronic Library (IEL) customDbUrl: eissn: 1558-0644 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0014517 issn: 0196-2892 databaseCode: RIE dateStart: 19800101 isFulltext: true titleUrlDefault: https://ieeexplore.ieee.org/ providerName: IEEE |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjR1dS8Mw8HADQR_8mBPnF33wSUhd27RNnmSoUwQ30Al7K0mWIijd0A7BX-9d2slQFN9auIRwH7m73BfASSxNwiciZ-gix4yrNGJaKMuoVV0ou5E1gqqR7wbJzSO_Hcfjuljd1cJYa13ymfXp08XyJ1Mzp6cylPAo4YKLBjTSVFbFWl8hAx4HdW10wtCLCOsQZtCVZ6Pr-wef5oT7UYTqn2ZJLCkhN1Xlx1Xs9Et_EwaLk1VpJc_-vNS--fjWtPHfR9-CjdrS9HoVa2zDii1asL7Uf7AFqy7_07ztwPnlnGBnM2R84gPWe1ev1rsqnuiP9vbQuvWG1BUZbVRvqOn9xru0pUvlKtrw2L8aXdywerYCM6HkJUOrC2XXoHmS21ShJAoa_hmLcKLTwCrUZ0aq2GiZB5HItUUVlhuD2MxDFVJK6i40i2lh98CLia6CmyDJNU-0FdKgF6QQBWoSGqk70F0gOzN143Gaf_GSOQekKzOiT0b0yWr6dOD0a8ms6rrxF3Cb8L0EWKG6A4cLkma1YL5loSQHUPIo2v9l2QGs4e68emY5hGb5OrdHaHiU-tgx3Ce4bNGC |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1bS8MwFD54QdQH7-K89sEnIXVt0y55EvE2bxN0g72VJEsRlDq2DsFf7zlpJ0NRfGshScO59HwnOReAw1iahPdExtBFjhlXjYhpoSyjUnWhrEfWCMpGvm8lzQ6_6cbdKlnd5cJYa13wmfXp0d3l997MiI7KUMOjhAsupmE2RreiUaZrfV0a8DiosqMThn5EWF1iBnV53L56fPKpU7gfRQgAqJvEhBlyfVV-_IydhblchtZ4b2VgyYs_KrRvPr6Vbfz35ldgqcKa3mkpHKswZfM1WJyoQLgGcy4C1AzX4eR8RGP7fRR9kgR2-q4G1rvIn-mN1vYQ33oPVBcZUar3oOkExzu3hQvmyjegc3nRPmuyqrsCM6HkBUPchdprEKBktqFQFwW1_4xF2NONwCq0aEaq2GiZBZHItEUjlhmD1MxCFVJQ6ibM5G-53QIvJs4KboIk0zzRVkiD3FFIAtULjdQ1qI-JnZqq9Dh1wHhNnQtSlynxJyX-pBV_anD0NaVf1t34a_AG0XtiYEnqGuyOWZpWqjlMQ0kuoORRtP3LtAOYb7bv79K769btDizgl3h56LILM8VgZPcQhhR63wnfJ-iQ1NM |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Dual+Appearance-Aware+Enhancement+for+Oriented+Object+Detection&rft.jtitle=IEEE+transactions+on+geoscience+and+remote+sensing&rft.au=Gong%2C+Maoguo&rft.au=Zhao%2C+Hongyu&rft.au=Wu%2C+Yue&rft.au=Tang%2C+Zedong&rft.date=2024&rft.issn=0196-2892&rft.eissn=1558-0644&rft.volume=62&rft.spage=1&rft.epage=14&rft_id=info:doi/10.1109%2FTGRS.2023.3344195&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_TGRS_2023_3344195 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0196-2892&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0196-2892&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0196-2892&client=summon |