Learning from algorithm-generated pseudo-annotations for detecting ants in videos
Deep learning (DL) based detection models are powerful tools for large-scale analysis of dynamic biological behaviors in video data. Supervised training of a DL detection model often requires a large amount of manually-labeled training data which are time-consuming and labor-intensive to acquire. In...
Saved in:
| Published in | Scientific reports Vol. 13; no. 1; pp. 11566 - 10 |
|---|---|
| Main Authors | , , , , , |
| Format | Journal Article |
| Language | English |
| Published |
London
Nature Publishing Group UK
18.07.2023
Nature Publishing Group Nature Portfolio |
| Subjects | |
| Online Access | Get full text |
| ISSN | 2045-2322 2045-2322 |
| DOI | 10.1038/s41598-023-28734-6 |
Cover
| Abstract | Deep learning (DL) based detection models are powerful tools for large-scale analysis of dynamic biological behaviors in video data. Supervised training of a DL detection model often requires a large amount of manually-labeled training data which are time-consuming and labor-intensive to acquire. In this paper, we propose LFAGPA (Learn From Algorithm-Generated Pseudo-Annotations) that utilizes (noisy) annotations which are automatically generated by algorithms to train DL models for ant detection in videos. Our method consists of two main steps: (1) generate foreground objects using a (set of) state-of-the-art foreground extraction algorithm(s); (2) treat the results from step (1) as pseudo-annotations and use them to train deep neural networks for ant detection. We tackle several challenges on how to make use of automatically generated noisy annotations, how to learn from multiple annotation resources, and how to combine algorithm-generated annotations with human-labeled annotations (when available) for this learning framework. In experiments, we evaluate our method using 82 videos (totally 20,348 image frames) captured under natural conditions in a tropical rain-forest for dynamic ant behavior study. Without any manual annotation cost but only algorithm-generated annotations, our method can achieve a decent detection performance (77% in
F
1
score). Moreover, when using only 10% manual annotations, our method can train a DL model to perform as well as using the full human annotations (81% in
F
1
score). |
|---|---|
| AbstractList | Abstract Deep learning (DL) based detection models are powerful tools for large-scale analysis of dynamic biological behaviors in video data. Supervised training of a DL detection model often requires a large amount of manually-labeled training data which are time-consuming and labor-intensive to acquire. In this paper, we propose LFAGPA (Learn From Algorithm-Generated Pseudo-Annotations) that utilizes (noisy) annotations which are automatically generated by algorithms to train DL models for ant detection in videos. Our method consists of two main steps: (1) generate foreground objects using a (set of) state-of-the-art foreground extraction algorithm(s); (2) treat the results from step (1) as pseudo-annotations and use them to train deep neural networks for ant detection. We tackle several challenges on how to make use of automatically generated noisy annotations, how to learn from multiple annotation resources, and how to combine algorithm-generated annotations with human-labeled annotations (when available) for this learning framework. In experiments, we evaluate our method using 82 videos (totally 20,348 image frames) captured under natural conditions in a tropical rain-forest for dynamic ant behavior study. Without any manual annotation cost but only algorithm-generated annotations, our method can achieve a decent detection performance (77% in $$F_1$$ F 1 score). Moreover, when using only 10% manual annotations, our method can train a DL model to perform as well as using the full human annotations (81% in $$F_1$$ F 1 score). Deep learning (DL) based detection models are powerful tools for large-scale analysis of dynamic biological behaviors in video data. Supervised training of a DL detection model often requires a large amount of manually-labeled training data which are time-consuming and labor-intensive to acquire. In this paper, we propose LFAGPA (Learn From Algorithm-Generated Pseudo-Annotations) that utilizes (noisy) annotations which are automatically generated by algorithms to train DL models for ant detection in videos. Our method consists of two main steps: (1) generate foreground objects using a (set of) state-of-the-art foreground extraction algorithm(s); (2) treat the results from step (1) as pseudo-annotations and use them to train deep neural networks for ant detection. We tackle several challenges on how to make use of automatically generated noisy annotations, how to learn from multiple annotation resources, and how to combine algorithm-generated annotations with human-labeled annotations (when available) for this learning framework. In experiments, we evaluate our method using 82 videos (totally 20,348 image frames) captured under natural conditions in a tropical rain-forest for dynamic ant behavior study. Without any manual annotation cost but only algorithm-generated annotations, our method can achieve a decent detection performance (77% in [Formula: see text] score). Moreover, when using only 10% manual annotations, our method can train a DL model to perform as well as using the full human annotations (81% in [Formula: see text] score).Deep learning (DL) based detection models are powerful tools for large-scale analysis of dynamic biological behaviors in video data. Supervised training of a DL detection model often requires a large amount of manually-labeled training data which are time-consuming and labor-intensive to acquire. In this paper, we propose LFAGPA (Learn From Algorithm-Generated Pseudo-Annotations) that utilizes (noisy) annotations which are automatically generated by algorithms to train DL models for ant detection in videos. Our method consists of two main steps: (1) generate foreground objects using a (set of) state-of-the-art foreground extraction algorithm(s); (2) treat the results from step (1) as pseudo-annotations and use them to train deep neural networks for ant detection. We tackle several challenges on how to make use of automatically generated noisy annotations, how to learn from multiple annotation resources, and how to combine algorithm-generated annotations with human-labeled annotations (when available) for this learning framework. In experiments, we evaluate our method using 82 videos (totally 20,348 image frames) captured under natural conditions in a tropical rain-forest for dynamic ant behavior study. Without any manual annotation cost but only algorithm-generated annotations, our method can achieve a decent detection performance (77% in [Formula: see text] score). Moreover, when using only 10% manual annotations, our method can train a DL model to perform as well as using the full human annotations (81% in [Formula: see text] score). Deep learning (DL) based detection models are powerful tools for large-scale analysis of dynamic biological behaviors in video data. Supervised training of a DL detection model often requires a large amount of manually-labeled training data which are time-consuming and labor-intensive to acquire. In this paper, we propose LFAGPA (Learn From Algorithm-Generated Pseudo-Annotations) that utilizes (noisy) annotations which are automatically generated by algorithms to train DL models for ant detection in videos. Our method consists of two main steps: (1) generate foreground objects using a (set of) state-of-the-art foreground extraction algorithm(s); (2) treat the results from step (1) as pseudo-annotations and use them to train deep neural networks for ant detection. We tackle several challenges on how to make use of automatically generated noisy annotations, how to learn from multiple annotation resources, and how to combine algorithm-generated annotations with human-labeled annotations (when available) for this learning framework. In experiments, we evaluate our method using 82 videos (totally 20,348 image frames) captured under natural conditions in a tropical rain-forest for dynamic ant behavior study. Without any manual annotation cost but only algorithm-generated annotations, our method can achieve a decent detection performance (77% in F1 score). Moreover, when using only 10% manual annotations, our method can train a DL model to perform as well as using the full human annotations (81% in F1 score). Deep learning (DL) based detection models are powerful tools for large-scale analysis of dynamic biological behaviors in video data. Supervised training of a DL detection model often requires a large amount of manually-labeled training data which are time-consuming and labor-intensive to acquire. In this paper, we propose LFAGPA (Learn From Algorithm-Generated Pseudo-Annotations) that utilizes (noisy) annotations which are automatically generated by algorithms to train DL models for ant detection in videos. Our method consists of two main steps: (1) generate foreground objects using a (set of) state-of-the-art foreground extraction algorithm(s); (2) treat the results from step (1) as pseudo-annotations and use them to train deep neural networks for ant detection. We tackle several challenges on how to make use of automatically generated noisy annotations, how to learn from multiple annotation resources, and how to combine algorithm-generated annotations with human-labeled annotations (when available) for this learning framework. In experiments, we evaluate our method using 82 videos (totally 20,348 image frames) captured under natural conditions in a tropical rain-forest for dynamic ant behavior study. Without any manual annotation cost but only algorithm-generated annotations, our method can achieve a decent detection performance (77% in [Formula: see text] score). Moreover, when using only 10% manual annotations, our method can train a DL model to perform as well as using the full human annotations (81% in [Formula: see text] score). Deep learning (DL) based detection models are powerful tools for large-scale analysis of dynamic biological behaviors in video data. Supervised training of a DL detection model often requires a large amount of manually-labeled training data which are time-consuming and labor-intensive to acquire. In this paper, we propose LFAGPA (Learn From Algorithm-Generated Pseudo-Annotations) that utilizes (noisy) annotations which are automatically generated by algorithms to train DL models for ant detection in videos. Our method consists of two main steps: (1) generate foreground objects using a (set of) state-of-the-art foreground extraction algorithm(s); (2) treat the results from step (1) as pseudo-annotations and use them to train deep neural networks for ant detection. We tackle several challenges on how to make use of automatically generated noisy annotations, how to learn from multiple annotation resources, and how to combine algorithm-generated annotations with human-labeled annotations (when available) for this learning framework. In experiments, we evaluate our method using 82 videos (totally 20,348 image frames) captured under natural conditions in a tropical rain-forest for dynamic ant behavior study. Without any manual annotation cost but only algorithm-generated annotations, our method can achieve a decent detection performance (77% in F 1 score). Moreover, when using only 10% manual annotations, our method can train a DL model to perform as well as using the full human annotations (81% in F 1 score). Deep learning (DL) based detection models are powerful tools for large-scale analysis of dynamic biological behaviors in video data. Supervised training of a DL detection model often requires a large amount of manually-labeled training data which are time-consuming and labor-intensive to acquire. In this paper, we propose LFAGPA (Learn From Algorithm-Generated Pseudo-Annotations) that utilizes (noisy) annotations which are automatically generated by algorithms to train DL models for ant detection in videos. Our method consists of two main steps: (1) generate foreground objects using a (set of) state-of-the-art foreground extraction algorithm(s); (2) treat the results from step (1) as pseudo-annotations and use them to train deep neural networks for ant detection. We tackle several challenges on how to make use of automatically generated noisy annotations, how to learn from multiple annotation resources, and how to combine algorithm-generated annotations with human-labeled annotations (when available) for this learning framework. In experiments, we evaluate our method using 82 videos (totally 20,348 image frames) captured under natural conditions in a tropical rain-forest for dynamic ant behavior study. Without any manual annotation cost but only algorithm-generated annotations, our method can achieve a decent detection performance (77% in $$F_1$$ F1 score). Moreover, when using only 10% manual annotations, our method can train a DL model to perform as well as using the full human annotations (81% in $$F_1$$ F1 score). Deep learning (DL) based detection models are powerful tools for large-scale analysis of dynamic biological behaviors in video data. Supervised training of a DL detection model often requires a large amount of manually-labeled training data which are time-consuming and labor-intensive to acquire. In this paper, we propose LFAGPA (Learn From Algorithm-Generated Pseudo-Annotations) that utilizes (noisy) annotations which are automatically generated by algorithms to train DL models for ant detection in videos. Our method consists of two main steps: (1) generate foreground objects using a (set of) state-of-the-art foreground extraction algorithm(s); (2) treat the results from step (1) as pseudo-annotations and use them to train deep neural networks for ant detection. We tackle several challenges on how to make use of automatically generated noisy annotations, how to learn from multiple annotation resources, and how to combine algorithm-generated annotations with human-labeled annotations (when available) for this learning framework. In experiments, we evaluate our method using 82 videos (totally 20,348 image frames) captured under natural conditions in a tropical rain-forest for dynamic ant behavior study. Without any manual annotation cost but only algorithm-generated annotations, our method can achieve a decent detection performance (77% in $$F_1$$ F 1 score). Moreover, when using only 10% manual annotations, our method can train a DL model to perform as well as using the full human annotations (81% in $$F_1$$ F 1 score). |
| ArticleNumber | 11566 |
| Author | Hughes, David P. Zhang, Yizhe Zheng, Hao Imirzian, Natalie Kurze, Christoph Chen, Danny Z. |
| Author_xml | – sequence: 1 givenname: Yizhe surname: Zhang fullname: Zhang, Yizhe email: zhangyizhe@njust.edu.cn organization: School of Computer Science and Engineering, Nanjing University of Science and Technology – sequence: 2 givenname: Natalie surname: Imirzian fullname: Imirzian, Natalie organization: Department of Entomology and Department of Biology, Pennsylvania State University, Department of Bioengineering, Imperial College London – sequence: 3 givenname: Christoph surname: Kurze fullname: Kurze, Christoph organization: Department of Entomology and Department of Biology, Pennsylvania State University, Institute for Zoology, University of Regensburg – sequence: 4 givenname: Hao surname: Zheng fullname: Zheng, Hao organization: Department of Computer Science and Engineering, University of Notre Dame – sequence: 5 givenname: David P. surname: Hughes fullname: Hughes, David P. organization: Department of Entomology and Department of Biology, Pennsylvania State University – sequence: 6 givenname: Danny Z. surname: Chen fullname: Chen, Danny Z. organization: Department of Computer Science and Engineering, University of Notre Dame |
| BackLink | https://www.ncbi.nlm.nih.gov/pubmed/37464003$$D View this record in MEDLINE/PubMed |
| BookMark | eNqNkUtv1DAUhSNUREvpH2CBIrFhE_Arjr1CqAJaaSSEBGvrxr6TZpSxB9tp1X-Ppxn6YIG4G1v2d869Pn5ZHfngsapeU_KeEq4-JEFbrRrCeMNUx0Ujn1UnjIi2YZyxo0f74-ospQ0p1TItqH5RHfNOSEEIP6m-rxCiH_1Qr2PY1jANIY75atsM6DFCRlfvEs4uNOB9yJDH4FO9DrF2mNHmvRJ8TvXo6-vRYUivqudrmBKeHdbT6ueXzz_OL5rVt6-X559WjRWa5YZ3XILrQZFeCOeo65yiFjRpudCupSgoh17THpXSzCJZK6kklwiUMlSan1aXi68LsDG7OG4h3poAo7k7CHEwEPNoJzQOlZBg-7aXXMjO9paXfhw7Ch3jaIsXX7xmv4PbG5ime0NKzD5vs-RtSt7mLm8ji-rjotrN_RadRZ8jTE9GeXrjxyszhOu9YSuoIsXh3cEhhl8zpmy2Y7I4TeAxzKl04rrjWrKuoG__Qjdhjr4kvKdUKcppod48Hul-lj8fXgC2ADaGlCKu_--hh3hSgf2A8aH3P1S_Ac3Bzi0 |
| Cites_doi | 10.3389/fmars.2018.00319 10.1016/j.zool.2016.03.007 10.1146/annurev-ento-010715-023711 10.1109/TIP.2015.2419084 10.1038/s41598-019-49655-3 10.1098/rstb.2017.0012 10.1242/jeb.01831 10.1038/35082745 10.1016/j.jip.2020.107506 10.2193/2006-465 10.1007/s00359-006-0116-7 10.1109/TPAMI.2012.97 10.1109/CVPR.2015.7298965 10.1609/aaai.v30i1.10141 10.1109/CVPR.2017.638 10.1109/TNNLS.2022.3152527 10.1109/ICCV.2017.322 10.1007/978-3-319-24574-4_28 10.1109/ICCV.2017.211 10.2307/j.ctvs32s3w 10.1109/CVPR.2016.91 10.1007/978-3-319-46448-0_2 10.1109/ICCV.2015.167 10.1007/978-3-319-46466-4_5 10.1109/CVPR42600.2020.00975 10.1109/CVPR.2018.00582 10.1109/ICCV.2019.00524 10.1109/TPAMI.2004.1273918 10.1007/978-3-319-46493-0_35 10.1609/aaai.v33i01.33015909 10.1080/00031305.1994.10476030 |
| ContentType | Journal Article |
| Copyright | The Author(s) 2023 2023. The Author(s). The Author(s) 2023. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
| Copyright_xml | – notice: The Author(s) 2023 – notice: 2023. The Author(s). – notice: The Author(s) 2023. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. |
| DBID | C6C AAYXX CITATION NPM 3V. 7X7 7XB 88A 88E 88I 8FE 8FH 8FI 8FJ 8FK ABUWG AEUYN AFKRA AZQEC BBNVY BENPR BHPHI CCPQU DWQXO FYUFA GHDGH GNUQQ HCIFZ K9. LK8 M0S M1P M2P M7P PHGZM PHGZT PIMPY PJZUB PKEHL PPXIY PQEST PQGLB PQQKQ PQUKI PRINS Q9U 7X8 5PM ADTOC UNPAY DOA |
| DOI | 10.1038/s41598-023-28734-6 |
| DatabaseName | Springer Nature OA Free Journals CrossRef PubMed ProQuest Central (Corporate) Health & Medical Collection ProQuest Central (purchase pre-March 2016) Biology Database (Alumni Edition) Medical Database (Alumni Edition) Science Database (Alumni Edition) ProQuest SciTech Collection ProQuest Natural Science Journals Hospital Premium Collection Hospital Premium Collection (Alumni Edition) ProQuest Central (Alumni) (purchase pre-March 2016) ProQuest Central (Alumni) ProQuest One Sustainability ProQuest Central UK/Ireland ProQuest Central Essentials ProQuest SciTech Premium Collection Natural Science Collection Biological Science Collection ProQuest Central Natural Science Collection ProQuest One ProQuest Central Korea Health Research Premium Collection Health Research Premium Collection (Alumni) ProQuest Central Student SciTech Premium Collection ProQuest Health & Medical Complete (Alumni) Biological Sciences ProQuest Health & Medical Collection Medical Database Science Database Biological Science Database ProQuest Central Premium ProQuest One Academic (New) ProQuest Publicly Available Content Database ProQuest Health & Medical Research Collection ProQuest One Academic Middle East (New) ProQuest One Health & Nursing ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central China ProQuest Central Basic MEDLINE - Academic PubMed Central (Full Participant titles) Unpaywall for CDI: Periodical Content Unpaywall DOAJ Directory of Open Access Journals |
| DatabaseTitle | CrossRef PubMed Publicly Available Content Database ProQuest Central Student ProQuest One Academic Middle East (New) ProQuest Central Essentials ProQuest Health & Medical Complete (Alumni) ProQuest Central (Alumni Edition) SciTech Premium Collection ProQuest One Community College ProQuest One Health & Nursing ProQuest Natural Science Collection ProQuest Central China ProQuest Biology Journals (Alumni Edition) ProQuest Central ProQuest One Applied & Life Sciences ProQuest One Sustainability ProQuest Health & Medical Research Collection Health Research Premium Collection Health and Medicine Complete (Alumni Edition) Natural Science Collection ProQuest Central Korea Health & Medical Research Collection Biological Science Collection ProQuest Central (New) ProQuest Medical Library (Alumni) ProQuest Science Journals (Alumni Edition) ProQuest Biological Science Collection ProQuest Central Basic ProQuest Science Journals ProQuest One Academic Eastern Edition ProQuest Hospital Collection Health Research Premium Collection (Alumni) Biological Science Database ProQuest SciTech Collection ProQuest Hospital Collection (Alumni) ProQuest Health & Medical Complete ProQuest Medical Library ProQuest One Academic UKI Edition ProQuest One Academic ProQuest One Academic (New) ProQuest Central (Alumni) MEDLINE - Academic |
| DatabaseTitleList | MEDLINE - Academic Publicly Available Content Database PubMed CrossRef |
| Database_xml | – sequence: 1 dbid: C6C name: Springer Nature Link OA Free Journals url: http://www.springeropen.com/ sourceTypes: Publisher – sequence: 2 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 3 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 4 dbid: UNPAY name: Unpaywall url: https://proxy.k.utb.cz/login?url=https://unpaywall.org/ sourceTypes: Open Access Repository – sequence: 5 dbid: BENPR name: ProQuest Central url: http://www.proquest.com/pqcentral?accountid=15518 sourceTypes: Aggregation Database |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Biology |
| EISSN | 2045-2322 |
| EndPage | 10 |
| ExternalDocumentID | oai_doaj_org_article_de846acb5b63467cbc31d73e71a723ec 10.1038/s41598-023-28734-6 PMC10354180 37464003 10_1038_s41598_023_28734_6 |
| Genre | Journal Article |
| GrantInformation_xml | – fundername: US National Institutes of Health grantid: R01 GM116927 – fundername: NIGMS NIH HHS grantid: R01 GM116927 – fundername: ; grantid: R01 GM116927 |
| GroupedDBID | 0R~ 3V. 4.4 53G 5VS 7X7 88A 88E 88I 8FE 8FH 8FI 8FJ AAFWJ AAJSJ AAKDD ABDBF ABUWG ACGFS ACSMW ACUHS ADBBV ADRAZ AENEX AEUYN AFKRA AJTQC ALIPV ALMA_UNASSIGNED_HOLDINGS AOIJS AZQEC BAWUL BBNVY BCNDV BENPR BHPHI BPHCQ BVXVI C6C CCPQU DIK DWQXO EBD EBLON EBS ESX FYUFA GNUQQ GROUPED_DOAJ GX1 HCIFZ HH5 HMCUK HYE KQ8 LK8 M0L M1P M2P M48 M7P M~E NAO OK1 PIMPY PQQKQ PROAC PSQYO RNT RNTTT RPM SNYQT UKHRP AASML AAYXX AFPKN CITATION PHGZM PHGZT PJZUB PPXIY PQGLB PUEGO NPM 7XB 8FK K9. PKEHL PQEST PQUKI PRINS Q9U 7X8 5PM ADTOC EJD IPNFZ RIG UNPAY |
| ID | FETCH-LOGICAL-c492t-3736adba80b44dd1d7d81ca905349d51e413ab91be8892ce0f868636ea112e893 |
| IEDL.DBID | M48 |
| ISSN | 2045-2322 |
| IngestDate | Tue Oct 14 18:11:12 EDT 2025 Sun Oct 26 04:05:02 EDT 2025 Tue Sep 30 17:12:42 EDT 2025 Fri Sep 05 13:02:26 EDT 2025 Tue Oct 07 08:05:36 EDT 2025 Wed Feb 19 02:23:14 EST 2025 Wed Oct 01 05:01:14 EDT 2025 Fri Feb 21 02:39:40 EST 2025 |
| IsDoiOpenAccess | true |
| IsOpenAccess | true |
| IsPeerReviewed | true |
| IsScholarly | true |
| Issue | 1 |
| Language | English |
| License | 2023. The Author(s). Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. cc-by |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c492t-3736adba80b44dd1d7d81ca905349d51e413ab91be8892ce0f868636ea112e893 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
| OpenAccessLink | https://proxy.k.utb.cz/login?url=https://www.nature.com/articles/s41598-023-28734-6.pdf |
| PMID | 37464003 |
| PQID | 2838888131 |
| PQPubID | 2041939 |
| PageCount | 10 |
| ParticipantIDs | doaj_primary_oai_doaj_org_article_de846acb5b63467cbc31d73e71a723ec unpaywall_primary_10_1038_s41598_023_28734_6 pubmedcentral_primary_oai_pubmedcentral_nih_gov_10354180 proquest_miscellaneous_2839739627 proquest_journals_2838888131 pubmed_primary_37464003 crossref_primary_10_1038_s41598_023_28734_6 springer_journals_10_1038_s41598_023_28734_6 |
| ProviderPackageCode | CITATION AAYXX |
| PublicationCentury | 2000 |
| PublicationDate | 2023-07-18 |
| PublicationDateYYYYMMDD | 2023-07-18 |
| PublicationDate_xml | – month: 07 year: 2023 text: 2023-07-18 day: 18 |
| PublicationDecade | 2020 |
| PublicationPlace | London |
| PublicationPlace_xml | – name: London – name: England |
| PublicationTitle | Scientific reports |
| PublicationTitleAbbrev | Sci Rep |
| PublicationTitleAlternate | Sci Rep |
| PublicationYear | 2023 |
| Publisher | Nature Publishing Group UK Nature Publishing Group Nature Portfolio |
| Publisher_xml | – name: Nature Publishing Group UK – name: Nature Publishing Group – name: Nature Portfolio |
| References | Hölldobler, Wilson (CR8) 2009 Elias, Land, Mason, Hoy (CR5) 2006; 192 CR19 CR18 CR17 Korb, Heinze (CR11) 2016; 61 CR16 CR36 CR35 Patek, Caldwell (CR4) 2005; 208 CR34 CR33 CR10 CR32 CR31 CR30 Liu, Zhao, Yao, Qi (CR39) 2015; 24 Pukelsheim (CR40) 1994; 48 Oreifej, Li, Shah (CR38) 2012; 35 Torres, Nieukirk, Lemos, Chandler (CR6) 2018; 5 Gordon (CR12) 1999 CR29 CR28 CR9 CR27 Torney, Hopcraft, Morrison, Couzin, Levin (CR7) 2018; 373 CR26 Imirzian (CR1) 2019; 9 CR25 Kurze, Routtu, Moritz (CR14) 2016; 119 CR24 CR23 Schmid-Hempel (CR13) 1998 CR22 Thomas, Thorne (CR3) 2001; 411 CR21 CR20 CR42 CR41 Horn, Arnett, Kunz (CR2) 2008; 72 Wilfert, Brown, Doublet (CR15) 2021; 186 Bai (CR37) 2021; 34 28734_CR19 28734_CR16 28734_CR18 28734_CR17 F Pukelsheim (28734_CR40) 1994; 48 X Liu (28734_CR39) 2015; 24 L Wilfert (28734_CR15) 2021; 186 GL Thomas (28734_CR3) 2001; 411 28734_CR23 DM Gordon (28734_CR12) 1999 28734_CR22 28734_CR25 CJ Torney (28734_CR7) 2018; 373 28734_CR9 28734_CR24 28734_CR41 Y Bai (28734_CR37) 2021; 34 28734_CR21 DO Elias (28734_CR5) 2006; 192 28734_CR20 28734_CR42 LG Torres (28734_CR6) 2018; 5 28734_CR27 B Hölldobler (28734_CR8) 2009 C Kurze (28734_CR14) 2016; 119 28734_CR26 28734_CR29 28734_CR28 N Imirzian (28734_CR1) 2019; 9 J Korb (28734_CR11) 2016; 61 JW Horn (28734_CR2) 2008; 72 S Patek (28734_CR4) 2005; 208 28734_CR34 28734_CR33 28734_CR36 28734_CR35 28734_CR30 P Schmid-Hempel (28734_CR13) 1998 28734_CR10 28734_CR32 28734_CR31 O Oreifej (28734_CR38) 2012; 35 |
| References_xml | – ident: CR22 – volume: 48 start-page: 88 year: 1994 end-page: 91 ident: CR40 article-title: The three sigma rule publication-title: Am. Stat. – ident: CR18 – volume: 5 start-page: 319 year: 2018 ident: CR6 article-title: Drone up! Quantifying whale behavior from a new perspective improves observational capacity publication-title: Front. Mar. Sci. doi: 10.3389/fmars.2018.00319 – volume: 119 start-page: 290 year: 2016 end-page: 297 ident: CR14 article-title: Parasite resistance and tolerance in honeybees at the individual and social level publication-title: Zoology doi: 10.1016/j.zool.2016.03.007 – ident: CR16 – volume: 61 start-page: 297 year: 2016 end-page: 316 ident: CR11 article-title: Major hurdles for the evolution of sociality publication-title: Annu. Rev. Entomol. doi: 10.1146/annurev-ento-010715-023711 – ident: CR30 – volume: 24 start-page: 2502 year: 2015 end-page: 2514 ident: CR39 article-title: Background subtraction based on low-rank and structured sparse decomposition publication-title: IEEE Trans. Image Process. doi: 10.1109/TIP.2015.2419084 – ident: CR10 – ident: CR33 – volume: 9 start-page: 1 year: 2019 end-page: 10 ident: CR1 article-title: Automated tracking and analysis of ant trajectories shows variation in forager exploration publication-title: Sci. Rep. doi: 10.1038/s41598-019-49655-3 – volume: 34 start-page: 24392 year: 2021 end-page: 24403 ident: CR37 article-title: Understanding and improving early stopping for learning with noisy labels publication-title: Adv. Neural. Inf. Process. Syst. – volume: 373 start-page: 20170012 year: 2018 ident: CR7 article-title: From single steps to mass migration: The problem of scale in the movement ecology of the Serengeti wildebeest publication-title: Philos. Trans. R. Soc. B Biol. Sci. doi: 10.1098/rstb.2017.0012 – ident: CR35 – volume: 208 start-page: 3655 year: 2005 end-page: 3664 ident: CR4 article-title: Extreme impact and cavitation forces of a biological hammer: Strike forces of the peacock mantis shrimp publication-title: J. Exp. Biol. doi: 10.1242/jeb.01831 – ident: CR29 – volume: 411 start-page: 1013 year: 2001 ident: CR3 article-title: Night-time predation by Steller sea lions publication-title: Nature doi: 10.1038/35082745 – volume: 186 year: 2021 ident: CR15 article-title: Onehealth implications of infectious diseases of wild and managed bees publication-title: J. Invertebr. Pathol. doi: 10.1016/j.jip.2020.107506 – ident: CR25 – ident: CR27 – ident: CR42 – ident: CR23 – ident: CR21 – year: 1999 ident: CR12 publication-title: Ants at Work: How an Insect Society is Organized – ident: CR19 – year: 2009 ident: CR8 publication-title: The Superorganism: The Beauty, Elegance, and Strangeness of Insect Societies – ident: CR17 – ident: CR31 – ident: CR9 – volume: 72 start-page: 123 year: 2008 end-page: 132 ident: CR2 article-title: Behavioral responses of bats to operating wind turbines publication-title: J. Wildl. Manag. doi: 10.2193/2006-465 – ident: CR32 – ident: CR34 – year: 1998 ident: CR13 publication-title: Parasites in Social Insects – ident: CR36 – ident: CR28 – ident: CR41 – volume: 192 start-page: 785 year: 2006 end-page: 797 ident: CR5 article-title: Measuring and quantifying dynamic visual signals in jumping spiders publication-title: J. Comp. Physiol. A. doi: 10.1007/s00359-006-0116-7 – ident: CR26 – ident: CR24 – ident: CR20 – volume: 35 start-page: 450 year: 2012 end-page: 462 ident: CR38 article-title: Simultaneous video stabilization and moving object detection in turbulence publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2012.97 – ident: 28734_CR20 doi: 10.1109/CVPR.2015.7298965 – ident: 28734_CR23 doi: 10.1609/aaai.v30i1.10141 – volume: 119 start-page: 290 year: 2016 ident: 28734_CR14 publication-title: Zoology doi: 10.1016/j.zool.2016.03.007 – volume: 411 start-page: 1013 year: 2001 ident: 28734_CR3 publication-title: Nature doi: 10.1038/35082745 – ident: 28734_CR21 – volume: 5 start-page: 319 year: 2018 ident: 28734_CR6 publication-title: Front. Mar. Sci. doi: 10.3389/fmars.2018.00319 – ident: 28734_CR31 doi: 10.1109/CVPR.2017.638 – ident: 28734_CR32 doi: 10.1109/TNNLS.2022.3152527 – ident: 28734_CR17 doi: 10.1109/ICCV.2017.322 – volume: 35 start-page: 450 year: 2012 ident: 28734_CR38 publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2012.97 – volume: 208 start-page: 3655 year: 2005 ident: 28734_CR4 publication-title: J. Exp. Biol. doi: 10.1242/jeb.01831 – volume: 192 start-page: 785 year: 2006 ident: 28734_CR5 publication-title: J. Comp. Physiol. A. doi: 10.1007/s00359-006-0116-7 – volume: 9 start-page: 1 year: 2019 ident: 28734_CR1 publication-title: Sci. Rep. doi: 10.1038/s41598-019-49655-3 – volume-title: Parasites in Social Insects year: 1998 ident: 28734_CR13 – volume: 24 start-page: 2502 year: 2015 ident: 28734_CR39 publication-title: IEEE Trans. Image Process. doi: 10.1109/TIP.2015.2419084 – ident: 28734_CR22 doi: 10.1007/978-3-319-24574-4_28 – ident: 28734_CR33 doi: 10.1109/ICCV.2017.211 – ident: 28734_CR10 doi: 10.2307/j.ctvs32s3w – ident: 28734_CR19 doi: 10.1109/CVPR.2016.91 – ident: 28734_CR18 doi: 10.1007/978-3-319-46448-0_2 – ident: 28734_CR25 – volume-title: Ants at Work: How an Insect Society is Organized year: 1999 ident: 28734_CR12 – ident: 28734_CR9 – ident: 28734_CR26 doi: 10.1109/ICCV.2015.167 – volume: 72 start-page: 123 year: 2008 ident: 28734_CR2 publication-title: J. Wildl. Manag. doi: 10.2193/2006-465 – volume: 61 start-page: 297 year: 2016 ident: 28734_CR11 publication-title: Annu. Rev. Entomol. doi: 10.1146/annurev-ento-010715-023711 – ident: 28734_CR28 doi: 10.1007/978-3-319-46466-4_5 – ident: 28734_CR16 – volume-title: The Superorganism: The Beauty, Elegance, and Strangeness of Insect Societies year: 2009 ident: 28734_CR8 – ident: 28734_CR30 – ident: 28734_CR29 doi: 10.1109/CVPR42600.2020.00975 – ident: 28734_CR34 doi: 10.1109/CVPR.2018.00582 – volume: 373 start-page: 20170012 year: 2018 ident: 28734_CR7 publication-title: Philos. Trans. R. Soc. B Biol. Sci. doi: 10.1098/rstb.2017.0012 – ident: 28734_CR36 doi: 10.1109/ICCV.2019.00524 – volume: 34 start-page: 24392 year: 2021 ident: 28734_CR37 publication-title: Adv. Neural. Inf. Process. Syst. – volume: 186 year: 2021 ident: 28734_CR15 publication-title: J. Invertebr. Pathol. doi: 10.1016/j.jip.2020.107506 – ident: 28734_CR42 doi: 10.1109/TPAMI.2004.1273918 – ident: 28734_CR41 – ident: 28734_CR27 doi: 10.1007/978-3-319-46493-0_35 – ident: 28734_CR24 – ident: 28734_CR35 doi: 10.1609/aaai.v33i01.33015909 – volume: 48 start-page: 88 year: 1994 ident: 28734_CR40 publication-title: Am. Stat. doi: 10.1080/00031305.1994.10476030 |
| SSID | ssj0000529419 |
| Score | 2.408195 |
| Snippet | Deep learning (DL) based detection models are powerful tools for large-scale analysis of dynamic biological behaviors in video data. Supervised training of a... Abstract Deep learning (DL) based detection models are powerful tools for large-scale analysis of dynamic biological behaviors in video data. Supervised... |
| SourceID | doaj unpaywall pubmedcentral proquest pubmed crossref springer |
| SourceType | Open Website Open Access Repository Aggregation Database Index Database Publisher |
| StartPage | 11566 |
| SubjectTerms | 631/114/1305 631/114/1314 631/114/1564 631/114/2397 Algorithms Annotations Deep learning Humanities and Social Sciences multidisciplinary Neural networks Rainforests Science Science (multidisciplinary) |
| SummonAdditionalLinks | – databaseName: DOAJ Directory of Open Access Journals dbid: DOA link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3da9RAEB-kIOpD8dvUKhF8s0uT7Ed2H61YiqAgWOhbmP3I9eBMjuaO0v--s0kuvUNRH3zNJmT3Nzs7M8zObwDeSywkZt4wsn6SCRSSoUbOSmswZmI09jwFX7-ps3Px5UJebLX6infCBnrgAbhjH8hCorPSKk5K7azjuS95KHMsCx5cPH0zbbaCqYHVuzAiN2OVTMb1cUeWKlaTFZxRkMAFUzuWqCfs_52X-etlySlj-ggerJsl3lzjYrFllE4fw_7oTaYfh1U8gXuheQr3h_6SN8_g-8ieOktjFUmKi1l7NV9d_mSznmyanM102YW1bxk2TTsk5buU3NjUh5hciF_GezLpvEljvV7bPYfz088_Pp2xsYcCc8IUKzo_uEJvUWdWCO8JOq9zh4Z0Txgv80BGDK3JbdDaFC5ktVZacRWQHLFAzswL2GvaJryCtLa2lsqXrg4UVtlgpSidxkLXmbO8Fgl82OBZLQeqjKpPcXNdDehXhH7Vo1-pBE4i5NObkea6f0DCr0bhV38TfgKHG4FVo-519ANOYb3OeZ7Au2mYtCamQrAJ7bp_x5Q8Nh5K4OUg32kmvBSKTjaegN6R_M5Ud0ea-WXPzE1rlSLXWQJHm01yN68_YXE0baR_gO7gf0D3Gh4WURkiR6g-hL3V1Tq8If9qZd_2qnQL7zshQg priority: 102 providerName: Directory of Open Access Journals – databaseName: ProQuest Central dbid: BENPR link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3da9RAEB_qFVEfxK_aaJUIvtmlSXY32TyIWGkpgoeKhb6F2Y9cC2dy9u4o_e-d3XzUQym-ZjewMztfu7PzG4C3EjOJiS0ZeT_JBArJUCFnhS7RZ2IUBpyCL9P85FR8PpNnWzAdamH8s8rBJgZDbVvj78gPyA3SYU2lPP2w-MV81yifXR1aaGDfWsG-DxBjd2A788hYE9g-PJp-_T7euvi8lkjLvnom4epgSR7MV5llnNHhgQuWb3ioAOT_r-jz70eUYyb1AdxbNwu8vsL5_A9ndfwIHvZRZvyxE4vHsOWaJ3C36zt5_RS-9aiqs9hXl8Q4nxGlq_OfbBZAqCkIjRdLt7Ytw6Zpu2T9MqbwNrbOJx38n_79THzRxL6Or10-g9Pjox-fTljfW4EZUWYrsis8R6tRJVoIa1NbWJUaLEknRWll6si5oS5T7ZQqM-OSWuUq57lDCtAcBTk7MGnaxu1CXGtdy9wWpnZ03NJOS1EYhZmqE6N5LSJ4N_CzWnQQGlVIfXNVddyviPtV4H6VR3DoWT7O9PDX4UN7Oat6baqso7AJjZY652TpjTacKOCuSLHIuDMR7A0bVvU6uaxuJCiCN-MwaZNPkWDj2nWYUxbcNySK4Hm3v-NKeCFysng8ArWx8xtL3RxpLs4DYjfRKkWqkgj2ByG5WddtvNgfBek_WPfidqpfwv3Mi7lHBVV7MFldrt0riqhW-nWvJr8Bx2Uegw priority: 102 providerName: ProQuest – databaseName: HAS SpringerNature Open Access 2022 dbid: AAJSJ link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3di9QwEB_OPUR9EL-vekoF39xg23w0fVzF41hQED24tzL56N7CXrtcd5H772_SdqvLiehbaZM2mclkfulkfgF4JzGTmLiCkfeTTKCQDDVylpsCQyRGY8dT8OWrOj0T83N5fgDTXS7MXvy-o-5uycWENLCMM0L3XDB1Bw7pItETOJzN5t_n4z-VELUSaTHkxlD1D7cr7_mfjqb_T9jy9hbJMU76AO5t6zVe_8TV6jdXdPIIHg4YMp71Sn8MB75-Anf7UyWvn8K3gTN1EYfckRhXi-Zqubm4ZIuOYpogZrxu_dY1DOu66UPxbUzgNXY-hBRCzbA7Jl7WccjSa9pncHby-cenUzacnMCsKLINzRpcoTOoEyOEc6nLnU4tFmRxonAy9eS60BSp8VoXmfVJpZVWXHkk-OUJwjyHSd3U_gjiyphKKpfbytNiyngjRW41ZrpKrOGViOD9Tp7luifIKLvANtdlL_2SpF920i9VBB-DyMeSgdy6u0E6LwdbKZ0nUITWSKM4zePWWE494D5PMc-4txEc7xRWDhbX0gc4LeZ1ytMI3o6PyVZCAARr32y7MkXOw3FDEbzo9Tu2hOdC0XzGI9B7mt9r6v6TennR8XFTX6VIdRLBdDdIfrXrb7KYjgPpH0T38v_e_gruZ2HYBw5QfQyTzdXWvyb8tDFvBrO5AbRrErQ priority: 102 providerName: Springer Nature – databaseName: Unpaywall dbid: UNPAY link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lb9QwEB6VrRDlwPsRKChI3KiXJH7EORZEVSFRgcSKcor8ynbFNlntboTKr2fsJAsLFaI5xrbiGY89XzQznwFecpVxldiCoPfjhCnGiZKKklwXykdipAo8BR9OxPGEvT_lpzsghlqYkLQfKC3DMT1kh71eoaPxxWAZJYjxKSNivLDVNdgVHDH4CHYnJx8Pv_qb5BCjEIQJWV8hk1B5yeAtLxTI-i9DmH8nSm6ipTfhRlsv1MV3NZ__5pCObsOXQZQuD-XbuF3rsfnxB8vj1WW9A7d6jBofdj3vwo6r78H17tbKi_vwqedknca-NiVW82mznK3Pzsk0UFgjhI0XK9fahqi6brpQ_ypGcBxb50MWfqTPvolndeyrAJvVA5gcvfv89pj0NzMQw4psjacSFcpqJRPNmLWpza1MjSpwR7PC8tSha1S6SLWTssiMSyoppKDCKYR3DiHSQxjVTe0eQ1xpXXFhc1M5_FnTTnOWG6kyWSVG04pF8GpYqXLREXCUIXBOZdkpqkRFlUFRpYjgjV_MTU9Pnh1eNMtp2Su4tA5BlzKaa0HRTxhtKEpAXZ6qPKPORLA_mELZ7-gVfoBKfFKaRvBi04x70QdYVO2aNvQpcuqvM4rgUWc5m5nQnAk8L2kEcsumtqa63VLPzgLfN8rKWSqTCA4G8_s1r3_p4mBjov-huidX6_4U9jJvoZ5jVO7DaL1s3TPEZ2v9vN-MPwEuODVJ priority: 102 providerName: Unpaywall |
| Title | Learning from algorithm-generated pseudo-annotations for detecting ants in videos |
| URI | https://link.springer.com/article/10.1038/s41598-023-28734-6 https://www.ncbi.nlm.nih.gov/pubmed/37464003 https://www.proquest.com/docview/2838888131 https://www.proquest.com/docview/2839739627 https://pubmed.ncbi.nlm.nih.gov/PMC10354180 https://www.nature.com/articles/s41598-023-28734-6.pdf https://doaj.org/article/de846acb5b63467cbc31d73e71a723ec |
| UnpaywallVersion | publishedVersion |
| Volume | 13 |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVFSB databaseName: Free Full-Text Journals in Chemistry customDbUrl: eissn: 2045-2322 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0000529419 issn: 2045-2322 databaseCode: HH5 dateStart: 20110101 isFulltext: true titleUrlDefault: http://abc-chemistry.org/ providerName: ABC ChemistRy – providerCode: PRVAFT databaseName: Open Access Digital Library customDbUrl: eissn: 2045-2322 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0000529419 issn: 2045-2322 databaseCode: KQ8 dateStart: 20110101 isFulltext: true titleUrlDefault: http://grweb.coalliance.org/oadl/oadl.html providerName: Colorado Alliance of Research Libraries – providerCode: PRVAON databaseName: DOAJ Directory of Open Access Journals customDbUrl: eissn: 2045-2322 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0000529419 issn: 2045-2322 databaseCode: DOA dateStart: 20110101 isFulltext: true titleUrlDefault: https://www.doaj.org/ providerName: Directory of Open Access Journals – providerCode: PRVEBS databaseName: EBSCOhost Academic Search Ultimate customDbUrl: https://search.ebscohost.com/login.aspx?authtype=ip,shib&custid=s3936755&profile=ehost&defaultdb=asn eissn: 2045-2322 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0000529419 issn: 2045-2322 databaseCode: ABDBF dateStart: 20121221 isFulltext: true titleUrlDefault: https://search.ebscohost.com/direct.asp?db=asn providerName: EBSCOhost – providerCode: PRVBFR databaseName: Free Medical Journals customDbUrl: eissn: 2045-2322 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0000529419 issn: 2045-2322 databaseCode: DIK dateStart: 20110101 isFulltext: true titleUrlDefault: http://www.freemedicaljournals.com providerName: Flying Publisher – providerCode: PRVFQY databaseName: GFMER Free Medical Journals customDbUrl: eissn: 2045-2322 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0000529419 issn: 2045-2322 databaseCode: GX1 dateStart: 0 isFulltext: true titleUrlDefault: http://www.gfmer.ch/Medical_journals/Free_medical.php providerName: Geneva Foundation for Medical Education and Research – providerCode: PRVHPJ databaseName: ROAD: Directory of Open Access Scholarly Resources customDbUrl: eissn: 2045-2322 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0000529419 issn: 2045-2322 databaseCode: M~E dateStart: 20110101 isFulltext: true titleUrlDefault: https://road.issn.org providerName: ISSN International Centre – providerCode: PRVAQN databaseName: PubMed Central customDbUrl: eissn: 2045-2322 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0000529419 issn: 2045-2322 databaseCode: RPM dateStart: 20110101 isFulltext: true titleUrlDefault: https://www.ncbi.nlm.nih.gov/pmc/ providerName: National Library of Medicine – providerCode: PRVAQT databaseName: Nature customDbUrl: eissn: 2045-2322 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0000529419 issn: 2045-2322 databaseCode: NAO dateStart: 20111201 isFulltext: true titleUrlDefault: https://www.nature.com/siteindex/index.html providerName: Nature Publishing – providerCode: PRVPQU databaseName: Health & Medical Collection customDbUrl: eissn: 2045-2322 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0000529419 issn: 2045-2322 databaseCode: 7X7 dateStart: 20210101 isFulltext: true titleUrlDefault: https://search.proquest.com/healthcomplete providerName: ProQuest – providerCode: PRVPQU databaseName: ProQuest Central customDbUrl: http://www.proquest.com/pqcentral?accountid=15518 eissn: 2045-2322 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0000529419 issn: 2045-2322 databaseCode: BENPR dateStart: 20210101 isFulltext: true titleUrlDefault: https://www.proquest.com/central providerName: ProQuest – providerCode: PRVFZP databaseName: Scholars Portal Journals: Open Access customDbUrl: eissn: 2045-2322 dateEnd: 20250131 omitProxy: true ssIdentifier: ssj0000529419 issn: 2045-2322 databaseCode: M48 dateStart: 20110801 isFulltext: true titleUrlDefault: http://journals.scholarsportal.info providerName: Scholars Portal – providerCode: PRVAVX databaseName: HAS SpringerNature Open Access 2022 customDbUrl: eissn: 2045-2322 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0000529419 issn: 2045-2322 databaseCode: AAJSJ dateStart: 20111201 isFulltext: true titleUrlDefault: https://www.springernature.com providerName: Springer Nature – providerCode: PRVAVX databaseName: Springer Nature Link OA Free Journals customDbUrl: eissn: 2045-2322 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0000529419 issn: 2045-2322 databaseCode: C6C dateStart: 20111201 isFulltext: true titleUrlDefault: http://www.springeropen.com/ providerName: Springer Nature |
| link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3db9MwED_tQwh4QHwvMKog8cbMmthxnAeEumrTVGnV-KhUniI7drpKJemaVtD_nrOTBioqBE-R4nzY5zvfz7nc7wDeRDKMZFcnBL1fRJhkEZFCUhKrRNpIjJCOp-BqyC9HbDCOxnuwKXfUCLDaubWz9aRGi9m7H7frD2jw7-uUcXFaoROyiWIhJYj_KSN8Hw7RUyW2lMNVA_drru8wYa7WhyVhJwgmwiaPZvdjtnyVo_TfhUP__J2yjaneh7urYi7X3-Vs9pvbungIDxq86fdqBXkEe6Z4DHfqCpTrJ_Cx4Ved-DbPxJezSbmYLm--kYmjo0Y46s8rs9IlkUVR1mH7ykeg62tjww_2TvsnjT8tfJvRV1ZPYXRx_qV_SZoqCyRjSbjEFYZyqZUUXcWY1oGOtQgymaB1skRHgUE3J1USKCNEEmammwsuOOVGIlQzCHeewUFRFuYI_FypPOI6znKDGy9lVMTiTMhQ5N1M0Zx58HYjz3Rek2mkLghORVpLP0Xpp076KffgzIq8vdISYbsT5WKSNnaVaoMASmYqUpzimp-pjOIIqIkDGYfUZB4cbyYs3SgXvoDixl8ENPDgdduMdmWDJbIw5cpdk8TUliby4Hk9v21PaMw4rn3UA7E181td3W4ppjeOuxvHGrFAdD042SjJr379TRYnrSL9g-he_JegX8K90Gq9pQsVx3CwXKzMK4RaS9WB_Xgcd-Cw1xt8HuDx7Hx4_QnP9nm_4z5fdJyFYctoeN37-hOVUyhZ |
| linkProvider | Scholars Portal |
| linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1bT9RAFD4hEIM-GLxXUWuiTzKh7Uzb6QMhoJBFYKMGEt7GuXUhWduV7obsn_O3caY33GiIL7x2psnMN-c6Z845AO9jGcUyMBlB7RcTJllMJJeUpCqTLhLDZV2n4HiYDE7Zl7P4bAl-d7kw7lllJxNrQW1K7e7IN1ENorPGQxpuT34R1zXKRVe7Fhqyba1gtuoSY21ix6GdX6ELV20dfMbz_hBF-3snnwak7TJANMuiKXIYTaRRkgeKMWNCkxoeapkhdbLMxKFFMS9VFirLeRZpG-Q84QlNrERTxXJXjAlVwArD2ej8rezuDb9-7295XByNhVmbrRNQvlmhxnRZbREl6KxQRpIFjVg3DviXtfv3o80-cvsAVmfFRM6v5Hj8h3LcX4OHrVXr7zRk-AiWbPEY7jV9LudP4FtbxXXku2wWX45HiOz0_CcZ1UWv0ej1J5WdmZLIoiibxwGVj-a0b6wLcrg_3Xsd_6LwXd5gWT2F0ztB-RksF2VhX4CfK5XHiUl1btG9U1bFLNVcRjwPtKI58-Bjh6eYNCU7RB1qp1w06AtEX9Toi8SDXQd5P9OV264_lJcj0XKvMBbNNKlVrBKKmkUrTXEH1KahTCNqtQfr3YGJVgZU4oZiPXjXDyP3upCMLGw5q-dkKXUNkDx43pxvvxKasgQlLPWAL5z8wlIXR4qL87pCOO41ZiEPPNjoiORmXbdhsdET0n9A9_L2Xb-F1cHJ8ZE4OhgevoL7kSN5V5GUr8Py9HJmX6M1N1VvWpbx4cddc-k1cZla7w |
| linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Jb9QwFH6qitgOiJ1AgSDBiVqTxE7sHBACyqilUIFEpbm53jKtNCRDM6Nq_hq_judsZQSquPQaO5L9-W322wBepipJVWRzgtovJUyxlCihKOE6V94TI1RTp-DLQbZ7yD5N0skG_OpzYXxYZS8TG0FtK-PfyEeoBvGyJmIaj4ouLOLrzvjt_CfxHaS8p7Vvp9GSyL5bneH1rX6zt4Nn_SpJxh-_f9glXYcBYlieLJC7aKasViLSjFkbW25FbFSOlMlym8YORbzSeaydEHliXFSITGQ0cwrNFCd8ISYU_1c4xY0iL_EJH953vAeNxXmXpxNRMapRV_p8toQSvKZQRrI1Xdi0DPiXnft3uObgs70J15flXK3O1Gz2h1oc34ZbnT0bvmsJ8A5suPIuXG07XK7uwbeufus09HksoZpNEcfF8Q8ybcpdo7kbzmu3tBVRZVm1YQF1iIZ0aJ13b_g_faROeFKGPmOwqu_D4aVg_AA2y6p0jyAstC7SzHJTOLzYaadTxo1QiSgio2nBAnjd4ynnbbEO2TjZqZAt-hLRlw36MgvgvYd8mOkLbTcfqtOp7PhWWocGmjI61RlFnWK0obgD6niseEKdCWCrPzDZcX8tz2k1gBfDMPKtd8ao0lXLZk7OqW99FMDD9nyHlVDOMpStNACxdvJrS10fKU-Om9rguNeUxSIKYLsnkvN1XYTF9kBI_wHd44t3_RyuIW_Kz3sH-0_gRuIp3pciFVuwuThduqdoxi30s4ZfQji6bAb9De3dWIk |
| linkToUnpaywall | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lb9QwEB6VrRDlwPsRKChI3KiXJH7EORZEVSFRgcSKcor8ynbFNlntboTKr2fsJAsLFaI5xrbiGY89XzQznwFecpVxldiCoPfjhCnGiZKKklwXykdipAo8BR9OxPGEvT_lpzsghlqYkLQfKC3DMT1kh71eoaPxxWAZJYjxKSNivLDVNdgVHDH4CHYnJx8Pv_qb5BCjEIQJWV8hk1B5yeAtLxTI-i9DmH8nSm6ipTfhRlsv1MV3NZ__5pCObsOXQZQuD-XbuF3rsfnxB8vj1WW9A7d6jBofdj3vwo6r78H17tbKi_vwqedknca-NiVW82mznK3Pzsk0UFgjhI0XK9fahqi6brpQ_ypGcBxb50MWfqTPvolndeyrAJvVA5gcvfv89pj0NzMQw4psjacSFcpqJRPNmLWpza1MjSpwR7PC8tSha1S6SLWTssiMSyoppKDCKYR3DiHSQxjVTe0eQ1xpXXFhc1M5_FnTTnOWG6kyWSVG04pF8GpYqXLREXCUIXBOZdkpqkRFlUFRpYjgjV_MTU9Pnh1eNMtp2Su4tA5BlzKaa0HRTxhtKEpAXZ6qPKPORLA_mELZ7-gVfoBKfFKaRvBi04x70QdYVO2aNvQpcuqvM4rgUWc5m5nQnAk8L2kEcsumtqa63VLPzgLfN8rKWSqTCA4G8_s1r3_p4mBjov-huidX6_4U9jJvoZ5jVO7DaL1s3TPEZ2v9vN-MPwEuODVJ |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Learning+from+algorithm-generated+pseudo-annotations+for+detecting+ants+in+videos&rft.jtitle=Scientific+reports&rft.au=Zhang%2C+Yizhe&rft.au=Imirzian%2C+Natalie&rft.au=Kurze%2C+Christoph&rft.au=Zheng%2C+Hao&rft.date=2023-07-18&rft.issn=2045-2322&rft.eissn=2045-2322&rft.volume=13&rft.issue=1&rft_id=info:doi/10.1038%2Fs41598-023-28734-6&rft.externalDBID=n%2Fa&rft.externalDocID=10_1038_s41598_023_28734_6 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2045-2322&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2045-2322&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2045-2322&client=summon |