LostNet: A smart way for lost and find

The rapid population growth in urban areas has led to an increased frequency of lost and unclaimed items in public spaces such as public transportation, restaurants, and other venues. Services like Find My iPhone efficiently track lost electronic devices, but many valuable items remain unmonitored,...

Full description

Saved in:
Bibliographic Details
Published inPloS one Vol. 19; no. 10; p. e0310998
Main Authors Zhou, Meihua, Fung, Ivan, Yang, Li, Wan, Nan, Di, Keke, Wang, Tingting
Format Journal Article
LanguageEnglish
Published United States Public Library of Science 30.10.2024
Public Library of Science (PLoS)
Subjects
Online AccessGet full text
ISSN1932-6203
1932-6203
DOI10.1371/journal.pone.0310998

Cover

More Information
Summary:The rapid population growth in urban areas has led to an increased frequency of lost and unclaimed items in public spaces such as public transportation, restaurants, and other venues. Services like Find My iPhone efficiently track lost electronic devices, but many valuable items remain unmonitored, resulting in delays in reclaiming lost and found items. This research presents a method to streamline the search process by comparing images of lost and recovered items provided by owners with photos taken when items are registered as lost and found. A photo matching network is proposed, integrating the transfer learning capabilities of MobileNetV2 with the Convolutional Block Attention Module (CBAM) and utilizing perceptual hashing algorithms for their simplicity and speed. An Internet framework based on the Spring Boot system supports the development of an online lost and found image identification system. The implementation achieves a testing accuracy of 96.8%, utilizing only 0.67 GFLOPs and 3.5M training parameters, thus enabling the recognition of images in real-world scenarios and operable on standard laptops.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
Competing Interests: The authors have declared that no competing interests exist.
ISSN:1932-6203
1932-6203
DOI:10.1371/journal.pone.0310998