Projector deep feature extraction-based garbage image classification model using underwater images

Marine and ocean pollution is one of the most serious environmental problems in the world. Marine plastics pose a significant threat to the marine ecosystem due to their negative effects. After passing through various processes, plastic waste accumulates on the seafloor and fragments into very small...

Full description

Saved in:
Bibliographic Details
Published inMultimedia tools and applications Vol. 83; no. 33; pp. 79437 - 79451
Main Authors Demir, Kubra, Yaman, Orhan
Format Journal Article
LanguageEnglish
Published New York Springer US 01.10.2024
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN1573-7721
1380-7501
1573-7721
DOI10.1007/s11042-024-18731-w

Cover

More Information
Summary:Marine and ocean pollution is one of the most serious environmental problems in the world. Marine plastics pose a significant threat to the marine ecosystem due to their negative effects. After passing through various processes, plastic waste accumulates on the seafloor and fragments into very small pieces known as microplastics. These microplastics are to blame for the extinction and death of aquatic life. This study obtained a hybrid underwater dataset containing 13,089 images, sized 300 × 300, including garbage and sea animals. In the proposed method, this dataset is used to develop our example projector deep feature generator. In this study, using the Resnet101 network in a sample projector build, the feature generator creates 6,000 features. Using NCA (Neighborhood Component Analysis), the best 1000 features from a pool of 6,000 are selected. The kNN (k-nearest neighbor) algorithm is then used to classify the resulting feature vectors. As validation techniques, both tenfold cross-validations were used. The hybrid dataset's best accuracy was calculated to be 99.35%. Our recommendation is successful based on the comparisons and calculated performance measures.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1573-7721
1380-7501
1573-7721
DOI:10.1007/s11042-024-18731-w