GreedyCenters: Satellite imagery adaptive sampling method for artificial neural networks training

The one of many significant particularities of satellite imagery is large size of images within orders of magnitude exceeds capability of modern GPGPU to train neural networks on its full size. On the other hand satellite imagery tends to be limitedly available. Moreover, the objects of interest ten...

Full description

Saved in:
Bibliographic Details
Published inE3S web of conferences Vol. 310; p. 2001
Main Author Gvozdev, Oleg
Format Journal Article Conference Proceeding
LanguageEnglish
Published Les Ulis EDP Sciences 01.01.2021
Subjects
Online AccessGet full text
ISSN2267-1242
2555-0403
2267-1242
DOI10.1051/e3sconf/202131002001

Cover

More Information
Summary:The one of many significant particularities of satellite imagery is large size of images within orders of magnitude exceeds capability of modern GPGPU to train neural networks on its full size. On the other hand satellite imagery tends to be limitedly available. Moreover, the objects of interest tends to constitute a small fraction of whole dataset. This leads to the demand of sample extraction and augmentation method specialized on satellite imagery. Yet this area is immensely underrated so almost all widely used method limited to grid-based sample extraction and augmentation via combinations of 90-degrees rotations and mirroring on vertical or horizontal axes. This paper proposes the domain-agnostic method of sample extraction and augmentation. Adoption of this method to specific subject area is based on domain-specific way to generate significance field of image. In contrast to trivial greedy solutions and more advanced stochastic optimization methods the design of proposed method is focused on maximizing per-step progress. This makes its performance reasonably good even without low-level optimizations without significant quality loss. It can be easily implemented using widely known and open source software libraries.
Bibliography:ObjectType-Conference Proceeding-1
SourceType-Conference Papers & Proceedings-1
content type line 21
ISSN:2267-1242
2555-0403
2267-1242
DOI:10.1051/e3sconf/202131002001