Underwater Image Enhancement Using a Multiscale Dense Generative Adversarial Network

Underwater image enhancement has received much attention in underwater vision research. However, raw underwater images easily suffer from color distortion, underexposure, and fuzz caused by the underwater scene. To address the above-mentioned problems, we propose a new multiscale dense generative ad...

Full description

Saved in:
Bibliographic Details
Published inIEEE journal of oceanic engineering Vol. 45; no. 3; pp. 862 - 870
Main Authors Guo, Yecai, Li, Hanyu, Zhuang, Peixian
Format Journal Article
LanguageEnglish
Published New York IEEE 01.07.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN0364-9059
1558-1691
DOI10.1109/JOE.2019.2911447

Cover

More Information
Summary:Underwater image enhancement has received much attention in underwater vision research. However, raw underwater images easily suffer from color distortion, underexposure, and fuzz caused by the underwater scene. To address the above-mentioned problems, we propose a new multiscale dense generative adversarial network (GAN) for enhancing underwater images. The residual multiscale dense block is presented in the generator, where the multiscale, dense concatenation, and residual learning can boost the performance, render more details, and utilize previous features, respectively. And the discriminator employs computationally light spectral normalization to stabilize the training of the discriminator. Meanwhile, nonsaturating GAN loss function combining <inline-formula><tex-math notation="LaTeX">L_1</tex-math></inline-formula> loss and gradient loss is presented to focus on image features of ground truth. Final enhanced results on synthetic and real underwater images demonstrate the superiority of the proposed method, which outperforms nondeep and deep learning methods in both qualitative and quantitative evaluations. Furthermore, we perform an ablation study to show the contributions of each component and carry out application tests to further demonstrate the effectiveness of the proposed method.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0364-9059
1558-1691
DOI:10.1109/JOE.2019.2911447