Underwater Image Enhancement Using a Multiscale Dense Generative Adversarial Network
Underwater image enhancement has received much attention in underwater vision research. However, raw underwater images easily suffer from color distortion, underexposure, and fuzz caused by the underwater scene. To address the above-mentioned problems, we propose a new multiscale dense generative ad...
Saved in:
Published in | IEEE journal of oceanic engineering Vol. 45; no. 3; pp. 862 - 870 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
New York
IEEE
01.07.2020
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
ISSN | 0364-9059 1558-1691 |
DOI | 10.1109/JOE.2019.2911447 |
Cover
Abstract | Underwater image enhancement has received much attention in underwater vision research. However, raw underwater images easily suffer from color distortion, underexposure, and fuzz caused by the underwater scene. To address the above-mentioned problems, we propose a new multiscale dense generative adversarial network (GAN) for enhancing underwater images. The residual multiscale dense block is presented in the generator, where the multiscale, dense concatenation, and residual learning can boost the performance, render more details, and utilize previous features, respectively. And the discriminator employs computationally light spectral normalization to stabilize the training of the discriminator. Meanwhile, nonsaturating GAN loss function combining <inline-formula><tex-math notation="LaTeX">L_1</tex-math></inline-formula> loss and gradient loss is presented to focus on image features of ground truth. Final enhanced results on synthetic and real underwater images demonstrate the superiority of the proposed method, which outperforms nondeep and deep learning methods in both qualitative and quantitative evaluations. Furthermore, we perform an ablation study to show the contributions of each component and carry out application tests to further demonstrate the effectiveness of the proposed method. |
---|---|
AbstractList | Underwater image enhancement has received much attention in underwater vision research. However, raw underwater images easily suffer from color distortion, underexposure, and fuzz caused by the underwater scene. To address the above-mentioned problems, we propose a new multiscale dense generative adversarial network (GAN) for enhancing underwater images. The residual multiscale dense block is presented in the generator, where the multiscale, dense concatenation, and residual learning can boost the performance, render more details, and utilize previous features, respectively. And the discriminator employs computationally light spectral normalization to stabilize the training of the discriminator. Meanwhile, nonsaturating GAN loss function combining [Formula Omitted] loss and gradient loss is presented to focus on image features of ground truth. Final enhanced results on synthetic and real underwater images demonstrate the superiority of the proposed method, which outperforms nondeep and deep learning methods in both qualitative and quantitative evaluations. Furthermore, we perform an ablation study to show the contributions of each component and carry out application tests to further demonstrate the effectiveness of the proposed method. Underwater image enhancement has received much attention in underwater vision research. However, raw underwater images easily suffer from color distortion, underexposure, and fuzz caused by the underwater scene. To address the above-mentioned problems, we propose a new multiscale dense generative adversarial network (GAN) for enhancing underwater images. The residual multiscale dense block is presented in the generator, where the multiscale, dense concatenation, and residual learning can boost the performance, render more details, and utilize previous features, respectively. And the discriminator employs computationally light spectral normalization to stabilize the training of the discriminator. Meanwhile, nonsaturating GAN loss function combining <inline-formula><tex-math notation="LaTeX">L_1</tex-math></inline-formula> loss and gradient loss is presented to focus on image features of ground truth. Final enhanced results on synthetic and real underwater images demonstrate the superiority of the proposed method, which outperforms nondeep and deep learning methods in both qualitative and quantitative evaluations. Furthermore, we perform an ablation study to show the contributions of each component and carry out application tests to further demonstrate the effectiveness of the proposed method. |
Author | Guo, Yecai Li, Hanyu Zhuang, Peixian |
Author_xml | – sequence: 1 givenname: Yecai orcidid: 0000-0002-4395-7553 surname: Guo fullname: Guo, Yecai email: guo-yecai@163.com organization: School of Electronic and Information Engineering, Nanjing University of Information Science and Technology, Nanjing, China – sequence: 2 givenname: Hanyu surname: Li fullname: Li, Hanyu email: lihanyu1204@163.com organization: School of Electronic and Information Engineering, Nanjing University of Information Science and Technology, Nanjing, China – sequence: 3 givenname: Peixian orcidid: 0000-0002-7143-9569 surname: Zhuang fullname: Zhuang, Peixian email: zhuangpeixian0624@163.com organization: School of Electronic and Information Engineering, Nanjing University of Information Science and Technology, Nanjing, China |
BookMark | eNp9kLFPAjEYRxuDiYDuJi5NnA_ba--OjgQRMSoLzE2v_Q6LRw_bAvG_9wjEwcGpy3tff3k91HGNA4RuKRlQSsTDy3wySAkVg1RQynlxgbo0y4YJzQXtoC5hOU8EycQV6oWwJuTIiC5aLJ0Bf1ARPJ5t1ArwxH0op2EDLuJlsG6FFX7b1dEGrWrAj-AC4Ck48CraPeCR2YMPyltV43eIh8Z_XqPLStUBbs5vHy2fJovxc_I6n87Go9dEtyNjwgXhQ12WRhuieWWqNCMEgDBW0FwNBRdFVUJleFGkvKy4roxWJVN5mSnFDWN9dH-6u_XN1w5ClOtm5137pUx5muU0ZRltqfxEad-E4KGS2sZ2e-OiV7aWlMhjQdkWlMeC8lywFckfcevtRvnv_5S7k2IB4BcfFoy0e9gPKjd-qw |
CODEN | IJOEDY |
CitedBy_id | crossref_primary_10_1016_j_sigpro_2022_108902 crossref_primary_10_1002_rob_22378 crossref_primary_10_3390_jmse12071210 crossref_primary_10_1016_j_engstruct_2024_119037 crossref_primary_10_1016_j_oceaneng_2022_113202 crossref_primary_10_1109_LGRS_2023_3299613 crossref_primary_10_3390_electronics11162537 crossref_primary_10_1007_s13042_023_01984_6 crossref_primary_10_1080_09500340_2024_2362893 crossref_primary_10_26748_KSOE_2020_030 crossref_primary_10_3390_jmse11020447 crossref_primary_10_1109_TIP_2021_3076367 crossref_primary_10_1177_14759217241228780 crossref_primary_10_1016_j_engappai_2023_107462 crossref_primary_10_1109_TIP_2022_3190209 crossref_primary_10_1049_ipr2_12845 crossref_primary_10_1109_TCSVT_2022_3174817 crossref_primary_10_1109_TIM_2024_3366583 crossref_primary_10_1109_TIP_2022_3177129 crossref_primary_10_3390_jmse12071216 crossref_primary_10_1007_s11042_024_20091_4 crossref_primary_10_1016_j_eswa_2024_125350 crossref_primary_10_1109_JOE_2023_3252760 crossref_primary_10_1109_TCSVT_2022_3208100 crossref_primary_10_3390_electronics14061203 crossref_primary_10_3389_fmars_2023_1226024 crossref_primary_10_1016_j_jvcir_2024_104240 crossref_primary_10_1016_j_prime_2024_100634 crossref_primary_10_1631_FITEE_2000190 crossref_primary_10_1016_j_dt_2023_12_007 crossref_primary_10_1007_s11431_024_2824_x crossref_primary_10_1109_JOE_2024_3463840 crossref_primary_10_1364_OE_427839 crossref_primary_10_1007_s10489_021_02835_z crossref_primary_10_1109_TCSVT_2021_3093890 crossref_primary_10_1016_j_heliyon_2023_e14442 crossref_primary_10_1016_j_optlaseng_2024_108575 crossref_primary_10_3390_rs15051195 crossref_primary_10_3390_jmse10020241 crossref_primary_10_1038_s41598_022_11422_2 crossref_primary_10_1016_j_jvcir_2024_104131 crossref_primary_10_1109_ACCESS_2024_3370597 crossref_primary_10_1364_OE_483632 crossref_primary_10_3389_fmars_2022_964600 crossref_primary_10_1007_s11042_022_12135_4 crossref_primary_10_1109_ACCESS_2021_3060947 crossref_primary_10_1186_s40494_023_01015_1 crossref_primary_10_1016_j_ecoinf_2024_102631 crossref_primary_10_1109_TGRS_2023_3293912 crossref_primary_10_1109_ACCESS_2020_3002883 crossref_primary_10_1109_JOE_2023_3245686 crossref_primary_10_1109_JOE_2022_3152519 crossref_primary_10_1109_JOE_2024_3458351 crossref_primary_10_1016_j_inffus_2022_12_012 crossref_primary_10_1109_TGRS_2024_3425539 crossref_primary_10_1007_s00371_021_02305_0 crossref_primary_10_1007_s11042_022_12267_7 crossref_primary_10_1109_JOE_2022_3227393 crossref_primary_10_1109_TIM_2024_3480228 crossref_primary_10_3390_jmse10040500 crossref_primary_10_1109_ACCESS_2020_3009161 crossref_primary_10_1049_ipr2_12745 crossref_primary_10_1109_LSP_2021_3072563 crossref_primary_10_46932_sfjdv5n9_053 crossref_primary_10_1007_s11042_023_15419_5 crossref_primary_10_1016_j_knosys_2022_109997 crossref_primary_10_1109_TIM_2022_3189630 crossref_primary_10_1109_TMM_2024_3387760 crossref_primary_10_1007_s11042_020_09429_w crossref_primary_10_1016_j_image_2022_116684 crossref_primary_10_1109_TGRS_2021_3134762 crossref_primary_10_1016_j_engappai_2023_105946 crossref_primary_10_57120_yalvac_1388877 crossref_primary_10_1155_2022_8229580 crossref_primary_10_1007_s11042_021_11269_1 crossref_primary_10_1016_j_eswa_2023_120856 crossref_primary_10_1016_j_fmre_2021_03_002 crossref_primary_10_1007_s11042_022_14228_6 crossref_primary_10_1016_j_imavis_2024_104995 crossref_primary_10_1016_j_dib_2021_106823 crossref_primary_10_1109_TCSVT_2023_3305777 crossref_primary_10_1364_OE_494638 crossref_primary_10_1016_j_imavis_2024_105285 crossref_primary_10_1016_j_neucom_2023_02_018 crossref_primary_10_11834_jig_230323 crossref_primary_10_1109_ACCESS_2023_3323360 crossref_primary_10_1016_j_neucom_2024_129270 crossref_primary_10_1364_OE_523951 crossref_primary_10_3390_biomimetics8030275 crossref_primary_10_1016_j_imavis_2023_104813 crossref_primary_10_1016_j_compeleceng_2022_107898 crossref_primary_10_3390_info13040187 crossref_primary_10_1109_LSP_2024_3384940 crossref_primary_10_1364_OE_482489 crossref_primary_10_3788_LOP223047 crossref_primary_10_1088_1755_1315_809_1_012012 crossref_primary_10_3389_fmars_2024_1366815 crossref_primary_10_3390_jmse11071285 crossref_primary_10_1016_j_displa_2022_102174 crossref_primary_10_3390_electronics13010199 crossref_primary_10_1109_TGRS_2023_3338611 crossref_primary_10_1016_j_eswa_2023_122844 crossref_primary_10_1109_TCI_2025_3544065 crossref_primary_10_3390_math11061382 crossref_primary_10_1007_s42979_024_02847_9 crossref_primary_10_3390_math12131933 crossref_primary_10_1038_s41598_025_89109_7 crossref_primary_10_1007_s11042_022_12721_6 crossref_primary_10_1007_s13042_022_01659_8 crossref_primary_10_1088_1361_6501_abaa1d crossref_primary_10_1109_JOE_2021_3086907 crossref_primary_10_1109_JOE_2022_3190517 crossref_primary_10_1049_ipr2_12781 crossref_primary_10_3390_sym13091597 crossref_primary_10_1007_s11760_022_02392_z crossref_primary_10_12677_CSA_2021_1110254 crossref_primary_10_3390_app12115420 crossref_primary_10_1109_ACCESS_2023_3240648 crossref_primary_10_1145_3511021 crossref_primary_10_1007_s42484_024_00206_8 crossref_primary_10_3389_fmars_2022_1058019 crossref_primary_10_1007_s11760_023_02864_w crossref_primary_10_1093_icesjms_fsae004 crossref_primary_10_1016_j_jvcir_2024_104051 crossref_primary_10_3390_jmse11061183 crossref_primary_10_1016_j_image_2020_115921 crossref_primary_10_1049_ipr2_12433 crossref_primary_10_1007_s10489_022_03275_z crossref_primary_10_3390_s23198297 crossref_primary_10_1109_LRA_2021_3070253 crossref_primary_10_1109_JOE_2021_3104055 crossref_primary_10_1016_j_jvcir_2022_103638 crossref_primary_10_1016_j_jvcir_2023_103926 crossref_primary_10_1016_j_isprsjprs_2022_12_007 crossref_primary_10_1109_JOE_2023_3297731 crossref_primary_10_1016_j_inffus_2024_102809 crossref_primary_10_1109_JOE_2023_3245760 crossref_primary_10_3390_s23031741 crossref_primary_10_1016_j_image_2024_117154 crossref_primary_10_1016_j_anucene_2021_108207 crossref_primary_10_1364_OE_462861 crossref_primary_10_1007_s00371_024_03630_w crossref_primary_10_1364_OE_512397 crossref_primary_10_1016_j_engappai_2022_104759 crossref_primary_10_1016_j_image_2021_116622 crossref_primary_10_1364_OE_428626 crossref_primary_10_1109_TCSVT_2023_3328272 crossref_primary_10_1007_s00371_024_03785_6 crossref_primary_10_1007_s11704_022_1205_7 crossref_primary_10_1016_j_optlaseng_2024_108154 crossref_primary_10_1049_ipr2_12210 crossref_primary_10_3390_jmse12030506 crossref_primary_10_3390_s20164425 crossref_primary_10_3390_jmse13020231 crossref_primary_10_1109_TMM_2023_3291823 crossref_primary_10_1002_admt_202500072 crossref_primary_10_1007_s11465_021_0669_8 crossref_primary_10_1016_j_oceaneng_2024_116794 crossref_primary_10_1109_TIP_2023_3276332 crossref_primary_10_1016_j_neunet_2024_106809 crossref_primary_10_1109_TCSVT_2023_3290363 crossref_primary_10_1109_JOE_2021_3064093 crossref_primary_10_1016_j_ijleo_2022_170168 crossref_primary_10_1016_j_isprsjprs_2023_01_007 crossref_primary_10_1109_JOE_2022_3226202 crossref_primary_10_1016_j_jvcir_2022_103656 crossref_primary_10_1109_JOE_2024_3429653 crossref_primary_10_1109_TIP_2023_3334556 crossref_primary_10_1109_ACCESS_2024_3474031 crossref_primary_10_1016_j_engappai_2023_106532 crossref_primary_10_1016_j_dsp_2022_103900 crossref_primary_10_1007_s00371_025_03866_0 crossref_primary_10_1016_j_image_2020_115892 crossref_primary_10_3390_app14020529 crossref_primary_10_1007_s00371_022_02665_1 crossref_primary_10_1016_j_image_2021_116248 crossref_primary_10_1109_JSEN_2023_3251326 crossref_primary_10_3390_w13233470 crossref_primary_10_1016_j_oceaneng_2025_120896 crossref_primary_10_1016_j_patcog_2021_108324 crossref_primary_10_1364_AO_452318 crossref_primary_10_1364_OE_453387 crossref_primary_10_3390_electronics11182894 crossref_primary_10_1016_j_engappai_2023_106866 crossref_primary_10_1364_OE_463865 crossref_primary_10_1117_1_JEI_33_2_023024 crossref_primary_10_1007_s11042_024_18550_z crossref_primary_10_1109_TETCI_2024_3369321 crossref_primary_10_1007_s11760_024_03047_x crossref_primary_10_1109_TCSVT_2023_3314767 crossref_primary_10_1016_j_displa_2024_102797 crossref_primary_10_1007_s11042_023_17180_1 crossref_primary_10_1109_TIP_2025_3539477 crossref_primary_10_1002_int_22806 crossref_primary_10_1007_s00371_023_03215_z crossref_primary_10_1016_j_engappai_2022_105489 crossref_primary_10_1007_s11042_023_15708_z crossref_primary_10_1145_3709003 crossref_primary_10_1016_j_dsp_2025_105048 crossref_primary_10_1016_j_jvcir_2024_104308 crossref_primary_10_1364_OE_492293 crossref_primary_10_1016_j_dsp_2025_105170 crossref_primary_10_1016_j_engappai_2024_108561 crossref_primary_10_1109_ACCESS_2024_3465550 crossref_primary_10_3390_jmse11061124 crossref_primary_10_1016_j_inffus_2024_102857 crossref_primary_10_1016_j_optlaseng_2025_108898 crossref_primary_10_1080_13682199_2024_2439731 crossref_primary_10_3390_electronics12244999 crossref_primary_10_1007_s11042_023_14687_5 crossref_primary_10_1364_AO_549613 crossref_primary_10_1016_j_image_2022_116797 crossref_primary_10_1109_LRA_2021_3105144 crossref_primary_10_1016_j_displa_2022_102337 crossref_primary_10_1016_j_physleta_2024_130001 crossref_primary_10_1007_s10043_022_00762_z crossref_primary_10_1016_j_neucom_2021_07_003 crossref_primary_10_1109_ACCESS_2023_3290903 crossref_primary_10_1016_j_optlaseng_2024_108640 crossref_primary_10_1038_s41598_024_55990_x crossref_primary_10_1109_TGRS_2023_3281741 crossref_primary_10_1007_s11042_020_10273_1 crossref_primary_10_1142_S0219467823500316 crossref_primary_10_1016_j_engappai_2023_106731 crossref_primary_10_1016_j_engappai_2023_106972 crossref_primary_10_1109_JOE_2024_3458348 crossref_primary_10_1016_j_displa_2025_102980 crossref_primary_10_1109_ACCESS_2024_3449136 crossref_primary_10_1109_TIM_2021_3120130 crossref_primary_10_3390_jmse11040787 crossref_primary_10_1016_j_aei_2024_102723 crossref_primary_10_1016_j_imavis_2024_105101 crossref_primary_10_1109_JOE_2023_3334478 crossref_primary_10_1109_LRA_2020_2974710 crossref_primary_10_1109_TGRS_2024_3358892 crossref_primary_10_1016_j_compeleceng_2023_108990 crossref_primary_10_1007_s11431_023_2614_8 crossref_primary_10_3390_jmse11061221 crossref_primary_10_1016_j_eswa_2024_126075 crossref_primary_10_1007_s00371_022_02580_5 crossref_primary_10_1016_j_eswa_2023_122693 crossref_primary_10_1016_j_image_2023_116939 crossref_primary_10_1109_TGRS_2022_3227548 crossref_primary_10_1109_JOE_2023_3302888 crossref_primary_10_1109_LRA_2022_3156176 crossref_primary_10_1109_TMM_2021_3115442 crossref_primary_10_1117_1_JEI_31_6_063017 crossref_primary_10_1016_j_asoc_2024_112000 crossref_primary_10_1109_TIP_2019_2955241 crossref_primary_10_1016_j_imavis_2024_105256 crossref_primary_10_1109_JOE_2024_3474919 crossref_primary_10_1016_j_eswa_2023_122546 crossref_primary_10_32604_cmes_2022_019447 crossref_primary_10_1007_s11760_024_03598_z crossref_primary_10_1016_j_inffus_2023_102127 crossref_primary_10_1049_ipr2_12702 crossref_primary_10_53433_yyufbed_1249102 crossref_primary_10_1016_j_engappai_2021_104171 crossref_primary_10_2139_ssrn_4129750 crossref_primary_10_3390_sym17020201 crossref_primary_10_1109_JOE_2023_3317903 crossref_primary_10_3390_math12223553 crossref_primary_10_1109_ACCESS_2024_3435569 crossref_primary_10_1016_j_jvcir_2024_104224 crossref_primary_10_1109_TETCI_2023_3322424 crossref_primary_10_3390_jmse11071476 crossref_primary_10_1016_j_sigpro_2024_109408 crossref_primary_10_1007_s11042_021_11327_8 crossref_primary_10_1016_j_ijleo_2022_169009 crossref_primary_10_1109_JOE_2022_3140563 crossref_primary_10_1038_s41598_024_82803_y crossref_primary_10_1016_j_image_2025_117281 crossref_primary_10_1109_TIP_2022_3216208 crossref_primary_10_1109_TPAMI_2022_3226276 crossref_primary_10_1109_TCSVT_2022_3225376 crossref_primary_10_1109_JOE_2022_3192089 crossref_primary_10_1371_journal_pone_0299110 |
Cites_doi | 10.1007/978-3-030-01237-3_32 10.1109/ICRA.2018.8460552 10.1109/CVPR.2012.6247661 10.1109/CVPR.2015.7298594 10.1016/j.jvcir.2014.11.006 10.1109/JOE.2015.2469915 10.1109/CVPR.2017.243 10.1109/CVPR.2016.90 10.1109/CVPR.2010.5539970 10.1109/OCEANS.2016.7761342 10.1109/ICCV.2017.244 10.1007/978-3-030-05792-3_7 10.4031/002533208786861209 10.1109/CVPR.2017.632 10.1109/TPAMI.2010.168 10.1007/978-3-319-10593-2_13 10.1109/ICIP.2014.7025927 10.1109/ACCESS.2017.2753796 10.1109/TIP.2017.2662206 10.1109/TIP.2017.2663846 10.1109/TCSI.2017.2751671 10.1109/CVPR.2017.186 10.1016/j.asoc.2014.11.020 10.1109/TIP.2016.2612882 10.1109/MCG.2016.26 10.1109/ICIP.2013.6738704 10.1109/LSP.2018.2792050 10.1023/B:VISI.0000029664.99615.94 10.1007/s11263-015-0816-y 10.1109/TIP.2015.2491020 10.1109/TPAMI.1986.4767851 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2020 |
DBID | 97E RIA RIE AAYXX CITATION 7SC 7SP 7TB 7TN 8FD F1W FR3 H96 JQ2 KR7 L.G L7M L~C L~D |
DOI | 10.1109/JOE.2019.2911447 |
DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef Computer and Information Systems Abstracts Electronics & Communications Abstracts Mechanical & Transportation Engineering Abstracts Oceanic Abstracts Technology Research Database ASFA: Aquatic Sciences and Fisheries Abstracts Engineering Research Database Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources ProQuest Computer Science Collection Civil Engineering Abstracts Aquatic Science & Fisheries Abstracts (ASFA) Professional Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional |
DatabaseTitle | CrossRef Civil Engineering Abstracts Aquatic Science & Fisheries Abstracts (ASFA) Professional Technology Research Database Computer and Information Systems Abstracts – Academic Mechanical & Transportation Engineering Abstracts Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Computer and Information Systems Abstracts Professional Aquatic Science & Fisheries Abstracts (ASFA) 2: Ocean Technology, Policy & Non-Living Resources Oceanic Abstracts ASFA: Aquatic Sciences and Fisheries Abstracts Engineering Research Database Advanced Technologies Database with Aerospace |
DatabaseTitleList | Civil Engineering Abstracts |
Database_xml | – sequence: 1 dbid: RIE name: IEEE Electronic Library (IEL) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering Oceanography |
EISSN | 1558-1691 |
EndPage | 870 |
ExternalDocumentID | 10_1109_JOE_2019_2911447 8730425 |
Genre | orig-research |
GrantInformation_xml | – fundername: Priority Academic Program Development of Jiangsu Higher Education Institutions funderid: 10.13039/501100012246 – fundername: National Natural Science Foundation of China grantid: 61701245 funderid: 10.13039/501100001809 – fundername: Startup Foundation for Introducing Talent of Nanjing University of Information Science and Technology grantid: 2243141701030 funderid: 10.13039/501100013156 |
GroupedDBID | -~X .DC 0R~ 29I 4.4 5GY 5VS 66. 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACGFO ACGFS ACIWK ACNCT AENEX AETIX AGQYO AGSQL AHBIQ AI. AIBXA AKJIK AKQYR ALLEH ALMA_UNASSIGNED_HOLDINGS ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 EBS EJD HZ~ H~9 IBMZZ ICLAB IFIPE IFJZH IPLJI JAVBF LAI M43 O9- OCL P2P RIA RIE RNS TAE TN5 VH1 ~02 AAYXX CITATION RIG 7SC 7SP 7TB 7TN 8FD F1W FR3 H96 JQ2 KR7 L.G L7M L~C L~D |
ID | FETCH-LOGICAL-c291t-49048cbbdcd0c4fdf2500ee033716a89497fbefd47724bf4cfdcab3a6b5aa4d33 |
IEDL.DBID | RIE |
ISSN | 0364-9059 |
IngestDate | Mon Jun 30 07:14:04 EDT 2025 Thu Apr 24 22:52:29 EDT 2025 Tue Jul 01 00:52:44 EDT 2025 Wed Aug 27 02:35:21 EDT 2025 |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 3 |
Language | English |
License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c291t-49048cbbdcd0c4fdf2500ee033716a89497fbefd47724bf4cfdcab3a6b5aa4d33 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ORCID | 0000-0002-4395-7553 0000-0002-7143-9569 |
PQID | 2425612351 |
PQPubID | 85484 |
PageCount | 9 |
ParticipantIDs | crossref_citationtrail_10_1109_JOE_2019_2911447 crossref_primary_10_1109_JOE_2019_2911447 ieee_primary_8730425 proquest_journals_2425612351 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2020-July 2020-7-00 20200701 |
PublicationDateYYYYMMDD | 2020-07-01 |
PublicationDate_xml | – month: 07 year: 2020 text: 2020-July |
PublicationDecade | 2020 |
PublicationPlace | New York |
PublicationPlace_xml | – name: New York |
PublicationTitle | IEEE journal of oceanic engineering |
PublicationTitleAbbrev | JOE |
PublicationYear | 2020 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref13 ref12 ref37 ref36 ref14 ref31 maas (ref33) 0; 30 ref30 ref11 ref32 ref10 ref2 ref1 ref39 ref17 miyato (ref27) 2018 hu (ref20) 0 ref19 ref18 ioffe (ref34) 0; 37 gulrajani (ref26) 0 he (ref23) 2011; 33 li (ref16) 2018; 3 ref24 ref25 ref42 ref41 ref21 kingma (ref38) 2014 ref29 ref8 ref7 ref9 ref4 chen (ref22) 2017 ref3 anwar (ref15) 2018 goodfellow (ref28) 0 ref6 ref5 ref40 kurach (ref35) 2018 |
References_xml | – ident: ref30 doi: 10.1007/978-3-030-01237-3_32 – ident: ref17 doi: 10.1109/ICRA.2018.8460552 – ident: ref3 doi: 10.1109/CVPR.2012.6247661 – ident: ref29 doi: 10.1109/CVPR.2015.7298594 – ident: ref8 doi: 10.1016/j.jvcir.2014.11.006 – ident: ref40 doi: 10.1109/JOE.2015.2469915 – ident: ref32 doi: 10.1109/CVPR.2017.243 – year: 2018 ident: ref35 article-title: The GAN landscape: Losses, architectures, regularization, and normalization – year: 2017 ident: ref22 article-title: Towards quality advancement of underwater machine vision with generative adversarial networks – ident: ref31 doi: 10.1109/CVPR.2016.90 – start-page: 2672 year: 0 ident: ref28 article-title: Generative adversarial nets publication-title: Proc Adv Neural Inf Process Syst – ident: ref37 doi: 10.1109/CVPR.2010.5539970 – volume: 3 start-page: 387 year: 2018 ident: ref16 article-title: WaterGAN: Unsupervised generative network to enable real-time color correction of monocular underwater images publication-title: IEEE Robot Autom Lett – ident: ref18 doi: 10.1109/OCEANS.2016.7761342 – ident: ref24 doi: 10.1109/ICCV.2017.244 – ident: ref19 doi: 10.1007/978-3-030-05792-3_7 – ident: ref1 doi: 10.4031/002533208786861209 – ident: ref25 doi: 10.1109/CVPR.2017.632 – volume: 33 start-page: 2341 year: 2011 ident: ref23 article-title: Single image haze removal using dark channel prior publication-title: IEEE Trans Pattern Anal Mach Intell doi: 10.1109/TPAMI.2010.168 – year: 2014 ident: ref38 article-title: Adam: A method for stochastic optimization – ident: ref12 doi: 10.1007/978-3-319-10593-2_13 – ident: ref5 doi: 10.1109/ICIP.2014.7025927 – ident: ref9 doi: 10.1109/ACCESS.2017.2753796 – ident: ref14 doi: 10.1109/TIP.2017.2662206 – start-page: 5767 year: 0 ident: ref26 article-title: Improved training of Wasserstein GANs publication-title: Proc Adv Neural Inf Process Syst – year: 2018 ident: ref15 article-title: Deep underwater image enhancement – start-page: 296 year: 0 ident: ref20 article-title: Underwater image restoration based on convolutional neural network publication-title: Proc Asian Conf Mach Learn – volume: 30 start-page: 1 year: 0 ident: ref33 article-title: Rectifier nonlinearities improve neural network acoustic models publication-title: Proc Int Conf Mach Learn – ident: ref11 doi: 10.1109/TIP.2017.2663846 – ident: ref10 doi: 10.1109/TCSI.2017.2751671 – ident: ref13 doi: 10.1109/CVPR.2017.186 – ident: ref4 doi: 10.1016/j.asoc.2014.11.020 – year: 2018 ident: ref27 article-title: Spectral normalization for generative adversarial networks – ident: ref6 doi: 10.1109/TIP.2016.2612882 – ident: ref7 doi: 10.1109/MCG.2016.26 – volume: 37 start-page: 448 year: 0 ident: ref34 article-title: Batch normalization: Accelerating deep network training by reducing internal covariate shift publication-title: Proc 32nd Int Conf Mach Learn – ident: ref2 doi: 10.1109/ICIP.2013.6738704 – ident: ref21 doi: 10.1109/LSP.2018.2792050 – ident: ref41 doi: 10.1023/B:VISI.0000029664.99615.94 – ident: ref36 doi: 10.1007/s11263-015-0816-y – ident: ref39 doi: 10.1109/TIP.2015.2491020 – ident: ref42 doi: 10.1109/TPAMI.1986.4767851 |
SSID | ssj0014479 |
Score | 2.6756146 |
Snippet | Underwater image enhancement has received much attention in underwater vision research. However, raw underwater images easily suffer from color distortion,... |
SourceID | proquest crossref ieee |
SourceType | Aggregation Database Enrichment Source Index Database Publisher |
StartPage | 862 |
SubjectTerms | Ablation Colour Dense concatenation Feature extraction Gallium nitride generative adversarial network (GAN) Generative adversarial networks Generators Ground truth Image color analysis Image enhancement Image restoration Machine learning multiscale residual learning underwater image enhancement Training Underwater |
Title | Underwater Image Enhancement Using a Multiscale Dense Generative Adversarial Network |
URI | https://ieeexplore.ieee.org/document/8730425 https://www.proquest.com/docview/2425612351 |
Volume | 45 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LS8QwEB7Ukwq-xfVFDl4Euxu3k7U5iq6ooF4UvJU8EdQq7i6Cv96ZtLuIinhqD0kI_b4kM83MNwB7GFihJCKRt9AZolJZEYnLng7f2DWHVktOTr667p3f4eW9up-Cg0kuTAghBZ-FNr-mu3z_4kb8q6xTHLHzraZhmmhW52pNbgwQa129vIeZJpthfCUpdefyps8xXLrdpZWNXEjlyxGUaqr82IjT6XK2CFfjedVBJY_t0dC23cc3ycb_TnwJFhozUxzXvFiGqVCtwNwX8cEVmL9xwVSNYvUq3KYKSO9ker6Ji2faZUS_emBK8NAiRRYII1K-7oBwDeKUHOAgatlq3jNFqu08MMxocV1Hl6_B3Vn_9uQ8a0ouZI4-zjBDTSvaWeudlw6jj2QhyRBknpNfZQqN-ijaED2SUY42ooveGZubnlXGoM_zdZipXqqwAaInTVBRGnIxPdKzMASLkkZ1vS6IBi3ojFEoXaNHzmUxnsrkl0hdEm4l41Y2uLVgf9Ljtdbi-KPtKsMwadcg0ILtMdBls1gHJXtdrEKjDjd_77UFs112s1OU7jbMDN9GYYdskaHdTST8BE462rc |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LbxMxEB615dCCxKMtIjSAD1wqdRMnO07XxwpSpaVJLqnU28pPIVE2KA8h8euZ8W6iCirEafdgW9Z-Y3u-9cw3AB8xsEJJRDLeQmeISmVFJFv2dPjGvulZLTk5eTwZjG7x-k7d7cDZNhcmhJCCz0KHX9Ndvp-7Nf8q6xbnTL7VLjxRxCqKOltre2eAWCvr5QPMNHkNm0tJqbvX0yFHcelOn9Y2cimVB4dQqqry11aczpfLFzDezKwOK_nWWa9sx_36Q7Txf6f-Ep43jqa4qC3jFeyE6hCePpAfPIRnUxdM1WhWH8Es1UD6Sc7nQlx9p31GDKuvbBQ8tEixBcKIlLG7JGSD-EwUOIhauJp3TZGqOy8N27SY1PHlx3B7OZx9GmVN0YXM0cdZZahpTTtrvfPSYfSRfCQZgsxzYlam0KjPow3RI7nlaCO66J2xuRlYZQz6PH8Ne9W8Cm9ADKQJKkpDJNMjPQtDsChpVN_rggyhBd0NCqVrFMm5MMZ9mZiJ1CXhVjJuZYNbC063PX7Uahz_aHvEMGzbNQi0oL0BumyW67Jk3sU6NKr39vFeH2B_NBvflDdXky8ncNBn0p1idtuwt1qswzvyTFb2fTLI3_jj3go |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Underwater+Image+Enhancement+Using+a+Multiscale+Dense+Generative+Adversarial+Network&rft.jtitle=IEEE+journal+of+oceanic+engineering&rft.au=Guo%2C+Yecai&rft.au=Hanyu+Li&rft.au=Zhuang%2C+Peixian&rft.date=2020-07-01&rft.pub=The+Institute+of+Electrical+and+Electronics+Engineers%2C+Inc.+%28IEEE%29&rft.issn=0364-9059&rft.eissn=1558-1691&rft.volume=45&rft.issue=3&rft.spage=862&rft_id=info:doi/10.1109%2FJOE.2019.2911447&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0364-9059&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0364-9059&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0364-9059&client=summon |