Invertible Residual Blocks in Deep Learning Networks
Residual blocks have been widely used in deep learning networks. However, information may be lost in residual blocks due to the relinquishment of information in rectifier linear units (ReLUs). To address this issue, invertible residual networks have been proposed recently but are generally under str...
Saved in:
| Published in | IEEE transaction on neural networks and learning systems Vol. 35; no. 7; pp. 10167 - 10173 |
|---|---|
| Main Authors | , , , |
| Format | Journal Article |
| Language | English |
| Published |
United States
IEEE
01.07.2024
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Subjects | |
| Online Access | Get full text |
| ISSN | 2162-237X 2162-2388 2162-2388 |
| DOI | 10.1109/TNNLS.2023.3238397 |
Cover
| Abstract | Residual blocks have been widely used in deep learning networks. However, information may be lost in residual blocks due to the relinquishment of information in rectifier linear units (ReLUs). To address this issue, invertible residual networks have been proposed recently but are generally under strict restrictions which limit their applications. In this brief, we investigate the conditions under which a residual block is invertible. A sufficient and necessary condition is presented for the invertibility of residual blocks with one layer of ReLU inside the block. In particular, for widely used residual blocks with convolutions, we show that such residual blocks are invertible under weak conditions if the convolution is implemented with certain zero-padding methods. Inverse algorithms are also proposed, and experiments are conducted to show the effectiveness of the proposed inverse algorithms and prove the correctness of the theoretical results. |
|---|---|
| AbstractList | Residual blocks have been widely used in deep learning networks. However, information may be lost in residual blocks due to the relinquishment of information in rectifier linear units (ReLUs). To address this issue, invertible residual networks have been proposed recently but are generally under strict restrictions which limit their applications. In this brief, we investigate the conditions under which a residual block is invertible. A sufficient and necessary condition is presented for the invertibility of residual blocks with one layer of ReLU inside the block. In particular, for widely used residual blocks with convolutions, we show that such residual blocks are invertible under weak conditions if the convolution is implemented with certain zero-padding methods. Inverse algorithms are also proposed, and experiments are conducted to show the effectiveness of the proposed inverse algorithms and prove the correctness of the theoretical results. Residual blocks have been widely used in deep learning networks. However, information may be lost in residual blocks due to the relinquishment of information in rectifier linear units (ReLUs). To address this issue, invertible residual networks have been proposed recently but are generally under strict restrictions which limit their applications. In this brief, we investigate the conditions under which a residual block is invertible. A sufficient and necessary condition is presented for the invertibility of residual blocks with one layer of ReLU inside the block. In particular, for widely used residual blocks with convolutions, we show that such residual blocks are invertible under weak conditions if the convolution is implemented with certain zero-padding methods. Inverse algorithms are also proposed, and experiments are conducted to show the effectiveness of the proposed inverse algorithms and prove the correctness of the theoretical results.Residual blocks have been widely used in deep learning networks. However, information may be lost in residual blocks due to the relinquishment of information in rectifier linear units (ReLUs). To address this issue, invertible residual networks have been proposed recently but are generally under strict restrictions which limit their applications. In this brief, we investigate the conditions under which a residual block is invertible. A sufficient and necessary condition is presented for the invertibility of residual blocks with one layer of ReLU inside the block. In particular, for widely used residual blocks with convolutions, we show that such residual blocks are invertible under weak conditions if the convolution is implemented with certain zero-padding methods. Inverse algorithms are also proposed, and experiments are conducted to show the effectiveness of the proposed inverse algorithms and prove the correctness of the theoretical results. |
| Author | An, Senjian Liu, Wanquan Li, Ling Wang, Ruhua |
| Author_xml | – sequence: 1 givenname: Ruhua orcidid: 0000-0001-5798-8118 surname: Wang fullname: Wang, Ruhua email: ruhua.wang@curtin.edu.au organization: School of Civil and Mechanical Engineering, Curtin University, Bentley, WA, Australia – sequence: 2 givenname: Senjian surname: An fullname: An, Senjian organization: School of Electrical Engineering, Computing and Mathematical Sciences, Curtin University, Bentley, WA, Australia – sequence: 3 givenname: Wanquan orcidid: 0000-0003-4910-353X surname: Liu fullname: Liu, Wanquan organization: School of Intelligent Systems Engineering, Sun Yat-sen University, Guangzhou, China – sequence: 4 givenname: Ling orcidid: 0000-0001-9722-9503 surname: Li fullname: Li, Ling email: l.li@curtin.edu.au organization: School of Electrical Engineering, Computing and Mathematical Sciences, Curtin University, Bentley, WA, Australia |
| BackLink | https://www.ncbi.nlm.nih.gov/pubmed/37022256$$D View this record in MEDLINE/PubMed |
| BookMark | eNp9kE1rGzEQQEVJaNw0fyCUstBLL3alGe1qdUzdfIFxofWhN6FdzwbFa60j7Sb030eOnRByiC6S4L1heJ_Yge88MXYq-EQIrn8s5vPZ3wlwwAkClqjVBzYCUcA4_cqDl7f6d8ROYrzl6RQ8L6T-yI5QcQDIixGT1_6eQu-qlrI_FN1ysG32s-3qVcycz34RbbIZ2eCdv8nm1D90YRU_s8PGtpFO9vcxW1ycL6ZX49nvy-vp2WxcYw79GC0plKIBBJ1jWdiqygXUDXICnuMSpKo1FARlKTRVKBqZiLrgy6oBaPCYfd-N3YTubqDYm7WLNbWt9dQN0YDSSkjNJSb02xv0thuCT8sZ5KrkWiolE_V1Tw3VmpZmE9zahv_mOUcCyh1Qhy7GQI2pXW971_k-WNcawc02vnmKb7bxzT5-UuGN-jz9XenLTnJE9ErgmLqV-AiU1IzV |
| CODEN | ITNNAL |
| CitedBy_id | crossref_primary_10_3390_ani14030499 crossref_primary_10_1038_s41598_025_93718_7 crossref_primary_10_1007_s11760_023_02981_6 crossref_primary_10_3390_pr13010151 |
| Cites_doi | 10.1109/ICASSP40776.2020.9053723 10.1007/978-3-030-58520-4_35 10.1109/TNNLS.2020.3042395 10.1007/978-3-319-46493-0_38 10.1016/0024-3795(90)90058-K 10.1109/TNNLS.2021.3084757 10.1109/CVPR.2017.195 10.1007/0-306-48332-7_258 10.1145/3422622 10.1007/978-3-030-32248-9_78 10.1109/CVPR42600.2020.00924 10.1109/tnnls.2021.3105543 10.1109/5.726791 10.1109/tpami.2022.3181070 10.1109/TNNLS.2018.2875194 |
| ContentType | Journal Article |
| Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024 |
| Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024 |
| DBID | 97E RIA RIE AAYXX CITATION NPM 7QF 7QO 7QP 7QQ 7QR 7SC 7SE 7SP 7SR 7TA 7TB 7TK 7U5 8BQ 8FD F28 FR3 H8D JG9 JQ2 KR7 L7M L~C L~D P64 7X8 |
| DOI | 10.1109/TNNLS.2023.3238397 |
| DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef PubMed Aluminium Industry Abstracts Biotechnology Research Abstracts Calcium & Calcified Tissue Abstracts Ceramic Abstracts Chemoreception Abstracts Computer and Information Systems Abstracts Corrosion Abstracts Electronics & Communications Abstracts Engineered Materials Abstracts Materials Business File Mechanical & Transportation Engineering Abstracts Neurosciences Abstracts Solid State and Superconductivity Abstracts METADEX Technology Research Database ANTE: Abstracts in New Technology & Engineering Engineering Research Database Aerospace Database Materials Research Database ProQuest Computer Science Collection Civil Engineering Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional Biotechnology and BioEngineering Abstracts MEDLINE - Academic |
| DatabaseTitle | CrossRef PubMed Materials Research Database Technology Research Database Computer and Information Systems Abstracts – Academic Mechanical & Transportation Engineering Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Materials Business File Aerospace Database Engineered Materials Abstracts Biotechnology Research Abstracts Chemoreception Abstracts Advanced Technologies Database with Aerospace ANTE: Abstracts in New Technology & Engineering Civil Engineering Abstracts Aluminium Industry Abstracts Electronics & Communications Abstracts Ceramic Abstracts Neurosciences Abstracts METADEX Biotechnology and BioEngineering Abstracts Computer and Information Systems Abstracts Professional Solid State and Superconductivity Abstracts Engineering Research Database Calcium & Calcified Tissue Abstracts Corrosion Abstracts MEDLINE - Academic |
| DatabaseTitleList | Materials Research Database PubMed MEDLINE - Academic |
| Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: RIE name: IEEE Electronic Library (IEL) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Computer Science |
| EISSN | 2162-2388 |
| EndPage | 10173 |
| ExternalDocumentID | 37022256 10_1109_TNNLS_2023_3238397 10033418 |
| Genre | orig-research Journal Article |
| GrantInformation_xml | – fundername: Australian Research Council Discovery Projects “AI Assisted Probabilistic Structural Health Monitoring with Uncertain Data” grantid: DP210103631 |
| GroupedDBID | 0R~ 4.4 5VS 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACIWK ACPRK AENEX AFRAH AGQYO AGSQL AHBIQ AKJIK AKQYR ALMA_UNASSIGNED_HOLDINGS ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ EBS EJD IFIPE IPLJI JAVBF M43 MS~ O9- OCL PQQKQ RIA RIE RNS AAYXX CITATION NPM RIG 7QF 7QO 7QP 7QQ 7QR 7SC 7SE 7SP 7SR 7TA 7TB 7TK 7U5 8BQ 8FD F28 FR3 H8D JG9 JQ2 KR7 L7M L~C L~D P64 7X8 |
| ID | FETCH-LOGICAL-c352t-3ae7341f23295386abb512cf30e2053d247c926e28819eb31f4bb5c60dbf22f3 |
| IEDL.DBID | RIE |
| ISSN | 2162-237X 2162-2388 |
| IngestDate | Sat Sep 27 18:51:49 EDT 2025 Mon Jun 30 04:53:19 EDT 2025 Sun Apr 06 01:21:17 EDT 2025 Wed Oct 01 00:45:11 EDT 2025 Thu Apr 24 22:52:08 EDT 2025 Wed Aug 27 02:05:16 EDT 2025 |
| IsPeerReviewed | false |
| IsScholarly | true |
| Issue | 7 |
| Language | English |
| License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html https://doi.org/10.15223/policy-029 https://doi.org/10.15223/policy-037 |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c352t-3ae7341f23295386abb512cf30e2053d247c926e28819eb31f4bb5c60dbf22f3 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
| ORCID | 0000-0001-9722-9503 0000-0003-4910-353X 0000-0001-5798-8118 |
| PMID | 37022256 |
| PQID | 3078094774 |
| PQPubID | 85436 |
| PageCount | 7 |
| ParticipantIDs | pubmed_primary_37022256 proquest_journals_3078094774 crossref_citationtrail_10_1109_TNNLS_2023_3238397 proquest_miscellaneous_2797149043 crossref_primary_10_1109_TNNLS_2023_3238397 ieee_primary_10033418 |
| ProviderPackageCode | CITATION AAYXX |
| PublicationCentury | 2000 |
| PublicationDate | 2024-07-01 |
| PublicationDateYYYYMMDD | 2024-07-01 |
| PublicationDate_xml | – month: 07 year: 2024 text: 2024-07-01 day: 01 |
| PublicationDecade | 2020 |
| PublicationPlace | United States |
| PublicationPlace_xml | – name: United States – name: Piscataway |
| PublicationTitle | IEEE transaction on neural networks and learning systems |
| PublicationTitleAbbrev | TNNLS |
| PublicationTitleAlternate | IEEE Trans Neural Netw Learn Syst |
| PublicationYear | 2024 |
| Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
| References | ref15 Chen (ref3) ref14 ref31 ref11 Dinh (ref2) 2016 ref17 ref16 Ardizzone (ref9) 2019 ref19 ref18 Kingma (ref25) Dolatabadi (ref6) Schneider (ref27) 2012 Behrmann (ref23) Dinh (ref1) 2014 Whang (ref10) Madhawa (ref13) 2019 ref24 ref26 Xiao (ref29) 2017 ref20 Glorot (ref32) ref22 Kruse (ref4) ref21 ref28 ref8 Howard (ref30) 2017 Bai (ref7) 2020 Song (ref5) Nalisnick (ref12) Abadi (ref33) 2015 |
| References_xml | – ident: ref8 doi: 10.1109/ICASSP40776.2020.9053723 – ident: ref20 doi: 10.1007/978-3-030-58520-4_35 – year: 2017 ident: ref30 article-title: MobileNets: Efficient convolutional neural networks for mobile vision applications publication-title: arXiv:1704.04861 – year: 2020 ident: ref7 article-title: Exploring adversarial examples via invertible neural networks publication-title: arXiv:2012.13111 – start-page: 1 volume-title: Proc. NeurIPS Workshop Deep Learn. Inverse Problems ident: ref10 article-title: Compressed sensing with invertible generative models and dependent noise – ident: ref14 doi: 10.1109/TNNLS.2020.3042395 – ident: ref22 doi: 10.1007/978-3-319-46493-0_38 – ident: ref26 doi: 10.1016/0024-3795(90)90058-K – start-page: 11002 volume-title: Proc. Adv. Neural Inf. Process. Syst. ident: ref5 article-title: MintNet: Building invertible neural networks with masked convolutions – ident: ref21 doi: 10.1109/TNNLS.2021.3084757 – year: 2017 ident: ref29 article-title: Fashion-MNIST: A novel image dataset for benchmarking machine learning algorithms publication-title: arXiv:1708.07747 – year: 2016 ident: ref2 article-title: Density estimation using real NVP publication-title: arXiv:1605.08803 – start-page: 9913 volume-title: Proc. Adv. Neural Inf. Process. Syst. ident: ref3 article-title: Residual flows for invertible generative modeling – ident: ref31 doi: 10.1109/CVPR.2017.195 – volume-title: Matrices and Linear Algebra year: 2012 ident: ref27 – start-page: 1 volume-title: Proc. ICML ident: ref4 article-title: Benchmarking invertible architectures on inverse problems – ident: ref24 doi: 10.1007/0-306-48332-7_258 – year: 2019 ident: ref9 article-title: Guided image generation with conditional invertible neural networks publication-title: arXiv:1907.02392 – volume-title: TensorFlow: Large-Scale Machine Learning on Heterogeneous Systems year: 2015 ident: ref33 – ident: ref16 doi: 10.1145/3422622 – ident: ref11 doi: 10.1007/978-3-030-32248-9_78 – start-page: 573 volume-title: Proc. Int. Conf. Mach. Learn. ident: ref23 article-title: Invertible residual networks – start-page: 10215 volume-title: Proc. Adv. Neural Inf. Process. Syst. ident: ref25 article-title: Glow: Generative flow with invertible 1 × 1 convolutions – start-page: 249 volume-title: Proc. AISTATS ident: ref32 article-title: Understanding the difficulty of training deep feedforward neural networks – start-page: 1 volume-title: Proc. Adv. Neural Inf. Process. Syst. 33rd Annu. Conf. Neural Inf. Process. Syst. (NeurIPS) ident: ref6 article-title: AdvFlow: Inconspicuous black-box adversarial attacks using normalizing flows – year: 2019 ident: ref13 article-title: GraphNVP: An invertible flow model for generating molecular graphs publication-title: arXiv:1905.11600 – start-page: 4723 volume-title: Proc. Int. Conf. Mach. Learn. ident: ref12 article-title: Hybrid models with deep and invertible features – ident: ref19 doi: 10.1109/CVPR42600.2020.00924 – ident: ref15 doi: 10.1109/tnnls.2021.3105543 – ident: ref28 doi: 10.1109/5.726791 – ident: ref17 doi: 10.1109/tpami.2022.3181070 – ident: ref18 doi: 10.1109/TNNLS.2018.2875194 – year: 2014 ident: ref1 article-title: NICE: Non-linear independent components estimation publication-title: arXiv:1410.8516 |
| SSID | ssj0000605649 |
| Score | 2.4899392 |
| Snippet | Residual blocks have been widely used in deep learning networks. However, information may be lost in residual blocks due to the relinquishment of information... |
| SourceID | proquest pubmed crossref ieee |
| SourceType | Aggregation Database Index Database Enrichment Source Publisher |
| StartPage | 10167 |
| SubjectTerms | Algorithms Deep learning Invertible neural networks invertible residual networks linear complementarity problem Networks Neural networks Rectifiers Residual neural networks Sufficient conditions Transforms Unsupervised learning |
| Title | Invertible Residual Blocks in Deep Learning Networks |
| URI | https://ieeexplore.ieee.org/document/10033418 https://www.ncbi.nlm.nih.gov/pubmed/37022256 https://www.proquest.com/docview/3078094774 https://www.proquest.com/docview/2797149043 |
| Volume | 35 |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVIEE databaseName: IEEE Electronic Library (IEL) customDbUrl: eissn: 2162-2388 dateEnd: 99991231 omitProxy: false ssIdentifier: ssj0000605649 issn: 2162-237X databaseCode: RIE dateStart: 20120101 isFulltext: true titleUrlDefault: https://ieeexplore.ieee.org/ providerName: IEEE |
| link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1JS8QwFH6oB_HivtSNCN6kNZNk0uboioj2oCPMrbRpIjLSETtz8df7kraDCoq3QtOmzdu-l7wF4JjJxKJjZUOb900oylKHrrdVqJ05Ebnifd9F4T6VN0_idtgftsnqPhfGGOODz0zkLv1ZfjnWU7dVhhJOOWrdZB7m40Q2yVqzDRWKwFx6uMt6koWMx8MuSYaq00Ga3j1Grld4xNFKoRVegkUee3dHfrNJvsnK73jT253rFUi7L27CTUbRdFJE-uNHMcd__9IqLLcIlJw1LLMGc6Zah5WuuwNphX0DhCvBgYOKV0MeTO2Ttsg52r5RTV4qcmnMG2mrsz6TtIkmrzdhcH01uLgJ2x4LoUboNQl5bmKc3yKwUqj7ZF4UCAG05dQwlM-SiVgrJg1LEDqg492zAkdoScvCMmb5FixU48rsALGqsAi_cl0oi4qhzBOLcE44xMioimkAvW6RM93WH3dtMF4z74dQlXkaZY5GWUujAE5mz7w11Tf-HL3pFvjLyGZtA9jviJm1ElpnqNsSdG0R_QZwNLuNsuUOTPLKjKd1xmIVowdJBQ9gu2GC2cs73tn9ZdI9WMJvE01k7z4sTN6n5gDxy6Q49Hz7CSPs5Ts |
| linkProvider | IEEE |
| linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1JT9wwFH4qIFEuZSktYSlG4lYleGxn8ZFV0zLkQAdpblbi2FUFyiAyc-mv59lJRhSJilukOHHit33PfgvAMUsyi46VDW0Rm1BUlQ5db6tQO3MiCslj30XhJk-Gd-LnJJ50yeo-F8YY44PPTOQu_Vl-NdVzt1WGEk45at1sCVZiIUTcpmsttlQoQvPEA142SFjIeDrp02SoPBnn-ehX5LqFRxztFNrhNVjlqXd4kn-skm-z8jbi9Jbnah3y_pvbgJP7aD4rI_33VTnHd__UBnzqMCg5bZlmEz6YegvW-_4OpBP3zyBcEQ4cVD4Ycmsan7ZFztD63TfkT00ujHkkXX3W3yRv48mbbRhfXY7Ph2HXZSHUCL5mIS9MivNbhFYStV9SlCWCAG05NQwltGIi1ZIlhmUIHtD1HliBI3RCq9IyZvkXWK6ntdkBYmVpEYAVupQWVUNVZBYBnXCYkVGZ0gAG_SIr3VUgd40wHpT3RKhUnkbK0Uh1NArg--KZx7b-xn9Hb7sFfjGyXdsA9ntiqk5GG4XaLUPnFvFvAEeL2yhd7sikqM103iiWyhR9SCp4AF9bJli8vOed3TcmPYSPw_HNSI1-5Nd7sIbfKdo4331Ynj3NzQGimVn5zfPwM1HN6Ig |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Invertible+Residual+Blocks+in+Deep+Learning+Networks&rft.jtitle=IEEE+transaction+on+neural+networks+and+learning+systems&rft.au=Wang%2C+Ruhua&rft.au=An%2C+Senjian&rft.au=Liu%2C+Wanquan&rft.au=Li%2C+Ling&rft.date=2024-07-01&rft.pub=IEEE&rft.issn=2162-237X&rft.volume=35&rft.issue=7&rft.spage=10167&rft.epage=10173&rft_id=info:doi/10.1109%2FTNNLS.2023.3238397&rft_id=info%3Apmid%2F37022256&rft.externalDocID=10033418 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2162-237X&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2162-237X&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2162-237X&client=summon |