The dropout learning algorithm
Dropout is a recently introduced algorithm for training neural networks by randomly dropping units during training to prevent their co-adaptation. A mathematical analysis of some of the static and dynamic properties of dropout is provided using Bernoulli gating variables, general enough to accommoda...
Saved in:
| Published in | Artificial intelligence Vol. 210; pp. 78 - 122 |
|---|---|
| Main Authors | , |
| Format | Journal Article |
| Language | English |
| Published |
Oxford
Elsevier B.V
01.05.2014
Elsevier |
| Subjects | |
| Online Access | Get full text |
| ISSN | 0004-3702 1872-7921 1872-7921 |
| DOI | 10.1016/j.artint.2014.02.004 |
Cover
| Abstract | Dropout is a recently introduced algorithm for training neural networks by randomly dropping units during training to prevent their co-adaptation. A mathematical analysis of some of the static and dynamic properties of dropout is provided using Bernoulli gating variables, general enough to accommodate dropout on units or connections, and with variable rates. The framework allows a complete analysis of the ensemble averaging properties of dropout in linear networks, which is useful to understand the non-linear case. The ensemble averaging properties of dropout in non-linear logistic networks result from three fundamental equations: (1) the approximation of the expectations of logistic functions by normalized geometric means, for which bounds and estimates are derived; (2) the algebraic equality between normalized geometric means of logistic functions with the logistic of the means, which mathematically characterizes logistic functions; and (3) the linearity of the means with respect to sums, as well as products of independent variables. The results are also extended to other classes of transfer functions, including rectified linear functions. Approximation errors tend to cancel each other and do not accumulate. Dropout can also be connected to stochastic neurons and used to predict firing rates, and to backpropagation by viewing the backward propagation as ensemble averaging in a dropout linear network. Moreover, the convergence properties of dropout can be understood in terms of stochastic gradient descent. Finally, for the regularization properties of dropout, the expectation of the dropout gradient is the gradient of the corresponding approximation ensemble, regularized by an adaptive weight decay term with a propensity for self-consistent variance minimization and sparse representations. |
|---|---|
| AbstractList | Dropout is a recently introduced algorithm for training neural network by randomly dropping units during training to prevent their co-adaptation. A mathematical analysis of some of the static and dynamic properties of dropout is provided using Bernoulli gating variables, general enough to accommodate dropout on units or connections, and with variable rates. The framework allows a complete analysis of the ensemble averaging properties of dropout in linear networks, which is useful to understand the non-linear case. The ensemble averaging properties of dropout in non-linear logistic networks result from three fundamental equations: (1) the approximation of the expectations of logistic functions by normalized geometric means, for which bounds and estimates are derived; (2) the algebraic equality between normalized geometric means of logistic functions with the logistic of the means, which mathematically characterizes logistic functions; and (3) the linearity of the means with respect to sums, as well as products of independent variables. The results are also extended to other classes of transfer functions, including rectified linear functions. Approximation errors tend to cancel each other and do not accumulate. Dropout can also be connected to stochastic neurons and used to predict firing rates, and to backpropagation by viewing the backward propagation as ensemble averaging in a dropout linear network. Moreover, the convergence properties of dropout can be understood in terms of stochastic gradient descent. Finally, for the regularization properties of dropout, the expectation of the dropout gradient is the gradient of the corresponding approximation ensemble, regularized by an adaptive weight decay term with a propensity for self-consistent variance minimization and sparse representations.Dropout is a recently introduced algorithm for training neural network by randomly dropping units during training to prevent their co-adaptation. A mathematical analysis of some of the static and dynamic properties of dropout is provided using Bernoulli gating variables, general enough to accommodate dropout on units or connections, and with variable rates. The framework allows a complete analysis of the ensemble averaging properties of dropout in linear networks, which is useful to understand the non-linear case. The ensemble averaging properties of dropout in non-linear logistic networks result from three fundamental equations: (1) the approximation of the expectations of logistic functions by normalized geometric means, for which bounds and estimates are derived; (2) the algebraic equality between normalized geometric means of logistic functions with the logistic of the means, which mathematically characterizes logistic functions; and (3) the linearity of the means with respect to sums, as well as products of independent variables. The results are also extended to other classes of transfer functions, including rectified linear functions. Approximation errors tend to cancel each other and do not accumulate. Dropout can also be connected to stochastic neurons and used to predict firing rates, and to backpropagation by viewing the backward propagation as ensemble averaging in a dropout linear network. Moreover, the convergence properties of dropout can be understood in terms of stochastic gradient descent. Finally, for the regularization properties of dropout, the expectation of the dropout gradient is the gradient of the corresponding approximation ensemble, regularized by an adaptive weight decay term with a propensity for self-consistent variance minimization and sparse representations. Dropout is a recently introduced algorithm for training neural networks by randomly dropping units during training to prevent their co-adaptation. A mathematical analysis of some of the static and dynamic properties of dropout is provided using Bernoulli gating variables, general enough to accommodate dropout on units or connections, and with variable rates. The framework allows a complete analysis of the ensemble averaging properties of dropout in linear networks, which is useful to understand the non-linear case. The ensemble averaging properties of dropout in non-linear logistic networks result from three fundamental equations: (1) the approximation of the expectations of logistic functions by normalized geometric means, for which bounds and estimates are derived; (2) the algebraic equality between normalized geometric means of logistic functions with the logistic of the means, which mathematically characterizes logistic functions; and (3) the linearity of the means with respect to sums, as well as products of independent variables. The results are also extended to other classes of transfer functions, including rectified linear functions. Approximation errors tend to cancel each other and do not accumulate. Dropout can also be connected to stochastic neurons and used to predict firing rates, and to backpropagation by viewing the backward propagation as ensemble averaging in a dropout linear network. Moreover, the convergence properties of dropout can be understood in terms of stochastic gradient descent. Finally, for the regularization properties of dropout, the expectation of the dropout gradient is the gradient of the corresponding approximation ensemble, regularized by an adaptive weight decay term with a propensity for self-consistent variance minimization and sparse representations. Dropout is a recently introduced algorithm for training neural network by randomly dropping units during training to prevent their co-adaptation. A mathematical analysis of some of the static and dynamic properties of dropout is provided using Bernoulli gating variables, general enough to accommodate dropout on units or connections, and with variable rates. The framework allows a complete analysis of the ensemble averaging properties of dropout in linear networks, which is useful to understand the non-linear case. The ensemble averaging properties of dropout in non-linear logistic networks result from three fundamental equations: (1) the approximation of the expectations of logistic functions by normalized geometric means, for which bounds and estimates are derived; (2) the algebraic equality between normalized geometric means of logistic functions with the logistic of the means, which mathematically characterizes logistic functions; and (3) the linearity of the means with respect to sums, as well as products of independent variables. The results are also extended to other classes of transfer functions, including rectified linear functions. Approximation errors tend to cancel each other and do not accumulate. Dropout can also be connected to stochastic neurons and used to predict firing rates, and to backpropagation by viewing the backward propagation as ensemble averaging in a dropout linear network. Moreover, the convergence properties of dropout can be understood in terms of stochastic gradient descent. Finally, for the regularization properties of dropout, the expectation of the dropout gradient is the gradient of the corresponding approximation ensemble, regularized by an adaptive weight decay term with a propensity for self-consistent variance minimization and sparse representations. |
| Author | Sadowski, Peter Baldi, Pierre |
| Author_xml | – sequence: 1 givenname: Pierre surname: Baldi fullname: Baldi, Pierre email: pfbaldici@uci.edu – sequence: 2 givenname: Peter surname: Sadowski fullname: Sadowski, Peter |
| BackLink | http://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&idt=28376664$$DView record in Pascal Francis https://www.ncbi.nlm.nih.gov/pubmed/24771879$$D View this record in MEDLINE/PubMed |
| BookMark | eNqVkU9vFCEYxompsdvqNzDNXkx6mREYBgYPJqapf5ImXuqZMPDOLhsWVmDa9NvLZlerHmw8kRd-zwPPwxk6CTEAQq8Jbgkm_O2m1am4UFqKCWsxbTFmz9CCDII2QlJygha4bjWdwPQUneW8qWMnJXmBTikTooJygS5u17C0Ke7iXJYedAourJbar2JyZb19iZ5P2md4dVzP0beP17dXn5ubr5--XH24aUxPeWmkHiaOGVg82GkEw0jPRiyx7KYJS93ROpsRxnHgRFpBMPQWGOWdZWPHLO7OUX_wncNOP9xr79Uuua1OD4pgtc-rNuqQV-3zKkxVDVd17w-63TxuwRoIJelHbdRO_XkS3Fqt4p2qRXBBSDW4PBqk-H2GXNTWZQPe6wBxzooMlPe10p49jXIuh14Qxit68fuzfr3nZ-8VeHMEdDbaT0kH4_IjN3SCc76_892BMynmnGBSxhVdXNyncf6pcthf4v_rFOqH3zlIKhsHwYB1CUxRNrp_G_wAbpjOMg |
| CODEN | AINTBB |
| CitedBy_id | crossref_primary_10_1371_journal_pone_0242301 crossref_primary_10_1007_s10994_022_06169_w crossref_primary_10_1016_j_measurement_2019_107357 crossref_primary_10_1016_j_neunet_2017_08_008 crossref_primary_10_1061_AOMJAH_AOENG_0043 crossref_primary_10_1016_j_inffus_2024_102355 crossref_primary_10_1088_2053_1591_ab1bb4 crossref_primary_10_1109_TNNLS_2023_3270559 crossref_primary_10_1007_s11063_024_11581_5 crossref_primary_10_1109_TCBB_2016_2598752 crossref_primary_10_1007_s12109_016_9467_2 crossref_primary_10_1371_journal_pone_0232578 crossref_primary_10_1016_j_meegid_2021_105034 crossref_primary_10_1097_eus_0000000000000011 crossref_primary_10_1103_PhysRevD_97_094506 crossref_primary_10_1016_j_jhydrol_2021_126629 crossref_primary_10_1088_1748_0221_11_12_C12004 crossref_primary_10_1109_TNS_2024_3519609 crossref_primary_10_1007_s10489_023_05005_5 crossref_primary_10_1016_j_compbiomed_2017_03_024 crossref_primary_10_1093_bioinformatics_btu407 crossref_primary_10_1155_2021_9923112 crossref_primary_10_1016_j_geoen_2024_213015 crossref_primary_10_1029_2023JD040418 crossref_primary_10_1039_C7ME00107J crossref_primary_10_1093_bioinformatics_btu352 crossref_primary_10_1162_neco_a_00990 crossref_primary_10_1007_s00500_019_04195_w crossref_primary_10_1007_s13139_019_00574_1 crossref_primary_10_1016_j_ebiom_2021_103238 crossref_primary_10_1021_acs_energyfuels_3c04548 crossref_primary_10_1109_TPAMI_2024_3398012 crossref_primary_10_1088_1757_899X_1022_1_012020 crossref_primary_10_1016_j_asoc_2017_02_019 crossref_primary_10_1016_j_joes_2022_06_033 crossref_primary_10_1038_s41746_020_0248_0 crossref_primary_10_3390_technologies12100183 crossref_primary_10_1038_ncomms5308 crossref_primary_10_1007_s00506_021_00771_3 crossref_primary_10_1371_journal_pone_0253057 crossref_primary_10_1097_eus_0000000000000029 crossref_primary_10_1186_s12711_018_0439_1 crossref_primary_10_1007_s11356_022_23305_0 crossref_primary_10_1016_j_clinph_2019_06_005 crossref_primary_10_1016_j_earscirev_2019_103076 crossref_primary_10_1109_TPAMI_2021_3121705 crossref_primary_10_1016_j_est_2023_108915 crossref_primary_10_1016_j_patcog_2019_07_006 crossref_primary_10_1162_neco_a_01276 crossref_primary_10_1007_s10766_016_0435_4 crossref_primary_10_1038_s41598_023_34176_x crossref_primary_10_1016_j_jclepro_2018_10_243 crossref_primary_10_1103_PhysRevD_107_063018 crossref_primary_10_1038_s41746_020_00377_1 crossref_primary_10_1007_s10916_023_01994_5 crossref_primary_10_1103_PhysRevD_103_116028 crossref_primary_10_1007_s11704_018_7314_7 crossref_primary_10_1097_ALN_0000000000002186 crossref_primary_10_1038_s41598_023_43617_6 crossref_primary_10_1007_s11749_016_0482_6 crossref_primary_10_1016_j_gie_2021_03_013 crossref_primary_10_1155_2020_8888811 crossref_primary_10_3390_s21217241 crossref_primary_10_5194_hess_25_2951_2021 crossref_primary_10_1587_transinf_2018EDP7289 crossref_primary_10_1080_24694452_2024_2373787 crossref_primary_10_1007_s42452_020_03910_9 crossref_primary_10_1016_j_csbj_2023_06_014 crossref_primary_10_1016_j_ifacol_2025_01_186 crossref_primary_10_1186_s40168_018_0401_z crossref_primary_10_1007_s10489_021_02588_9 crossref_primary_10_1109_TNSM_2024_3488568 crossref_primary_10_1007_s10898_018_0701_7 crossref_primary_10_1016_j_measurement_2021_109285 crossref_primary_10_1016_j_neucom_2020_03_119 crossref_primary_10_1007_s42461_023_00768_4 crossref_primary_10_1039_D0QO00544D crossref_primary_10_1109_ACCESS_2021_3073731 crossref_primary_10_1109_TNNLS_2021_3070895 crossref_primary_10_1002_mp_16838 crossref_primary_10_1016_j_compag_2023_108271 crossref_primary_10_1109_TMI_2017_2655486 crossref_primary_10_3390_electronics12040926 crossref_primary_10_1007_s00348_020_2943_7 crossref_primary_10_1007_s11709_022_0823_3 crossref_primary_10_1016_j_compchemeng_2023_108466 crossref_primary_10_1109_ACCESS_2019_2927396 crossref_primary_10_1016_j_aei_2021_101396 crossref_primary_10_1016_j_compbiolchem_2024_108169 crossref_primary_10_1186_s12859_020_3339_7 crossref_primary_10_1109_TSC_2020_2988760 crossref_primary_10_1016_j_neucom_2017_09_047 crossref_primary_10_1109_TFUZZ_2019_2907497 crossref_primary_10_1016_j_compag_2023_107864 crossref_primary_10_1103_PhysRevD_99_012011 crossref_primary_10_1016_j_mlwa_2021_100040 crossref_primary_10_1109_JSTARS_2016_2621011 crossref_primary_10_1007_s11042_019_07988_1 crossref_primary_10_1016_j_saa_2024_124454 crossref_primary_10_3389_fenvs_2024_1426942 crossref_primary_10_1109_TPAMI_2015_2461544 crossref_primary_10_1002_int_22590 crossref_primary_10_1016_j_artint_2018_03_003 crossref_primary_10_1016_j_seta_2019_100601 crossref_primary_10_1103_PhysRevLett_114_111801 crossref_primary_10_3389_fnbot_2022_1037381 crossref_primary_10_1109_TNNLS_2017_2750679 crossref_primary_10_1007_s10916_018_1001_y crossref_primary_10_1109_TNNLS_2022_3153039 crossref_primary_10_3389_fncir_2016_00007 crossref_primary_10_1016_j_mri_2019_02_013 crossref_primary_10_1016_j_neunet_2015_07_007 crossref_primary_10_1021_acs_jcim_1c01400 crossref_primary_10_1016_j_neunet_2016_07_006 crossref_primary_10_1109_LGRS_2024_3372513 crossref_primary_10_1016_j_isatra_2021_01_058 crossref_primary_10_1016_j_patcog_2020_107609 crossref_primary_10_3390_jmse11030616 crossref_primary_10_1007_s10596_022_10151_9 crossref_primary_10_1016_j_neucom_2019_09_113 crossref_primary_10_3390_app10175772 crossref_primary_10_1038_s41598_019_52552_4 crossref_primary_10_1016_j_jhydrol_2021_126253 crossref_primary_10_1088_2399_6528_aa83fa crossref_primary_10_1088_2515_7639_ab1738 crossref_primary_10_4018_IJACI_355192 crossref_primary_10_1016_j_gie_2021_09_017 crossref_primary_10_3390_ijms242115681 crossref_primary_10_1007_JHEP10_2021_200 crossref_primary_10_1016_j_petrol_2021_108975 crossref_primary_10_1109_ACCESS_2024_3501773 crossref_primary_10_1016_j_engappai_2024_109038 crossref_primary_10_1016_j_neucom_2018_08_093 crossref_primary_10_1016_j_neunet_2014_09_003 crossref_primary_10_1016_j_neunet_2018_09_006 crossref_primary_10_3390_ijgi12110467 crossref_primary_10_3390_app10134551 crossref_primary_10_1016_j_neunet_2020_03_016 crossref_primary_10_1016_j_scitotenv_2024_170435 crossref_primary_10_1007_s11042_019_7172_9 crossref_primary_10_1167_tvst_10_4_22 crossref_primary_10_1109_ACCESS_2022_3140289 crossref_primary_10_5715_jnlp_28_573 crossref_primary_10_1016_j_knosys_2022_109567 crossref_primary_10_1016_j_enggeo_2020_105529 crossref_primary_10_1103_PhysRevD_94_112002 crossref_primary_10_3390_biomedinformatics4030095 crossref_primary_10_1016_j_neunet_2016_12_009 crossref_primary_10_1016_j_jclepro_2020_125187 crossref_primary_10_1093_bioinformatics_btae378 crossref_primary_10_1016_j_jbo_2023_100498 crossref_primary_10_1093_bioinformatics_btw243 crossref_primary_10_1007_s11432_015_5470_z crossref_primary_10_1053_j_gastro_2018_06_037 crossref_primary_10_1007_s10994_018_5696_2 crossref_primary_10_1109_ACCESS_2021_3072731 crossref_primary_10_1121_10_0019632 crossref_primary_10_1109_TMM_2020_2993960 crossref_primary_10_1016_j_asoc_2024_112519 crossref_primary_10_1016_j_knosys_2025_113071 crossref_primary_10_2139_ssrn_4173703 crossref_primary_10_1016_j_compag_2024_109511 crossref_primary_10_3390_s22165956 crossref_primary_10_1109_ACCESS_2019_2904881 crossref_primary_10_1016_j_est_2023_107866 crossref_primary_10_3389_fnins_2022_858126 crossref_primary_10_1016_j_gie_2020_04_071 crossref_primary_10_1016_j_csbj_2020_04_005 crossref_primary_10_1007_s42853_021_00098_7 crossref_primary_10_1016_j_aichem_2024_100062 crossref_primary_10_1109_ACCESS_2024_3453496 crossref_primary_10_2174_1874120701913010001 crossref_primary_10_1590_1984_70332021v21sa15 crossref_primary_10_1016_j_engappai_2023_107487 crossref_primary_10_1055_a_1229_0920 crossref_primary_10_1016_j_transproceed_2019_10_019 crossref_primary_10_1029_2018WR022643 crossref_primary_10_1016_j_gie_2021_11_049 crossref_primary_10_3389_aot_2025_1546386 crossref_primary_10_1007_s11042_022_12242_2 crossref_primary_10_3390_a13060145 crossref_primary_10_1038_s41598_022_09041_y crossref_primary_10_1016_j_csbj_2024_09_021 crossref_primary_10_3389_fsurg_2022_976536 crossref_primary_10_1016_j_oooo_2023_01_017 crossref_primary_10_1007_s11063_017_9677_4 crossref_primary_10_1109_TNNLS_2017_2776248 crossref_primary_10_1016_j_neucom_2024_128533 crossref_primary_10_1016_j_inffus_2024_102417 crossref_primary_10_1155_2020_1397948 |
| Cites_doi | 10.1152/jn.1985.53.1.89 10.1016/0893-6080(89)90014-2 10.1002/net.3230200507 10.1017/S0308210500021326 10.1016/0167-2789(90)90081-Y 10.1109/21.155944 10.1080/095400996116811 10.1162/neco.1995.7.1.108 10.1016/0022-247X(64)90089-7 10.1017/S0004972700034894 10.1162/neco.1996.8.3.643 10.1109/72.392248 10.1109/72.317730 10.1073/pnas.85.21.8311 10.1038/323533a0 10.1523/JNEUROSCI.10-10-03227.1990 10.1090/S0002-9939-1978-0476971-2 10.1016/0893-6080(89)90016-6 10.7153/jmi-03-21 10.1007/BF00058655 |
| ContentType | Journal Article |
| Copyright | 2014 The Authors 2015 INIST-CNRS 2014 Elsevier B.V. All rights reserved. 2014 |
| Copyright_xml | – notice: 2014 The Authors – notice: 2015 INIST-CNRS – notice: 2014 Elsevier B.V. All rights reserved. 2014 |
| DBID | 6I. AAFTH AAYXX CITATION IQODW NPM 7SC 8FD F28 FR3 JQ2 L7M L~C L~D 7X8 5PM ADTOC UNPAY |
| DOI | 10.1016/j.artint.2014.02.004 |
| DatabaseName | ScienceDirect Open Access Titles Elsevier:ScienceDirect:Open Access CrossRef Pascal-Francis PubMed Computer and Information Systems Abstracts Technology Research Database ANTE: Abstracts in New Technology & Engineering Engineering Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional MEDLINE - Academic PubMed Central (Full Participant titles) Unpaywall for CDI: Periodical Content Unpaywall |
| DatabaseTitle | CrossRef PubMed Technology Research Database Computer and Information Systems Abstracts – Academic ProQuest Computer Science Collection Computer and Information Systems Abstracts Engineering Research Database Advanced Technologies Database with Aerospace ANTE: Abstracts in New Technology & Engineering Computer and Information Systems Abstracts Professional MEDLINE - Academic |
| DatabaseTitleList | MEDLINE - Academic PubMed Technology Research Database |
| Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: UNPAY name: Unpaywall url: https://proxy.k.utb.cz/login?url=https://unpaywall.org/ sourceTypes: Open Access Repository |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Computer Science Applied Sciences |
| EISSN | 1872-7921 |
| EndPage | 122 |
| ExternalDocumentID | oai:escholarship.org:ark:/13030/qt7st7476x PMC3996711 24771879 28376664 10_1016_j_artint_2014_02_004 S0004370214000216 |
| Genre | Journal Article |
| GrantInformation_xml | – fundername: NLM NIH HHS grantid: T15 LM007443 – fundername: NLM NIH HHS grantid: R01 LM010235 |
| GroupedDBID | --K --M --Z -~X .DC .~1 0R~ 1B1 1~. 1~5 23N 4.4 457 4G. 5GY 5VS 6I. 6J9 6TJ 7-5 71M 77K 8P~ 9JN AACTN AAEDT AAEDW AAFTH AAIAV AAIKJ AAKOC AAKPC AALRI AAOAW AAQFI AAQXK AAXUO AAYFN ABBOA ABFNM ABFRF ABJNI ABMAC ABVKL ABXDB ABYKQ ACDAQ ACGFO ACGFS ACNCT ACNNM ACRLP ACWUS ACZNC ADBBV ADEZE ADMUD AEBSH AECPX AEFWE AEKER AENEX AETEA AEXQZ AFKWA AFTJW AGHFR AGUBO AGYEJ AHHHB AHJVU AHZHX AIALX AIEXJ AIKHN AITUG AJBFU AJOXV ALMA_UNASSIGNED_HOLDINGS AMFUW AMRAJ AOUOD ASPBG AVWKF AXJTR AZFZN BJAXD BKOJK BLXMC CS3 E3Z EBS EFJIC EFLBG EJD EO8 EO9 EP2 EP3 F0J F5P FDB FEDTE FGOYB FIRID FNPLU FYGXN G-2 G-Q G8K GBLVA GBOLZ HLZ HVGLF HZ~ IHE IXB J1W JJJVA KOM KQ8 LG9 LY7 M41 MO0 MVM N9A NCXOZ O-L O9- OAUVE OK1 OZT P-8 P-9 P2P PC. PQQKQ Q38 R2- RIG RNS ROL RPZ SBC SDF SDG SDP SES SET SEW SPC SPCBC SST SSV SSZ T5K TAE TN5 TR2 TWZ UPT UQL VQA WH7 WUQ XFK XJE XJT XPP XSW ZMT ~02 ~G- 77I AATTM AAXKI AAYWO AAYXX ABDPE ABWVN ACLOT ACRPL ACVFH ADCNI ADNMO ADVLN AEIPS AEUPX AFJKZ AFPUW AGQPQ AIGII AIIUN AKBMS AKRWK AKYEP ANKPU APXCP CITATION EFKBS ~HD BNPGV IQODW SSH AFXIZ AGCQF AGRNS NPM 7SC 8FD F28 FR3 JQ2 L7M L~C L~D 7X8 5PM ADTOC UNPAY |
| ID | FETCH-LOGICAL-c526t-9a8f604ed08dfbec4154b09093ff09a32154cbebb8619d710e5de4263d4b34d03 |
| IEDL.DBID | UNPAY |
| ISSN | 0004-3702 1872-7921 |
| IngestDate | Sun Oct 26 04:08:13 EDT 2025 Tue Sep 30 17:07:56 EDT 2025 Wed Oct 01 13:37:57 EDT 2025 Wed Oct 01 12:51:30 EDT 2025 Mon Jul 21 06:00:50 EDT 2025 Wed Apr 02 07:25:05 EDT 2025 Thu Apr 24 23:04:34 EDT 2025 Wed Oct 01 04:37:54 EDT 2025 Fri Feb 23 02:31:59 EST 2024 |
| IsDoiOpenAccess | true |
| IsOpenAccess | true |
| IsPeerReviewed | true |
| IsScholarly | true |
| Keywords | Backpropagation Neural networks Geometric mean Variance minimization Machine learning Ensemble Sparse representations Stochastic gradient descent Regularization Stochastic neurons Ergodic theory On line Self consistency Dynamic properties Modeling Adaptive method Varying speed Variance Gradient descent Backpropagation algorithm Sparse representation Transfer function Learning algorithm Logistic function Static properties Minimization Neural network Independent variable Linearity Artificial intelligence Logistics sparse representations backpropagation neural networks regularization variance minimization stochastic neurons ensemble machine learning stochastic gradient descent geometric mean |
| Language | English |
| License | http://creativecommons.org/licenses/by-nc-nd/3.0 CC BY 4.0 cc-by-nc-nd |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-c526t-9a8f604ed08dfbec4154b09093ff09a32154cbebb8619d710e5de4263d4b34d03 |
| Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 23 |
| OpenAccessLink | https://proxy.k.utb.cz/login?url=https://escholarship.org/uc/item/7st7476x |
| PMID | 24771879 |
| PQID | 1669857146 |
| PQPubID | 23500 |
| PageCount | 45 |
| ParticipantIDs | unpaywall_primary_10_1016_j_artint_2014_02_004 pubmedcentral_primary_oai_pubmedcentral_nih_gov_3996711 proquest_miscellaneous_1826587254 proquest_miscellaneous_1669857146 pubmed_primary_24771879 pascalfrancis_primary_28376664 crossref_citationtrail_10_1016_j_artint_2014_02_004 crossref_primary_10_1016_j_artint_2014_02_004 elsevier_sciencedirect_doi_10_1016_j_artint_2014_02_004 |
| ProviderPackageCode | CITATION AAYXX |
| PublicationCentury | 2000 |
| PublicationDate | 2014-05-01 |
| PublicationDateYYYYMMDD | 2014-05-01 |
| PublicationDate_xml | – month: 05 year: 2014 text: 2014-05-01 day: 01 |
| PublicationDecade | 2010 |
| PublicationPlace | Oxford |
| PublicationPlace_xml | – name: Oxford – name: Netherlands |
| PublicationTitle | Artificial intelligence |
| PublicationTitleAlternate | Artif Intell |
| PublicationYear | 2014 |
| Publisher | Elsevier B.V Elsevier |
| Publisher_xml | – name: Elsevier B.V – name: Elsevier |
| References | Mercer (br0310) 2001; 31 Alon, Spencer (br0030) 2004 Raviv, Intrator (br0370) 1996; 8 Spiegelhalter, Lauritzen (br0410) 1990; 20 Gardner (br0240) 1989; 2 Neuman, Sándor (br0350) 2002; 5 Rumelhart, Hintont, Williams (br0400) 1986; 323 Aldaz (br0010) 2009; 3 Duda, Hart, Stork (br0230) 2000 Alzer (br0040) 1997; 27 Alzer (br0050) 1999; 129 Neuman, Sandor (br0360) 2005; 72 Bottou (br0130) 1998 Breiman (br0170) 1996; 24 Matsuoka (br0300) 1992; 22 Rockafellar (br0390) 1997 Cartwright, Field (br0200) 1978 Diaconis (br0220) 1988; vol. 1 Wager, Wang, Liang (br0430) 2013; vol. 26 Baldi, Hornik (br0080) 1988; 2 Carr, Konishi (br0190) 1988; 85 Maaten, Chen, Tyree, Weinberger (br0290) 2013 Bishop (br0120) 1995; 7 Robbins, Siegmund (br0380) 1971 Baldi, Hornik (br0090) 1994; 6 Aldaz (br0020) 2012 Ba, Frey (br0070) 2013; vol. 26 Murray, Edwards (br0340) 1994; 5 Bowling, Khasawneh, Kaewkuekool, Cho (br0150) 2009; 2 Harnischfeger, Neuweiler, Schlegel (br0260) 1985; 53 Vincent, Larochelle, Bengio, Manzagol (br0420) 2008 Hanson (br0250) 1990; 42 Mitzenmacher, Upfal (br0330) 2005 Carr, Konishi (br0180) 1990; 10 An (br0060) 1996; 8 Beckenbach, Bellman (br0110) 1965 Cox (br0210) 1989 Baldi, Sadowski (br0100) 2013; vol. 26 Bottou (br0140) 2004; vol. 3176 Levinson (br0280) 1964; 8 Hinton, Srivastava, Krizhevsky, Sutskever, Salakhutdinov (br0270) 2012 Mercer (br0320) 2003; 33 Boyd, Vandenberghe (br0160) 2004 Hanson (10.1016/j.artint.2014.02.004_br0250) 1990; 42 Breiman (10.1016/j.artint.2014.02.004_br0170) 1996; 24 Neuman (10.1016/j.artint.2014.02.004_br0360) 2005; 72 Baldi (10.1016/j.artint.2014.02.004_br0090) 1994; 6 Mercer (10.1016/j.artint.2014.02.004_br0310) 2001; 31 An (10.1016/j.artint.2014.02.004_br0060) 1996; 8 Cartwright (10.1016/j.artint.2014.02.004_br0200) 1978 Murray (10.1016/j.artint.2014.02.004_br0340) 1994; 5 Cox (10.1016/j.artint.2014.02.004_br0210) 1989 Matsuoka (10.1016/j.artint.2014.02.004_br0300) 1992; 22 Bottou (10.1016/j.artint.2014.02.004_br0140) 2004; vol. 3176 Harnischfeger (10.1016/j.artint.2014.02.004_br0260) 1985; 53 Duda (10.1016/j.artint.2014.02.004_br0230) 2000 Bowling (10.1016/j.artint.2014.02.004_br0150) 2009; 2 Gardner (10.1016/j.artint.2014.02.004_br0240) 1989; 2 Alzer (10.1016/j.artint.2014.02.004_br0040) 1997; 27 Alzer (10.1016/j.artint.2014.02.004_br0050) 1999; 129 Bishop (10.1016/j.artint.2014.02.004_br0120) 1995; 7 Vincent (10.1016/j.artint.2014.02.004_br0420) 2008 Aldaz (10.1016/j.artint.2014.02.004_br0010) 2009; 3 Baldi (10.1016/j.artint.2014.02.004_br0100) 2013; vol. 26 Rockafellar (10.1016/j.artint.2014.02.004_br0390) 1997 Alon (10.1016/j.artint.2014.02.004_br0030) 2004 Levinson (10.1016/j.artint.2014.02.004_br0280) 1964; 8 Wager (10.1016/j.artint.2014.02.004_br0430) 2013; vol. 26 Raviv (10.1016/j.artint.2014.02.004_br0370) 1996; 8 Spiegelhalter (10.1016/j.artint.2014.02.004_br0410) 1990; 20 Boyd (10.1016/j.artint.2014.02.004_br0160) 2004 Mercer (10.1016/j.artint.2014.02.004_br0320) 2003; 33 Maaten (10.1016/j.artint.2014.02.004_br0290) 2013 Mitzenmacher (10.1016/j.artint.2014.02.004_br0330) 2005 Beckenbach (10.1016/j.artint.2014.02.004_br0110) 1965 Carr (10.1016/j.artint.2014.02.004_br0180) 1990; 10 Baldi (10.1016/j.artint.2014.02.004_br0080) 1988; 2 Neuman (10.1016/j.artint.2014.02.004_br0350) 2002; 5 Aldaz (10.1016/j.artint.2014.02.004_br0020) Ba (10.1016/j.artint.2014.02.004_br0070) 2013; vol. 26 Hinton (10.1016/j.artint.2014.02.004_br0270) Robbins (10.1016/j.artint.2014.02.004_br0380) 1971 Bottou (10.1016/j.artint.2014.02.004_br0130) 1998 Carr (10.1016/j.artint.2014.02.004_br0190) 1988; 85 Diaconis (10.1016/j.artint.2014.02.004_br0220) 1988; vol. 1 Rumelhart (10.1016/j.artint.2014.02.004_br0400) 1986; 323 18263374 - IEEE Trans Neural Netw. 1995;6(4):837-58 3186725 - Proc Natl Acad Sci U S A. 1988 Nov;85(21):8311-5 2213141 - J Neurosci. 1990 Oct;10(10):3227-46 3973664 - J Neurophysiol. 1985 Jan;53(1):89-109 18267852 - IEEE Trans Neural Netw. 1994;5(5):792-802 |
| References_xml | – volume: vol. 26 start-page: 351 year: 2013 end-page: 359 ident: br0430 article-title: Dropout training as adaptive regularization publication-title: Advances in Neural Information Processing Systems – volume: 24 start-page: 123 year: 1996 end-page: 140 ident: br0170 article-title: Bagging predictors publication-title: Mach. Learn. – volume: 6 start-page: 837 year: 1994 end-page: 858 ident: br0090 article-title: Learning in linear networks: a survey publication-title: IEEE Trans. Neural Netw. – year: 2005 ident: br0330 article-title: Probability and Computing: Randomized Algorithms and Probabilistic Analysis – volume: 323 start-page: 533 year: 1986 end-page: 536 ident: br0400 article-title: Learning representations by back-propagating errors publication-title: Nature – year: 2012 ident: br0270 article-title: Improving neural networks by preventing co-adaptation of feature detectors – year: 1997 ident: br0390 article-title: Convex Analysis, vol. 28 – volume: 129 start-page: 221 year: 1999 end-page: 228 ident: br0050 article-title: Some inequalities for arithmetic and geometric means publication-title: Proc. R. Soc. Edinb., Sect. A, Math. – volume: 5 start-page: 792 year: 1994 end-page: 802 ident: br0340 article-title: Enhanced mlp performance and fault tolerance resulting from synaptic weight noise during training publication-title: IEEE Trans. Neural Netw. – volume: vol. 26 start-page: 3084 year: 2013 end-page: 3092 ident: br0070 article-title: Adaptive dropout for training deep neural networks publication-title: Advances in Neural Information Processing Systems – volume: 27 year: 1997 ident: br0040 article-title: A new refinement of the arithmetic mean geometric mean inequality publication-title: J. Math. – start-page: 1096 year: 2008 end-page: 1103 ident: br0420 article-title: Extracting and composing robust features with denoising autoencoders publication-title: Proceedings of the 25th International Conference on Machine Learning – volume: 3 start-page: 213 year: 2009 end-page: 216 ident: br0010 article-title: Self improvement of the inequality between arithmetic and geometric means publication-title: J. Math. Inequal. – year: 1989 ident: br0210 article-title: The Analysis of Binary Data, vol. 32 – start-page: 233 year: 1971 end-page: 257 ident: br0380 article-title: A convergence theorem for non negative almost supermartingales and some applications publication-title: Optimizing Methods in Statistics – volume: 8 start-page: 643 year: 1996 end-page: 674 ident: br0060 article-title: The effects of adding noise during backpropagation training on a generalization performance publication-title: Neural Comput. – volume: vol. 26 start-page: 2814 year: 2013 end-page: 2822 ident: br0100 article-title: Understanding dropout publication-title: Advances in Neural Information Processing Systems – year: 2004 ident: br0030 article-title: The Probabilistic Method – year: 1965 ident: br0110 article-title: Inequalities – volume: 31 year: 2001 ident: br0310 article-title: Improved upper and lower bounds for the difference an-gn publication-title: J. Math. – start-page: 36 year: 1978 end-page: 38 ident: br0200 article-title: A refinement of the arithmetic mean-geometric mean inequality publication-title: Proc. Am. Math. Soc. – year: 1998 ident: br0130 article-title: Online algorithms and stochastic approximations publication-title: Online Learning and Neural Networks – volume: 8 start-page: 355 year: 1996 end-page: 372 ident: br0370 article-title: Bootstrapping with noise: An effective regularization technique publication-title: Connect. Sci. – volume: 72 start-page: 87 year: 2005 end-page: 108 ident: br0360 article-title: On the Ky Fan inequality and related inequalities ii publication-title: Bull. Aust. Math. Soc. – volume: 33 year: 2003 ident: br0320 article-title: Refined arithmetic, geometric and harmonic mean inequalities publication-title: J. Math. – volume: 5 start-page: 49 year: 2002 end-page: 56 ident: br0350 article-title: On the Ky Fan inequality and related inequalities i publication-title: Math. Inequal. Appl – volume: 2 start-page: 114 year: 2009 end-page: 127 ident: br0150 article-title: A logistic approximation to the cumulative normal distribution publication-title: J. Ind. Eng. Manag. – volume: 10 start-page: 3227 year: 1990 end-page: 3246 ident: br0180 article-title: A circuit for detection of interaural time differences in the brain stem of the barn owl publication-title: J. Neurosci. – volume: 85 start-page: 8311 year: 1988 end-page: 8315 ident: br0190 article-title: Axonal delay lines for time measurement in the owl's brainstem publication-title: Proc. Natl. Acad. Sci. – volume: vol. 1 start-page: 163 year: 1988 end-page: 175 ident: br0220 article-title: Bayesian numerical analysis publication-title: Statistical Decision Theory and Related Topics IV – volume: 42 start-page: 265 year: 1990 end-page: 272 ident: br0250 article-title: A stochastic version of the delta rule publication-title: Physica D – volume: 8 start-page: 133 year: 1964 end-page: 134 ident: br0280 article-title: Generalization of an inequality of Ky Fan publication-title: J. Math. Anal. Appl. – volume: vol. 3176 start-page: 146 year: 2004 end-page: 168 ident: br0140 article-title: Stochastic learning publication-title: Advanced Lectures on Machine Learning – volume: 22 start-page: 436 year: 1992 end-page: 440 ident: br0300 article-title: Noise injection into inputs in back-propagation learning publication-title: IEEE Trans. Syst. Man Cybern. – year: 2012 ident: br0020 article-title: Sharp bounds for the difference between the arithmetic and geometric means – year: 2000 ident: br0230 article-title: Pattern Classification – volume: 2 start-page: 69 year: 1989 end-page: 76 ident: br0240 article-title: Noise modulation of synaptic weights in a biological neural network publication-title: Neural Netw. – volume: 20 start-page: 579 year: 1990 end-page: 605 ident: br0410 article-title: Sequential updating of conditional probabilities on directed graphical structures publication-title: Networks – volume: 2 start-page: 53 year: 1988 end-page: 58 ident: br0080 article-title: Neural networks and principal component analysis: Learning from examples without local minima publication-title: Neural Netw. – year: 2004 ident: br0160 article-title: Convex Optimization – start-page: 410 year: 2013 end-page: 418 ident: br0290 article-title: Learning with marginalized corrupted features publication-title: Proceedings of the 30th International Conference on Machine Learning (ICML-13) – volume: 53 start-page: 89 year: 1985 end-page: 109 ident: br0260 article-title: Interaural time and intensity coding in superior olivary complex and inferior colliculus of the echolocating bat molossus ater publication-title: J. Neurophysiol. – volume: 7 start-page: 108 year: 1995 end-page: 116 ident: br0120 article-title: Training with noise is equivalent to Tikhonov regularization publication-title: Neural Comput. – volume: 53 start-page: 89 issue: 1 year: 1985 ident: 10.1016/j.artint.2014.02.004_br0260 article-title: Interaural time and intensity coding in superior olivary complex and inferior colliculus of the echolocating bat molossus ater publication-title: J. Neurophysiol. doi: 10.1152/jn.1985.53.1.89 – volume: 2 start-page: 53 issue: 1 year: 1988 ident: 10.1016/j.artint.2014.02.004_br0080 article-title: Neural networks and principal component analysis: Learning from examples without local minima publication-title: Neural Netw. doi: 10.1016/0893-6080(89)90014-2 – volume: 20 start-page: 579 issue: 5 year: 1990 ident: 10.1016/j.artint.2014.02.004_br0410 article-title: Sequential updating of conditional probabilities on directed graphical structures publication-title: Networks doi: 10.1002/net.3230200507 – year: 2004 ident: 10.1016/j.artint.2014.02.004_br0160 – volume: 129 start-page: 221 issue: 02 year: 1999 ident: 10.1016/j.artint.2014.02.004_br0050 article-title: Some inequalities for arithmetic and geometric means publication-title: Proc. R. Soc. Edinb., Sect. A, Math. doi: 10.1017/S0308210500021326 – year: 1998 ident: 10.1016/j.artint.2014.02.004_br0130 article-title: Online algorithms and stochastic approximations – volume: vol. 26 start-page: 351 year: 2013 ident: 10.1016/j.artint.2014.02.004_br0430 article-title: Dropout training as adaptive regularization – volume: 42 start-page: 265 issue: 1 year: 1990 ident: 10.1016/j.artint.2014.02.004_br0250 article-title: A stochastic version of the delta rule publication-title: Physica D doi: 10.1016/0167-2789(90)90081-Y – start-page: 410 year: 2013 ident: 10.1016/j.artint.2014.02.004_br0290 article-title: Learning with marginalized corrupted features – year: 1989 ident: 10.1016/j.artint.2014.02.004_br0210 – year: 2000 ident: 10.1016/j.artint.2014.02.004_br0230 – volume: 22 start-page: 436 issue: 3 year: 1992 ident: 10.1016/j.artint.2014.02.004_br0300 article-title: Noise injection into inputs in back-propagation learning publication-title: IEEE Trans. Syst. Man Cybern. doi: 10.1109/21.155944 – volume: 5 start-page: 49 year: 2002 ident: 10.1016/j.artint.2014.02.004_br0350 article-title: On the Ky Fan inequality and related inequalities i publication-title: Math. Inequal. Appl – volume: 8 start-page: 355 issue: 3–4 year: 1996 ident: 10.1016/j.artint.2014.02.004_br0370 article-title: Bootstrapping with noise: An effective regularization technique publication-title: Connect. Sci. doi: 10.1080/095400996116811 – volume: vol. 26 start-page: 3084 year: 2013 ident: 10.1016/j.artint.2014.02.004_br0070 article-title: Adaptive dropout for training deep neural networks – volume: 7 start-page: 108 issue: 1 year: 1995 ident: 10.1016/j.artint.2014.02.004_br0120 article-title: Training with noise is equivalent to Tikhonov regularization publication-title: Neural Comput. doi: 10.1162/neco.1995.7.1.108 – year: 1965 ident: 10.1016/j.artint.2014.02.004_br0110 – volume: 8 start-page: 133 issue: 1 year: 1964 ident: 10.1016/j.artint.2014.02.004_br0280 article-title: Generalization of an inequality of Ky Fan publication-title: J. Math. Anal. Appl. doi: 10.1016/0022-247X(64)90089-7 – volume: 27 issue: 3 year: 1997 ident: 10.1016/j.artint.2014.02.004_br0040 article-title: A new refinement of the arithmetic mean geometric mean inequality publication-title: J. Math. – year: 2005 ident: 10.1016/j.artint.2014.02.004_br0330 – volume: 72 start-page: 87 issue: 1 year: 2005 ident: 10.1016/j.artint.2014.02.004_br0360 article-title: On the Ky Fan inequality and related inequalities ii publication-title: Bull. Aust. Math. Soc. doi: 10.1017/S0004972700034894 – volume: 33 issue: 4 year: 2003 ident: 10.1016/j.artint.2014.02.004_br0320 article-title: Refined arithmetic, geometric and harmonic mean inequalities publication-title: J. Math. – volume: 8 start-page: 643 issue: 3 year: 1996 ident: 10.1016/j.artint.2014.02.004_br0060 article-title: The effects of adding noise during backpropagation training on a generalization performance publication-title: Neural Comput. doi: 10.1162/neco.1996.8.3.643 – volume: 6 start-page: 837 issue: 4 year: 1994 ident: 10.1016/j.artint.2014.02.004_br0090 article-title: Learning in linear networks: a survey publication-title: IEEE Trans. Neural Netw. doi: 10.1109/72.392248 – volume: 5 start-page: 792 issue: 5 year: 1994 ident: 10.1016/j.artint.2014.02.004_br0340 article-title: Enhanced mlp performance and fault tolerance resulting from synaptic weight noise during training publication-title: IEEE Trans. Neural Netw. doi: 10.1109/72.317730 – volume: vol. 1 start-page: 163 year: 1988 ident: 10.1016/j.artint.2014.02.004_br0220 article-title: Bayesian numerical analysis – volume: vol. 26 start-page: 2814 year: 2013 ident: 10.1016/j.artint.2014.02.004_br0100 article-title: Understanding dropout – volume: 2 start-page: 114 issue: 1 year: 2009 ident: 10.1016/j.artint.2014.02.004_br0150 article-title: A logistic approximation to the cumulative normal distribution publication-title: J. Ind. Eng. Manag. – volume: 85 start-page: 8311 issue: 21 year: 1988 ident: 10.1016/j.artint.2014.02.004_br0190 article-title: Axonal delay lines for time measurement in the owl's brainstem publication-title: Proc. Natl. Acad. Sci. doi: 10.1073/pnas.85.21.8311 – start-page: 1096 year: 2008 ident: 10.1016/j.artint.2014.02.004_br0420 article-title: Extracting and composing robust features with denoising autoencoders – volume: 31 issue: 2 year: 2001 ident: 10.1016/j.artint.2014.02.004_br0310 article-title: Improved upper and lower bounds for the difference an-gn publication-title: J. Math. – volume: 323 start-page: 533 issue: 6088 year: 1986 ident: 10.1016/j.artint.2014.02.004_br0400 article-title: Learning representations by back-propagating errors publication-title: Nature doi: 10.1038/323533a0 – ident: 10.1016/j.artint.2014.02.004_br0270 – ident: 10.1016/j.artint.2014.02.004_br0020 – volume: vol. 3176 start-page: 146 year: 2004 ident: 10.1016/j.artint.2014.02.004_br0140 article-title: Stochastic learning – start-page: 233 year: 1971 ident: 10.1016/j.artint.2014.02.004_br0380 article-title: A convergence theorem for non negative almost supermartingales and some applications – volume: 10 start-page: 3227 issue: 10 year: 1990 ident: 10.1016/j.artint.2014.02.004_br0180 article-title: A circuit for detection of interaural time differences in the brain stem of the barn owl publication-title: J. Neurosci. doi: 10.1523/JNEUROSCI.10-10-03227.1990 – year: 2004 ident: 10.1016/j.artint.2014.02.004_br0030 – start-page: 36 year: 1978 ident: 10.1016/j.artint.2014.02.004_br0200 article-title: A refinement of the arithmetic mean-geometric mean inequality publication-title: Proc. Am. Math. Soc. doi: 10.1090/S0002-9939-1978-0476971-2 – volume: 2 start-page: 69 issue: 1 year: 1989 ident: 10.1016/j.artint.2014.02.004_br0240 article-title: Noise modulation of synaptic weights in a biological neural network publication-title: Neural Netw. doi: 10.1016/0893-6080(89)90016-6 – year: 1997 ident: 10.1016/j.artint.2014.02.004_br0390 – volume: 3 start-page: 213 issue: 2 year: 2009 ident: 10.1016/j.artint.2014.02.004_br0010 article-title: Self improvement of the inequality between arithmetic and geometric means publication-title: J. Math. Inequal. doi: 10.7153/jmi-03-21 – volume: 24 start-page: 123 issue: 2 year: 1996 ident: 10.1016/j.artint.2014.02.004_br0170 article-title: Bagging predictors publication-title: Mach. Learn. doi: 10.1007/BF00058655 – reference: 2213141 - J Neurosci. 1990 Oct;10(10):3227-46 – reference: 18263374 - IEEE Trans Neural Netw. 1995;6(4):837-58 – reference: 3973664 - J Neurophysiol. 1985 Jan;53(1):89-109 – reference: 18267852 - IEEE Trans Neural Netw. 1994;5(5):792-802 – reference: 3186725 - Proc Natl Acad Sci U S A. 1988 Nov;85(21):8311-5 |
| SSID | ssj0003991 |
| Score | 2.574175 |
| Snippet | Dropout is a recently introduced algorithm for training neural networks by randomly dropping units during training to prevent their co-adaptation. A... Dropout is a recently introduced algorithm for training neural network by randomly dropping units during training to prevent their co-adaptation. A... |
| SourceID | unpaywall pubmedcentral proquest pubmed pascalfrancis crossref elsevier |
| SourceType | Open Access Repository Aggregation Database Index Database Enrichment Source Publisher |
| StartPage | 78 |
| SubjectTerms | Applied sciences Approximation Artificial intelligence Backpropagation Computer science; control theory; systems Connectionism. Neural networks Detection, estimation, filtering, equalization, prediction Dropouts Ensemble Exact sciences and technology Functions (mathematics) Geometric mean Information, signal and communications theory Learning and adaptive systems Logistics Machine learning Mathematical analysis Networks Neural networks Regularization Signal and communications theory Signal, noise Sparse representations Stochastic gradient descent Stochastic neurons Stochasticity Telecommunications and information theory Training Variance minimization |
| SummonAdditionalLinks | – databaseName: Elsevier ScienceDirect dbid: .~1 link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1LT8MwDI4mLiAh3o_xmIrENZDStEmPaAIhDpxA2i1KmwSGRjdtnRAXfjt2mw4mnuLY1unDdmJbdb6PkOOEGxOlNqMQPB3lOddUulhQK13KZJ4ixBN2W9wkV3f8uhf3WqTb7IXBtkq_9tdrerVa-zOnXpuno34f9_giLg9ifuG0DhF2m3OBLAYnr-9tHhCAPWsepyjdbJ-reryqnfrYURnyGrmTfxeelkd6AkpzNdvFV-no567KxWkx0i_PejD4ELIu18iKzzWD8_pz1knLFhtkteFxCPy03iQd8JXAIF3CtAw8jcR9oAf3w3G_fHjaIneXF7fdK-p5E2genyUlTbV0CePWMGkc2AhiNM9YytLIOZbqCKI8zzObZRKqJwMpho2NReB2w7OIGxZtk4ViWNhdEghICLR1cCu4onOoz8IccrrYhcwJaUWbRI26VO5BxZHbYqCa7rFHVStZoZIVO1Og5Dahs1GjGlTjF3nRWELNOYeCdf-XkZ05w80eh6A_ULmBwFFjSQUTC_-W6MIOpxMVJkkqYwGR5AcZKM5iKaDIbpOd2vrvT-BCIJU7vPycX8wEENh7_krRf6gAvsFnExGGbXIy86A_6Wnv33raJ0t4VHdyHpCFcjy1h5BtlVmnmk5vJNYn8w priority: 102 providerName: Elsevier |
| Title | The dropout learning algorithm |
| URI | https://dx.doi.org/10.1016/j.artint.2014.02.004 https://www.ncbi.nlm.nih.gov/pubmed/24771879 https://www.proquest.com/docview/1669857146 https://www.proquest.com/docview/1826587254 https://pubmed.ncbi.nlm.nih.gov/PMC3996711 https://escholarship.org/uc/item/7st7476x |
| UnpaywallVersion | submittedVersion |
| Volume | 210 |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| journalDatabaseRights | – providerCode: PRVESC databaseName: Baden-Württemberg Complete Freedom Collection (Elsevier) customDbUrl: eissn: 1872-7921 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0003991 issn: 0004-3702 databaseCode: GBLVA dateStart: 20110101 isFulltext: true titleUrlDefault: https://www.sciencedirect.com providerName: Elsevier – providerCode: PRVESC databaseName: Elsevier Free Content customDbUrl: eissn: 1872-7921 dateEnd: 20211105 omitProxy: true ssIdentifier: ssj0003991 issn: 0004-3702 databaseCode: IXB dateStart: 19950101 isFulltext: true titleUrlDefault: https://www.sciencedirect.com providerName: Elsevier – providerCode: PRVESC databaseName: Elsevier SD Complete Freedom Collection customDbUrl: eissn: 1872-7921 dateEnd: 20211105 omitProxy: true ssIdentifier: ssj0003991 issn: 0004-3702 databaseCode: ACRLP dateStart: 19950101 isFulltext: true titleUrlDefault: https://www.sciencedirect.com providerName: Elsevier – providerCode: PRVESC databaseName: Elsevier SD Freedom Collection Journals [SCFCJ] customDbUrl: eissn: 1872-7921 dateEnd: 20211031 omitProxy: true ssIdentifier: ssj0003991 issn: 0004-3702 databaseCode: AIKHN dateStart: 19950101 isFulltext: true titleUrlDefault: https://www.sciencedirect.com providerName: Elsevier – providerCode: PRVESC databaseName: ScienceDirect (Elsevier) customDbUrl: eissn: 1872-7921 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0003991 issn: 0004-3702 databaseCode: .~1 dateStart: 19950101 isFulltext: true titleUrlDefault: https://www.sciencedirect.com providerName: Elsevier – providerCode: PRVLSH databaseName: Elsevier Journals customDbUrl: mediaType: online eissn: 1872-7921 dateEnd: 99991231 omitProxy: true ssIdentifier: ssj0003991 issn: 0004-3702 databaseCode: AKRWK dateStart: 19700301 isFulltext: true providerName: Library Specific Holdings |
| link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lj9MwELagPYCEWN6URxQkrilO4-exoF11Qao4UKmcLCe2t11KWm0TITjsb99xnJQtr12OUfxIPOPMN8rnbxB6zYgxmbR5AsHTJaQgOhGO8sQKJ7EopJd48myLKZvMyPs5nbdi0f4sjO1yusVy0_zIr0NNsjd8WwHyZQAX-4wC7O6h_mz6cfw5wFsCGyXwCwUHxChHaXdMruFyNSfyPXMyJUGhk_wtDN3Z6C0sjgtVLf4EO39nT96qy43-_k2vVpdC09FBIHVtG0VDz0j5MqyrfFj8-EXv8VpvfQ_dbQFqPA4edR_dsOUDdNAVf4jbb8FDFIGDxcbXWKiruK09cRLr1cn6bFktvj5Cs6PDT-8mSVtsISnoiFWJ1MIxTKzBwjgwLAR2kmOJZeYcljoDaECK3Oa5gJTLAC6x1Fiv9m5InhGDs8eoV65L-xTFHFCEtg6Ggju6gKQuLQAIUpdix4XlA5R1a6-KVoncF8RYqY5ydqqCxZS3mMIjBRYboGTXaxOUOK5ozzuzqhZNBJSgIFhc0TPa84LddF4pCNI9aPCqcwsFu9H_YtGlXddblTImBeUQfv7RBjI6Cm5KYZwnwZV-zkA49_Xf4eH3nGzXwKuB798pl4tGFRyQJuNpOkDDnTtea52e_W-H5-i2vwqszxeoV53V9iUgsyqP0M3heRqh_vj4w2QKV8fzt1G7Ty8AcUA8ig |
| linkProvider | Unpaywall |
| linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1LT9wwEB4BPYCE-gLK9rGkEleDs3Fi51ihom15nEDiZjmxDYu22RVkhbj0t3cmcZauaEvFNR7nMQ_PjDLzDcBuJqxNclcwdJ6eiVIYpnwqmVM-56rMCeKJqi1Os-G5-H6RXizBQdcLQ2WV4exvz_TmtA5X9gM396ejEfX4Ei4PYX6RWcfZMrwQ6UBSBrb386HOAz1wGJsnGJF3_XNNkVfTqk8llbFooTvF3_zT-tTcItd8O-7iT_Ho47LK1Vk1Nfd3Zjz-zWcdvoaXIdiMvrTf8waWXPUWXnWDHKJg1xvQR2WJLM1LmNVRmCNxGZnx5eRmVF_92ITzw69nB0MWBiewMh1kNcuN8hkXznJlPQoJnbQoeM7zxHuemwTdvCgLVxQK0yeLMYZLrSPkdiuKRFiebMFKNancNkQSIwLjPN4KV0yJCVpcYlCX-ph7qZzsQdKxS5cBVZyGW4x1Vz52rVsma2Ky5gONTO4Bm--atqgaT9DLThJ6QTs0HvxP7OwvCG7-OEL9wdQNCT53ktRoWfS7xFRuMrvVcZblKpXoSv5Bg9lZqiRm2T1410r_4QlCSprlji-_oBdzAkL2XlypRlcNwjfqbCbjuAd7cw36Lz69fzafdmB1eHZyrI-_nR59gDVaacs6P8JKfTNznzD0qot-Y1q_AAnlKxY |
| linkToUnpaywall | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1bT9swFLam8jAkBIxx6QZVkHhNcRpfH9E0hHhAe1gl9mQ5sU0LJa1oIgS_fsdx0q2DcXmM4kvic5zzHeXzdxA6YsSYVNoshuDpYpITHQtHeWyFk1jk0ks8ebbFBTsbkvNLetmIRfuzMLbN6UbjWf0jvwo1yY75vATkywAurjAKsLuDVoYXP05-BXhLYKMEfqHggBjlIGmPydVcrvpEvmdOJiQodJL_haG1mZ7D4rhQ1eI52PmUPfmxKmb64V5PJn-FptONQOqa14qGnpFy06_KrJ8__qP3-Ka33kTrDUCNToJHfUIfbLGFNtriD1HzLfiMeuBgkfE1FqoyampPXEV6cjW9G5ej2200PP3-89tZ3BRbiHM6YGUstXAME2uwMA4MC4GdZFhimTqHpU4BGpA8s1kmIOUygEssNdarvRuSpcTgdAd1imlh91DEAUVo62AouKNzSOqSHIAgdQl2XFjeRWm79ipvlMh9QYyJailn1ypYTHmLKTxQYLEuihe9ZkGJ45X2vDWratBEQAkKgsUrPXtLXrCYzisFQboHDQ5bt1CwG_0vFl3YaTVXCWNSUA7h54U2kNFRcFMK4-wGV_ozA-Hc13-Hh19yskUDrwa-fKcYj2pVcECajCdJF_UX7vimdfry3g5f0aq_CqzPfdQp7yp7AMiszHrNnvwN4Ts4-g |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=The+dropout+learning+algorithm&rft.jtitle=Artificial+intelligence&rft.au=BALDI%2C+Pierre&rft.au=SADOWSKI%2C+Peter&rft.date=2014-05-01&rft.pub=Elsevier&rft.issn=0004-3702&rft.volume=210&rft.spage=78&rft.epage=122&rft_id=info:doi/10.1016%2Fj.artint.2014.02.004&rft.externalDBID=n%2Fa&rft.externalDocID=28376664 |
| thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0004-3702&client=summon |
| thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0004-3702&client=summon |
| thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0004-3702&client=summon |