Coding schemes in neural networks learning classification tasks
Neural networks posses the crucial ability to generate meaningful representations of task-dependent features. Indeed, with appropriate scaling, supervised learning in neural networks can result in strong, task-dependent feature learning. However, the nature of the emergent representations is still u...
Saved in:
Published in | Nature communications Vol. 16; no. 1; pp. 3354 - 12 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
London
Nature Publishing Group UK
09.04.2025
Nature Publishing Group Nature Portfolio |
Subjects | |
Online Access | Get full text |
ISSN | 2041-1723 2041-1723 |
DOI | 10.1038/s41467-025-58276-6 |
Cover
Abstract | Neural networks posses the crucial ability to generate meaningful representations of task-dependent features. Indeed, with appropriate scaling, supervised learning in neural networks can result in strong, task-dependent feature learning. However, the nature of the emergent representations is still unclear. To understand the effect of learning on representations, we investigate fully-connected, wide neural networks learning classification tasks using the Bayesian framework where learning shapes the posterior distribution of the network weights. Consistent with previous findings, our analysis of the feature learning regime (also known as ‘non-lazy’ regime) shows that the networks acquire strong, data-dependent features, denoted as coding schemes, where neuronal responses to each input are dominated by its class membership. Surprisingly, the nature of the coding schemes depends crucially on the neuronal nonlinearity. In linear networks, an analog coding scheme of the task emerges; in nonlinear networks, strong spontaneous symmetry breaking leads to either redundant or sparse coding schemes. Our findings highlight how network properties such as scaling of weights and neuronal nonlinearity can profoundly influence the emergent representations.
Neural networks discover meaningful representations of the data through the process of learning. Here, the authors explore how these representations are affected by scaling the network output or modifying the activation functions. |
---|---|
AbstractList | Neural networks posses the crucial ability to generate meaningful representations of task-dependent features. Indeed, with appropriate scaling, supervised learning in neural networks can result in strong, task-dependent feature learning. However, the nature of the emergent representations is still unclear. To understand the effect of learning on representations, we investigate fully-connected, wide neural networks learning classification tasks using the Bayesian framework where learning shapes the posterior distribution of the network weights. Consistent with previous findings, our analysis of the feature learning regime (also known as 'non-lazy' regime) shows that the networks acquire strong, data-dependent features, denoted as coding schemes, where neuronal responses to each input are dominated by its class membership. Surprisingly, the nature of the coding schemes depends crucially on the neuronal nonlinearity. In linear networks, an analog coding scheme of the task emerges; in nonlinear networks, strong spontaneous symmetry breaking leads to either redundant or sparse coding schemes. Our findings highlight how network properties such as scaling of weights and neuronal nonlinearity can profoundly influence the emergent representations. Neural networks posses the crucial ability to generate meaningful representations of task-dependent features. Indeed, with appropriate scaling, supervised learning in neural networks can result in strong, task-dependent feature learning. However, the nature of the emergent representations is still unclear. To understand the effect of learning on representations, we investigate fully-connected, wide neural networks learning classification tasks using the Bayesian framework where learning shapes the posterior distribution of the network weights. Consistent with previous findings, our analysis of the feature learning regime (also known as 'non-lazy' regime) shows that the networks acquire strong, data-dependent features, denoted as coding schemes, where neuronal responses to each input are dominated by its class membership. Surprisingly, the nature of the coding schemes depends crucially on the neuronal nonlinearity. In linear networks, an analog coding scheme of the task emerges; in nonlinear networks, strong spontaneous symmetry breaking leads to either redundant or sparse coding schemes. Our findings highlight how network properties such as scaling of weights and neuronal nonlinearity can profoundly influence the emergent representations.Neural networks posses the crucial ability to generate meaningful representations of task-dependent features. Indeed, with appropriate scaling, supervised learning in neural networks can result in strong, task-dependent feature learning. However, the nature of the emergent representations is still unclear. To understand the effect of learning on representations, we investigate fully-connected, wide neural networks learning classification tasks using the Bayesian framework where learning shapes the posterior distribution of the network weights. Consistent with previous findings, our analysis of the feature learning regime (also known as 'non-lazy' regime) shows that the networks acquire strong, data-dependent features, denoted as coding schemes, where neuronal responses to each input are dominated by its class membership. Surprisingly, the nature of the coding schemes depends crucially on the neuronal nonlinearity. In linear networks, an analog coding scheme of the task emerges; in nonlinear networks, strong spontaneous symmetry breaking leads to either redundant or sparse coding schemes. Our findings highlight how network properties such as scaling of weights and neuronal nonlinearity can profoundly influence the emergent representations. Neural networks posses the crucial ability to generate meaningful representations of task-dependent features. Indeed, with appropriate scaling, supervised learning in neural networks can result in strong, task-dependent feature learning. However, the nature of the emergent representations is still unclear. To understand the effect of learning on representations, we investigate fully-connected, wide neural networks learning classification tasks using the Bayesian framework where learning shapes the posterior distribution of the network weights. Consistent with previous findings, our analysis of the feature learning regime (also known as ‘non-lazy’ regime) shows that the networks acquire strong, data-dependent features, denoted as coding schemes, where neuronal responses to each input are dominated by its class membership. Surprisingly, the nature of the coding schemes depends crucially on the neuronal nonlinearity. In linear networks, an analog coding scheme of the task emerges; in nonlinear networks, strong spontaneous symmetry breaking leads to either redundant or sparse coding schemes. Our findings highlight how network properties such as scaling of weights and neuronal nonlinearity can profoundly influence the emergent representations. Neural networks discover meaningful representations of the data through the process of learning. Here, the authors explore how these representations are affected by scaling the network output or modifying the activation functions. Abstract Neural networks posses the crucial ability to generate meaningful representations of task-dependent features. Indeed, with appropriate scaling, supervised learning in neural networks can result in strong, task-dependent feature learning. However, the nature of the emergent representations is still unclear. To understand the effect of learning on representations, we investigate fully-connected, wide neural networks learning classification tasks using the Bayesian framework where learning shapes the posterior distribution of the network weights. Consistent with previous findings, our analysis of the feature learning regime (also known as ‘non-lazy’ regime) shows that the networks acquire strong, data-dependent features, denoted as coding schemes, where neuronal responses to each input are dominated by its class membership. Surprisingly, the nature of the coding schemes depends crucially on the neuronal nonlinearity. In linear networks, an analog coding scheme of the task emerges; in nonlinear networks, strong spontaneous symmetry breaking leads to either redundant or sparse coding schemes. Our findings highlight how network properties such as scaling of weights and neuronal nonlinearity can profoundly influence the emergent representations. Neural networks posses the crucial ability to generate meaningful representations of task-dependent features. Indeed, with appropriate scaling, supervised learning in neural networks can result in strong, task-dependent feature learning. However, the nature of the emergent representations is still unclear. To understand the effect of learning on representations, we investigate fully-connected, wide neural networks learning classification tasks using the Bayesian framework where learning shapes the posterior distribution of the network weights. Consistent with previous findings, our analysis of the feature learning regime (also known as ‘non-lazy’ regime) shows that the networks acquire strong, data-dependent features, denoted as coding schemes, where neuronal responses to each input are dominated by its class membership. Surprisingly, the nature of the coding schemes depends crucially on the neuronal nonlinearity. In linear networks, an analog coding scheme of the task emerges; in nonlinear networks, strong spontaneous symmetry breaking leads to either redundant or sparse coding schemes. Our findings highlight how network properties such as scaling of weights and neuronal nonlinearity can profoundly influence the emergent representations.Neural networks discover meaningful representations of the data through the process of learning. Here, the authors explore how these representations are affected by scaling the network output or modifying the activation functions. |
ArticleNumber | 3354 |
Author | van Meegen, Alexander Sompolinsky, Haim |
Author_xml | – sequence: 1 givenname: Alexander orcidid: 0000-0003-2766-3982 surname: van Meegen fullname: van Meegen, Alexander email: alexander.vanmeegen@epfl.ch organization: Center for Brain Science, Harvard University – sequence: 2 givenname: Haim orcidid: 0000-0002-0322-0629 surname: Sompolinsky fullname: Sompolinsky, Haim email: hsompolinsky@mcb.harvard.edu organization: Center for Brain Science, Harvard University, Edmond and Lily Safra Center for Brain Sciences, Hebrew University |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/40204730$$D View this record in MEDLINE/PubMed |
BookMark | eNp9ksFu1DAQhi3UipbSF-CAInHhErDHduKcKrSipVKlXsrZcuzJ1tusXewExNvj3ZTScqgvM5r55veMPW_IQYgBCXnH6CdGufqcBRNNW1OQtVTQNnXzihwDFaxmLfCDJ_4ROc15Q8vhHVNCvCZHgpZsy-kxOVtF58O6yvYWt5grH6qAczJjMdOvmO5yNaJJYcfY0eTsB2_N5GOoJpPv8ltyOJgx4-mDPSHfz7_erL7VV9cXl6svV7UVnZhqkD0MqnSjAFvVuN5I01Nsd2HRc2mpcI11gzBSdFDQvrPKdeA4MMFayU_I5aLrotno--S3Jv3W0Xi9D8S01iZN3o6omXNKouAAygrnmg4s75yTiMWlyhWts0Xrfu636CyGqQz8TPR5JvhbvY4_NWOdAg5tUfj4oJDijxnzpLc-WxxHEzDOWXOmlIAyyK7xD_-hmzinUN5qTzHVSKkK9f5pS4-9_P2oAsAC2BRzTjg8Iozq3ULoZSF0WQi9XwjdlCK-FOUChzWmf3e_UPUH1bm29Q |
Cites_doi | 10.1103/PhysRevA.45.4146 10.1145/3446776 10.1038/nature14539 10.1016/S0042-6989(97)00169-7 10.1103/PhysRevA.45.7590 10.1073/pnas.2301345120 10.1016/j.spa.2019.06.003 10.1073/pnas.1903070116 10.1088/1742-5468/ac8e57 10.1007/978-1-4612-0745-0 10.1103/RevModPhys.65.499 10.1146/annurev-conmatphys-031119-050745 10.1088/1742-5468/abc4de 10.1073/pnas.1806579115 10.1051/jphys:0198900500200305700 10.1109/TPAMI.2006.79 10.1103/PhysRevA.45.6056 10.1088/1751-8121/ab3f3f 10.1038/s42256-023-00767-6 10.1017/9781009023405 10.1103/PhysRevE.104.064301 10.1103/PhysRevE.105.064118 10.1109/TPAMI.2013.50 10.1073/pnas.2200800119 10.1038/s41467-023-36361-y 10.1073/pnas.2015509117 10.1017/S0962492921000039 10.1088/1742-5468/ad01b0 10.1016/j.conb.2004.07.007 |
ContentType | Journal Article |
Copyright | The Author(s) 2025 2025. The Author(s). Copyright Nature Publishing Group 2025 The Author(s) 2025 2025 |
Copyright_xml | – notice: The Author(s) 2025 – notice: 2025. The Author(s). – notice: Copyright Nature Publishing Group 2025 – notice: The Author(s) 2025 2025 |
DBID | C6C AAYXX CITATION NPM 3V. 7QL 7QP 7QR 7SN 7SS 7ST 7T5 7T7 7TM 7TO 7X7 7XB 88E 8AO 8FD 8FE 8FG 8FH 8FI 8FJ 8FK ABUWG AEUYN AFKRA ARAPS AZQEC BBNVY BENPR BGLVJ BHPHI C1K CCPQU DWQXO FR3 FYUFA GHDGH GNUQQ H94 HCIFZ K9. LK8 M0S M1P M7P P5Z P62 P64 PHGZM PHGZT PIMPY PJZUB PKEHL PPXIY PQEST PQGLB PQQKQ PQUKI PRINS RC3 SOI 7X8 5PM DOA |
DOI | 10.1038/s41467-025-58276-6 |
DatabaseName | Springer Nature OA Free Journals CrossRef PubMed ProQuest Central (Corporate) Bacteriology Abstracts (Microbiology B) Calcium & Calcified Tissue Abstracts Chemoreception Abstracts Ecology Abstracts Entomology Abstracts (Full archive) Environment Abstracts Immunology Abstracts Industrial and Applied Microbiology Abstracts (Microbiology A) Nucleic Acids Abstracts Oncogenes and Growth Factors Abstracts Health & Medical Collection ProQuest Central (purchase pre-March 2016) Medical Database (Alumni Edition) ProQuest Pharma Collection Technology Research Database ProQuest SciTech Collection ProQuest Technology Collection ProQuest Natural Science Collection Hospital Premium Collection Hospital Premium Collection (Alumni Edition) ProQuest Central (Alumni) (purchase pre-March 2016) ProQuest Central (Alumni) ProQuest One Sustainability ProQuest Central UK/Ireland Advanced Technologies & Aerospace Collection ProQuest Central Essentials Biological Science Database ProQuest Central Technology Collection Natural Science Collection Environmental Sciences and Pollution Management ProQuest One ProQuest Central Korea Engineering Research Database Health Research Premium Collection Health Research Premium Collection (Alumni) ProQuest Central Student AIDS and Cancer Research Abstracts SciTech Premium Collection ProQuest Health & Medical Complete (Alumni) Biological Sciences ProQuest Health & Medical Collection Medical Database ProQuest Biological Science Advanced Technologies & Aerospace Database ProQuest Advanced Technologies & Aerospace Collection Biotechnology and BioEngineering Abstracts ProQuest Central Premium ProQuest One Academic (New) ProQuest Publicly Available Content ProQuest Health & Medical Research Collection ProQuest One Academic Middle East (New) ProQuest One Health & Nursing ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central China Genetics Abstracts Environment Abstracts MEDLINE - Academic PubMed Central (Full Participant titles) DOAJ Directory of Open Access Journals |
DatabaseTitle | CrossRef PubMed Publicly Available Content Database ProQuest Central Student Oncogenes and Growth Factors Abstracts ProQuest Advanced Technologies & Aerospace Collection ProQuest Central Essentials Nucleic Acids Abstracts SciTech Premium Collection ProQuest Central China Environmental Sciences and Pollution Management ProQuest One Applied & Life Sciences ProQuest One Sustainability Health Research Premium Collection Natural Science Collection Health & Medical Research Collection Biological Science Collection Chemoreception Abstracts Industrial and Applied Microbiology Abstracts (Microbiology A) ProQuest Central (New) ProQuest Medical Library (Alumni) Advanced Technologies & Aerospace Collection ProQuest Biological Science Collection ProQuest One Academic Eastern Edition ProQuest Hospital Collection ProQuest Technology Collection Health Research Premium Collection (Alumni) Biological Science Database Ecology Abstracts ProQuest Hospital Collection (Alumni) Biotechnology and BioEngineering Abstracts Entomology Abstracts ProQuest Health & Medical Complete ProQuest One Academic UKI Edition Engineering Research Database ProQuest One Academic Calcium & Calcified Tissue Abstracts ProQuest One Academic (New) Technology Collection Technology Research Database ProQuest One Academic Middle East (New) ProQuest Health & Medical Complete (Alumni) ProQuest Central (Alumni Edition) ProQuest One Community College ProQuest One Health & Nursing ProQuest Natural Science Collection ProQuest Pharma Collection ProQuest Central ProQuest Health & Medical Research Collection Genetics Abstracts Health and Medicine Complete (Alumni Edition) ProQuest Central Korea Bacteriology Abstracts (Microbiology B) AIDS and Cancer Research Abstracts ProQuest SciTech Collection Advanced Technologies & Aerospace Database ProQuest Medical Library Immunology Abstracts Environment Abstracts ProQuest Central (Alumni) MEDLINE - Academic |
DatabaseTitleList | PubMed MEDLINE - Academic Publicly Available Content Database |
Database_xml | – sequence: 1 dbid: C6C name: Springer Nature OA Free Journals url: http://www.springeropen.com/ sourceTypes: Publisher – sequence: 2 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 3 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 4 dbid: 8FG name: ProQuest Technology Collection url: https://search.proquest.com/technologycollection1 sourceTypes: Aggregation Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Biology |
EISSN | 2041-1723 |
EndPage | 12 |
ExternalDocumentID | oai_doaj_org_article_1dd85e43228c4dd692c39dd5ee69208d PMC11982327 40204730 10_1038_s41467_025_58276_6 |
Genre | Journal Article |
GrantInformation_xml | – fundername: Swartz Foundation |
GroupedDBID | --- 0R~ 39C 53G 5VS 70F 7X7 88E 8AO 8FE 8FG 8FH 8FI 8FJ AAHBH AAJSJ AASML ABUWG ACGFO ACGFS ACIWK ACMJI ACPRK ADBBV ADFRT ADMLS ADRAZ AENEX AEUYN AFKRA AFRAH AHMBA ALIPV ALMA_UNASSIGNED_HOLDINGS AMTXH AOIJS ARAPS ASPBG AVWKF AZFZN BBNVY BCNDV BENPR BGLVJ BHPHI BPHCQ BVXVI C6C CCPQU DIK EBLON EBS EE. EMOBN F5P FEDTE FYUFA GROUPED_DOAJ HCIFZ HMCUK HVGLF HYE HZ~ KQ8 LGEZI LK8 LOTEE M1P M7P M~E NADUK NAO NXXTH O9- OK1 P2P P62 PHGZM PHGZT PIMPY PJZUB PPXIY PQGLB PQQKQ PROAC PSQYO RNS RNT RNTTT RPM SNYQT SV3 TSG UKHRP AAYXX CITATION PUEGO NPM 3V. 7QL 7QP 7QR 7SN 7SS 7ST 7T5 7T7 7TM 7TO 7XB 8FD 8FK AZQEC C1K DWQXO FR3 GNUQQ H94 K9. M48 P64 PKEHL PQEST PQUKI PRINS RC3 SOI 7X8 5PM |
ID | FETCH-LOGICAL-c494t-25b2f817282e786dba5ab0e75b2f4b35c04d6cdf4a54922f8b9c8d92d32141753 |
IEDL.DBID | 7X7 |
ISSN | 2041-1723 |
IngestDate | Wed Aug 27 01:20:12 EDT 2025 Thu Aug 21 18:28:07 EDT 2025 Fri Sep 05 17:40:45 EDT 2025 Sat Aug 23 14:55:22 EDT 2025 Tue Apr 15 01:23:10 EDT 2025 Thu Sep 18 02:31:12 EDT 2025 Mon Jul 21 06:07:54 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 1 |
Language | English |
License | 2025. The Author(s). Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/. |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c494t-25b2f817282e786dba5ab0e75b2f4b35c04d6cdf4a54922f8b9c8d92d32141753 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ORCID | 0000-0003-2766-3982 0000-0002-0322-0629 |
OpenAccessLink | https://www.proquest.com/docview/3188186558?pq-origsite=%requestingapplication% |
PMID | 40204730 |
PQID | 3188186558 |
PQPubID | 546298 |
PageCount | 12 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_1dd85e43228c4dd692c39dd5ee69208d pubmedcentral_primary_oai_pubmedcentral_nih_gov_11982327 proquest_miscellaneous_3188429225 proquest_journals_3188186558 pubmed_primary_40204730 crossref_primary_10_1038_s41467_025_58276_6 springer_journals_10_1038_s41467_025_58276_6 |
PublicationCentury | 2000 |
PublicationDate | 2025-04-09 |
PublicationDateYYYYMMDD | 2025-04-09 |
PublicationDate_xml | – month: 04 year: 2025 text: 2025-04-09 day: 09 |
PublicationDecade | 2020 |
PublicationPlace | London |
PublicationPlace_xml | – name: London – name: England |
PublicationTitle | Nature communications |
PublicationTitleAbbrev | Nat Commun |
PublicationTitleAlternate | Nat Commun |
PublicationYear | 2025 |
Publisher | Nature Publishing Group UK Nature Publishing Group Nature Portfolio |
Publisher_xml | – name: Nature Publishing Group UK – name: Nature Publishing Group – name: Nature Portfolio |
References | 58276_CR31 58276_CR32 HS Seung (58276_CR52) 1992; 45 G Naveh (58276_CR17) 2021; 104 58276_CR30 58276_CR71 58276_CR70 Y Bahri (58276_CR11) 2020; 11 M Geiger (58276_CR29) 2020; 2020 M Belkin (58276_CR8) 2021; 30 58276_CR4 58276_CR7 K Segadlo (58276_CR18) 2022; 2022 M Belkin (58276_CR6) 2019; 116 Y Bengio (58276_CR1) 2013; 35 58276_CR3 58276_CR28 58276_CR26 58276_CR27 L Fei-Fei (58276_CR65) 2006; 28 58276_CR9 58276_CR69 58276_CR66 58276_CR23 58276_CR67 58276_CR43 BA Olshausen (58276_CR46) 1997; 37 58276_CR40 58276_CR41 B Sorscher (58276_CR68) 2022; 119 E Barkai (58276_CR54) 1992; 45 A Engel (58276_CR55) 1992; 45 T Hou (58276_CR56) 2019; 52 S Mei (58276_CR34) 2018; 115 58276_CR39 58276_CR38 W Krauth (58276_CR51) 1989; 50 58276_CR35 58276_CR36 TLH Watkin (58276_CR53) 1993; 65 58276_CR33 58276_CR10 C Cortes (58276_CR42) 2012; 13 58276_CR50 BA Olshausen (58276_CR47) 2004; 14 C Zhang (58276_CR5) 2021; 64 Q Li (58276_CR20) 2021; 11 B Hanin (58276_CR24) 2023; 120 58276_CR48 58276_CR49 58276_CR44 58276_CR45 J Sirignano (58276_CR37) 2020; 130 58276_CR64 58276_CR21 58276_CR63 JA Zavatone-Veth (58276_CR22) 2022; 105 58276_CR60 58276_CR61 V Papyan (58276_CR62) 2020; 117 R Pacelli (58276_CR25) 2023; 5 Y LeCun (58276_CR2) 2015; 521 58276_CR19 58276_CR15 58276_CR59 58276_CR16 58276_CR13 58276_CR57 58276_CR14 58276_CR58 58276_CR12 |
References_xml | – volume: 45 start-page: 4146 year: 1992 ident: 58276_CR54 publication-title: Phys. Rev. A doi: 10.1103/PhysRevA.45.4146 – volume: 64 start-page: 107 year: 2021 ident: 58276_CR5 publication-title: Commun. ACM doi: 10.1145/3446776 – ident: 58276_CR69 – ident: 58276_CR36 – volume: 521 start-page: 436 year: 2015 ident: 58276_CR2 publication-title: Nature doi: 10.1038/nature14539 – ident: 58276_CR71 – ident: 58276_CR32 – ident: 58276_CR13 – volume: 37 start-page: 3311 year: 1997 ident: 58276_CR46 publication-title: Vis. Res. doi: 10.1016/S0042-6989(97)00169-7 – volume: 45 start-page: 7590 year: 1992 ident: 58276_CR55 publication-title: Phys. Rev. A doi: 10.1103/PhysRevA.45.7590 – volume: 120 start-page: e2301345120 year: 2023 ident: 58276_CR24 publication-title: Proc. Natl Acad. Sci. doi: 10.1073/pnas.2301345120 – ident: 58276_CR61 – volume: 130 start-page: 1820 year: 2020 ident: 58276_CR37 publication-title: Stoch. Process. Appl. doi: 10.1016/j.spa.2019.06.003 – ident: 58276_CR3 – ident: 58276_CR27 – ident: 58276_CR9 – ident: 58276_CR23 – ident: 58276_CR43 – ident: 58276_CR70 – ident: 58276_CR64 – volume: 116 start-page: 15849 year: 2019 ident: 58276_CR6 publication-title: Proc. Natl Acad. Sci. doi: 10.1073/pnas.1903070116 – volume: 13 start-page: 795 year: 2012 ident: 58276_CR42 publication-title: J. Mach. Learn. Res. – ident: 58276_CR16 – volume: 2022 start-page: 103401 year: 2022 ident: 58276_CR18 publication-title: J. Stat. Mech. Theory Exp. doi: 10.1088/1742-5468/ac8e57 – ident: 58276_CR12 doi: 10.1007/978-1-4612-0745-0 – volume: 65 start-page: 499 year: 1993 ident: 58276_CR53 publication-title: Rev. Mod. Phys. doi: 10.1103/RevModPhys.65.499 – volume: 11 start-page: 501 year: 2020 ident: 58276_CR11 publication-title: Annu. Rev. Condens. Matter Phys. doi: 10.1146/annurev-conmatphys-031119-050745 – ident: 58276_CR57 – volume: 2020 start-page: 113301 year: 2020 ident: 58276_CR29 publication-title: J. Stat. Mech.: Theory Exp. doi: 10.1088/1742-5468/abc4de – volume: 115 start-page: E7665 year: 2018 ident: 58276_CR34 publication-title: Proc. Natl Acad. Sci. doi: 10.1073/pnas.1806579115 – ident: 58276_CR60 – ident: 58276_CR26 – ident: 58276_CR4 – volume: 50 start-page: 3057 year: 1989 ident: 58276_CR51 publication-title: J. de. Phys. doi: 10.1051/jphys:0198900500200305700 – ident: 58276_CR67 – ident: 58276_CR19 – ident: 58276_CR44 – volume: 28 start-page: 594 year: 2006 ident: 58276_CR65 publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2006.79 – ident: 58276_CR50 – ident: 58276_CR38 – ident: 58276_CR15 – ident: 58276_CR31 – ident: 58276_CR58 – volume: 45 start-page: 6056 year: 1992 ident: 58276_CR52 publication-title: Phys. Rev. A doi: 10.1103/PhysRevA.45.6056 – ident: 58276_CR48 – volume: 52 start-page: 414001 year: 2019 ident: 58276_CR56 publication-title: J. Phys. A: Math. Theor. doi: 10.1088/1751-8121/ab3f3f – volume: 5 start-page: 1497 year: 2023 ident: 58276_CR25 publication-title: Nat. Mach. Intell. doi: 10.1038/s42256-023-00767-6 – ident: 58276_CR33 doi: 10.1017/9781009023405 – ident: 58276_CR63 – volume: 104 start-page: 064301 year: 2021 ident: 58276_CR17 publication-title: Phys. Rev. E doi: 10.1103/PhysRevE.104.064301 – volume: 105 start-page: 064118 year: 2022 ident: 58276_CR22 publication-title: Phys. Rev. E doi: 10.1103/PhysRevE.105.064118 – ident: 58276_CR40 – ident: 58276_CR7 – ident: 58276_CR21 – ident: 58276_CR66 – ident: 58276_CR45 – volume: 35 start-page: 1798 year: 2013 ident: 58276_CR1 publication-title: IEEE Trans. Pattern Anal. Mach. Intell. doi: 10.1109/TPAMI.2013.50 – volume: 119 start-page: e2200800119 year: 2022 ident: 58276_CR68 publication-title: Proc. Natl Acad. Sci. doi: 10.1073/pnas.2200800119 – ident: 58276_CR35 – ident: 58276_CR41 doi: 10.1038/s41467-023-36361-y – ident: 58276_CR10 – volume: 117 start-page: 24652 year: 2020 ident: 58276_CR62 publication-title: Proc. Natl Acad. Sci. doi: 10.1073/pnas.2015509117 – ident: 58276_CR14 – ident: 58276_CR30 – ident: 58276_CR28 – ident: 58276_CR59 – ident: 58276_CR49 – volume: 11 start-page: 031059 year: 2021 ident: 58276_CR20 publication-title: Phys. Rev. X – volume: 30 start-page: 203 year: 2021 ident: 58276_CR8 publication-title: Acta Numerica doi: 10.1017/S0962492921000039 – ident: 58276_CR39 doi: 10.1088/1742-5468/ad01b0 – volume: 14 start-page: 481 year: 2004 ident: 58276_CR47 publication-title: Curr. Opin. Neurobiol. doi: 10.1016/j.conb.2004.07.007 |
SSID | ssj0000391844 |
Score | 2.4882967 |
Snippet | Neural networks posses the crucial ability to generate meaningful representations of task-dependent features. Indeed, with appropriate scaling, supervised... Abstract Neural networks posses the crucial ability to generate meaningful representations of task-dependent features. Indeed, with appropriate scaling,... |
SourceID | doaj pubmedcentral proquest pubmed crossref springer |
SourceType | Open Website Open Access Repository Aggregation Database Index Database Publisher |
StartPage | 3354 |
SubjectTerms | 631/378/116/1925 631/378/116/2395 639/766/530/2804 Bayesian analysis Broken symmetry Classification Coding Humanities and Social Sciences Learning Machine learning multidisciplinary Neural coding Neural networks Nonlinear systems Nonlinearity Representations Scaling Science Science (multidisciplinary) Supervised learning |
SummonAdditionalLinks | – databaseName: DOAJ Directory of Open Access Journals dbid: DOA link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3NaxUxEB-kIHgpVmvdtsoKvenSvGyym5xES0sR9GSht5Bk8rSI-0r39eB_70yy79nnB714W7LZJfxmMh9k8huAI08-MqLwTbQyNgpD4MvKqrGeolcxMxEzA9_HT935hfpwqS_vtPrimrBCD1yAO54hGp0U6Z2JCrGjX7YWUadEj8IgW19hxZ1kKtvg1lLqoqZbMqI1x6PKNoG7t2oj-67pNjxRJuz_W5T5Z7Hkbyem2RGdPYbtKYKs35WV78CDNDyBh6Wn5I-n8PZkwe6opqw1fU9jfTXUzFlJXwyl4nusp04RX-rIoTPXCmXx1Es_fht34eLs9PPJeTN1SWiismrZSB3k3HCbKZl602Hw2geReh5WodVRKOwizpVnMjaaGmw0aCVyiyLm6XwGW8NiSM-hTn4e29YHRNUrAtmkPgqdbB8oCglGVvB6hZi7LmQYLh9it8YVfB3h6zK-rqvgPYO6nslE1nmAxOsm8br7xFvB4UokbtpdoyM7xER8WpsKXq1f077gww4_pMVtmcOtuKSuYK9IcL0SzpkVmbYKzIZsN5a6-Wa4-pq5t2czaygI7St4s1KDX-v6Nxb7_wOLA3gkWX-5bsgewtby5ja9oJBoGV5m7f8Jf5YHbA priority: 102 providerName: Directory of Open Access Journals – databaseName: Springer Nature OA Free Journals dbid: C6C link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LT9wwEB7RRZV6QUBfgQWlUm9t1KxjJ_YJwYrVCqk9FWlvll9LUUVSkd0D_54ZJ9lqKT1wi-yxbM2MZ8bx-BuAzwZ9pPO5yZxiLuPeWnqszDNlMHrNJ9L5iMD3_Uc5v-ZXC7HYATa8hYlJ-xHSMprpITvsW8vjlqbiq0KyqszKV7Arq0KQVk_L6ea_CiGeS8779zF5IZ8ZuuWDIlT_c_Hlv2mST-5Kowua7cNeHzum591qD2An1Ifwuqsm-fAWzqYNOaIUz6vhLrTpbZ0SWiWOqLtc7zbta0TcpI6CZsoSioJJV6b93b6D69nlz-k86-sjZI4rvsqYsGwpqcAUC5UsvTXC2DxU1MxtIVzOfen8khuCYUNSq5z0inkqTkQIne9hVDd1-AhpMEtXFMZ6zyuuvJehcrkIqrIYf1jJEvgycEz_6WAwdLy-LqTu-KuRvzryV5cJXBBTN5QEYR0bmvsb3YtUT3AWETgaFOm49yXqSoEzixDwM5c-gfEgEt3vq1ajBSIIPiFkAp823bgj6JrD1KFZdzRUhIuJBD50EtyshE7LHI1aAnJLtltL3e6pb39F1O3JREkMP6sEvg5q8Hdd_-fF0cvIj-ENI02l3CA1htHqfh1OMOxZ2dOo548e8fyJ priority: 102 providerName: Springer Nature |
Title | Coding schemes in neural networks learning classification tasks |
URI | https://link.springer.com/article/10.1038/s41467-025-58276-6 https://www.ncbi.nlm.nih.gov/pubmed/40204730 https://www.proquest.com/docview/3188186558 https://www.proquest.com/docview/3188429225 https://pubmed.ncbi.nlm.nih.gov/PMC11982327 https://doaj.org/article/1dd85e43228c4dd692c39dd5ee69208d |
Volume | 16 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1La9wwEB7ahEIvpe86TRcXemtNvLJkyaewWbINCw2lbWBvQq9NQqmdxJtD_31nZO2G7etiG1lG8sxoNJbG3wfwzuAc6XxpCtcwV3BvLf2szIvGYPRajpXzEYHv02l9csbnC7FIC259Sqtc-8ToqH3naI38AG2PwNeEUIdX1wWxRtHuaqLQuA-7Y4xEiLpBLuRmjYXQzxXn6V-ZslIHPY-egThchWKyLuqt-SjC9v8t1vwzZfK3fdM4Hc0ew6MUR-aTQfFP4F5on8KDgVny5zM4nHY0KeX47Rp-hD6_bHNCrsQn2iHvu88TX8R57iiApoyhqKR8Zfrv_XM4mx1_m54UiSuhcLzhq4IJy5aKyKZYkKr21ghjyyCpmNtKuJL72vklNwTJhlVt45RvmCeiIkLrfAE7bdeGV5AHs3RVZaz3XPLGexWkK0VopMVYxCqWwfu1xPTVAImh41Z2pfQgX43y1VG-us7giIS6qUlw1rGguznXaXToMbYiAkfnohz3vka7qbBlEQJelspnsL9WiU5jrNd3FpHB281tHB205WHa0N0OdYiQi4kMXg4a3PSEvpw5OrgM1JZut7q6fae9vIgI3GhsCkNRmcGHtRnc9evfstj7_2u8hoeMLJPygpp92Fnd3IY3GPKs7CjaNR7V7OMIdieT-dc5no-OTz9_wdJpPR3FxYRflWkEpA |
linkProvider | ProQuest |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1LT9wwEB7RRVV7qfpuKG1TqT21EVnHTpwDQoWClgKrqgKJm_FrKaqaAFmE-HP8ts44yaLt68Ytsp3YGY_HY3v8fQDvNM6R1qU6sSWzCXfG0GVlnpQavdd0KK0LCHx743x0wL8cisMFuO7vwlBYZW8Tg6F2taU98hXUPQJfE0KunZ4lxBpFp6s9hYbuqBXcaoAY6y527PirS1zCNavbn7G_3zO2tbm_MUo6loHE8pJPEyYMm0iiaWK-kLkzWmiT-oKSucmETbnLrZtwTWBmWNSUVrqSOaL4IZxL_O4dWOS0gTKAxfXN8ddvs10ewl-XnHe3ddJMrjQ82CZikRWSFXmSz82IgTjgb97un0Gbv53chglx6yE86DzZ-FOreo9gwVeP4W7LbXn1BNY2apoWY1w9-5--iU-qmLAz8Y2qjTxv4o6x4ji25MJTzFJQk3iqmx_NUzi4FTk-g0FVV_4FxF5PbJZp4xwveOmc9IVNhS8Lg96QkSyCD73E1GkLyqHCYXomVStfhfJVQb4qj2CdhDorSYDaIaE-P1bd-FRDrEV4juZNWu5cjpqbYc3Ce3xMpYtgue8S1Y3yRt3oZARvZ9k4PunQRVe-vmjLECUYExE8b3tw1hJau3M0sRHIub6da-p8TnXyPWCAD4elRGe4iOBjrwY37fq3LJb-_xtv4N5of29X7W6Pd17CfUZaSlFK5TIMpucX_hU6YFPzutPyGI5ue2D9AlA0QbQ |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9QwEB6VIhAXxJuUAkGCE0SbdezEPlQVtCwthYoDlXpz_dpSIZK22Qr1r_HrOuMkWy2vW29R4iTO-Bt7HI-_D-ClwTHS-dxkTjGXcW8tbVbmmTIYveZj6Xxk4Pu8W27t8Y_7Yn8Jfg17YSitcugTY0ftG0f_yEeIPSJfE0KOpn1axJfNyfrxSUYKUrTSOshpdBDZCec_cfrWrm1vYlu_Ymzy_uvGVtYrDGSOKz7LmLBsKkmiiYVKlt4aYWweKjrNbSFczn3p_JQbIjLDolY56RXzJO9DHJf43GtwvSrQT2iX-uTD_P8OMa9Lzvt9OnkhRy2PvRLpxwrJqjIrF8bCKBnwtzj3z3TN39Zs41A4uQO3-xg2fduB7i4shfoe3OhULc_vw_pGQwNiivPm8CO06VGdEmsm3lF3Oedt2mtVHKaOgnfKVooASWem_d4-gL0rseJDWK6bOjyGNJipKwpjvecVV97LULlcBFVZjIOsZAm8Hiymjzs6Dh2X0QupO_tqtK-O9tVlAu_IqPOSRKUdTzSnh7r3TD3Gt4jAsWOTjntfImYLfLMIAQ9z6RNYHZpE9_7d6ks0JvBifhk9k5ZbTB2as64MiYExkcCjrgXnNaFZO0fQJCAX2nahqotX6qNvkf17PFYSw-AqgTcDDC7r9W9brPz_M57DTXQn_Wl7d-cJ3GIEUkpPUquwPDs9C08x8prZZxHiKRxctU9dAIWLP1A |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Coding+schemes+in+neural+networks+learning+classification+tasks&rft.jtitle=Nature+communications&rft.au=van+Meegen%2C+Alexander&rft.au=Sompolinsky%2C+Haim&rft.date=2025-04-09&rft.issn=2041-1723&rft.eissn=2041-1723&rft.volume=16&rft.issue=1&rft.spage=3354&rft_id=info:doi/10.1038%2Fs41467-025-58276-6&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2041-1723&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2041-1723&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2041-1723&client=summon |