SpaRCe: Improved Learning of Reservoir Computing Systems Through Sparse Representations
"Sparse" neural networks, in which relatively few neurons or connections are active, are common in both machine learning and neuroscience. While, in machine learning, "sparsity" is related to a penalty term that leads to some connecting weights becoming small or zero, in biologic...
Saved in:
Published in | IEEE transaction on neural networks and learning systems Vol. 34; no. 2; pp. 824 - 838 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
United States
IEEE
01.02.2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Subjects | |
Online Access | Get full text |
ISSN | 2162-237X 2162-2388 2162-2388 |
DOI | 10.1109/TNNLS.2021.3102378 |
Cover
Abstract | "Sparse" neural networks, in which relatively few neurons or connections are active, are common in both machine learning and neuroscience. While, in machine learning, "sparsity" is related to a penalty term that leads to some connecting weights becoming small or zero, in biological brains, sparsity is often created when high spiking thresholds prevent neuronal activity. Here, we introduce sparsity into a reservoir computing network via neuron-specific learnable thresholds of activity, allowing neurons with low thresholds to contribute to decision-making but suppressing information from neurons with high thresholds. This approach, which we term "SpaRCe," optimizes the sparsity level of the reservoir without affecting the reservoir dynamics. The read-out weights and the thresholds are learned by an online gradient rule that minimizes an error function on the outputs of the network. Threshold learning occurs by the balance of two opposing forces: reducing interneuronal correlations in the reservoir by deactivating redundant neurons, while increasing the activity of neurons participating in correct decisions. We test SpaRCe on classification problems and find that threshold learning improves performance compared to standard reservoir computing. SpaRCe alleviates the problem of catastrophic forgetting, a problem most evident in standard echo state networks (ESNs) and recurrent neural networks in general, due to increasing the number of task-specialized neurons that are included in the network decisions. |
---|---|
AbstractList | "Sparse" neural networks, in which relatively few neurons or connections are active, are common in both machine learning and neuroscience. While, in machine learning, "sparsity" is related to a penalty term that leads to some connecting weights becoming small or zero, in biological brains, sparsity is often created when high spiking thresholds prevent neuronal activity. Here, we introduce sparsity into a reservoir computing network via neuron-specific learnable thresholds of activity, allowing neurons with low thresholds to contribute to decision-making but suppressing information from neurons with high thresholds. This approach, which we term "SpaRCe," optimizes the sparsity level of the reservoir without affecting the reservoir dynamics. The read-out weights and the thresholds are learned by an online gradient rule that minimizes an error function on the outputs of the network. Threshold learning occurs by the balance of two opposing forces: reducing interneuronal correlations in the reservoir by deactivating redundant neurons, while increasing the activity of neurons participating in correct decisions. We test SpaRCe on classification problems and find that threshold learning improves performance compared to standard reservoir computing. SpaRCe alleviates the problem of catastrophic forgetting, a problem most evident in standard echo state networks (ESNs) and recurrent neural networks in general, due to increasing the number of task-specialized neurons that are included in the network decisions."Sparse" neural networks, in which relatively few neurons or connections are active, are common in both machine learning and neuroscience. While, in machine learning, "sparsity" is related to a penalty term that leads to some connecting weights becoming small or zero, in biological brains, sparsity is often created when high spiking thresholds prevent neuronal activity. Here, we introduce sparsity into a reservoir computing network via neuron-specific learnable thresholds of activity, allowing neurons with low thresholds to contribute to decision-making but suppressing information from neurons with high thresholds. This approach, which we term "SpaRCe," optimizes the sparsity level of the reservoir without affecting the reservoir dynamics. The read-out weights and the thresholds are learned by an online gradient rule that minimizes an error function on the outputs of the network. Threshold learning occurs by the balance of two opposing forces: reducing interneuronal correlations in the reservoir by deactivating redundant neurons, while increasing the activity of neurons participating in correct decisions. We test SpaRCe on classification problems and find that threshold learning improves performance compared to standard reservoir computing. SpaRCe alleviates the problem of catastrophic forgetting, a problem most evident in standard echo state networks (ESNs) and recurrent neural networks in general, due to increasing the number of task-specialized neurons that are included in the network decisions. "Sparse" neural networks, in which relatively few neurons or connections are active, are common in both machine learning and neuroscience. While, in machine learning, "sparsity" is related to a penalty term that leads to some connecting weights becoming small or zero, in biological brains, sparsity is often created when high spiking thresholds prevent neuronal activity. Here, we introduce sparsity into a reservoir computing network via neuron-specific learnable thresholds of activity, allowing neurons with low thresholds to contribute to decision-making but suppressing information from neurons with high thresholds. This approach, which we term "SpaRCe," optimizes the sparsity level of the reservoir without affecting the reservoir dynamics. The read-out weights and the thresholds are learned by an online gradient rule that minimizes an error function on the outputs of the network. Threshold learning occurs by the balance of two opposing forces: reducing interneuronal correlations in the reservoir by deactivating redundant neurons, while increasing the activity of neurons participating in correct decisions. We test SpaRCe on classification problems and find that threshold learning improves performance compared to standard reservoir computing. SpaRCe alleviates the problem of catastrophic forgetting, a problem most evident in standard echo state networks (ESNs) and recurrent neural networks in general, due to increasing the number of task-specialized neurons that are included in the network decisions. |
Author | Vasilaki, Eleni Manneschi, Luca Lin, Andrew C. |
Author_xml | – sequence: 1 givenname: Luca orcidid: 0000-0002-0125-1325 surname: Manneschi fullname: Manneschi, Luca organization: Department of Computer Science, The University of Sheffield, Sheffield, U.K – sequence: 2 givenname: Andrew C. orcidid: 0000-0001-6310-9765 surname: Lin fullname: Lin, Andrew C. organization: Department of Biomedical Science, The University of Sheffield, Sheffield, U.K – sequence: 3 givenname: Eleni orcidid: 0000-0003-3705-7070 surname: Vasilaki fullname: Vasilaki, Eleni email: e.vasilaki@sheffield.ac.uk organization: Department of Computer Science, The University of Sheffield, Sheffield, U.K |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/34398765$$D View this record in MEDLINE/PubMed |
BookMark | eNp9kUtLAzEUhYMovv-Aggy4cdOayXPiToovKBVsQXchk7mjI53JmMwU_PemtnbRhXeTS_jOvYd7jtBu4xpA6CzFwzTF6no2mYynQ4JJOqQpJlRmO-iQpIIMCM2y3U0v3w7QaQifOJbAXDC1jw4ooyqTgh-i12lrXkZwkzzVrXcLKJIxGN9UzXviyuQFAviFq3wycnXbd8vv6XfooA7J7MO7_v0jiQN8gIi2PtJNZ7rKNeEE7ZVmHuB0_R6j2f3dbPQ4GD8_PI1uxwNLFe8GlgtRkNJYKfOiyJXAwAumFGelLIEBNznNqWXEMCUItrwoQcic2SIjKqf0GF2txkbzXz2ETtdVsDCfmwZcHzThghCqsMQRvdxCP13vm2hOEykp40wqEqmLNdXnNRS69VVt_Lf-u1gEyAqw3oXgodwgKdbLZPRvMnqZjF4nE0XZlshWq0N13lTz_6XnK2kFAJtdiqfRkKI_WjGblA |
CODEN | ITNNAL |
CitedBy_id | crossref_primary_10_1101_lm_053825_123 crossref_primary_10_1088_1361_6528_ac87b5 crossref_primary_10_1073_pnas_2102158118 crossref_primary_10_1007_s12652_023_04686_7 crossref_primary_10_1109_LRA_2022_3150505 crossref_primary_10_1063_5_0119040 crossref_primary_10_1080_17445760_2025_2472211 crossref_primary_10_3389_felec_2022_869013 crossref_primary_10_1007_s11047_024_09997_y crossref_primary_10_1002_inc2_12013 crossref_primary_10_1038_s41467_024_50633_1 crossref_primary_10_1038_s42005_023_01352_4 crossref_primary_10_1016_j_neunet_2024_107079 |
Cites_doi | 10.1016/j.neunet.2012.02.028 10.1016/j.neunet.2019.03.005 10.1038/ncomms4541 10.1016/j.patcog.2011.09.011 10.35848/1347-4065/ab8d4f 10.1007/11550822_11 10.1152/jn.1995.73.2.713 10.1016/j.neuron.2013.08.006 10.1038/nn.3547 10.1038/nature09160 10.1016/j.neunet.2007.04.011 10.1523/JNEUROSCI.1099-11.2011 10.1016/j.neuron.2015.10.018 10.1073/pnas.1005635107 10.1109/TNNLS.2020.3001377 10.1038/nature12063 10.1201/b18401 10.1162/NECO_a_00499 10.1073/pnas.130200797 10.1016/j.ins.2015.11.017 10.1126/science.1070502 10.1109/MSP.2012.2211477 10.1109/CSE-EUC-DCABES.2016.229 10.1007/978-3-642-35289-8_36 10.1209/0295-5075/4/2/007 10.1063/1.5079305 10.1152/jn.01283.2007 10.1016/j.neucom.2016.12.089 10.1145/2765491.2765531 10.3389/fams.2020.616658 10.1016/j.neunet.2012.11.011 10.1016/j.neunet.2007.04.016 10.1209/0295-5075/6/2/002 10.1016/j.neuron.2010.04.009 10.1038/nn.3660 10.1073/pnas.1305857110 10.1162/NECO_a_00200 10.1016/j.ins.2016.08.081 10.1038/s41586-018-0632-y 10.1038/s41467-017-02337-y 10.1038/s41467-020-16261-1 10.1016/j.neucom.2007.12.020 10.1063/1.4746765 10.1371/journal.pbio.0030068 10.1109/TNN.2010.2089641 10.1038/srep00287 10.1103/PhysRevLett.55.1530 10.1038/nature23011 10.1016/S0893-6080(05)80003-6 10.1209/0295-5075/7/3/003 10.1609/aaai.v32i1.11651 10.1016/j.cell.2006.01.050 10.1007/s00041-008-9045-x 10.1609/aaai.v33i01.33013280 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023 |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023 |
DBID | 97E ESBDL RIA RIE AAYXX CITATION CGR CUY CVF ECM EIF NPM 7QF 7QO 7QP 7QQ 7QR 7SC 7SE 7SP 7SR 7TA 7TB 7TK 7U5 8BQ 8FD F28 FR3 H8D JG9 JQ2 KR7 L7M L~C L~D P64 7X8 |
DOI | 10.1109/TNNLS.2021.3102378 |
DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE Xplore Open Access Journals IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE Electronic Library (IEL) CrossRef Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed Aluminium Industry Abstracts Biotechnology Research Abstracts Calcium & Calcified Tissue Abstracts Ceramic Abstracts Chemoreception Abstracts Computer and Information Systems Abstracts Corrosion Abstracts Electronics & Communications Abstracts Engineered Materials Abstracts Materials Business File Mechanical & Transportation Engineering Abstracts Neurosciences Abstracts Solid State and Superconductivity Abstracts METADEX Technology Research Database ANTE: Abstracts in New Technology & Engineering Engineering Research Database Aerospace Database Materials Research Database ProQuest Computer Science Collection Civil Engineering Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional Biotechnology and BioEngineering Abstracts MEDLINE - Academic |
DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) Materials Research Database Technology Research Database Computer and Information Systems Abstracts – Academic Mechanical & Transportation Engineering Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Materials Business File Aerospace Database Engineered Materials Abstracts Biotechnology Research Abstracts Chemoreception Abstracts Advanced Technologies Database with Aerospace ANTE: Abstracts in New Technology & Engineering Civil Engineering Abstracts Aluminium Industry Abstracts Electronics & Communications Abstracts Ceramic Abstracts Neurosciences Abstracts METADEX Biotechnology and BioEngineering Abstracts Computer and Information Systems Abstracts Professional Solid State and Superconductivity Abstracts Engineering Research Database Calcium & Calcified Tissue Abstracts Corrosion Abstracts MEDLINE - Academic |
DatabaseTitleList | MEDLINE - Academic MEDLINE Materials Research Database |
Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: EIF name: MEDLINE url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search sourceTypes: Index Database – sequence: 3 dbid: RIE name: IEEE/IET Electronic Library (IEL) (UW System Shared) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Computer Science |
EISSN | 2162-2388 |
EndPage | 838 |
ExternalDocumentID | 34398765 10_1109_TNNLS_2021_3102378 9514399 |
Genre | orig-research Research Support, Non-U.S. Gov't Journal Article |
GrantInformation_xml | – fundername: Google Deepmind Faculty Research Awards Program funderid: 10.13039/100006785 – fundername: Biotechnology and Biological Sciences Research Council grantid: BB/S016031/1 funderid: 10.13039/501100000268 – fundername: Engineering and Physical Sciences Research Council grantid: EP/P006094/1; EP/S030964/1; EP/S009647/1; EP/V006339/1 funderid: 10.13039/501100000266 – fundername: European Research Council grantid: 639489 funderid: 10.13039/501100000781 – fundername: Biotechnology and Biological Sciences Research Council grantid: BB/S016031/1 |
GroupedDBID | 0R~ 4.4 5VS 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACIWK ACPRK AENEX AFRAH AGQYO AGSQL AHBIQ AKJIK AKQYR ALMA_UNASSIGNED_HOLDINGS ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ EBS EJD ESBDL IFIPE IPLJI JAVBF M43 MS~ O9- OCL PQQKQ RIA RIE RNS AAYXX CITATION RIG CGR CUY CVF ECM EIF NPM 7QF 7QO 7QP 7QQ 7QR 7SC 7SE 7SP 7SR 7TA 7TB 7TK 7U5 8BQ 8FD F28 FR3 H8D JG9 JQ2 KR7 L7M L~C L~D P64 7X8 |
ID | FETCH-LOGICAL-c395t-c566d2fac77bddb960e5d49954f7fe4e5ab3b3c42a49620c5dfe67b4cd829b33 |
IEDL.DBID | RIE |
ISSN | 2162-237X 2162-2388 |
IngestDate | Sun Sep 28 11:17:40 EDT 2025 Mon Jun 30 04:39:57 EDT 2025 Thu Apr 03 07:03:15 EDT 2025 Tue Jul 01 00:27:40 EDT 2025 Thu Apr 24 22:49:15 EDT 2025 Wed Aug 27 02:48:03 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | false |
IsScholarly | true |
Issue | 2 |
Language | English |
License | https://creativecommons.org/licenses/by/4.0/legalcode |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c395t-c566d2fac77bddb960e5d49954f7fe4e5ab3b3c42a49620c5dfe67b4cd829b33 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ORCID | 0000-0002-0125-1325 0000-0001-6310-9765 0000-0003-3705-7070 |
OpenAccessLink | https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/document/9514399 |
PMID | 34398765 |
PQID | 2773454792 |
PQPubID | 85436 |
PageCount | 15 |
ParticipantIDs | ieee_primary_9514399 proquest_journals_2773454792 crossref_primary_10_1109_TNNLS_2021_3102378 crossref_citationtrail_10_1109_TNNLS_2021_3102378 proquest_miscellaneous_2562239070 pubmed_primary_34398765 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2023-02-01 |
PublicationDateYYYYMMDD | 2023-02-01 |
PublicationDate_xml | – month: 02 year: 2023 text: 2023-02-01 day: 01 |
PublicationDecade | 2020 |
PublicationPlace | United States |
PublicationPlace_xml | – name: United States – name: Piscataway |
PublicationTitle | IEEE transaction on neural networks and learning systems |
PublicationTitleAbbrev | TNNLS |
PublicationTitleAlternate | IEEE Trans Neural Netw Learn Syst |
PublicationYear | 2023 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) |
References | ref13 ref57 ref12 ref15 ref59 ref14 ref58 ref53 ref52 ref11 ref55 ref10 ref17 ref16 ref19 ref18 wen (ref7) 2016 lukoševi?ius (ref39) 2012 ref50 srivastava (ref8) 2014; 15 dong (ref60) 2020; 33 ref45 ref48 ref47 ref42 ref41 ref44 ref43 ref49 huang (ref37) 2011; 12 ref9 ref4 ref3 ref6 ref5 ref40 ref35 ref34 ref36 ref31 ref30 ref33 ref32 ref2 ref1 kingma (ref54) 2014 ref38 krishnamurthy (ref51) 2017 arora (ref56) 2019 ref24 ref23 ref26 ref25 ref20 ref22 ref21 ref28 ref27 ref29 ref62 serra (ref46) 2018; 80 ref61 |
References_xml | – ident: ref58 doi: 10.1016/j.neunet.2012.02.028 – ident: ref43 doi: 10.1016/j.neunet.2019.03.005 – ident: ref28 doi: 10.1038/ncomms4541 – ident: ref9 doi: 10.1016/j.patcog.2011.09.011 – volume: 15 start-page: 1929 year: 2014 ident: ref8 article-title: Dropout: A simple way to prevent neural networks from overfitting publication-title: J Mach Learn Res – ident: ref30 doi: 10.35848/1347-4065/ab8d4f – ident: ref21 doi: 10.1007/11550822_11 – ident: ref10 doi: 10.1152/jn.1995.73.2.713 – ident: ref50 doi: 10.1016/j.neuron.2013.08.006 – ident: ref14 doi: 10.1038/nn.3547 – ident: ref19 doi: 10.1038/nature09160 – ident: ref23 doi: 10.1016/j.neunet.2007.04.011 – ident: ref11 doi: 10.1523/JNEUROSCI.1099-11.2011 – ident: ref17 doi: 10.1016/j.neuron.2015.10.018 – ident: ref49 doi: 10.1073/pnas.1005635107 – ident: ref36 doi: 10.1109/TNNLS.2020.3001377 – year: 2017 ident: ref51 article-title: Disorder and the neural representation of complex odors: Smelling in the real world publication-title: arXiv 1707 01962 – ident: ref52 doi: 10.1038/nature12063 – ident: ref6 doi: 10.1201/b18401 – ident: ref5 doi: 10.1162/NECO_a_00499 – ident: ref18 doi: 10.1073/pnas.130200797 – ident: ref22 doi: 10.1016/j.ins.2015.11.017 – ident: ref16 doi: 10.1126/science.1070502 – year: 2014 ident: ref54 article-title: Adam: A method for stochastic optimization publication-title: arXiv 1412 6980 – volume: 33 start-page: 16785 year: 2020 ident: ref60 article-title: Reservoir computing meets recurrent kernels and structured transforms publication-title: Proc Adv Neural Inf Process Syst – ident: ref55 doi: 10.1109/MSP.2012.2211477 – ident: ref41 doi: 10.1109/CSE-EUC-DCABES.2016.229 – start-page: 659 year: 2012 ident: ref39 article-title: A practical guide to applying echo state networks publication-title: Neural Networks Tricks of the Trade doi: 10.1007/978-3-642-35289-8_36 – ident: ref3 doi: 10.1209/0295-5075/4/2/007 – ident: ref61 doi: 10.1063/1.5079305 – ident: ref13 doi: 10.1152/jn.01283.2007 – ident: ref34 doi: 10.1016/j.neucom.2016.12.089 – ident: ref26 doi: 10.1145/2765491.2765531 – ident: ref35 doi: 10.3389/fams.2020.616658 – ident: ref57 doi: 10.1016/j.neunet.2012.11.011 – ident: ref31 doi: 10.1016/j.neunet.2007.04.016 – ident: ref1 doi: 10.1209/0295-5075/6/2/002 – ident: ref48 doi: 10.1016/j.neuron.2010.04.009 – ident: ref12 doi: 10.1038/nn.3660 – ident: ref15 doi: 10.1073/pnas.1305857110 – ident: ref59 doi: 10.1162/NECO_a_00200 – ident: ref40 doi: 10.1016/j.ins.2016.08.081 – ident: ref62 doi: 10.1038/s41586-018-0632-y – volume: 80 start-page: 4548 year: 2018 ident: ref46 article-title: Overcoming catastrophic forgetting with hard attention to the task publication-title: Proc 35th Int Conf Mach Learn – ident: ref25 doi: 10.1038/s41467-017-02337-y – ident: ref27 doi: 10.1038/s41467-020-16261-1 – ident: ref24 doi: 10.1016/j.neucom.2007.12.020 – start-page: 77 year: 2019 ident: ref56 article-title: Does an LSTM forget more than a CNN? An empirical study of catastrophic forgetting in NLP publication-title: Proc 17th Annu Workshop Australas Lang Technol Assoc – ident: ref33 doi: 10.1063/1.4746765 – ident: ref53 doi: 10.1371/journal.pbio.0030068 – ident: ref32 doi: 10.1109/TNN.2010.2089641 – ident: ref29 doi: 10.1038/srep00287 – ident: ref4 doi: 10.1103/PhysRevLett.55.1530 – ident: ref42 doi: 10.1038/nature23011 – ident: ref20 doi: 10.1016/S0893-6080(05)80003-6 – ident: ref2 doi: 10.1209/0295-5075/7/3/003 – ident: ref45 doi: 10.1609/aaai.v32i1.11651 – start-page: 2074 year: 2016 ident: ref7 article-title: Learning structured sparsity in deep neural networks publication-title: Advances in neural information processing systems – ident: ref47 doi: 10.1016/j.cell.2006.01.050 – ident: ref38 doi: 10.1007/s00041-008-9045-x – ident: ref44 doi: 10.1609/aaai.v33i01.33013280 – volume: 12 start-page: 3371 year: 2011 ident: ref37 article-title: Learning with structured sparsity publication-title: J Mach Learn Res |
SSID | ssj0000605649 |
Score | 2.4980712 |
Snippet | "Sparse" neural networks, in which relatively few neurons or connections are active, are common in both machine learning and neuroscience. While, in machine... “Sparse” neural networks, in which relatively few neurons or connections are active, are common in both machine learning and neuroscience. While, in machine... |
SourceID | proquest pubmed crossref ieee |
SourceType | Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 824 |
SubjectTerms | Brain Catastrophic forgetting Computation Computational modeling Decision making echo state networks (ESNs) Error functions Firing pattern Learning algorithms Machine Learning Mathematical model Nervous system Neural networks Neural Networks, Computer Neurons Neurons - physiology online learning Recurrent neural networks reservoir computing Reservoirs Sparsity Task analysis Thresholds |
Title | SpaRCe: Improved Learning of Reservoir Computing Systems Through Sparse Representations |
URI | https://ieeexplore.ieee.org/document/9514399 https://www.ncbi.nlm.nih.gov/pubmed/34398765 https://www.proquest.com/docview/2773454792 https://www.proquest.com/docview/2562239070 |
Volume | 34 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB61PXGhQHkEWmQkbpBtYjtxza2qqCpE90AXsbfIjwmqQMlqHxz49YxjJwgEiFsUj53HjD3f2PMAeCm5LwnpYo4afS5FYXNbY5WXtnTaGK0ND_HO1_P66qN8t6yWe_B6ioVBxMH5DGfhcjjL973bha2yUx20u9b7sE9iFmO1pv2UgnB5PaBdXtY850ItxxiZQp8u5vP3N2QN8pKMVFJTKtTpEzQYLQbVLyppqLHyd7g5qJ3LQ7geXzh6m3yZ7bZ25r7_lsvxf7_oHtxN-JOdR4G5D3vYPYDDsbYDS1P9CD7drMyHC3zD4q4DepYysX5mfcuCv976W3-7ZrFnuJ1yn7NFrPzDaID1Bol09TPCqds8hMXl28XFVZ6KMORO6GqbO8J7nrfGKWW9t2TwYOVlyCLXqhYlVsYKK5zkRuqaF67yLdbKSufPuLZCPIKDru_wCTCJrm6Vw7Oy1dLWiloJHlpXqLaqnTUZlCMbGpcSlIc6GV-bwVApdDNwsQlcbBIXM3g19VnF9Bz_pD4KLJgo09_P4HjkdpNm8KbhSomQ7EzzDF5MzTT3woGK6bDfEQ2BRy40rZoZPI5SMo09CtfTPz_zGdwJheuj__cxHGzXOzwheLO1zwe5_gHiJPUH |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB6VcoALBQo0UMBI3CDbxHbiNTdUUS2wuwcaxN6i2J4gRJWs9sGBX884cYJAgLhF8dh5zNjzjT0PgOeSu5SQLsao0cVSJCY2OWZxalKrq0rrivt458Uyn32U71bZ6gBejrEwiNg5n-HEX3Zn-a61e79Vdqa9dtf6GlzPyKqY9tFa445KQsg87_AuT3Mec6FWQ5RMos-K5XJ-SfYgT8lMJUWlfKU-QcPRcpD9opS6Kit_B5yd4rk4gsXwyr2_ydfJfmcm9vtv2Rz_95tuw62AQNnrXmTuwAE2d-FoqO7AwmQ_hk-X6-rDOb5i_b4DOhZysX5mbc28x97mW_tlw_qe_nbIfs6KvvYPowE2WyTS9c8Yp2Z7D4qLN8X5LA5lGGIrdLaLLSE-x-vKKmWcM2TyYOakzyNXqxolZpURRljJK6lzntjM1ZgrI62bcm2EuA-HTdvgCTCJNq-VxWlaa2lyRa0EEI1NVJ3l1lQRpAMbShtSlPtKGVdlZ6okuuy4WHouloGLEbwY-6z7BB3_pD72LBgpw9-P4HTgdhnm8LbkSgmf7kzzCJ6NzTT7_JFK1WC7JxqCj1xoWjcjeNBLyTj2IFwP__zMp3BjVizm5fzt8v0juOnL2Pfe4KdwuNvs8TGBnZ150sn4D-zY-Fo |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=SpaRCe%3A+Improved+Learning+of+Reservoir+Computing+Systems+Through+Sparse+Representations&rft.jtitle=IEEE+transaction+on+neural+networks+and+learning+systems&rft.au=Manneschi%2C+Luca&rft.au=Lin%2C+Andrew+C.&rft.au=Vasilaki%2C+Eleni&rft.date=2023-02-01&rft.pub=IEEE&rft.issn=2162-237X&rft.volume=34&rft.issue=2&rft.spage=824&rft.epage=838&rft_id=info:doi/10.1109%2FTNNLS.2021.3102378&rft_id=info%3Apmid%2F34398765&rft.externalDocID=9514399 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2162-237X&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2162-237X&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2162-237X&client=summon |