Hybrid Self-Attention NEAT: A novel evolutionary approach to improve the NEAT algorithm
This article presents a "Hybrid Self-Attention NEAT" method to improve the original NeuroEvolution of Augmenting Topologies (NEAT) algorithm in high-dimensional inputs. Although the NEAT algorithm has shown a significant result in different challenging tasks, as input representations are h...
Saved in:
| Published in | arXiv.org |
|---|---|
| Main Authors | , |
| Format | Paper Journal Article |
| Language | English |
| Published |
Ithaca
Cornell University Library, arXiv.org
14.08.2022
|
| Subjects | |
| Online Access | Get full text |
| ISSN | 2331-8422 |
| DOI | 10.48550/arxiv.2112.03670 |
Cover
| Abstract | This article presents a "Hybrid Self-Attention NEAT" method to improve the original NeuroEvolution of Augmenting Topologies (NEAT) algorithm in high-dimensional inputs. Although the NEAT algorithm has shown a significant result in different challenging tasks, as input representations are high dimensional, it cannot create a well-tuned network. Our study addresses this limitation by using self-attention as an indirect encoding method to select the most important parts of the input. In addition, we improve its overall performance with the help of a hybrid method to evolve the final network weights. The main conclusion is that Hybrid Self- Attention NEAT can eliminate the restriction of the original NEAT. The results indicate that in comparison with evolutionary algorithms, our model can get comparable scores in Atari games with raw pixels input with a much lower number of parameters. |
|---|---|
| AbstractList | This article presents a "Hybrid Self-Attention NEAT" method to improve the original NeuroEvolution of Augmenting Topologies (NEAT) algorithm in high-dimensional inputs. Although the NEAT algorithm has shown a significant result in different challenging tasks, as input representations are high dimensional, it cannot create a well-tuned network. Our study addresses this limitation by using self-attention as an indirect encoding method to select the most important parts of the input. In addition, we improve its overall performance with the help of a hybrid method to evolve the final network weights. The main conclusion is that Hybrid Self- Attention NEAT can eliminate the restriction of the original NEAT. The results indicate that in comparison with evolutionary algorithms, our model can get comparable scores in Atari games with raw pixels input with a much lower number of parameters. This article presents a "Hybrid Self-Attention NEAT" method to improve the original NeuroEvolution of Augmenting Topologies (NEAT) algorithm in high-dimensional inputs. Although the NEAT algorithm has shown a significant result in different challenging tasks, as input representations are high dimensional, it cannot create a well-tuned network. Our study addresses this limitation by using self-attention as an indirect encoding method to select the most important parts of the input. In addition, we improve its overall performance with the help of a hybrid method to evolve the final network weights. The main conclusion is that Hybrid Self- Attention NEAT can eliminate the restriction of the original NEAT. The results indicate that in comparison with evolutionary algorithms, our model can get comparable scores in Atari games with raw pixels input with a much lower number of parameters. |
| Author | Malek, Hamed Khamesian, Saman |
| Author_xml | – sequence: 1 givenname: Saman surname: Khamesian fullname: Khamesian, Saman – sequence: 2 givenname: Hamed surname: Malek fullname: Malek, Hamed |
| BackLink | https://doi.org/10.1007/s12530-023-09510-3$$DView published paper (Access to full text may be restricted) https://doi.org/10.48550/arXiv.2112.03670$$DView paper in arXiv |
| BookMark | eNotj8FKw0AQhhdRsNY-gCcXPCfuzmazG2-htFYoejDgMWySiU1Js3WzDfbtTVtPMwz_P3zfHbnubIeEPHAWRlpK9mzcbzOEwDmETMSKXZEJCMEDHQHcklnfbxljECuQUkzI1-pYuKain9jWQeo9dr6xHX1fpNkLTWlnB2wpDrY9nO7GHanZ75015YZ6S5vduA9I_QbPFWrab-sav9ndk5vatD3O_ueUZMtFNl8F64_Xt3m6DowECOpEYgQqSpSpBBglK6gKViqoo4qZpMa6RKah5CXHhCkOsSmwiHkJXGsshJiSx8vbs3W-d81uZMxP9vnZfkw8XRIj6c8Be59v7cF1I1MOMVOJ0CoG8Qflkl6A |
| ContentType | Paper Journal Article |
| Copyright | 2022. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. http://creativecommons.org/licenses/by/4.0 |
| Copyright_xml | – notice: 2022. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. – notice: http://creativecommons.org/licenses/by/4.0 |
| DBID | 8FE 8FG ABJCF ABUWG AFKRA AZQEC BENPR BGLVJ CCPQU DWQXO HCIFZ L6V M7S PHGZM PHGZT PIMPY PKEHL PQEST PQGLB PQQKQ PQUKI PRINS PTHSS AKY GOX |
| DOI | 10.48550/arxiv.2112.03670 |
| DatabaseName | ProQuest SciTech Collection ProQuest Technology Collection Materials Science & Engineering Collection ProQuest Central (Alumni) ProQuest Central UK/Ireland ProQuest Central Essentials ProQuest Central Technology Collection ProQuest One ProQuest Central SciTech Premium Collection ProQuest Engineering Collection Engineering Database Proquest Central Premium ProQuest One Academic Publicly Available Content Database ProQuest One Academic Middle East (New) ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Applied & Life Sciences ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central China Engineering Collection arXiv Computer Science arXiv.org |
| DatabaseTitle | Publicly Available Content Database Engineering Database Technology Collection ProQuest One Academic Middle East (New) ProQuest Central Essentials ProQuest One Academic Eastern Edition ProQuest Central (Alumni Edition) SciTech Premium Collection ProQuest One Community College ProQuest Technology Collection ProQuest SciTech Collection ProQuest Central China ProQuest Central ProQuest One Applied & Life Sciences ProQuest Engineering Collection ProQuest One Academic UKI Edition ProQuest Central Korea Materials Science & Engineering Collection ProQuest Central (New) ProQuest One Academic ProQuest One Academic (New) Engineering Collection |
| DatabaseTitleList | Publicly Available Content Database |
| Database_xml | – sequence: 1 dbid: GOX name: arXiv.org url: http://arxiv.org/find sourceTypes: Open Access Repository – sequence: 2 dbid: 8FG name: ProQuest Technology Collection url: https://search.proquest.com/technologycollection1 sourceTypes: Aggregation Database |
| DeliveryMethod | fulltext_linktorsrc |
| Discipline | Physics |
| EISSN | 2331-8422 |
| ExternalDocumentID | 2112_03670 |
| Genre | Working Paper/Pre-Print |
| GroupedDBID | 8FE 8FG ABJCF ABUWG AFKRA ALMA_UNASSIGNED_HOLDINGS AZQEC BENPR BGLVJ CCPQU DWQXO FRJ HCIFZ L6V M7S M~E PHGZM PHGZT PIMPY PKEHL PQEST PQGLB PQQKQ PQUKI PRINS PTHSS AKY GOX |
| ID | FETCH-LOGICAL-a522-f95e427497ad32a75d2db0c72f4d0a9fefce082c1c1e907126abeb61c2188eb33 |
| IEDL.DBID | 8FG |
| IngestDate | Wed Jul 23 01:59:31 EDT 2025 Mon Jun 30 09:25:29 EDT 2025 |
| IsDoiOpenAccess | true |
| IsOpenAccess | true |
| IsPeerReviewed | false |
| IsScholarly | false |
| Language | English |
| LinkModel | DirectLink |
| MergedId | FETCHMERGED-LOGICAL-a522-f95e427497ad32a75d2db0c72f4d0a9fefce082c1c1e907126abeb61c2188eb33 |
| Notes | SourceType-Working Papers-1 ObjectType-Working Paper/Pre-Print-1 content type line 50 |
| OpenAccessLink | https://www.proquest.com/docview/2607938762?pq-origsite=%requestingapplication% |
| PQID | 2607938762 |
| PQPubID | 2050157 |
| ParticipantIDs | arxiv_primary_2112_03670 proquest_journals_2607938762 |
| PublicationCentury | 2000 |
| PublicationDate | 20220814 |
| PublicationDateYYYYMMDD | 2022-08-14 |
| PublicationDate_xml | – month: 08 year: 2022 text: 20220814 day: 14 |
| PublicationDecade | 2020 |
| PublicationPlace | Ithaca |
| PublicationPlace_xml | – name: Ithaca |
| PublicationTitle | arXiv.org |
| PublicationYear | 2022 |
| Publisher | Cornell University Library, arXiv.org |
| Publisher_xml | – name: Cornell University Library, arXiv.org |
| SSID | ssj0002672553 |
| Score | 1.8067442 |
| SecondaryResourceType | preprint |
| Snippet | This article presents a "Hybrid Self-Attention NEAT" method to improve the original NeuroEvolution of Augmenting Topologies (NEAT) algorithm in... This article presents a "Hybrid Self-Attention NEAT" method to improve the original NeuroEvolution of Augmenting Topologies (NEAT) algorithm in... |
| SourceID | arxiv proquest |
| SourceType | Open Access Repository Aggregation Database |
| SubjectTerms | Algorithms Computer Science - Artificial Intelligence Computer Science - Neural and Evolutionary Computing Evolutionary algorithms Topology |
| SummonAdditionalLinks | – databaseName: arXiv.org dbid: GOX link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV09T8MwELXaTiwIBKiFgjywBhLHSRq2CLWqkCgDRWSL_HGGSiVBaajov-fspGJArNZ5OX_ce2ffO0KuQxUkZmISPEiR9Lhkthsg1x4kfiRDn4FWNjXwuIjnL_whj_IeoftaGFF_r7atPrDc3CI7YTe-1Rjrkz4CBVvM-5S3j5NOiquz_7VDjOmG_lytLl7MjshhB_Ro1q7MMelBeUJe5ztbIUWfYW28rGnav4Z0Mc2WdzSjZbWFNYVttx1EvaN7zW_aVHTlEgBAEbO5KVSs3yok9-8fp2Q5my7v517X2sATCHg8k0bAkQ-midAhE0mkmZa-Spjh2hepAaMAY7MKVADIXgMWCwkyDhQG5AnS3_CMDMqqhCGhgUBSYCKFREfyUKdSMsk1SKEE90WsR2ToHFJ8tuoVhfVV4Xw1IuO9j4pu524K5Dd4ZO0def7_zAtywGwZgJWG5WMyaOovuMTg3Mgrt0I_pQKPYw priority: 102 providerName: Cornell University |
| Title | Hybrid Self-Attention NEAT: A novel evolutionary approach to improve the NEAT algorithm |
| URI | https://www.proquest.com/docview/2607938762 https://arxiv.org/abs/2112.03670 |
| hasFullText | 1 |
| inHoldings | 1 |
| isFullTextHit | |
| isPrint | |
| link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1LT8JAEN4oxMSbz4Ai2YPXQrvdttSLQVMgJiBRjNyafVVJsMVSiVz87c4uRQ8mXpq0mz10dnZmvm9nZxC6dIUTJJ0kgI3kcYtyorsBUmmpwPa4axMlhaYGhiN_8ETvpt60JNyWZVrl1iYaQy0zoTnyNsTdoEp6714v3i3dNUqfrpYtNHZRFRx1qLW60-v_cCzEDyBidjeHmaZ0V5vln7NVC1APadm6dhnEpObTH1Ns_EvvAFXHbKHyQ7Sj0iO0Z9IyxfIYPQ_W-kYVflTzxOoWxSY3EY-i7uQKd3GardQcq1WpPixf422NcFxkeGYIA4UhxjNTMJu_wC8Vr28naNKLJrcDq2yFYDEIkKwk9BQF_BgGTLqEBZ4kktsiIAmVNgsTlQgFvlw4wlGAdh3iM6647whw4B2Ay-4pqqRZqmoIOwxEkngCgBGnrgw5J5xKxZlg1Ga-rKOaEUi82FS7iLWsYiOrOmpsZRSXmr6Mf9fl7P_hc7RP9NUBXU6WNlClyD_UBTj0gjfNqjVR9SYajR_grX8_hefwK_oG0OikBg |
| linkProvider | ProQuest |
| linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1LT-MwEB5BK8TeeIousPgAx0DiOAlFQqsuFJVXhaAIbpEfE0AqbWlDl_64_W87dhM4IHHjGis-fJ7xfDOeB8B2qIMk288SUqRIeUJxOw1QGA8TP1Khz9FoGxq4bMetW3F2H93PwL-yFsamVZZ3oruoTV_bGPke8W4SJau7vwcvnp0aZV9XyxEashitYA5di7GisOMcJ3_JhRsdnh7Tee9wftLsHLW8YsqAJ4l7eFk9QkGuWT2RJuQyiQw3ytcJz4TxZT3DTCOZSR3oAMmRDHgsFao40GQb98kTDWnbWaiKkHaoQPVPs311_R7k4XFClD2cvqa63mF7cvj2NN4lt4vv-rZ5GpFi9-mTLXAG7mQBqldygMNFmMHeEsy5vFA9Woa71sSWdLEb7GZeI8-nyZGs3Wx0DliD9fpj7DIcF_IrhxNWNilneZ89uYgFMiKZ7hcmuw-Eaf74vAKd70BpFSq9fg_XgAWSIMkiTZ6ZEqGpK8WVMKiklsKXsanBmgMkHUzbbaQWq9RhVYONEqO0ULVR-iEYP79e3oL5VufyIr04bZ-vww9u6xhsb1uxAZV8-IqbxC5y9as4QwbpN0vNf-Nb44c |
| openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Hybrid+Self-Attention+NEAT%3A+A+novel+evolutionary+approach+to+improve+the+NEAT+algorithm&rft.jtitle=arXiv.org&rft.au=Khamesian%2C+Saman&rft.au=Malek%2C+Hamed&rft.date=2022-08-14&rft.pub=Cornell+University+Library%2C+arXiv.org&rft.eissn=2331-8422&rft_id=info:doi/10.48550%2Farxiv.2112.03670 |