Hybrid self-attention NEAT: a novel evolutionary self-attention approach to improve the NEAT algorithm in high dimensional inputs

This article presents a “Hybrid Self-Attention NEAT” method to improve the original NeuroEvolution of Augmenting Topologies (NEAT) algorithm in high-dimensional inputs. Although the NEAT algorithm showed a significant result in different challenging tasks, as input representations are highly dimensi...

Full description

Saved in:
Bibliographic Details
Published inEvolving systems Vol. 15; no. 2; pp. 489 - 503
Main Authors Khamesian, Saman, Malek, Hamed
Format Journal Article
LanguageEnglish
Published Berlin/Heidelberg Springer Berlin Heidelberg 01.04.2024
Springer Nature B.V
Subjects
Online AccessGet full text
ISSN1868-6478
1868-6486
DOI10.1007/s12530-023-09510-3

Cover

More Information
Summary:This article presents a “Hybrid Self-Attention NEAT” method to improve the original NeuroEvolution of Augmenting Topologies (NEAT) algorithm in high-dimensional inputs. Although the NEAT algorithm showed a significant result in different challenging tasks, as input representations are highly dimensional, it cannot create a well-tuned network. Accordingly, we decided to overcome this limitation by using the Self-Attention technique as an indirect encoding method to select the most important parts of the input. In order to tune the hyper-parameters of the self-attention module, we used the CMA-ES evolutionary algorithm. Also, an innovative method called Seesaw is presented in this article to evolve populations of the NEAT and CMA-ES algorithms simultaneously. Besides the evolutionary operators of the NEAT algorithm to update the weights, we used a combination method to reach more fitting weights. We tested our model on a variety of Atari games. The results showed that, compared to state-of-the-art evolutionary algorithms, Hybrid Self-Attention NEAT could eliminate the restriction of the original NEAT and achieve comparable scores with raw pixel input while using much smaller (e.g. approximately 300 × against HyperNEAT) number of parameters.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1868-6478
1868-6486
DOI:10.1007/s12530-023-09510-3