Scale adaptive fitness evaluation‐based particle swarm optimisation for hyperparameter and architecture optimisation in neural networks and deep learning

Research into automatically searching for an optimal neural network (NN) by optimisation algorithms is a significant research topic in deep learning and artificial intelligence. However, this is still challenging due to two issues: Both the hyperparameter and architecture should be optimised and the...

Full description

Saved in:
Bibliographic Details
Published inCAAI Transactions on Intelligence Technology Vol. 8; no. 3; pp. 849 - 862
Main Authors Wang, Ye‐Qun, Li, Jian‐Yu, Chen, Chun‐Hua, Zhang, Jun, Zhan, Zhi‐Hui
Format Journal Article
LanguageEnglish
Published Beijing John Wiley & Sons, Inc 01.09.2023
Wiley
Subjects
Online AccessGet full text
ISSN2468-6557
2468-2322
2468-2322
DOI10.1049/cit2.12106

Cover

More Information
Summary:Research into automatically searching for an optimal neural network (NN) by optimisation algorithms is a significant research topic in deep learning and artificial intelligence. However, this is still challenging due to two issues: Both the hyperparameter and architecture should be optimised and the optimisation process is computationally expensive. To tackle these two issues, this paper focusses on solving the hyperparameter and architecture optimization problem for the NN and proposes a novel light‐weight scale‐adaptive fitness evaluation‐based particle swarm optimisation (SAFE‐PSO) approach. Firstly, the SAFE‐PSO algorithm considers the hyperparameters and architectures together in the optimisation problem and therefore can find their optimal combination for the globally best NN. Secondly, the computational cost can be reduced by using multi‐scale accuracy evaluation methods to evaluate candidates. Thirdly, a stagnation‐based switch strategy is proposed to adaptively switch different evaluation methods to better balance the search performance and computational cost. The SAFE‐PSO algorithm is tested on two widely used datasets: The 10‐category (i.e., CIFAR10) and the 100−category (i.e., CIFAR100). The experimental results show that SAFE‐PSO is very effective and efficient, which can not only find a promising NN automatically but also find a better NN than compared algorithms at the same computational cost.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2468-6557
2468-2322
2468-2322
DOI:10.1049/cit2.12106