Training of the Artificial Neural Networks using States of Matter Search Algorithm

In recent years, technology has been developing very rapidly in the field of artificial intelligence. In this development, Artificial neural networks (ANNs) have taken a huge place. The human brain has an excellent understanding structure. The brain makes this understanding through neuron cells. ANN...

Full description

Saved in:
Bibliographic Details
Published inInternational Journal of Intelligent Systems and Applications in Engineering Vol. 8; no. 3; pp. 131 - 136
Main Author Gulcu, Saban
Format Journal Article
LanguageEnglish
Published Selçuk Üniversitesi 28.09.2020
Subjects
Online AccessGet full text
ISSN2147-6799
2147-6799
DOI10.18201/ijisae.2020363532

Cover

More Information
Summary:In recent years, technology has been developing very rapidly in the field of artificial intelligence. In this development, Artificial neural networks (ANNs) have taken a huge place. The human brain has an excellent understanding structure. The brain makes this understanding through neuron cells. ANN aims to solve some complex problems by establishing the perception structure of human over neurons in the computer environment. A multilayer perceptron (MLP) is a class of artificial neural networks. MLP has the ability to learn using inputs and expected outputs. In order to do this, weight values in MLP are constantly updated according to the inputs and expected outputs. Thus, weight values are tried to be kept at an optimum level. Therefore, this problem is an optimization problem. In this study, the State of Matter Search meta-heuristic algorithm was used to optimize the weight values in MLP, called SMS-MLP. In the experiments, five classification datasets (xor, balloon, iris, breast cancer, heart) were used. The SMS-MLP algorithm was compared with the previous six algorithms (GWO-MLP, ACO-MLP, GA-MLP, PBIL-MLP, PSO-MLP and ES-MLP) in the literature. The experimental study shows that the SMS-MLP algorithm is more efficient than the other six algorithms.
ISSN:2147-6799
2147-6799
DOI:10.18201/ijisae.2020363532