Impact and Performance Analysis of Various Activation Functions for Classification Problems

The human brain serves as the inspiration for artificial neural networks in deep learning, which have a similar structure and operation as the brain's network of neurons. These networks have layers of cells that are connected and process information in a similar way to real neurons. An activati...

Full description

Saved in:
Bibliographic Details
Published in2023 IEEE International Conference on Contemporary Computing and Communications (InC4) Vol. 1; pp. 1 - 7
Main Authors Singh, Yvuraj, Saini, Madan, Savita
Format Conference Proceeding
LanguageEnglish
Published IEEE 21.04.2023
Subjects
Online AccessGet full text
DOI10.1109/InC457730.2023.10263129

Cover

More Information
Summary:The human brain serves as the inspiration for artificial neural networks in deep learning, which have a similar structure and operation as the brain's network of neurons. These networks have layers of cells that are connected and process information in a similar way to real neurons. An activation functions plays a important role in these deep learning based neural network architectures. This paper examines the impacts and performance of various activation functions: step function, sigmoid, Tanh, soft-sign, soft-max, Relu and its variants. These activation functions are tested on four publicly available biomedical data sets. This paper studies the impacts of these activation functions and finds the best activation function for binary and multi-class classification problems in deep artificial neural networks. The results show that sigmoid has good performance for binary classification and soft-max has good performance for multi-class classification tasks. However, building a successful neural network depends on choice of parameter setting but, in this study, all other parameters are kept unchanged.
DOI:10.1109/InC457730.2023.10263129