Fast neural network algorithm for solving classification tasks: Batch error back-propagation algorithm

Classification is one-out-of several applications in the neural network (NN) world. Multilayer perceptron (MLP) is the common neural network architecture which is used for classification tasks. It is known for its error back propagation (EBP) algorithm, which opened the new way for solving classific...

Full description

Saved in:
Bibliographic Details
Published in2013 Proceedings of IEEE Southeastcon pp. 1 - 8
Main Authors Albarakati, Noor, Kecman, Vojislav
Format Conference Proceeding
LanguageEnglish
Published IEEE 01.04.2013
Subjects
Online AccessGet full text
ISBN9781479900527
1479900524
ISSN1091-0050
DOI10.1109/SECON.2013.6567409

Cover

More Information
Summary:Classification is one-out-of several applications in the neural network (NN) world. Multilayer perceptron (MLP) is the common neural network architecture which is used for classification tasks. It is known for its error back propagation (EBP) algorithm, which opened the new way for solving classification problems given a set of empirical data. In this paper, we performed experiments using three different NN structures in order to find the best performing MLP neural network for the nonlinear classification of multiclass data sets. The three different MLP structures for solving classification problems having K classes are: one model/K output layer neurons, K separate models/One output layer neuron, and K joint models/One output layer neuron. A developed learning algorithm used here is the batch EBP algorithm which uses all the data as a single batch while updating the NN weights. The batch EBP speeds significantly the training up. The use of a pseudo-inverse in calculating the output layer weights is also contributing to faster training. The extensive series of experiments performed within the research proved that the best structure for solving multiclass classification problems is a K joint models/One output layer neuron structure.
ISBN:9781479900527
1479900524
ISSN:1091-0050
DOI:10.1109/SECON.2013.6567409