Review of Deep Learning Algorithms and Architectures

Deep learning (DL) is playing an increasingly important role in our lives. It has already made a huge impact in areas such as cancer diagnosis, precision medicine, self-driving cars, predictive forecasting, speech recognition, etc. The painstakingly handcrafted feature extractors used in the traditi...

Full description

Saved in:
Bibliographic Details
Published inIEEE access Vol. 7; p. 1
Main Authors Shrestha, Ajay, Mahmood, Ausif
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.01.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text
ISSN2169-3536
2169-3536
DOI10.1109/ACCESS.2019.2912200

Cover

More Information
Summary:Deep learning (DL) is playing an increasingly important role in our lives. It has already made a huge impact in areas such as cancer diagnosis, precision medicine, self-driving cars, predictive forecasting, speech recognition, etc. The painstakingly handcrafted feature extractors used in the traditional learning, classification and pattern recognition systems are not scalable for large-sized data sets. In many cases depending on the problem complexity, deep learning can also overcome limitations of earlier shallow networks that prevented efficient training and abstractions of hierarchical representations of multi-dimensional training data. Deep Neural Network (DNN) uses multiple (deep) layers of units with highly optimized algorithms and architectures. The paper reviews several optimization methods to improve accuracy of the training and reduce training time. We delve into the math behind training algorithms used in recent deep networks. We describe current shortcomings, enhancements and implementations. The review also covers different types of deep architectures such as deep convolution networks, deep residual networks, recurrent neural networks, reinforcement learning, variational autoencoders, and others.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2019.2912200