Analysis of Weights and Feature Patterns in Popular 2D Deep Neural Networks Models for MRI Image Classification

A deep neural network (DNN) includes variables whose values keep on changing with the training process until it reaches the final point of convergence. These variables are the co-efficient of a polynomial expression to relate to the feature extraction process. In general, DNNs work in multiple ‘dime...

Full description

Saved in:
Bibliographic Details
Published inJournal of Multimedia Information System Vol. 9; no. 3; pp. 177 - 182
Main Authors Khagi, Bijen, Kwon, Goo-Rak
Format Journal Article
LanguageEnglish
Published 한국멀티미디어학회 30.09.2022
Subjects
Online AccessGet full text
ISSN2383-7632
2383-7632
DOI10.33851/JMIS.2022.9.3.177

Cover

More Information
Summary:A deep neural network (DNN) includes variables whose values keep on changing with the training process until it reaches the final point of convergence. These variables are the co-efficient of a polynomial expression to relate to the feature extraction process. In general, DNNs work in multiple ‘dimensions’ depending upon the number of channels and batches accounted for training. However, after the execution of feature extraction and before entering the SoftMax or other classifier, there is a conversion of features from multiple N-dimensions to a single vector form, where ‘N’ represents the number of activation channels. This usually happens in a Fully connected layer (FCL) or a dense layer. This reduced 2D feature is the subject of study for our analysis. For this, we have used the FCL, so the trained weights of this FCL will be used for the weight-class correlation analysis. The popular DNN models selected for our study are ResNet-101, VGG-19, and GoogleNet. These models’ weights are directly used for fine-tuning (with all trained weights initially transferred) and scratch trained (with no weights transferred). Then the comparison is done by plotting the graph of feature distribution and the final FCL weights. KCI Citation Count: 0
ISSN:2383-7632
2383-7632
DOI:10.33851/JMIS.2022.9.3.177