Automatic Modulation Classification Based on CNN-Transformer Graph Neural Network

In recent years, neural network algorithms have demonstrated tremendous potential for modulation classification. Deep learning methods typically take raw signals or convert signals into time–frequency images as inputs to convolutional neural networks (CNNs) or recurrent neural networks (RNNs). Howev...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 23; no. 16; p. 7281
Main Authors Wang, Dong, Lin, Meiyan, Zhang, Xiaoxu, Huang, Yonghui, Zhu, Yan
Format Journal Article
LanguageEnglish
Published Basel MDPI AG 20.08.2023
MDPI
Subjects
Online AccessGet full text
ISSN1424-8220
1424-8220
DOI10.3390/s23167281

Cover

More Information
Summary:In recent years, neural network algorithms have demonstrated tremendous potential for modulation classification. Deep learning methods typically take raw signals or convert signals into time–frequency images as inputs to convolutional neural networks (CNNs) or recurrent neural networks (RNNs). However, with the advancement of graph neural networks (GNNs), a new approach has been introduced involving transforming time series data into graph structures. In this study, we propose a CNN-transformer graph neural network (CTGNet) for modulation classification, to uncover complex representations in signal data. First, we apply sliding window processing to the original signals, obtaining signal subsequences and reorganizing them into a signal subsequence matrix. Subsequently, we employ CTGNet, which adaptively maps the preprocessed signal matrices into graph structures, and utilize a graph neural network based on GraphSAGE and DMoNPool for classification. Extensive experiments demonstrated that our method outperformed advanced deep learning techniques, achieving the highest recognition accuracy. This underscores CTGNet’s significant advantage in capturing key features in signal data and providing an effective solution for modulation classification tasks.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s23167281