Xu, Yuewen (2025) AMC-Transformer: Automatic Modulation Classification based on Enhanced Attention Model. INFOCOMMUNICATIONS JOURNAL, 17 (4). pp. 32-40. ISSN 2061-2079
|
Text
InfocomJournal_2025_4_5.pdf - Published Version Download (2MB) | Preview |
Abstract
High-accuracy automatic modulation classification (AMC) is essential for spectrum monitoring and interferenceaware access in future 6G systems [1]. We propose AMCTransformer, which tokenizes raw I/Q sequences into fixedlength patches, augments them with learnable positional embeddings, and applies multi-layer, multi-head self-attention to capture global temporal–spatial correlations without handcrafted features or convolutions. On RadioML2018.01A, our model achieves 98.8% accuracy in the high-SNR regime (SNR at least 10 dB), showing higher accuracy than a CNN and a ResNet reimplementation by 4.44% and 1.96% in relative terms; averaged across all SNRs, it also improves upon MCformer, CNN, and ResNet baselines. Consistent gains are observed on the RadioML2016.10A dataset, further validating robustness across benchmarks. Ablations on depth, patch size, and head count provide practical guidance under different SNR regimes and compute budgets. These results demonstrate the promise of transformer-based AMC for robust recognition in complex wireless environments.
| Item Type: | Article |
|---|---|
| Uncontrolled Keywords: | Modulation Recognition, Deep Learning, Transformer, Attention Mechanism, IQ Signal |
| Subjects: | T Technology / alkalmazott, műszaki tudományok > T2 Technology (General) / műszaki tudományok általában |
| SWORD Depositor: | MTMT SWORD |
| Depositing User: | MTMT SWORD |
| Date Deposited: | 28 Jan 2026 15:21 |
| Last Modified: | 28 Jan 2026 15:21 |
| URI: | https://real.mtak.hu/id/eprint/232834 |
Actions (login required)
![]() |
Edit Item |




