Ganie, Aadil and Dadvandipour, Samad (2024) Exploring noise-induced techniques to strengthen deep learning model resilience. POLLACK PERIODICA: AN INTERNATIONAL JOURNAL FOR ENGINEERING AND INFORMATION SCIENCES, 19 (3). pp. 8-13. ISSN 1788-1994
|
Text
article-p8.pdf - Published Version Available under License Creative Commons Attribution. Download (1MB) | Preview |
Abstract
In artificial intelligence, combating overfitting and enhancing model generalization is crucial. This research explores innovative noise-induced regularization techniques, focusing on natural language processing tasks. Inspired by gradient noise and Dropout, this study investigates the interplay between controlled noise, model complexity, and overfitting prevention. Utilizing long short-term memory and bidirectional long short term memory architectures, this study examines the impact of noise-induced regularization on robustness to noisy input data. Through extensive experimentation, this study shows that introducing controlled noise improves model generalization, especially in language understanding. This contributes to the theoretical understanding of noise-induced regularization, advancing reliable and adaptable artificial intelligence systems for natural language processing.
Item Type: | Article |
---|---|
Subjects: | T Technology / alkalmazott, műszaki tudományok > TA Engineering (General). Civil engineering (General) / általános mérnöki tudományok |
SWORD Depositor: | MTMT SWORD |
Depositing User: | MTMT SWORD |
Date Deposited: | 14 Nov 2024 08:40 |
Last Modified: | 14 Nov 2024 08:40 |
URI: | https://real.mtak.hu/id/eprint/209565 |
Actions (login required)
![]() |
Edit Item |