Yolchuyeva, Sevinj and Németh, Géza and Gyires-Tóth, Bálint (2019) Self-Attention Networks for Intent Detection. In: Recent Advances in Natural Language Processing, Sep 2–4 2019, Varna, Bulgaria.
![]() |
Text
RANLP2019.pdf - Published Version Restricted to Registered users only until 10 January 2034. Download (478kB) | Request a copy |
Abstract
Self-attention networks (SAN) have shown promising performance in various Natural Language Processing (NLP) scenarios, especially in machine translation. One of the main points of SANs is the strength of capturing long-range and multi-scale dependencies from the data. In this paper, we present a novel intent detection system which is based on a self-attention network and a Bi-LSTM. Our approach shows improvement by using a transformer model and deep averaging network-based universal sentence encoder compared to previous solutions. We evaluate the system on Snips, Smart Speaker, Smart Lights, and ATIS datasets by different evaluation metrics. The performance of the proposed model is compared with LSTM with the same datasets.
Item Type: | Conference or Workshop Item (Paper) |
---|---|
Subjects: | T Technology / alkalmazott, műszaki tudományok > T2 Technology (General) / műszaki tudományok általában |
Depositing User: | Dr. Gyires-Tóth Bálint Pál |
Date Deposited: | 25 Sep 2019 21:48 |
Last Modified: | 25 Sep 2019 21:48 |
URI: | http://real.mtak.hu/id/eprint/101533 |
Actions (login required)
![]() |
Edit Item |