e-ISSN:0976-5166
p-ISSN:2231-3850


INDIAN JOURNAL OF COMPUTER SCIENCE AND ENGINEERING

Call for Papers 2024

Feb 2024 - Volume 15, Issue 1
Deadline: 15 Jan 2024
Publication: 20 Feb 2024

Apr 2024 - Volume 15, Issue 2
Deadline: 15 Mar 2024
Publication: 20 Apr 2024

More

 

ABSTRACT

Title : ENHANCING LEARNING IN SPOKEN LANGUAGE UNDERSTANDING BY MITIGATING INTERNAL COVARIANT SHIFT IN LEARNING MODELS
Authors : Sheetal Jagdale, Milind Shah
Keywords : Keyword1; keyword2; keyword3. Internal covariant shift; Spoken language understanding; learning models; machine learning.
Issue Date : Jan-Feb 2021
Abstract :
Spoken Language Understanding (SLU) is a part of the Spoken dialogue system (SDS) and it assists to achieve the user's goal. SLU maps user utterance to a logical structure that can be easily understood by the computer. SLU accomplish this task by using deep learning models such as Convolution Neural Network (CNN) from machine learning. These machine-learning models suffer from an Internal Covariant Shift (ICS). Due to ICS, the learning efficiency of the deep model has reduced which in turn reduces the learning efficiency of SLU. An ICS in SLU can be reduced by techniques proposed in machine learning. The techniques are batch normalization, cosine normalization, group normalization, layer normalization, and instance normalization. The work in the paper extends these techniques to a deep model in SLU to mitigate the ICS and to enhance the performance of SLU. The results show improvement in SLU performance with normalization. The speed of training is also improved with normalization. The evaluation parameters are F-score, accuracy, and detection rate. When the SLU is trained with less training data, SLU with the instance normalization displayed good results. SLU with instance normalization displayed minimum variation in the learning curve. SLU with batch normalization displayed better accuracy, detection rate, and F-score for both area and price slot. In terms of training time, SLU with cosine normalization and batch normalization required the least time.
Page(s) : 297-305
ISSN : 0976-5166
Source : Vol. 12, No.1
PDF : Download
DOI : 10.21817/indjcse/2021/v12i1/211201239