TR2016-074

Context-Sensitive and Role-Dependent Spoken Language Understanding using Bidirectional and Attention LSTMs


    •  Hori, C., Hori, T., Watanabe, S., Hershey, J.R., "Context-Sensitive and Role-Dependent Spoken Language Understanding using Bidirectional and Attention LSTMs", Interspeech, DOI: 10.21437/​Interspeech.2016-1171, September 2016, pp. 3236-3240.
      BibTeX TR2016-074 PDF
      • @inproceedings{Hori2016sep,
      • author = {Hori, Chiori and Hori, Takaaki and Watanabe, Shinji and Hershey, John R.},
      • title = {Context-Sensitive and Role-Dependent Spoken Language Understanding using Bidirectional and Attention LSTMs},
      • booktitle = {Interspeech},
      • year = 2016,
      • pages = {3236--3240},
      • month = sep,
      • doi = {10.21437/Interspeech.2016-1171},
      • url = {https://www.merl.com/publications/TR2016-074}
      • }
  • MERL Contact:
  • Research Areas:

    Artificial Intelligence, Speech & Audio

Abstract:

To understand speaker intentions accurately in a dialog, it is important to consider the context of the surrounding sequence of dialog turns. Furthermore, each speaker may play a different role in the conversation, such as agent versus client, and thus features related to these roles may be important to the context. In previous work, we proposed context-sensitive spoken language understanding (SLU) using role-dependent long short-term memory (LSTM) recurrent neural networks (RNNs), and showed improved performance at predicting concept tags representing the intentions of agent and client in a human-human hotel reservation task. In the present study, we use bidirectional and attention-based LSTMs to train a roledependent context-sensitive model to jointly represent both the local word-level context within each utterance, and the left and right context within the dialog. The different roles of client and agent are modeled by switching between role-dependent layers. We evaluated label accuracies in the hotel reservation task using a variety of models, including logistic regression, RNNs, LSTMs, and the proposed bidirectional and attentionbased LSTMs. The bidirectional and attention-based LSTMs yield significantly better performance in this task.