Models, code, and papers for "Doyeon Kong":

Self-Attention-Based Message-Relevant Response Generation for Neural Conversation Model

May 23, 2018
Jonggu Kim, Doyeon Kong, Jong-Hyeok Lee

Using a sequence-to-sequence framework, many neural conversation models for chit-chat succeed in naturalness of the response. Nevertheless, the neural conversation models tend to give generic responses which are not specific to given messages, and it still remains as a challenge. To alleviate the tendency, we propose a method to promote message-relevant and diverse responses for neural conversation model by using self-attention, which is time-efficient as well as effective. Furthermore, we present an investigation of why and how effective self-attention is in deep comparison with the standard dialogue generation. The experiment results show that the proposed method improves the standard dialogue generation in various evaluation metrics.

* 8 pages 

  Click for Model/Code and Paper