Poster
Dialog-to-Action: Conversational Question Answering Over a Large-Scale Knowledge Base
Daya Guo · Duyu Tang · Nan Duan · Ming Zhou · Jian Yin
Room 210 #93
Keywords: [ Natural Language Processing ]
We present an approach to map utterances in conversation to logical forms, which will be executed on a large-scale knowledge base. To handle enormous ellipsis phenomena in conversation, we introduce dialog memory management to manipulate historical entities, predicates, and logical forms when inferring the logical form of current utterances. Dialog memory management is embodied in a generative model, in which a logical form is interpreted in a top-down manner following a small and flexible grammar. We learn the model from denotations without explicit annotation of logical forms, and evaluate it on a large-scale dataset consisting of 200K dialogs over 12.8M entities. Results verify the benefits of modeling dialog memory, and show that our semantic parsing-based approach outperforms a memory network based encoder-decoder model by a huge margin.
Live content is unavailable. Log in and register to view live content