What is it about?
Question generation is the task of generating questions from a text passage that can be answered using information available in the passage. Known models for question generation are trained to predict words from a large, predefined vocabulary. However, a large vocabulary increases memory usage, training and inference times and a predefined vocabulary may not include context-specific words from the input passage. In this paper, we propose a neural question generation framework that generates semantically accurate and context-specific questions using a small-size vocabulary. We break the question generation task into two subtasks namely, generating the skeletal structure of a question using common words from the vocabulary and pointing to rare words from the input passage to complete the question.
Featured Image
Photo by Ana Municio on Unsplash
Read the Original
This page is a summary of: Vocabulary-constrained Question Generation with Rare Word Masking and Dual Attention, January 2021, ACM (Association for Computing Machinery),
DOI: 10.1145/3430984.3431074.
You can read the full text:
Contributors
The following have contributed to this page







