What is it about?
Prompt-tuning and instruction-tuning of language models have exhibited significant results in few-shot Natural Language Processing (NLP) tasks, such as Relation Extraction (RE), which involves identifying relationships between entities within a sentence. However, the effectiveness of these methods relies heavily on the design of the prompts. A compelling question is whether incorporating external knowledge can enhance the language model's understanding of NLP tasks. In this paper, we introduce {\em wiki-based} prompt construction that leverages Wikidata as a source of information to craft more informative prompts for both prompt-tuning and instruction-tuning of language models in RE. Our experiments show that using wiki-based prompts enhances cutting-edge language models in RE, emphasizing their potential for improving RE tasks.
Featured Image
Why is it important?
How does the integration of external knowledge sources influence the generalization capabilities of language models in the relation extraction task, particularly in terms of solving memorization over generalization problem? How can informative prompt construction improve the accuracy and efficiency of relation extraction in language models across few-shot scenarios?
Read the Original
This page is a summary of: Wiki-based Prompts for Enhancing Relation Extraction using Language Models, April 2024, ACM (Association for Computing Machinery),
DOI: 10.1145/3605098.3635949.
You can read the full text:
Contributors
The following have contributed to this page







