While semantic inference has always been a major focus in Computational Linguistics, the topic has benefited of new attention in the field thanks to the Recognizing Textual Entailment (RTE) framework, first launched in 2004, which has provided an operational definition of entailment based on human judgements over portions of text. On top of such definition, a task has been designed, which includes both guidelines for dataset annotation and evaluation metrics for assessing systems' performance. This chapter presents the successful experience of creating Textual Entailment datasets. We show how, during the years, RTE datasets have been developed in several variants, not only to address complex phenomena underlying entailment, but also to demonstrate the potential application of entailment inference into concrete scenarios, including summarization, knowledge base population, answer validation for question answering, and student answer assessment.

The Recognizing Textual Entailment Challenges: Datasets and Methodologies

Luisa Bentivogli
;
Ido Dagan;Bernardo Magnini
2017-01-01

Abstract

While semantic inference has always been a major focus in Computational Linguistics, the topic has benefited of new attention in the field thanks to the Recognizing Textual Entailment (RTE) framework, first launched in 2004, which has provided an operational definition of entailment based on human judgements over portions of text. On top of such definition, a task has been designed, which includes both guidelines for dataset annotation and evaluation metrics for assessing systems' performance. This chapter presents the successful experience of creating Textual Entailment datasets. We show how, during the years, RTE datasets have been developed in several variants, not only to address complex phenomena underlying entailment, but also to demonstrate the potential application of entailment inference into concrete scenarios, including summarization, knowledge base population, answer validation for question answering, and student answer assessment.
2017
978-94-024-0879-9
978-94-024-0881-2
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11582/312803
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
social impact