2013 |
MCTest mc160 |
Crowd-sourcing |
Factoid stories |
Paragraph |
Open-domain |
|
2013 |
MCTest mc500 |
Crowd-sourcing |
Factoid stories |
Paragraph |
Open-domain |
|
2015 |
CNN |
Automated |
News |
Document |
Complex Reasoning |
|
2015 |
Daily Mail |
Automated |
News |
Document |
Complex Reasoning |
|
2015 |
CuratedTREC |
Crowd-sourcing |
Factoid stories |
Paragraph |
Open-domain |
|
2015 |
WikiQA |
Crowd-sourcing |
Wikipedia |
Paragraph |
Unanswerable Questions |
|
2016 |
Facebook CBT |
Automated |
Factoid stories |
Paragraph |
Complex Reasoning |
|
2016 |
Google MC-AFP |
Automated |
The Gigaword corpus |
Paragraph |
Complex Reasoning |
|
2016 |
LAMBADA |
Crowd-sourcing |
Book Corpus |
Paragraph |
Complex Reasoning |
|
2016 |
NewsQA |
Crowd-sourcing |
News |
Document |
Complex Reasoning |
|
2016 |
SQuAD1.1 |
Crowd-sourcing |
Wikipedia |
Paragraph |
Complex Reasoning |
|
2016 |
Who-did-What |
Automated |
News |
Document |
Complex Reasoning ; Paraphrased |
|
2016 |
WikiMovies |
Automated |
Movies |
Document |
Domain-specific ; Commonsense |
|
2016 |
BookTest |
Automated |
Factoid stories |
Paragraph |
Large scale |
|
2016 |
WikiReading |
Automated |
Wikipedia |
Document |
Large scale ; Commonsense |
|
2016 |
MovieQA |
Crowd-sourcing |
Movies |
Paragraph with Images and Videos |
Multi-Model MRC |
|
2016 |
MS MARCO |
Automated |
The Bing |
Paragraph |
Unanswerable Questions ; Multi-hop |
|
2017 |
RACE |
Expert |
English Exam |
Document |
Complex Reasoning |
|
2017 |
TriviaQA (Wiki) |
Automated |
The Bing |
Paragraph |
Complex Reasoning |
|
2017 |
TriviaQA(Web) |
Automated |
The Bing |
Paragraph |
Complex Reasoning |
|
2017 |
NarrativeQA |
Crowd-sourcing |
Movies |
Document |
Complex Reasoning ; Multi-hop |
|
2017 |
SciQ |
Crowd-sourcing |
School science curricula |
Paragraph |
Domain-specific |
|
2017 |
Quasar-S |
Crowd-sourcing |
Stack Overflow |
Paragraph |
Open-domain |
|
2017 |
Quasar-T |
Crowd-sourcing |
Stack Overflow |
Paragraph |
Open-domain |
|
2017 |
SearchQA |
Crowd-sourcing |
J! Archive and Google |
Paragraph & URL |
Open-domain |
|
2017 |
Qangaroo-MEDHOP |
Crowd-sourcing |
Wikipedia |
Paragraph |
Multi-hop |
|
2017 |
Qangaroo-WIKIHOP |
Crowd-sourcing |
Scientic paper |
Paragraph |
Multi-hop |
|
2017 |
COMICS |
Automated |
Comics |
Paragraph with Images |
Multi-Model MRC |
|
2017 |
TQA |
Expert |
School science curricula |
Paragraph with Images |
Multi-Model MRC |
|
2018 |
CLOTH |
Expert |
English Exam |
Document |
Complex Reasoning |
|
2018 |
ProPara |
Crowd-sourcing |
Process Paragraph |
Paragraph |
Complex Reasoning |
|
2018 |
ARC-Challenge Set |
Expert |
School science curricula |
Paragraph |
Complex Reasoning ;Commonsense |
|
2018 |
ARC-Easy Set |
Expert |
School science curricula |
Paragraph |
Complex Reasoning ;Commonsense |
|
2018 |
CoQA |
Crowd-sourcing |
Jeopardy |
Paragraph |
Conversational ; Unanswerable Questions |
|
2018 |
QuAC |
Crowd-sourcing |
Wikipedia |
Document |
Conversational ; Unanswerable Questions |
|
2018 |
CliCR |
Automated |
BMJ Case Reports |
Paragraph |
Domain-specific |
|
2018 |
PaperQA(Yining etc 2018) |
Crowd-sourcing |
Scientic paper |
Paragraph |
Domain-specific |
|
2018 |
PaperQA-L(Park etc 2018) |
Automated |
Scientic paper |
Paragraph |
Domain-specific |
|
2018 |
PaperQA-T(Park etc 2018) |
Automated |
Scientic paper |
Paragraph |
Domain-specific |
|
2018 |
ReviewQA |
Crowd-sourcing |
Hotel Comments |
Paragraph |
Domain-specific |
|
2018 |
SciTail |
Crowd-sourcing |
School science curricula |
Paragraph |
Domain-specific |
|
2018 |
MultiRC |
Crowd-sourcing |
News and other web pages |
Multi-Sentence |
Multi-hop |
|
2018 |
HotpotQA(Distractor) |
Crowd-sourcing |
Wikipedia |
Multi-Paragraph |
Multi-hop ; Complex Reasoning |
|
2018 |
HotpotQA(Fullwiki) |
Crowd-sourcing |
Wikipedia |
Multi-Paragraph |
Multi-hop ; Complex Reasoning |
|
2018 |
RecipeQA |
Automated |
Recipes |
Paragraph with Images |
Multi-Model MRC |
|
2018 |
MCScript |
Crowd-sourcing |
Narrative texts |
Paragraph |
Require Commonsense(World knowledge) |
|
2018 |
OpenBookQA |
Crowd-sourcing |
School science curricula |
Paragraph |
Commonsense |
|
2018 |
ReCoRD |
Crowd-sourcing |
News |
Paragraph |
Commonsense |
|
2018 |
DuoRC-Paraphrase |
Crowd-sourcing |
Movies |
Paragraph |
Paraphrased ;Commonsense ; Complex Reasoning ; Unanswerable Questions |
|
2018 |
DuoRC-Self |
Crowd-sourcing |
Movies |
Paragraph |
Paraphrased ; Commonsense ; Complex Reasoning ; Unanswerable Questions |
|
2018 |
SQuAD2.0 |
Crowd-sourcing |
Wikipedia |
Paragraph |
Unanswerable Questions |
|
2019 |
DROP |
Crowd-sourcing |
Wikipedia |
Paragraph |
Complex Reasoning |
|
2019 |
ShARC |
Crowd-sourcing |
Government Websites |
Paragraph |
Conversational |
|
2019 |
DREAM |
Crowd-sourcing |
English Exam |
Dialogues |
Conversational ;Commonsense |
|
2019 |
CommonSenseQA |
Crowd-sourcing |
Narrative texts |
Paragraph |
Commonsense |
|
2019 |
Natural Question(Long) |
Crowd-sourcing |
Wikipedia |
Paragraph |
Unanswerable Questions |
|
2019 |
Natural Question(Short) |
Crowd-sourcing |
Wikipedia |
Paragraph |
Unanswerable Questions |
|