Skip to content

Commit

Permalink
Fixed XNLI prompts entailment direction (#52)
Browse files Browse the repository at this point in the history
* fixed subject-verb agreement in XNLI prompt

* fixed NLI direction: P entails H, not vice versa

We could maybe either prompt "does S1 contradict S2" or “does S2 contradict S1”. But for entailment we can only prompt “does S1 entail S2” (does the premise entail the hypothesis) not the other way around.

* standardized line breaks and oxford commas

To exactly match Figure G7 in the GPT-3 paper.
  • Loading branch information
awebson committed Jun 2, 2021
1 parent 88c8937 commit 98668ae
Showing 1 changed file with 12 additions and 12 deletions.
24 changes: 12 additions & 12 deletions templates/xnli/en/templates.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,38 +3,38 @@ subset: en
templates:
4e122d26-7e79-49c0-961b-cf8ee134759e: !Template
id: 4e122d26-7e79-49c0-961b-cf8ee134759e
jinja: "Sentence 1: {{premise}}\nSentence 2: {{hypothesis}}\n\nDoes Sentence 2\
\ contradicts Sentence 1? Yes, No or Neutral? |||\n{% if label == 0 %} \nNo\n\
{% elif label == 1 %}\nNeutral\n{% else %}\nYes\n{% endif %}"
jinja: "Sentence 1: {{premise}}\nSentence 2: {{hypothesis}}\nQuestion: Does Sentence\
\ 1 contradict Sentence 2? Yes, No, or Neutral? |||\n{% if label == 0 %} \n\
No\n{% elif label == 1 %}\nNeutral\n{% else %}\nYes\n{% endif %}"
name: Concatenation contraposition
reference: Concatenation contraposition
c62a3048-018e-4d93-bc46-645f3f763ee6: !Template
id: c62a3048-018e-4d93-bc46-645f3f763ee6
jinja: "Sentence 1: {{premise}}\n\nSentence 2: {{hypothesis}}\n\nDoes Sentence\
\ 2 contradicts Sentence 1? Yes or No? |||\n{% if label == 2 %} \nYes\n{% else\
jinja: "Sentence 1: {{premise}}\nSentence 2: {{hypothesis}}\nQuestion: Does Sentence\
\ 1 contradict Sentence 2? Yes or No? |||\n{% if label == 2 %} \nYes\n{% else\
\ %}\nNo\n{% endif %}"
name: Label binarization contraposition
name: Label binarization contraposition
reference: Inspired from https://arxiv.org/pdf/1902.01007.pdf Section 4 - Implementation
and evaluation
d9e13133-267e-46c4-afad-c2379dcc5272: !Template
id: d9e13133-267e-46c4-afad-c2379dcc5272
jinja: "{{premise}}\nQuestion: {{hypothesis}}\nTrue, False or Neither? ||| \n\
jinja: "{{premise}}\nQuestion: {{hypothesis}} True, False, or Neither? ||| \n\
{% if label == 0 %} \nTrue\n{% elif label == 1 %}\nNeither\n{% else %}\nFalse\n\
{% endif %}"
name: ANLI GPT3
reference: ANLI prompt format from Table G7 in the GPT3 paper
dd4276e6-aebd-44a3-b3cf-baf8a4c237f0: !Template
id: dd4276e6-aebd-44a3-b3cf-baf8a4c237f0
jinja: "Sentence 1: {{premise}}\n\nSentence 2: {{hypothesis}}\n\nDoes Sentence\
\ 2 entails Sentence 1? Yes or No? |||\n{% if label == 0 %} \nYes\n{% else %}\n\
jinja: "Sentence 1: {{premise}}\nSentence 2: {{hypothesis}}\nQuestion: Does Sentence\
\ 1 entail Sentence 2? Yes or No? |||\n{% if label == 0 %} \nYes\n{% else %}\n\
No\n{% endif %}"
name: Label binarization
reference: Grouping "neutral" and "contradiction" as a single label following
https://arxiv.org/pdf/1902.01007.pdf Section 4 - Implementation and evaluation
e174f56a-b0af-4937-b6ae-1897cac26eba: !Template
id: e174f56a-b0af-4937-b6ae-1897cac26eba
jinja: "Sentence 1: {{premise}}\nSentence 2: {{hypothesis}}\n\nDoes Sentence 2\
\ entails Sentence 1? Yes, No or Neutral? |||\n{% if label == 0 %} \nYes\n{%\
\ elif label == 1 %}\nNeutral\n{% else %}\nNo\n{% endif %}"
jinja: "Sentence 1: {{premise}}\nSentence 2: {{hypothesis}}\nQuestion: Does Sentence\
\ 1 entail Sentence 2? Yes, No, or Neutral? |||\n{% if label == 0 %} \nYes\n\
{% elif label == 1 %}\nNeutral\n{% else %}\nNo\n{% endif %}"
name: Concatenation
reference: Concatenation of premise and hypothesis

0 comments on commit 98668ae

Please sign in to comment.