Skip to content

Conversation

@clefourrier
Copy link
Member

@clefourrier clefourrier commented Feb 27, 2024

Fix #66
We actually don't need to use the tokenizer.apply_chat_template for the possible choices, since the correct logic is taken care of in get_examples_with_chat_template, which adds a generation prompt start before the tokens

…g the chat templates since correct mechanism is in
@clefourrier
Copy link
Member Author

Can confirm it works well

@lewtun
Copy link
Member

lewtun commented Feb 27, 2024

I've tested it as well and confirm it works - shall we merge?

@clefourrier clefourrier requested a review from NathanHB February 27, 2024 15:40
@clefourrier
Copy link
Member Author

Not merging without maintainer approval :) (in this case, @NathanHB , who's off on Tuesdays ^^)

@clefourrier clefourrier merged commit 4907499 into main Feb 27, 2024
hynky1999 pushed a commit that referenced this pull request May 22, 2025
We actually don't need to use the tokenizer.apply_chat_template for the possible choices, since the correct logic is taken care of in get_examples_with_chat_template, which adds a generation prompt start before the tokens
NathanHB pushed a commit that referenced this pull request Sep 19, 2025
We actually don't need to use the tokenizer.apply_chat_template for the possible choices, since the correct logic is taken care of in get_examples_with_chat_template, which adds a generation prompt start before the tokens
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Cannot evaluate chat model on TruthfulQA (TypeError: can only concatenate str (not "list") to str)

4 participants