-
Notifications
You must be signed in to change notification settings - Fork 59
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix t5 tokenizer and prompt token failures #1966
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -75,6 +75,9 @@ def update_request_cache_with_output(request_cache: OrderedDict, | |
if "prompt_tokens_details" not in request_cache[ | ||
request_id] and request_output.prompt_logprobs: | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Curious, request_output should not have prompt_logprobs as well right? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. from what I can read, |
||
request_cache[request_id]["prompt_tokens_details"] = [] | ||
if not isinstance(request_output.prompt_token_ids, list): | ||
## lmi-dist does not return prompt_token_ids for t5 | ||
request_output.prompt_token_ids = [] | ||
for index, prompt_token_id in enumerate( | ||
request_output.prompt_token_ids): | ||
prompt_token = Token( | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
QQ: I see we are populating model_config, Where do we populate "model_type"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
model_config here is hugging face config.json which has model_type populated.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh misread it, Model_type is read from model_config. Cool