Skip to content

Commit

Permalink
Fix batched output in decoder-only models
Browse files Browse the repository at this point in the history
  • Loading branch information
jncraton committed Jan 3, 2024
1 parent ab7959b commit 3eecebb
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion languagemodels/inference.py
Original file line number Diff line number Diff line change
Expand Up @@ -170,7 +170,7 @@ def generate(
beam_size=1,
include_prompt_in_result=False,
)
outputs_ids = results[0].sequences_ids[0]
outputs_ids = [r.sequences_ids[0] for r in results]

return [tokenizer.decode(i, skip_special_tokens=True).lstrip() for i in outputs_ids]

Expand Down

0 comments on commit 3eecebb

Please sign in to comment.