-
Notifications
You must be signed in to change notification settings - Fork 55
Closed
Description
Problem
When using the LlamaModel.generate() method, the resulting tokens are corrupted in case the model generates emojis as response. Instead of getting the correct emojis, I get question mark blocks instead.
It seems emoji characters are split into multiple tokens, and buffering might be necessary?
Similar Issue
Here's a related issue with a comment from a user who figured out a fix for the C# binding library
Metadata
Metadata
Assignees
Labels
No labels