-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: Asking to pad but the tokenizer does not have a padding token. Please select a token to use as `pad_token #8603
Comments
Using unk_token, but it is not set yet. same question in qwen2 inference:
|
没用的,llama3连unk_token都set不了,这里的代码改动可以先忽略。我们根本上需要同时解决pad_token和unk_token的set问题才行,我猜测,其他token可能也同样无法set。 |
This issue is stale because it has been open for 60 days with no activity. 当前issue 60天内无活动,被标记为stale。 |
This issue was closed because it has been inactive for 14 days since being marked as stale. 当前issue 被标记为stale已有14天,即将关闭。 |
软件环境
重复问题
错误描述
!pip install tiktoken
!python predictor.py --model_name_or_path meta-llama/Meta-Llama-3-8B-Instruct --dtype=float16
The text was updated successfully, but these errors were encountered: