-
Notifications
You must be signed in to change notification settings - Fork 500
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
官方示例输出,是否符合期望? #24
Comments
我也遇到了类似的问题,感觉输出大部分时间都不太符合预期 |
能否提供一个脚本参数,让模型正常起来 |
遇到同样的问题 |
推理代码: from transformers import AutoModelForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("/home/user/models/pre/baichuan-7b", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained("/home/user/models/pre/baichuan-7b", device_map="auto", trust_remote_code=True)
inputs = tokenizer('登鹳雀楼->王之涣\n夜雨寄北->', return_tensors='pt')
inputs = inputs.to('cuda:0')
pred = model.generate(**inputs, max_new_tokens=64)
print(tokenizer.decode(pred.cpu()[0], skip_special_tokens=True)) |
@yucc-leon @expresschen 结尾不要\n |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
第一行是输入
The text was updated successfully, but these errors were encountered: