Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Chat 函数有bug #100

Closed
1 task done
huajianmao opened this issue Aug 7, 2023 · 5 comments
Closed
1 task done

[BUG] Chat 函数有bug #100

huajianmao opened this issue Aug 7, 2023 · 5 comments
Assignees
Labels
bug Something isn't working

Comments

@huajianmao
Copy link

是否已有关于该错误的issue? | Is there an existing issue for this?

  • 我已经搜索过已有的issues | I have searched the existing issues

当前行为 | Current Behavior

加载模型后,先调用chat(..., stream=True),然后再调chat(..., stream=False)的时候,会报错。

AttributeError: 'GenerationConfig' object has no attribute 'do_stream'

可能是因为stream里会设置self.__class__.generatesample_stream的原因。

期望行为 | Expected Behavior

两者行为应该相互独立

复现方法 | Steps To Reproduce

from transformers import AutoTokenizer, AutoModelForCausalLM
from transformers.generation import GenerationConfig

model_id = "Qwen/Qwen-7B-Chat"
tokenizer = AutoTokenizer.from_pretrained(model_id, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(model_id, device_map="cuda", trust_remote_code=True).cuda().eval()
model.generation_config = GenerationConfig.from_pretrained(model_id, trust_remote_code=True)

query = "Hi"
history = None

model.chat(tokenizer, query, history, stream=True) # <--------- OK
model.chat(tokenizer, query, history) # <--------- It will fail

### 运行环境 | Environment

```Markdown
- OS:
- Python:
- Transformers:
- PyTorch:
- CUDA (`python -c 'import torch; print(torch.version.cuda)'`):

备注 | Anything else?

No response

@huajianmao
Copy link
Author

另外,stop_words_ids在stream模式下是无效的。

@HongyuJiang
Copy link

遇到了同样的问题

@huajianmao
Copy link
Author

建议添加自动测试用例等,提高模型代码质量

@fyabc
Copy link
Contributor

fyabc commented Aug 7, 2023

@huajianmao 您好,该问题已经在最新的HuggingFace repo中修复,您可以更新到最新版本,然后重新尝试一下
在调用时,请将model.chat(..., stream=True)调用替换为model.chat_stream(...),前者已废弃

@JianxinMa
Copy link
Member

@huajianmao 您好,该问题已经在最新的HuggingFace repo中修复,您可以更新到最新版本,然后重新尝试一下 在调用时,请将model.chat(..., stream=True)调用替换为model.chat_stream(...),前者已废弃

更好的做法是不是做下兼容,让 model.chat(..., stream=True)等价于model.chat_stream(...)

@fyabc fyabc added the bug Something isn't working label Aug 8, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

5 participants