-
Notifications
You must be signed in to change notification settings - Fork 389
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[QA] 书生2模型有关chat_template的问题 #700
Comments
你可以使用tokenizer的 chat = [{"role": "user", "content": "Hello! What's your name?"},
{"role": "assistant", "content": "My name is InternLM2!"},
{"role": "user", "content": "Nice to meet you InternLM2!"},]
# convert the chat history to a string for generation
chat_str = tokenizer.apply_chat_template(chat, tokenize=False, add_generation_prompt=True)
print(chat_str) output:
我们之后会更新示例里面的用法 |
This issue is marked as stale because it has been marked as invalid or awaiting response for 7 days without any further response. It will be closed in 7 days if the stale label is not removed or if there is no further response. |
如何体验 function call 的功能? |
This issue is marked as stale because it has been marked as invalid or awaiting response for 7 days without any further response. It will be closed in 7 days if the stale label is not removed or if there is no further response. |
This issue is closed because it has been stale for 7 days. Please open a new issue if you have similar issues or you have any new updates now. |
描述问题
整个模型系列中有
chat_template
,但是最新的模型文件中modeling.py
里面没有使用到。官方的例程里使用Transformers调用模型,对话用的是
chat()
,添加使用function call会很麻烦。希望能基于openai样式的对话历史来组装inputs。
The text was updated successfully, but these errors were encountered: