Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

i请问一下lora fintune chatglm的时候,data_collector函数里面为什么没有attention_mask相关内容呀 #8

Closed
zzy347964399 opened this issue Sep 12, 2023 · 1 comment

Comments

@zzy347964399
Copy link

No description provided.

@zzy347964399
Copy link
Author

GLM内部会对Input检查并加上,所以attention mask 和 position id 可以不丢给trainer #mymusise/ChatGLM-Tuning#256

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant