-
Notifications
You must be signed in to change notification settings - Fork 427
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Check bf16 model in torch engine #1270
Conversation
model_config = ModelConfig.from_hf_config(config, | ||
model_path=model_path) | ||
if model_config.dtype == torch.bfloat16: | ||
assert not torch.cuda.is_bf16_supported(), ( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这种情况下,告警,并 fallback到 fp16,如何?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
warning 用户不一定会注意,这样更醒目,也能强调 fallback 会造成精度问题。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
有没有可以在 engine 中 fallback 到 fp16的办法呢?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lmdeploy/lmdeploy/pytorch/engine/model_agent.py
Lines 493 to 495 in 9c3069f
hf_model = AutoModelForCausalLM.from_pretrained( | |
model_path, | |
torch_dtype=torch_dtype, |
lmdeploy/lmdeploy/pytorch/engine/model_agent.py
Lines 710 to 712 in 9c3069f
model = AutoModelForCausalLM.from_config( | |
config, | |
torch_dtype=torch_dtype, |
可以,但是没有办法保证这样的计算结果的正确性。用户也很容易无视 warning。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
也好,我们先看看社区的反馈。
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
check bf16 support on v100.