Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix TorchEngine stuck when benchmarking with tp>1 #942

Merged
merged 9 commits into from
Jan 22, 2024

Conversation

grimoire
Copy link
Collaborator

Pool is daemontic, create submodule in daemontic process is not allowed.

@AllentDan
Copy link
Collaborator

我这边会和之前一样卡住。@HIT-cwh 可以试下OK不?

@AllentDan
Copy link
Collaborator

May resolve conflicts.

@lvhan028
Copy link
Collaborator

May fix the conflict

Copy link
Collaborator

@AllentDan AllentDan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@lvhan028 lvhan028 requested a review from irexyc January 15, 2024 13:08
@lvhan028 lvhan028 removed the request for review from irexyc January 15, 2024 13:19
@irexyc
Copy link
Collaborator

irexyc commented Jan 17, 2024

https://github.com/InternLM/lmdeploy/blob/main/lmdeploy/pytorch/config.py#L8C1-L16

这个地方应该有问题,torch_dtype 是GenerationConfig的属性,默认是None,所以default没什么用

@lvhan028 lvhan028 changed the title fix benchmark tp fix TorchEngine stuck when benchmarking with tp>1 Jan 22, 2024
@lvhan028 lvhan028 merged commit 9ff13ba into InternLM:main Jan 22, 2024
4 of 6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants