Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BUG: Fix the launch bug of OmnilMM 12B. #1241

Merged
merged 1 commit into from
Apr 3, 2024

Conversation

hainaweiben
Copy link
Contributor

ValueError: [address=0.0.0.0:41917, pid=2321173] Unrecognized configuration class <class 'xinference.thirdparty.omnilmm.model.omnilmm.OmniLMMConfig'> for this kind of AutoModel: AutoModel.

@XprobeBot XprobeBot added the bug Something isn't working label Apr 3, 2024
@XprobeBot XprobeBot added this to the v0.10.1 milestone Apr 3, 2024
Copy link
Contributor

@qinxuye qinxuye left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@qinxuye qinxuye merged commit f7cae0c into xorbitsai:main Apr 3, 2024
4 of 12 checks passed
@hainaweiben hainaweiben deleted the fix/OmniLMM branch April 3, 2024 07:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants