-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Model] Add support for GPT-J #226
Conversation
@AndreSlavescu Awesome! Thanks for your contribution. Is this PR ready for review? Otherwise, please ping me when you are ready. Thanks again! |
can you merge this change? so we can test it out with our fine tuned gpt-j model? 8-) |
@AndreSlavescu What's going on with the PR? If you are not able to continue it, no worries, I can take it. Please let us know if you have any question. |
@WoosukKwon Hi sorry for the delayed reply, had a busy schedule this past week. I won't have much time to continue this coming week, so please continue on it if you'd like. |
Is it just waiting for review or requires additional work? Is it expected to be working (if so I can use it now). |
@ri938 This PR is not ready yet. I'll take this over and finish the PR soon. |
|
@zhuohan123 This PR is ready for review. Please take a look at it. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! Left some minor comments.
@silvacarl2 @ri938 We'v just merged this PR. Please install vLLM from source and try it out! |
Cool will do!! |
got this error: python offline_inference.py |
same with gpt-neo: python offline_inference.py |
@silvacarl2 Could you check again if you installed the latest vLLM from source? BTW, GPTNeo is not supported yet. |
NP, trying out others |
install vllm from source. i encountered this problem:
|
Co-authored-by: woWoosuk Kwon <woosuk.kwon@berkeley.edu>
Co-authored-by: woWoosuk Kwon <woosuk.kwon@berkeley.edu>
reference to issue #198