-
Notifications
You must be signed in to change notification settings - Fork 414
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added LoRAX example for multi-lora LLM inference #2883
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for adding this example @tgaddair! It looks very promising! I am testing out the example right now. Left several nits. Let's get it in as soon as possible. : )
Thanks for the review @Michaelvll, addressed your comments. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for adding the example @tgaddair and the quick fix! I just tested it out and it works like a magic.
Updated the example LoRAs to all use Mistral-7B as a base. Let me know if there's anything else you'd like to address before merging! |
* Added LoRAX example for multi-lora LLM inference * Width * Move to llm * Updated README * Fix warn * Docker link * Fix ip * Use latest * Addressed comments * Added example LoRAs * Examples * Fixed exampls * Added example
No description provided.