-
Notifications
You must be signed in to change notification settings - Fork 155
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding together.ai support #280
Conversation
name: modelName, | ||
emoji: "🤝", | ||
model, | ||
base_model: "together/" + model, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It should be just "together"
Together_Undi95_ReMM_SLERP_L2_13B = "Undi95/ReMM-SLERP-L2-13B", | ||
Together_Undi95_Toppy_M_7B = "Undi95/Toppy-M-7B", | ||
Together_WizardLM_WizardLM_v1_2_13B = "WizardLM/WizardLM-13B-V1.2", | ||
Together_upstage_Upstage_SOLAR_Instruct_v1_11B = "upstage/SOLAR-10.7B-Instruct-v1.0", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we add "together/" in front of all of these? And just extract out the last part in call_together
. To not clutter the namespace. Thanks!
(I realize I should've thought harder about this problem when structuring this backend ---it is on my todo list ot restructure it, but it is so central it is not so easy anymore.)
"Undi95/ReMM-SLERP-L2-13B", | ||
"Undi95/Toppy-M-7B", | ||
"WizardLM/WizardLM-13B-V1.2", | ||
"upstage/SOLAR-10.7B-Instruct-v1.0", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You'll have to add together/
to all of these, too...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks largely good! Just make the few changes and I will push when I return from my conference this week.
I may get to this if I have a moment; let me know if you are in the midst of working on it though. |
This is the main PR blocked the updates to Amazon Bedrock models (other PRs). I intend to merge all of these together in |
I'll do it |
* feat(bedrock_llama3): added support for Llama3 (#270) - added also Claude 3 Opus to the list of models - replaced hardcoded model Id strings with refs to NativeLLM enum * chore: bump @mirai73/bedrock-fm library (#277) - the new version adds source code to facilitate debugging Co-authored-by: ianarawjo <fatso784@gmail.com> * Adding together.ai support (#280) --------- Co-authored-by: ianarawjo <fatso784@gmail.com> * Add Together.ai and update Bedrock models --------- Co-authored-by: Massimiliano Angelino <angmas@amazon.com> Co-authored-by: Can Bal <canbal@users.noreply.github.com>
Apologies. I didn't get any notifications for your comments so it's only after I decided to check myself I saw your feedback. Thanks for cleaning and merging. |
With this change I'm adding support for all the chat completion models that together.ai supports. Together.ai has lots of models. A flat list didn't fit in a single submenu on my screen. I looked into making it scroll but
mantine-contextmenu
library doesn't seem to easily support styling submenus. Since many of the models are trained by other companies, it made sense to me to add yet another nested menu under "Together" and group them by their prefix. Luckilymantine-contextmenu
supports infinitely many nested submenus. To support that, I had to refactor a few things. Here's how it looks like.I just discovered the project today and I'm not highly familiar with Typescript either. My styling might be off, and my code might not be in the best places. Feel free to edit, or give me feedback so I can make changes accordingly.
fixes #261