-
Notifications
You must be signed in to change notification settings - Fork 675
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: integrate the llama3 (8B, 70B), Mistral.AI, Gemma (7B, 9B) served by Groq #531
Conversation
Important Review skippedAuto reviews are disabled on this repository. Please check the settings in the CodeRabbit UI or the You can disable this status message by setting the Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (invoked as PR comments)
Additionally, you can add CodeRabbit Configuration File (
|
@Wendong-Fan Hi, could you please help me add |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! Left a question
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @Appointat 's contribution! Overall is great, left some comments
@camel-ai/camel-maintainers @Wendong-Fan hi guys, this pr is fixed finally, could you please review it? thanks |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @Appointat 's update, left some comments
…token counting improvements
@Wendong-Fan I have seen an error in pytest
It means the model can not download the tokenizer from huggingface, we need to add |
token counter access issue fixed |
Thank you, I will fix the conflits. |
Hey @Appointat , did some update in 407b44e please review the change, I used OpenAITokenCounter as default token counter since it's more easy for user set it up even it's not accurate for open source models, now we also allow user to switch the token counter when initilizing the model, let me know WDYT~ |
@Wendong-Fan I have checked the code, but I cannot review the code. I think it is ok to be merged. Thank you. |
Description
Describe your changes in detail.
Motivation and Context
Types of changes
What types of changes does your code introduce? Put an
x
in all the boxes that apply: