-
Notifications
You must be signed in to change notification settings - Fork 14.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Added llama2-13b-chat-v1 on bedrock #13322
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎ 1 Ignored Deployment
|
Looks like you beat me to it, exactly the same changes I'd staged! Looks good to me +1! @3coins @hwchase17 |
+1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@sungeuns
Thanks for adding Llama2 support. LGTM! 🚀
This was fast, I did the exact changes :) looks good |
Open question: shouldn't this technically (also) be in Update: Added a PR to do this ourselves (@bockaerts) |
Yes it should be |
Hi there, when can we get a new release with this change? I have been using a manually updated version of langchain for that, will feel more comfortable when we get the official update. Thanks! |
@baskaryan |
@@ -80,7 +81,7 @@ def prepare_input( | |||
input_body = {**model_kwargs} | |||
if provider == "anthropic": | |||
input_body["prompt"] = _human_assistant_format(prompt) | |||
elif provider == "ai21" or provider == "cohere": | |||
elif provider == "ai21" or provider == "cohere" or provider == "meta": |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Suggesting a minor update to simplify the check. Accepting this should also kick in the CI.
elif provider == "ai21" or provider == "cohere" or provider == "meta": | |
elif provider in ["ai21", "cohere", "meta"]: |
Hi 👋 We are working with Llama2 on Bedrock, and would like to add it to Langchain. We saw a [pull request](#13322) to add it to the `llm.Bedrock` class, but since it concerns a chat model, we would like to add it to `BedrockChat` as well. - **Description:** Add support for Llama2 to `BedrockChat` in `chat_models` - **Issue:** the issue # it fixes (if applicable) [#13316](#13316) - **Dependencies:** any dependencies required for this change `None` - **Tag maintainer:** / - **Twitter handle:** `@SimonBockaert @WouterDurnez` --------- Co-authored-by: wouter.durnez <wouter.durnez@showpad.com> Co-authored-by: Simon Bockaert <simon.bockaert@showpad.com>
…n-ai#13403) Hi 👋 We are working with Llama2 on Bedrock, and would like to add it to Langchain. We saw a [pull request](langchain-ai#13322) to add it to the `llm.Bedrock` class, but since it concerns a chat model, we would like to add it to `BedrockChat` as well. - **Description:** Add support for Llama2 to `BedrockChat` in `chat_models` - **Issue:** the issue # it fixes (if applicable) [langchain-ai#13316](langchain-ai#13316) - **Dependencies:** any dependencies required for this change `None` - **Tag maintainer:** / - **Twitter handle:** `@SimonBockaert @WouterDurnez` --------- Co-authored-by: wouter.durnez <wouter.durnez@showpad.com> Co-authored-by: Simon Bockaert <simon.bockaert@showpad.com>
…n-ai#13403) Hi 👋 We are working with Llama2 on Bedrock, and would like to add it to Langchain. We saw a [pull request](langchain-ai#13322) to add it to the `llm.Bedrock` class, but since it concerns a chat model, we would like to add it to `BedrockChat` as well. - **Description:** Add support for Llama2 to `BedrockChat` in `chat_models` - **Issue:** the issue # it fixes (if applicable) [langchain-ai#13316](langchain-ai#13316) - **Dependencies:** any dependencies required for this change `None` - **Tag maintainer:** / - **Twitter handle:** `@SimonBockaert @WouterDurnez` --------- Co-authored-by: wouter.durnez <wouter.durnez@showpad.com> Co-authored-by: Simon Bockaert <simon.bockaert@showpad.com>
believe all functionality was covered in #13403 so closing this pr, let me know if i'm missing something! |
llama2-13b-chat-v1
for Bedrock clientPlease make sure your PR is passing linting and testing before submitting. Run
make format
,make lint
andmake test
to check this locally.See contribution guidelines for more information on how to write/run tests, lint, etc:
https://github.com/langchain-ai/langchain/blob/master/.github/CONTRIBUTING.md
If you're adding a new integration, please include:
docs/extras
directory.If no one reviews your PR within a few days, please @-mention one of @baskaryan, @eyurtsev, @hwchase17.