Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for Bedrock Llama 2 13b model (meta.llama2-13b-chat-v1) #13316

Closed
1 of 14 tasks
mirodrr opened this issue Nov 14, 2023 · 4 comments
Closed
1 of 14 tasks

Add support for Bedrock Llama 2 13b model (meta.llama2-13b-chat-v1) #13316

mirodrr opened this issue Nov 14, 2023 · 4 comments
Labels
🔌: aws Primarily related to Amazon Web Services (AWS) integrations 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature Ɑ: models Related to LLMs or chat model modules

Comments

@mirodrr
Copy link

mirodrr commented Nov 14, 2023

System Info

I get following error just by adding model parameters to existing code that works with other models

"Malformed input request: 2 schema violations found, please reformat your input and try again."

model_name = "meta.llama2-13b-chat-v1"
model_kwargs = {
"max_gen_len": 512,
"temperature": 0.2,
 "top_p": 0.9
}
bedrock_boto = boto3.client("bedrock-runtime", "us-east-1")
bedrock_llm = Bedrock(model_id=model_name, client=bedrock_boto,model_kwargs=model_kwargs)
bedrock_llm("Hello!")

Who can help?

@hwchase17
@agola11

Information

  • The official example notebooks/scripts
  • My own modified scripts

Related Components

  • LLMs/Chat Models
  • Embedding Models
  • Prompts / Prompt Templates / Prompt Selectors
  • Output Parsers
  • Document Loaders
  • Vector Stores / Retrievers
  • Memory
  • Agents / Agent Executors
  • Tools / Toolkits
  • Chains
  • Callbacks/Tracing
  • Async

Reproduction

  1. Attempt to call the new llama2 bedrock model like so:
model_name = "meta.llama2-13b-chat-v1"
model_kwargs = {
"max_gen_len": 512,
"temperature": 0.2,
 "top_p": 0.9
}
bedrock_boto = boto3.client("bedrock-runtime", "us-east-1")
bedrock_llm = Bedrock(model_id=model_name, client=bedrock_boto,model_kwargs=model_kwargs)
bedrock_llm("Hello!")

Expected behavior

The Bedrock class would work successfully as it does for other BedRock models

Copy link

dosubot bot commented Nov 14, 2023

🤖

Hello @mirodrr!

Nice to meet you. I'm Dosu, an automated assistant. I'm here to help you with your issues, answer any questions, and guide you on becoming a contributor to the LangChain repository while we wait for a human maintainer.

I see you're having an issue with the Bedrock Llama 2 13b model. I'm currently reviewing your issue and will provide a more comprehensive answer shortly. Hang tight!

@dosubot dosubot bot added Ɑ: models Related to LLMs or chat model modules 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature 🔌: aws Primarily related to Amazon Web Services (AWS) integrations labels Nov 14, 2023
Copy link

dosubot bot commented Nov 14, 2023

🤖

Based on the information you've provided, it seems like you're encountering a "Malformed input request" error when trying to use the "meta.llama2-13b-chat-v1" model with the Bedrock class in LangChain. This error typically occurs when the input parameters provided to the model do not match the parameters that the model expects.

From the context provided, there are two similar issues that have been resolved in the past:

  1. Issue: Amazon Bedrock Cohere Command - Malformed input request: 2 schema violations found, please reformat your input and try again. - The user found that the issue was due to the LangChain library being outdated. Therefore, updating the LangChain library to the latest version resolved the issue.

  2. Inference parameters for Bedrock titan models not working - This issue was closed by a pull request: Fixes model arguments for amazon models on June 08, 2023.

Based on these previous issues, I would recommend checking the version of LangChain you're currently using. If it's not the latest version (v0.0.335 as of November 14, 2023), updating to the latest version might resolve your issue.

Additionally, it's important to ensure that the parameters you're passing to the "meta.llama2-13b-chat-v1" model match the parameters that the model expects. According to the LangChain codebase, the "meta.llama2-13b-chat-v1" model accepts the following parameters:

  • messages: A list of BaseMessage objects.
  • stop: An optional list of strings.
  • run_manager: An optional CallbackManagerForLLMRun or AsyncCallbackManagerForLLMRun object.
  • stream: An optional boolean.
  • **kwargs: Any additional keyword arguments.

Please ensure that your input parameters match these expected parameters. If you're still encountering issues after checking these things, please provide more information about the exact error message you're seeing and any additional context about your setup that might be relevant.

Sources


This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

@CarloNicolini
Copy link

Same error here with langchain 0.0.336.

The DOSU bot is confusing and wrong. Solution is simpler, there is a reason why you get the 2-schema validation errors.
If you look at the body of the Bedrock._prepare_input_and_invoke the cases of 'cohere', 'anthropic', and other providers are manually set.
There is the need to manually include the 'meta' provider by adding all the parts of the code containing match cases with the provider 'meta'.
In practice in the body one should modify the inputText key renaming it into prompt.

baskaryan pushed a commit that referenced this issue Nov 20, 2023
Hi 👋 We are working with Llama2 on Bedrock, and would like to add it to
Langchain. We saw a [pull
request](#13322) to add it
to the `llm.Bedrock` class, but since it concerns a chat model, we would
like to add it to `BedrockChat` as well.

- **Description:** Add support for Llama2 to `BedrockChat` in
`chat_models`
- **Issue:** the issue # it fixes (if applicable)
[#13316](#13316)
  - **Dependencies:** any dependencies required for this change `None`
  - **Tag maintainer:** /
  - **Twitter handle:** `@SimonBockaert @WouterDurnez`

---------

Co-authored-by: wouter.durnez <wouter.durnez@showpad.com>
Co-authored-by: Simon Bockaert <simon.bockaert@showpad.com>
pprados pushed a commit to pprados/langchain that referenced this issue Nov 20, 2023
…n-ai#13403)

Hi 👋 We are working with Llama2 on Bedrock, and would like to add it to
Langchain. We saw a [pull
request](langchain-ai#13322) to add it
to the `llm.Bedrock` class, but since it concerns a chat model, we would
like to add it to `BedrockChat` as well.

- **Description:** Add support for Llama2 to `BedrockChat` in
`chat_models`
- **Issue:** the issue # it fixes (if applicable)
[langchain-ai#13316](langchain-ai#13316)
  - **Dependencies:** any dependencies required for this change `None`
  - **Tag maintainer:** /
  - **Twitter handle:** `@SimonBockaert @WouterDurnez`

---------

Co-authored-by: wouter.durnez <wouter.durnez@showpad.com>
Co-authored-by: Simon Bockaert <simon.bockaert@showpad.com>
amiaxys pushed a commit to Haoming-jpg/team-skill-issue-langchain that referenced this issue Nov 23, 2023
…n-ai#13403)

Hi 👋 We are working with Llama2 on Bedrock, and would like to add it to
Langchain. We saw a [pull
request](langchain-ai#13322) to add it
to the `llm.Bedrock` class, but since it concerns a chat model, we would
like to add it to `BedrockChat` as well.

- **Description:** Add support for Llama2 to `BedrockChat` in
`chat_models`
- **Issue:** the issue # it fixes (if applicable)
[langchain-ai#13316](langchain-ai#13316)
  - **Dependencies:** any dependencies required for this change `None`
  - **Tag maintainer:** /
  - **Twitter handle:** `@SimonBockaert @WouterDurnez`

---------

Co-authored-by: wouter.durnez <wouter.durnez@showpad.com>
Co-authored-by: Simon Bockaert <simon.bockaert@showpad.com>
Copy link

dosubot bot commented Feb 15, 2024

Hi, @mirodrr,

I'm helping the LangChain team manage their backlog and am marking this issue as stale. It seems that the issue you reported regarding the "Malformed input request" error has received detailed responses from me and "CarloNicolini" suggesting potential resolutions related to outdated LangChain library or incorrect input parameters.

Could you please confirm if this issue is still relevant to the latest version of the LangChain repository? If it is, kindly let the LangChain team know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days. Thank you!

@dosubot dosubot bot added the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Feb 15, 2024
@dosubot dosubot bot closed this as not planned Won't fix, can't repro, duplicate, stale Feb 22, 2024
@dosubot dosubot bot removed the stale Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed label Feb 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🔌: aws Primarily related to Amazon Web Services (AWS) integrations 🤖:bug Related to a bug, vulnerability, unexpected error with an existing feature Ɑ: models Related to LLMs or chat model modules
Projects
None yet
Development

No branches or pull requests

2 participants