-
Notifications
You must be signed in to change notification settings - Fork 13.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error : Stop sequence key name for {meta or mistral or any other mode} is not supported with AWS Bedrock #20053
Comments
Bedrock meta models don't currently support stop sequences. Mistral models have a parameter mapping issue addressed in the above PR' If you want to use any of the three Mistral models for an agent you can use the kwargs settings to define them at the client level. And then modify the agent_executor object to remove any stop sequences it will try to pass over to the llm class until the PR is merged. Like so'ish
|
Thanks for the PR, i'm surprised this wasn't tested before the bedrock wrapper was released. |
@jonathancaevans Thanks .. I did try this before. but it doesn't work either specially when I am using an agent_executor.
I tried other services like - OpenAI models, Gemini Models via Vertex and none of them have this issue. Hopefully that PR merge will help. |
Fairly simple workaround for now until the PR is merged:
|
- **Description:** Change Bedrock's Mistral stop sequence key mapping to "stop" rather than "stop_sequences" which is the correct key [Bedrock docs link](https://docs.aws.amazon.com/bedrock/latest/userguide/model-parameters-mistral.html) `{ "prompt": string, "max_tokens" : int, "stop" : [string], "temperature": float, "top_p": float, "top_k": int }` - **Issue:** #20053 - **Dependencies:** N/A - **Twitter handle:** N/a
What do we have to change for 'meta' here? I'm getting the same error for 'meta.llama3-8b-instruct-v1:0'. |
Meta is not in the key map, maybe you can add in stop sequences as kawargs, you'll have to check what boto3 supports with meta models. |
I added |
llm.provider_stop_sequence_key_name_map = {...} is not the solution for the initial bug reported. |
Checked other resources
Example Code
I am trying to use AWS Bedrock models such as Llama / Mistral with Langchain Libraries such as SQLDatabaseToolkit.
Error Message and Stack Trace (if applicable)
This errors out with the following.
Description
I have tried the same code with
OpenAI
/Ollama Mistral/Lamma
as well as google GenAI models and they don't seem to show this error. This seems like something with the way the bedrock library works in Langchain or the bedrock service.Is there a workaround I can use to get this to work.
System Info
System Information
Package Information
Packages not installed (Not Necessarily a Problem)
The following packages were not found:
The text was updated successfully, but these errors were encountered: