You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
Calling run with the Mixtral model - mistral.mixtral-8x7b-instruct-v0:1 is failing with the following error message: Could not inference Amazon Bedrock model mistral.mixtral-8x7b-instruct-v0:1 due: An error occurred (ValidationException) when calling the InvokeModel operation: Validation Error.
While the code runs successfully with the Mistral model - mistral.mistral-7b-instruct-v0:2.
To Reproduce
AmazonBedrockChatGenerator(
# model="mistral.mistral-7b-instruct-v0:2", It passes# model="mistral.mixtral-8x7b-instruct-v0:1", It fails
).run(prompt)
Describe your environment (please complete the following information):
OS: Ubuntu 22.04
Haystack version: 2.1.2
Integration version: 0.7.1
PS: I've access to both the models, so, it's not an access issue.
The text was updated successfully, but these errors were encountered:
@anakin87 Actually, this is a different issue. I'm able to run inference for the MiStral Model via HF access token.
But the issue here is that while the MiStral model works fine. The MiXtral model gives the above mentioned validation error.
Describe the bug
Calling
run
with the Mixtral model -mistral.mixtral-8x7b-instruct-v0:1
is failing with the following error message:Could not inference Amazon Bedrock model mistral.mixtral-8x7b-instruct-v0:1 due: An error occurred (ValidationException) when calling the InvokeModel operation: Validation Error
.While the code runs successfully with the Mistral model -
mistral.mistral-7b-instruct-v0:2
.To Reproduce
Describe your environment (please complete the following information):
PS: I've access to both the models, so, it's not an access issue.
The text was updated successfully, but these errors were encountered: