-
Notifications
You must be signed in to change notification settings - Fork 3.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
.Net: Bug: AWS Bedrock Connector - Cross-region inference Not Supported #10738
Comments
I'm currently facing the same issue for python. Here is my configuration and the error. bedrock_runtime_client = boto3.client(service_name='bedrock-runtime', config=my_config)
bedrock_client = boto3.client("bedrock", config=my_config)
sk_client = BedrockChatCompletion(
model_id='us.anthropic.claude-3-7-sonnet-20250219-v1:0',
runtime_client=bedrock_runtime_client,
client=bedrock_client,
)
# Configure execution settings
settings = BedrockChatPromptExecutionSettings(
temperature=0.7,
max_tokens=1000,
) File "/Users/leonardopinheiro/git/autogen/python/packages/autogen-ext/src/autogen_ext/models/semantic_kernel/_sk_chat_completion_adapter.py", line 565, in create_stream
async for streaming_messages in self._sk_client.get_streaming_chat_message_contents(
File "/Users/leonardopinheiro/git/autogen/python/.venv/lib/python3.11/site-packages/semantic_kernel/connectors/ai/chat_completion_client_base.py", line 261, in get_streaming_chat_message_contents
async for streaming_chat_message_contents in self._inner_get_streaming_chat_message_contents(
File "/Users/leonardopinheiro/git/autogen/python/.venv/lib/python3.11/site-packages/semantic_kernel/utils/telemetry/model_diagnostics/decorators.py", line 164, in wrapper_decorator
async for streaming_chat_message_contents in completion_func(*args, **kwargs):
File "/Users/leonardopinheiro/git/autogen/python/.venv/lib/python3.11/site-packages/semantic_kernel/connectors/ai/bedrock/services/bedrock_chat_completion.py", line 134, in _inner_get_streaming_chat_message_contents
model_info = await self.get_foundation_model_info(self.ai_model_id)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/leonardopinheiro/git/autogen/python/.venv/lib/python3.11/site-packages/semantic_kernel/connectors/ai/bedrock/services/bedrock_base.py", line 24, in get_foundation_model_info
response = await run_in_executor(
^^^^^^^^^^^^^^^^^^^^^^
File "/Users/leonardopinheiro/git/autogen/python/.venv/lib/python3.11/site-packages/semantic_kernel/connectors/ai/bedrock/services/model_provider/utils.py", line 26, in run_in_executor
return await asyncio.get_event_loop().run_in_executor(executor, partial(func, *args, **kwargs))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/leonardopinheiro/.pyenv/versions/3.11.8/lib/python3.11/concurrent/futures/thread.py", line 58, in run
result = self.fn(*self.args, **self.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/leonardopinheiro/git/autogen/python/.venv/lib/python3.11/site-packages/botocore/client.py", line 569, in _api_call
return self._make_api_call(operation_name, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/leonardopinheiro/git/autogen/python/.venv/lib/python3.11/site-packages/botocore/client.py", line 1023, in _make_api_call
raise error_class(parsed_response, operation_name)
botocore.errorfactory.ValidationException: An error occurred (ValidationException) when calling the GetFoundationModel operation: The provided model identifier is invalid. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Describe the bug
when using the cross inference profileid, I receive this error: An error occurred while initializing the BedrockChatCompletionService: Unsupported model provider: us
When using a non cross inference profileid, just the modelid, I get this message from AWS: Invocation of model ID anthropic.claude-3-5-sonnet-20241022-v2:0 with on-demand throughput isn’t supported. Retry your request with the ID or ARN of an inference profile that contains this model.
To Reproduce
Steps to reproduce the behavior:
kernelBuilder.AddBedrockChatCompletionService("us.anthropic.claude-3-5-sonnet-20241022-v2:0");
Expected behavior
I would expect either the cross region profile id to work with the semanic kernal when ondemand does not work.
Screenshots
If applicable, add screenshots to help explain your problem.
Platform
Additional context
Using this with C#, dot net core 8, Blazor Server
The text was updated successfully, but these errors were encountered: