Skip to content

.Net: Bug: AWS Bedrock Connector - Cross-region inference Not Supported #10738

Closed
@HallianTech-ChiefEngineer

Description

Describe the bug
when using the cross inference profileid, I receive this error: An error occurred while initializing the BedrockChatCompletionService: Unsupported model provider: us

When using a non cross inference profileid, just the modelid, I get this message from AWS: Invocation of model ID anthropic.claude-3-5-sonnet-20241022-v2:0 with on-demand throughput isn’t supported. Retry your request with the ID or ARN of an inference profile that contains this model.

To Reproduce
Steps to reproduce the behavior:
kernelBuilder.AddBedrockChatCompletionService("us.anthropic.claude-3-5-sonnet-20241022-v2:0");

Expected behavior
I would expect either the cross region profile id to work with the semanic kernal when ondemand does not work.

Screenshots
If applicable, add screenshots to help explain your problem.

Platform

  • Language: C#
  • Source: Microsoft.SemanticKernel.Connectors.Amazon 1.38.0-alpha
  • AI model: us.anthropic.claude-3-5-sonnet-20241022-v2:0
  • IDE: Visual Studio
  • OS: Windows

Additional context
Using this with C#, dot net core 8, Blazor Server

Metadata

Metadata

Assignees

Labels

.NETIssue or Pull requests regarding .NET codebugSomething isn't workingstaleIssue is stale because it has been open for a while and has no activity

Type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions