Skip to content

AWS Bedrock OpenAI-Compatible Provider #3410

@skamenan7

Description

@skamenan7

🚀 Describe the new functionality needed

The current AWS Bedrock provider uses direct boto3 API calls which creates complexity around credential management and doesn't leverage the existing OpenAI-compatible infrastructure. We need to refactor the Bedrock provider to use Bedrock's OpenAI-compatible endpoint instead.

  • Refactor BedrockInferenceAdapter to inherit from LiteLLMOpenAIMixin (like the Groq provider)
  • Switch from boto3 dependency to litellm
  • Update configuration to use simple API key + region instead of AWS credentials
  • Use Bedrock's OpenAI endpoint: https://bedrock-runtime.{region}.amazonaws.com/openai/v1
  • Add unit tests for the new implementation
  • Update integration tests to remove Bedrock from OpenAI compatibility exclusions

💡 Why is this needed? What if we don't build it?

  • Simplified credential management: Users can use API tokens instead of managing AWS access keys
  • Consistency: Follows the same pattern as other OpenAI-compatible providers (Groq, etc.)
  • Reduced complexity: Eliminates custom boto3 integration code in favor of proven OpenAI compatibility layer

If we don't build it:

  • Bedrock remains inconsistent with other providers' authentication patterns
  • More complex credential setup for users

Other thoughts

  • Follows established patterns already proven with the Groq provider

  • Leverages existing LiteLLMOpenAIMixin infrastructure

  • Breaking change: Existing configurations will need to be updated from AWS credentials to API key format

  • Dependency change: Switches from boto3 to litellm (but litellm is already used by other providers)

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions