Skip to content

Add AWS Bedrock provider support #872

@moncapitaine

Description

@moncapitaine

Overview

Add support for AWS Bedrock as an AI provider, enabling users to access multiple foundation models (Anthropic, Meta, Cohere, Amazon) through a unified AWS-native platform.

Why AWS Bedrock?

  • Multi-provider platform: Access Anthropic Claude, Meta Llama, Cohere, Amazon Titan, etc.
  • AWS-native: Seamless integration with AWS services (S3, Lambda, SageMaker)
  • Enterprise AWS customers: Organizations already using AWS infrastructure
  • Regional compliance: AWS regional deployments for data residency
  • Unified billing: Single AWS bill for all models

Capabilities

  • Chat Completion: Claude (Anthropic), Llama (Meta), Command (Cohere), Titan (Amazon)
  • Embedding: Titan Embeddings, Cohere Embed
  • Vision: Claude 3 models (via Bedrock)
  • ⚠️ Function Calling: Model-dependent (Claude supports it)

Implementation Checklist

Backend

  • Create AWS Bedrock provider client in ai-service-client package
  • Implement AWS SDK v3 authentication (IAM, credentials)
  • Implement chat completion (InvokeModel API)
  • Implement streaming support (InvokeModelWithResponseStream)
  • Implement embedding generation
  • Handle model-specific request/response formats
  • Add model capability detection
  • Add to provider cache system
  • Add connection testing endpoint
  • Support cross-region model access

Database

  • Add aws_bedrock to provider enum (if needed)
  • Update provider configuration schema
  • Support AWS-specific fields:
    • region (AWS region)
    • accessKeyId (optional, can use IAM role)
    • secretAccessKey (optional)

GraphQL API

  • Add AWS Bedrock provider to AiServiceProvider mutations
  • Model discovery from Bedrock ListFoundationModels API
  • Support region selection
  • Handle IAM role vs access key authentication

Frontend

  • Add AWS Bedrock provider UI in /admin/ai-services
  • Configuration form:
    • AWS region
    • Access credentials (or IAM role)
    • Model selection
  • Connection testing
  • Model selection filtered by Bedrock
  • Display model providers (Anthropic, Meta, etc.)

Documentation

  • Update /docs/admin/ai-models with AWS Bedrock
  • Update /docs/admin/ai-services with Bedrock configuration
  • Add AWS Bedrock setup guide
  • Document IAM role setup
  • Document regional model availability
  • Explain multi-provider access

Testing

  • Unit tests for Bedrock client
  • Integration tests for chat completion (multiple models)
  • Integration tests for embeddings
  • E2E tests for provider configuration
  • Test IAM authentication
  • Test cross-region access

API Details

SDK: AWS SDK for JavaScript v3 (@aws-sdk/client-bedrock-runtime)
Authentication: IAM roles, access keys, or STS temporary credentials
API: InvokeModel, InvokeModelWithResponseStream, ListFoundationModels
Region-based: Models available in specific regions (us-east-1, eu-west-1, etc.)

Model ID Examples:

  • anthropic.claude-3-5-sonnet-20241022-v2:0
  • meta.llama3-2-90b-instruct-v1:0
  • cohere.command-r-plus-v1:0
  • amazon.titan-embed-text-v2:0

Challenges

  1. Complex authentication: IAM roles, cross-account access
  2. Model-specific formats: Different request/response formats per provider
  3. Regional availability: Not all models in all regions
  4. AWS SDK dependency: Large SDK, adds complexity
  5. Cost tracking: Harder to attribute costs per model

Resources

Related

  • Part of multi-provider support initiative
  • Enables AWS-native deployments
  • Provides unified access to multiple model providers

Priority

P3-low - Valuable for AWS-heavy organizations, but adds significant complexity. Most users can access Anthropic/Meta/Cohere directly.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions