-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Labels
enhancementNew feature or requestNew feature or request
Description
Overview
Add support for AWS Bedrock as an AI provider, enabling users to access multiple foundation models (Anthropic, Meta, Cohere, Amazon) through a unified AWS-native platform.
Why AWS Bedrock?
- Multi-provider platform: Access Anthropic Claude, Meta Llama, Cohere, Amazon Titan, etc.
- AWS-native: Seamless integration with AWS services (S3, Lambda, SageMaker)
- Enterprise AWS customers: Organizations already using AWS infrastructure
- Regional compliance: AWS regional deployments for data residency
- Unified billing: Single AWS bill for all models
Capabilities
- ✅ Chat Completion: Claude (Anthropic), Llama (Meta), Command (Cohere), Titan (Amazon)
- ✅ Embedding: Titan Embeddings, Cohere Embed
- ✅ Vision: Claude 3 models (via Bedrock)
⚠️ Function Calling: Model-dependent (Claude supports it)
Implementation Checklist
Backend
- Create AWS Bedrock provider client in
ai-service-clientpackage - Implement AWS SDK v3 authentication (IAM, credentials)
- Implement chat completion (InvokeModel API)
- Implement streaming support (InvokeModelWithResponseStream)
- Implement embedding generation
- Handle model-specific request/response formats
- Add model capability detection
- Add to provider cache system
- Add connection testing endpoint
- Support cross-region model access
Database
- Add
aws_bedrockto provider enum (if needed) - Update provider configuration schema
- Support AWS-specific fields:
region(AWS region)accessKeyId(optional, can use IAM role)secretAccessKey(optional)
GraphQL API
- Add AWS Bedrock provider to
AiServiceProvidermutations - Model discovery from Bedrock ListFoundationModels API
- Support region selection
- Handle IAM role vs access key authentication
Frontend
- Add AWS Bedrock provider UI in
/admin/ai-services - Configuration form:
- AWS region
- Access credentials (or IAM role)
- Model selection
- Connection testing
- Model selection filtered by Bedrock
- Display model providers (Anthropic, Meta, etc.)
Documentation
- Update
/docs/admin/ai-modelswith AWS Bedrock - Update
/docs/admin/ai-serviceswith Bedrock configuration - Add AWS Bedrock setup guide
- Document IAM role setup
- Document regional model availability
- Explain multi-provider access
Testing
- Unit tests for Bedrock client
- Integration tests for chat completion (multiple models)
- Integration tests for embeddings
- E2E tests for provider configuration
- Test IAM authentication
- Test cross-region access
API Details
SDK: AWS SDK for JavaScript v3 (@aws-sdk/client-bedrock-runtime)
Authentication: IAM roles, access keys, or STS temporary credentials
API: InvokeModel, InvokeModelWithResponseStream, ListFoundationModels
Region-based: Models available in specific regions (us-east-1, eu-west-1, etc.)
Model ID Examples:
anthropic.claude-3-5-sonnet-20241022-v2:0meta.llama3-2-90b-instruct-v1:0cohere.command-r-plus-v1:0amazon.titan-embed-text-v2:0
Challenges
- Complex authentication: IAM roles, cross-account access
- Model-specific formats: Different request/response formats per provider
- Regional availability: Not all models in all regions
- AWS SDK dependency: Large SDK, adds complexity
- Cost tracking: Harder to attribute costs per model
Resources
Related
- Part of multi-provider support initiative
- Enables AWS-native deployments
- Provides unified access to multiple model providers
Priority
P3-low - Valuable for AWS-heavy organizations, but adds significant complexity. Most users can access Anthropic/Meta/Cohere directly.
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request