Skip to content

[FEATURE] dotAI: LangChain4J integration — Phase 1 (OpenAI) #35183

@ihoffmann-dot

Description

@ihoffmann-dot

Problem Statement

dotCMS currently supports only standalone OpenAI for AI capabilities (text generation, embeddings, image generation). The implementation is tightly coupled to OpenAI's API contract: hardcoded HTTP calls, OpenAI-specific parameters, and model validation via /v1/models. This makes it impossible to support alternative providers without rewriting the core request/response flow.

Enterprise customers in regulated industries cannot adopt dotAI because their compliance frameworks require AI providers that operate within approved cloud infrastructure (Azure, AWS, GCP) and provide contractual data privacy guarantees.

Solution

Integrate LangChain4J as an abstraction layer between dotCMS and AI providers, replacing the direct OpenAI HTTP client. Phase 1 establishes the architecture using OpenAI as the first provider, setting the foundation for Azure OpenAI, AWS Bedrock, and Google Vertex AI in subsequent phases.

The providerConfig JSON secret replaces all individual AI secrets with a single structured configuration object, supporting per-section (chat/embeddings/image) provider and model settings.

Acceptance Criteria

  • Existing OpenAI customers can configure dotAI using the new providerConfig JSON format.
  • Chat completions, embeddings, and image generation all route through the LangChain4J abstraction layer.
  • LangChain4jModelFactory is the single place with provider-specific builder logic — adding a new provider requires changes only there.
  • ProviderConfig supports configuration fields for future providers (Azure OpenAI, AWS Bedrock, Google Vertex AI).
  • No breaking changes to existing CompletionsAPI, EmbeddingsAPI, or ImageAPI interfaces.

dotCMS Version

main

Links

Metadata

Metadata

Assignees

Type

No type

Projects

Status

In Review

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions