-
Notifications
You must be signed in to change notification settings - Fork 2
Description
To provide users with greater flexibility and control over cost, performance, and capabilities, Ghost Code should be extended to support multiple Large Language Model (LLM) providers. This enhancement will allow users to select their preferred provider (e.g., OpenAI, Google, Anthropic) and model directly within the settings.ghostdev configuration file.
Proposed Functionality
Configuration Update: The llm section within the settings.ghostdev file will be the designated area for provider selection. The schema should be standardized to accept a provider name and the corresponding model and apiKey.
Modular LLM Service:
-
Refactor the existing LLM integration into a modular service layer.
-
Implement separate clients or adapters for each supported LLM provider (e.g., OpenAIClient, GoogleClient, AnthropicClient).
-
A factory or strategy pattern will be used to instantiate the correct client at runtime based on the provider value in settings.ghostdev.
Dynamic Client Selection:
On startup, Ghost Code will parse the settings.ghostdev file to determine the user's selected LLM provider.
The application will then route all subsequent AI-powered tasks through the corresponding client, using the specified model and API key.
Acceptance Criteria
- The system can be successfully configured to use different LLM providers (e.g., OpenAI, Google, Anthropic) via the settings.ghostdev file.
- The application correctly initializes the appropriate LLM client based on the configuration.
- All AI features (code generation, analysis, etc.) function correctly using the user-selected provider.
- If the llm configuration is invalid or missing, the system falls back to a default provider or logs a clear, user-friendly error message.
- Documentation is updated to list all supported providers and provide clear instructions on how to configure them