This project demonstrates how to configure and call two different Large Language Models (LLMs) in the same Java application using Spring AI. Specifically, it shows how to integrate OpenAI and Anthropic models within a Spring Boot application.
This Spring Boot application showcases the following features:
- Integration with Spring AI
- Configuration of both OpenAI and Anthropic chat models
- Separate REST endpoints for each LLM
- Usage of ChatClient for interacting with the models
Before you begin, ensure you have the following installed:
- Java Development Kit (JDK) 23 or later
- Apache Maven
- An OpenAI API key
- An Anthropic API key
The project consists of the following key components:
Application.java
: The main Spring Boot application classOpenAiChatController.java
: Controller for OpenAI interactionsAnthropicChatController.java
: Controller for Anthropic interactionspom.xml
: Maven configuration file
- Clone the repository:
git clone https://github.com/yourusername/multiple-llms.git
cd multiple-llms
- You can disable the ChatClient.Builder autoconfiguration by setting the property:
spring.ai.chat.client.enabled=false
- Set up your API keys:
Create an application.properties
file in the src/main/resources
directory and add your API keys:
spring.ai.openai.api-key=your_openai_api_key
spring.ai.anthropic.api-key=your_anthropic_api_key
- Build the project:
mvn clean install
- Run the application:
mvn spring-boot:run
Once the application is running, you can interact with the LLMs using the following endpoints:
- OpenAI:
http://localhost:8080/openai
- Anthropic:
http://localhost:8080/claude
Each endpoint will return an interesting fact about the respective company.
In Application.java
, we define two ChatClient
beans, one for each LLM:
@Bean
public ChatClient openAIChatClient(OpenAiChatModel chatModel) {
return ChatClient.create(chatModel);
}
@Bean
public ChatClient anthropicChatClient(AnthropicChatModel chatModel) {
return ChatClient.create(chatModel);
}
Each controller is injected with the appropriate ChatClient
using the @Qualifier
annotation:
@RestController
public class OpenAiChatController {
private final ChatClient chatClient;
public OpenAiChatController(@Qualifier("openAIChatClient") ChatClient chatClient) {
this.chatClient = chatClient;
}
@GetMapping("/openai")
public String home() {
return chatClient.prompt()
.user("Tell me an interesting fact about OpenAI")
.call()
.content();
}
}
The Anthropic controller follows a similar pattern.
This project uses the following key dependencies:
- Spring Boot Starter Web
- Spring AI OpenAI Spring Boot Starter
- Spring AI Anthropic Spring Boot Starter
The complete list of dependencies can be found in the pom.xml
file.
To customize the prompts or add more functionality:
- Modify the controller methods to accept user input.
- Adjust the prompts in the
chatClient.user()
calls. - Add new endpoints for different types of interactions with the LLMs.
If you encounter any issues:
- Ensure your API keys are correctly set in the
application.properties
file. - Check that you're using a compatible Java version (23 or later).
- Verify that all dependencies are correctly resolved by Maven.
Contributions are welcome! Please feel free to submit a Pull Request.
This project is open source and available under the MIT License.
Happy coding with Spring AI and multiple LLMs!