Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -67,13 +67,13 @@ This is because in OpenAI there is no `Deployment Name`, only a `Model Name`.

NOTE: The property `spring.ai.azure.openai.chat.options.model` has been renamed to `spring.ai.azure.openai.chat.options.deployment-name`.

NOTE: If you decide to connect to `OpenAI` instead of `Azure OpeanAI`, by setting the `spring.ai.azure.openai.openai-api-key=<Your OpenAI Key>` property,
NOTE: If you decide to connect to `OpenAI` instead of `Azure OpenAI`, by setting the `spring.ai.azure.openai.openai-api-key=<Your OpenAI Key>` property,
then the `spring.ai.azure.openai.chat.options.deployment-name` is treathed as an link:https://platform.openai.com/docs/models[OpenAI model] name.

==== Access the OpenAI Model

You can configure the client to use directly `OpenAI` instead of the `Azure OpenAI` deployed models.
For this you need to set the `spring.ai.azure.openai.openai-api-key=<Your OpenAI Key>` instead of `spring.ai.azure.openai.api-key=<Yur Azure OpenAi Key>`.
For this you need to set the `spring.ai.azure.openai.openai-api-key=<Your OpenAI Key>` instead of `spring.ai.azure.openai.api-key=<Your Azure OpenAi Key>`.

=== Add Repositories and BOM

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -229,7 +229,7 @@ CohereChatBedrockApi cohereChatApi = new CohereChatBedrockApi(
Duration.ofMillis(1000L));

var request = CohereChatRequest
.builder("What is the capital of Bulgaria and what is the size? What it the national anthem?")
.builder("What is the capital of Bulgaria and what is the size? What is the national anthem?")
.withStream(false)
.withTemperature(0.5f)
.withTopP(0.8f)
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
= Anthropic Function Calling

TIP: Starting of Jul 1th, 2024, streaming function calling and Tool use is supporetd.
TIP: Starting of Jul 1st, 2024, streaming function calling and Tool use is supported.

You can register custom Java functions with the `AnthropicChatModel` and have the Anthropic models intelligently choose to output a JSON object containing arguments to call one or many of the registered functions.
This allows you to connect the LLM capabilities with external tools and APIs.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ To support the response of the chatbot, we will register our own function that t

When the response to the prompt to the model needs to answer a question such as `"What’s the weather like in Boston?"` the AI model will invoke the client providing the location value as an argument to be passed to the function. This RPC-like data is passed as JSON.

Our function can some SaaS based weather service API and returns the weather response back to the model to complete the conversation. In this example we will use a simple implementation named `MockWeatherService` that hard codes the temperature for various locations.
Our function can have some SaaS based weather service API and returns the weather response back to the model to complete the conversation. In this example we will use a simple implementation named `MockWeatherService` that hard codes the temperature for various locations.

The following `MockWeatherService.java` represents the weather service API:

Expand Down Expand Up @@ -154,7 +154,7 @@ ChatResponse response = chatModel.call(new Prompt(List.of(userMessage),
logger.info("Response: {}", response);
----

// NOTE: You can can have multiple functions registered in your `ChatModel` but only those enabled in the prompt request will be considered for the function calling.
// NOTE: You can have multiple functions registered in your `ChatModel` but only those enabled in the prompt request will be considered for the function calling.

Above user question will trigger 3 calls to `CurrentWeather` function (one for each city) and the final response will be something like this:

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -130,7 +130,7 @@ Below is a simple code example extracted from https://github.com/spring-projects
----
byte[] data = new ClassPathResource("/vertex-test.png").getContentAsByteArray();

var userMessage = new UserMessage("Explain what do you see o this picture?",
var userMessage = new UserMessage("Explain what do you see on this picture?",
List.of(new Media(MimeTypeUtils.IMAGE_PNG, data)));

ChatResponse response = chatModel.call(new Prompt(List.of(userMessage)));
Expand Down