Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -145,13 +145,13 @@ ChatResponse response = chatModel.call(
new Prompt(
"Generate the names of 5 famous pirates.",
OpenAiChatOptions.builder()
.withModel("deepseek-chat")
.withTemperature(0.4)
.model("deepseek-chat")
.temperature(0.4)
.build()
));
----

TIP: In addition to the model specific https://github.com/spring-projects/spring-ai/blob/main/models/spring-ai-openai/src/main/java/org/springframework/ai/openai/OpenAiChatOptions.java[OpenAiChatOptions] you can use a portable https://github.com/spring-projects/spring-ai/blob/main/spring-ai-core/src/main/java/org/springframework/ai/chat/prompt/ChatOptions.java[ChatOptions] instance, created with the https://github.com/spring-projects/spring-ai/blob/main/spring-ai-core/src/main/java/org/springframework/ai/chat/prompt/ChatOptionsBuilder.java[ChatOptionsBuilder#builder()].
TIP: In addition to the model specific https://github.com/spring-projects/spring-ai/blob/main/models/spring-ai-openai/src/main/java/org/springframework/ai/openai/OpenAiChatOptions.java[OpenAiChatOptions] you can use a portable https://github.com/spring-projects/spring-ai/blob/main/spring-ai-core/src/main/java/org/springframework/ai/chat/prompt/ChatOptions.java[ChatOptions] instance, created with the https://github.com/spring-projects/spring-ai/blob/main/spring-ai-core/src/main/java/org/springframework/ai/chat/prompt/ChatOptions.java[ChatOptions#builder()].

== Function Calling

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -150,13 +150,13 @@ ChatResponse response = chatModel.call(
new Prompt(
"Generate the names of 5 famous pirates.",
OpenAiChatOptions.builder()
.withModel("mixtral-8x7b-32768")
.withTemperature(0.4)
.model("mixtral-8x7b-32768")
.temperature(0.4)
.build()
));
----

TIP: In addition to the model specific https://github.com/spring-projects/spring-ai/blob/main/models/spring-ai-openai/src/main/java/org/springframework/ai/openai/OpenAiChatOptions.java[OpenAiChatOptions] you can use a portable https://github.com/spring-projects/spring-ai/blob/main/spring-ai-core/src/main/java/org/springframework/ai/chat/prompt/ChatOptions.java[ChatOptions] instance, created with the https://github.com/spring-projects/spring-ai/blob/main/spring-ai-core/src/main/java/org/springframework/ai/chat/prompt/ChatOptionsBuilder.java[ChatOptionsBuilder#builder()].
TIP: In addition to the model specific https://github.com/spring-projects/spring-ai/blob/main/models/spring-ai-openai/src/main/java/org/springframework/ai/openai/OpenAiChatOptions.java[OpenAiChatOptions] you can use a portable https://github.com/spring-projects/spring-ai/blob/main/spring-ai-core/src/main/java/org/springframework/ai/chat/prompt/ChatOptions.java[ChatOptions] instance, created with the https://github.com/spring-projects/spring-ai/blob/main/spring-ai-core/src/main/java/org/springframework/ai/chat/prompt/ChatOptions.java[ChatOptions#builder()].

== Function Calling

Expand Down Expand Up @@ -303,9 +303,9 @@ Next, create a `OpenAiChatModel` and use it for text generations:
----
var openAiApi = new OpenAiApi("https://api.groq.com/openai", System.getenv("GROQ_API_KEY"));
var openAiChatOptions = OpenAiChatOptions.builder()
.withModel("llama3-70b-8192")
.withTemperature(0.4)
.withMaxTokens(200)
.model("llama3-70b-8192")
.temperature(0.4)
.maxTokens(200)
.build();
var chatModel = new OpenAiChatModel(this.openAiApi, this.openAiChatOptions);

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -121,13 +121,13 @@ ChatResponse response = chatModel.call(
new Prompt(
"Generate the names of 5 famous pirates.",
OpenAiChatOptions.builder()
.withModel("mixtral-8x7b-32768")
.withTemperature(0.4)
.model("mixtral-8x7b-32768")
.temperature(0.4)
.build()
));
----

TIP: In addition to the model specific https://github.com/spring-projects/spring-ai/blob/main/models/spring-ai-openai/src/main/java/org/springframework/ai/openai/OpenAiChatOptions.java[OpenAiChatOptions] you can use a portable https://github.com/spring-projects/spring-ai/blob/main/spring-ai-core/src/main/java/org/springframework/ai/chat/prompt/ChatOptions.java[ChatOptions] instance, created with the https://github.com/spring-projects/spring-ai/blob/main/spring-ai-core/src/main/java/org/springframework/ai/chat/prompt/ChatOptionsBuilder.java[ChatOptionsBuilder#builder()].
TIP: In addition to the model specific https://github.com/spring-projects/spring-ai/blob/main/models/spring-ai-openai/src/main/java/org/springframework/ai/openai/OpenAiChatOptions.java[OpenAiChatOptions] you can use a portable https://github.com/spring-projects/spring-ai/blob/main/spring-ai-core/src/main/java/org/springframework/ai/chat/prompt/ChatOptions.java[ChatOptions] instance, created with the https://github.com/spring-projects/spring-ai/blob/main/spring-ai-core/src/main/java/org/springframework/ai/chat/prompt/ChatOptions.java[ChatOptions#builder()].

== Function Calling

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -309,8 +309,8 @@ String jsonSchema = """

Prompt prompt = new Prompt("how can I solve 8x + 7 = -23",
OllamaOptions.builder()
.withModel(OllamaModel.LLAMA3_2.getName())
.withFormat(new ObjectMapper().readValue(jsonSchema, Map.class))
.model(OllamaModel.LLAMA3_2.getName())
.format(new ObjectMapper().readValue(jsonSchema, Map.class))
.build());

ChatResponse response = this.ollamaChatModel.call(this.prompt);
Expand Down Expand Up @@ -340,8 +340,8 @@ var outputConverter = new BeanOutputConverter<>(MathReasoning.class);

Prompt prompt = new Prompt("how can I solve 8x + 7 = -23",
OllamaOptions.builder()
.withModel(OllamaModel.LLAMA3_2.getName())
.withFormat(outputConverter.getJsonSchemaMap())
.model(OllamaModel.LLAMA3_2.getName())
.format(outputConverter.getJsonSchemaMap())
.build());

ChatResponse response = this.ollamaChatModel.call(this.prompt);
Expand Down Expand Up @@ -466,10 +466,14 @@ Next, create an `OllamaChatModel` instance and use it to send requests for text
----
var ollamaApi = new OllamaApi();

var chatModel = new OllamaChatModel(this.ollamaApi,
OllamaOptions.create()
.model(OllamaOptions.DEFAULT_MODEL)
.temperature(0.9));
var chatModel = OllamaChatModel.builder()
.ollamaApi(ollamaApi)
.defaultOptions(
OllamaOptions.builder()
.model(OllamaModel.MISTRAL)
.temperature(0.9)
.build())
.build();

ChatResponse response = this.chatModel.call(
new Prompt("Generate the names of 5 famous pirates."));
Expand Down Expand Up @@ -508,7 +512,7 @@ var request = ChatRequest.builder("orca-mini")
.content("What is the capital of Bulgaria and what is the size? "
+ "What is the national anthem?")
.build()))
.options(OllamaOptions.create().temperature(0.9))
.options(OllamaOptions.builder().temperature(0.9).build())
.build();

ChatResponse response = this.ollamaApi.chat(this.request);
Expand All @@ -519,7 +523,7 @@ var request2 = ChatRequest.builder("orca-mini")
.messages(List.of(Message.builder(Role.USER)
.content("What is the capital of Bulgaria and what is the size? " + "What is the national anthem?")
.build()))
.options(OllamaOptions.create().temperature(0.9).toMap())
.options(OllamaOptions.builder().temperature(0.9).build().toMap())
.build();

Flux<ChatResponse> streamingResponse = this.ollamaApi.streamingChat(this.request2);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -147,7 +147,7 @@ ChatResponse response = chatModel.call(
));
----

TIP: In addition to the model specific https://github.com/spring-projects/spring-ai/blob/main/models/spring-ai-openai/src/main/java/org/springframework/ai/openai/OpenAiChatOptions.java[OpenAiChatOptions] you can use a portable https://github.com/spring-projects/spring-ai/blob/main/spring-ai-core/src/main/java/org/springframework/ai/chat/prompt/ChatOptions.java[ChatOptions] instance, created with the https://github.com/spring-projects/spring-ai/blob/main/spring-ai-core/src/main/java/org/springframework/ai/chat/prompt/ChatOptions.java#L97[ChatOptions#builder()].
TIP: In addition to the model specific https://github.com/spring-projects/spring-ai/blob/main/models/spring-ai-openai/src/main/java/org/springframework/ai/openai/OpenAiChatOptions.java[OpenAiChatOptions] you can use a portable https://github.com/spring-projects/spring-ai/blob/main/spring-ai-core/src/main/java/org/springframework/ai/chat/prompt/ChatOptions.java[ChatOptions] instance, created with the https://github.com/spring-projects/spring-ai/blob/main/spring-ai-core/src/main/java/org/springframework/ai/chat/prompt/ChatOptions.java[ChatOptions#builder()].


== Function Calling
Expand Down