sidebar_position |
---|
1 |
Let's get started with Ollama4j.
Ollama is an advanced AI tool that allows users to easily set up and run large language models locally (in CPU and GPU modes). With Ollama, users can leverage powerful language models such as Llama 2 and even customize and create their own models.
Ollama4j was built for the simple purpose of integrating Ollama with Java applications.
flowchart LR
o4j[Ollama4j]
o[Ollama Server]
o4j -->|Communicates with| o;
m[Models]
p[Your Java Project]
subgraph Your Java Environment
direction TB
p -->|Uses| o4j
end
subgraph Ollama Setup
direction TB
o -->|Manages| m
end
- Ollama
- Oracle JDK or Open JDK 11.0 or above.
- Maven
The easiest way of getting started with Ollama server is with Docker. But if you choose to run the Ollama server directly, download the distribution of your choice and follow the installation process.
docker run -it -v ~/ollama:/root/.ollama -p 11434:11434 ollama/ollama
docker run -it --gpus=all -v ~/ollama:/root/.ollama -p 11434:11434 ollama/ollama
You can type this command into Command Prompt, Powershell, Terminal, or any other integrated terminal of your code editor.
The command runs the Ollama server locally at http://localhost:11434/.
Get started by creating a new Maven project on your favorite IDE.
Add the dependency to your project's pom.xml
.
<dependency>
<groupId>io.github.amithkoujalgi</groupId>
<artifactId>ollama4j</artifactId>
<version>1.0.27</version>
</dependency>
Find the latest version of the library here.
You might want to include an implementation of SL4J logger in your pom.xml
file. For
example,
Use slf4j-jdk14
implementation:
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-jdk14</artifactId>
<version>2.0.9</version> <!--Replace with appropriate version-->
</dependency>
or use logback-classic
implementation:
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.3.11</version> <!--Replace with appropriate version-->
</dependency>
or use other suitable implementations.
Create a new Java class in your project and add this code.
public class OllamaAPITest {
public static void main(String[] args) {
String host = "http://localhost:11434/";
OllamaAPI ollamaAPI = new OllamaAPI(host);
ollamaAPI.setVerbose(true);
boolean isOllamaServerReachable = ollamaAPI.ping();
System.out.println("Is Ollama server alive: " + isOllamaServerReachable);
}
}