Player service is a spring boot based microservice which serves the contents of Player.csv through a REST API. This service also integrates with Ollama using Ollama4j library.
- Serve the contents of
Player.csvthrough a REST API - In memory H2 database
- Integrate with Ollma using Ollama4j library
GET /v1/players- returns the list of all players.GET /v1/players/{playerId}- returns a single player byplayerId.GET /v1/chat/list-models- returns the list of Ollama models available in the Gitpod workspacePOST /v1/chat/- Endpoint to chat with the available Ollama model
The database connected to this service is in-memory H2 Database.
/collectionfolder contains sample requests for player service./src: Source code/config: Configuration for Ollama4j library/controller: API controllers/model: Data models/repository: Data repositories/service: Service layer implementations
/player-service-modelfolder contains a dummy AI model forPlayer.csvdata.
- Java 17
- Maven
- Spring Boot 3.3.4 (with Spring Web MVC, Spring Data JPA)
- H2 Database
- Docker
- Ollama4j
Before you begin, ensure you have the following installed on your development machine:
-
Java 17
- Verify installation:
java --version
- Verify installation:
-
Maven
- Download and install from maven.apache.org
- Verify installation:
mvn --version
2 Docker
- Download and install from docker.com
- Verify installation: docker --version
- Git
- Download and install from git-scm.com
- Verify installation:
git --version
-
Fork the repository:
- Visit the GitHub repository
- Click the "Fork" button in the top-right corner to create your own copy of the repository
-
Clone your forked repository:
git clone https://github.com/your-username/player-service-java.git cd player-service-java -
Install dependencies:
mvn clean install -DskipTests -
Start the server:
mvn spring-boot:run -
Open your browser and visit
http://localhost:8080orhttp://127.0.0.1:8080/to test the application. -
Run tests:
mvn clean install
-
Pull and run Ollama docker image and download
tinyllamamodeldocker pull ollama/ollamadocker run -it -v ~/ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollamadocker exec -it ollama ollama run tinyllama
-
Test API server
curl -v --location 'http://localhost:11434/api/generate' --header 'Content-Type: application/json' --data '{"model": "tinyllama","prompt": "why is the sky blue?", "stream": false}'