Configuration and minimal code for a containerized web-app
- Language: Java
- Framework: Spring (AI, Security, Kafka, GraphQL, MVC, Data, Compose)
- Infrastructure: Postgresql (database) and Apache Kafka (streaming platform)
- No service layer (delegation of HTTP calls to the persistence layer or other clients)
- Automated end-to-end tests implemented as health-check of containers with a GitHub-Action
- Expected result:
Container java-spring-server-spring-app-1 Healthy
- Expected result:
Container db Starting
Container zookeeper Starting
Container db Started
Container zookeeper Started
Container kafka Starting
Container kafka Started
Container db Waiting
Container kafka Waiting
Container db Healthy
Container kafka Healthy
Container java-spring-server-spring-app-1 Starting
Container java-spring-server-spring-app-1 Started
Container java-spring-server-spring-app-1 Waiting
Container java-spring-server-spring-app-1 Healthy
Container java-spring-server-spring-app-client-1 Starting
Container java-spring-server-spring-app-client-1 Started
compose started
- Complete containerization (infrastructure and Spring-app)
- Relevant docker-compose file: docker-compose.yml
- Containerization of the infrastructure + local execution of the Spring-app
- Relevant docker-compose file: docker-compose-infrastructure.yml
- Java 21 (for Java 17, change the setting in the build script)
- Apache Maven 3.6.0
- Launch 5 terminals (CLI):
- For the containerized infrastructure
- For the Java-Spring application
- For a client of the OpenAI API (CURL commands)
- For a client of the own API (CURL commands)
- For an independent Kafka subscriber
- Set
spring.kafka.bootstrap-servers=localhost:9092
in application.properties - Start DockerDesktop.
- Start the infrastructure (Postgresql):
docker-compose -f ./infrastructure/docker-compose-infrastructure.yml up
- Verify the correct start of the database. The log should show:
- ...
listening on IPv4 address "0.0.0.0", port 5432
- ...
database system is ready to accept connections
- Build and start the Spring project:
mvn clean package
java -jar ./target\demo-0.0.1-SNAPSHOT.jar
- Launch 5 terminals (CLI):
- For the containerized infrastructure
- For the Java-Spring application
- For a client of the OpenAI API (CURL commands)
- For a client of the own API (CURL commands)
- For an independent Kafka subscriber
- Start DockerDesktop.
- Start the infrastructure (Postgresql):
docker-compose -f ./app-client/docker-compose.yml up
- Get an OpenAI API-key from here.
- Use this API-key to set the corresponding property in application.properties.
- Have the OpenAI API (ChatGPT) generate responses for your prompts:
curl -X POST "localhost:8080/api/ai/chat" -H "X-API-KEY: API-KEY-RAW" -d "Where is the capital of South Africa"
- Possible errors
- 401 Unauthorized (invalid or no API-key)
- 429 Too Many Requests (valid API-key with insufficient credit)
- Possible errors
- Use CLI of the db container to query the database content directly:
psql -U compose-postgres -c '\x' -c 'SELECT * FROM book;'
- Use the API to create new database entries or fetch existing entries:
curl -X GET -H "X-API-KEY: API-KEY-RAW" localhost:8080/api/books
curl -X POST "localhost:8080/api/book" -H "X-API-KEY: API-KEY-RAW" -H "Content-Type: application/json" -d "{\"name\":\"name0\", \"publisher\":\"publisher0\", \"isbn\":\"isbn0\", \"language\":\"language0\", \"authors\":[\"author0a\", \"author0b\"]}"
- Replace name0 and other values with appropriate ones.
- Start an independent (CLI) subscriber to the published topic and monitor the logs:
docker exec -it broker /bin/sh
/bin/kafka-console-consumer --topic book --from-beginning --bootstrap-server localhost:9092
- Monitor published messages in Spring logs (triggered by the
POST /book
endpoint):... com.example.demo.kafka.KafkaPublisher : Sending value=Book(id=null, name=name0, publisher=publisher0, isbn=isbn0, language=language0, authors=[author0a, author0b]) ...
... com.example.demo.kafka.KafkaPublisher : Sending value=Book(id=null, name=name0, publisher=publisher0, isbn=isbn0, language=language0, authors=[author0a, author0b]) was completed.
... com.example.demo.kafka.KafkaPublisher : Sending value=Book(id=..., name=name0, publisher=publisher0, isbn=isbn0, language=language0, authors=[author0a, author0b]) ...
... com.example.demo.kafka.KafkaPublisher : Sending value=Book(id=..., name=name0, publisher=publisher0, isbn=isbn0, language=language0, authors=[author0a, author0b]) was completed.
... com.example.demo.kafka.KafkaSubscriber : Received value={"id":...,"name":"name0","publisher":"publisher0","isbn":"isbn0","language":"language0","authors":["author0a","author0b"]}.
- Use the embedded GraphQL GUI-client by opening
http://localhost:8080/graphiql
with a web-browser and then use the upper-left corner text-box to write your queries. Execute a query by pressing on the purple "Play" button:- Mutation
mutation CreateBook { createBook( name: "name1" publisher: "publisher1" isbn: "isbn1" language: "language1" authors: [ "author1a" "author1b" ] ) { id authors } }
- Query
query GetBooksByLanguage { getBooksByLanguage(language: "language1") { id } }
- Mutation
- Alternatively (instead of the GUI-client) contact the GraphQL-server directly:
curl -X POST "localhost:8080/graphql" -H "Content-Type: application/json" -d "{\"query\": \"query GetBooksByLanguage {getBooksByLanguage(language: \\\"language1\\\") {id}}\"}"
- Stop all containers.
- Run the command
docker system prune --volumes --force
- Exit DockerDesktop.
- Run the command
wsl --shutdown