After the SQL marathon we just ran, it's time to assume we know everything about data modeling and let's dive on how to use a database with spring boot.
In the last sample project, we used JDBC to access the database. Several API's are built on top of that in order to ease things for enterprise development. For instance, a Datasource helps to provide connections without explicit knowledge of connection details. JEE technologies have sophisticated ways to acquire and release connections.
Spring boot however is meant to make things easier so you don't have to deal with any of those complexities. Here's a sample datasource configuration:
spring.datasource.url=jdbc:h2:./visits
spring.datasource.username=enterprise
A very simple configuration using H2, which we used before. All available data options can be checked here.
On top of a datasource, several other cool things are built up, like query mappers and ORM's. Enterprise java has a specification for ORM called JPA and it's very popular due the delivered productivity and spring boot offers even easier access to it.
The Java Persistence API is another abstraction layer on top of not knowing deep details about the database being consumed. You may ask why is that important, since everyone uses PostgreSQL, but it was tricky in the past due the wide number of SQL Dialects.
The key idea behind any ORM is to map entities between the database and the application. Usually the application has structs or classes while the database has tables and views. But in the end they are mostly the same information.
An Entity is a class decorated with the @Entity annotation. It means that the fields in that class, more or less, map to columns in a table with similar name. Here's a sample from the sample project:
package sample.project14
import jakarta.persistence.*
import java.time.LocalDateTime
@Entity
data class Visit(
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
var id: Long? = null,
@Temporal(TemporalType.TIMESTAMP)
var created: LocalDateTime? = LocalDateTime.now(),
)
This entity expects a table called visit containing two columns: id and created. It also expects the database handles primary key generation and the type of created column to be a timestamp. Besides that, there is a lot that JPA hides by default so we don't need to think about proper queries all the time.
That approach, of course, has some criticism but we won't dive into it.
By the current JEE standards, the proper way to handle database access is using an EntityManager. It queries your mapped entity classes:
package sample.project14
import jakarta.persistence.EntityManager
import org.springframework.web.bind.annotation.GetMapping
import org.springframework.web.bind.annotation.RequestMapping
import org.springframework.web.bind.annotation.RestController
@RestController
@RequestMapping("visit")
class Controller(val em: EntityManager) {
@GetMapping("count")
fun visit(): List<Visit> = em.createQuery("select v from Visit v", Visit::class.java).resultList
// ...
}
No need to know any SQL or its dialects, but a language called JPQL.
Then Spring make things even easier with Repositories. With repository, common CRUD operations and more are available for free. You just need to extend one interface, nothing more:
package sample.project14
import org.springframework.data.jpa.repository.JpaRepository
interface VisitRepository : JpaRepository<Visit, Long?>
Then you can save and count (and more) like this, zero SQL queries:
package sample.project14
import jakarta.persistence.EntityManager
import jakarta.transaction.Transactional
import org.springframework.web.bind.annotation.GetMapping
import org.springframework.web.bind.annotation.RequestMapping
import org.springframework.web.bind.annotation.RestController
@RestController
@RequestMapping("visit")
class Controller(
val em: EntityManager,
val repo: VisitRepository
) {
// ...
@GetMapping
@Transactional
fun count(): Long {
repo.save(Visit())
return repo.count()
}
// ...
}
Basic listing can be achieved with findAll, it's already there like save and count. Needless to say to not use findAll on huge tables.
Repositories even has that special feature that let we create new methods in the interface with special names and then it understands the desired query. For example, findByCreatedBetween will take two LocalDateTime arguments and then search visits by the created field:
package sample.project14
import org.springframework.data.jpa.repository.JpaRepository
import java.time.LocalDateTime
interface VisitRepository : JpaRepository<Visit, Long?> {
fun findByCreatedBetween(createdAfter: LocalDateTime, createdBefore: LocalDateTime): List<Visit>
}
Another feature very important elegantly provided by repositories is the pagination. Again, all we need to use is a special method signature, receiving a Pageable and returning either a List or a Page:
package sample.project14
import org.springframework.data.domain.Page
import org.springframework.data.domain.PageRequest
import org.springframework.data.domain.Sort
// ...
@RestController
@RequestMapping("visit")
class Controller(
val em: EntityManager,
val repo: VisitRepository,
) {
// ...
@GetMapping("repo-list-paged")
fun repoListPaged(
@RequestParam(defaultValue = "0") page: Int,
@RequestParam(defaultValue = "10") size: Int,
): Page<Visit> = repo.listPaged(
PageRequest.of(
page, size,
Sort.Direction.DESC, "id"
)
)
}
Finally, let's talk about migrations.
Any actively used system is constantly evolving and so the infrastructure behind it. That means changes in our code, our database, our user interfaces and so on.
Changes in the database had a history of be hard to track, but the migrations pattern defines a good way to both track the changes and set a source of truth to them.
There are several migrate systems out there, liquibase is one of the best and it happens to be supported by spring boot out of the box. All you have to do is to add liquibase-core as dependency and to make sure a liquibase root changelog is present either in the default location or at the configured path in your application.properties:
spring.application.name=project-014-spring-with-database
spring.datasource.url=jdbc:h2:./visits
spring.datasource.username=enterprise
#spring.liquibase.change-log=classpath:/db/changelog/db.changelog-master.yaml
spring.liquibase.change-log=classpath:/changelog-master.yml
By this configuration, we need to create a file called changelog-master.yml and place it in the same folder (resources probably) the application.properties file:
databaseChangeLog:
- includeAll:
path: migrations
relativeToChangelogFile: true
And this file expects a folder called migrations. There we place our migrate files in any expected migrate format supported by liquibase:
-- liquibase formatted sql
-- changeset sombriks:2024-12-03-create-table-visit.sql
create table visit(
id identity primary key,
created timestamp
);
A migrate/changelog file can contain one or more changesets. For each changeset, it may or may not contain a rollback.
Now all our database changes can be put in migrations and tracked along changes in the codebase.
That's it, spring/kotlin applications are ready for serious business. Remember to check the sample project for this chapter.
Next we will discuss more tests.