Skip to content
Rich Criteria API for JPA providers
Java CSS ANTLR FreeMarker Shell HTML
Branch: master
Clone or download


Type Name Latest commit message Commit time
Failed to load latest commit information.
.dependabot [#937] Rename spring-data-rest to spring-data-webmvc and introduce sp… Nov 18, 2019
.github Switch back to the property name version.quarkus and try to use JDK 1… May 25, 2020
archetype [maven-release-plugin] prepare for next development iteration May 29, 2020
checkstyle-rules [maven-release-plugin] prepare for next development iteration May 29, 2020
core [maven-release-plugin] prepare for next development iteration May 29, 2020
dist [maven-release-plugin] prepare for next development iteration May 29, 2020
documentation [maven-release-plugin] prepare for next development iteration May 29, 2020
entity-view [maven-release-plugin] prepare for next development iteration May 29, 2020
examples do not rely on ParamConverterProvider for string entity view id extra… Jun 3, 2020
integration do not rely on ParamConverterProvider for string entity view id extra… Jun 3, 2020
jpa-criteria [maven-release-plugin] prepare for next development iteration May 29, 2020
parent [maven-release-plugin] prepare for next development iteration May 29, 2020
testsuite-base [maven-release-plugin] prepare for next development iteration May 29, 2020
travis Update to Mar 30, 2020
website [maven-release-plugin] prepare for next development iteration May 29, 2020
.gitattributes Fixed #270 and added checkstyle Nov 2, 2016
.gitignore Added a couple of Query execution/translation tests Sep 4, 2019 Hopefully fix snapshot deployment May 22, 2020
.travis-settings.xml Hopefully fix snapshot deployment May 22, 2020
.travis.yml Fix certificate issue and increase wait time for native image generation Jun 3, 2020 Prepare release May 29, 2020 Add info for test case template repository. Fixed #267 Feb 18, 2017
LICENSE.txt line ending normalization Aug 3, 2016 Prepare release May 29, 2020 Fix website deployment procedure when using Java 10+ May 14, 2020 Fix website deployment procedure when using Java 10+ May 14, 2020 Further work on documentation Feb 11, 2017 Further work on oracle and mssql support Nov 28, 2016 Try to smooth next release Feb 1, 2017 Some build fixes. Allow using sorted collection in updatable view whe… May 26, 2020 Some build fixes. Allow using sorted collection in updatable view whe… May 26, 2020 [#817] Quarkus integration (#1046) May 3, 2020
eclipse-cleanup-rules.xml Update license headers Jan 2, 2020
eclipse-formatting-rules.xml Update license headers Jan 2, 2020
license-header-definition.xml Update license headers Jan 2, 2020
license-header.txt Add license checker Nov 2, 2016
pom.xml [maven-release-plugin] prepare for next development iteration May 29, 2020
roadmap.asciidoc [#1065] Support spring-data dynamic projections for entity views May 4, 2020 Styling fixes, better syntax highlighting Nov 7, 2016 [#817] Quarkus integration (#1046) May 3, 2020

Build Status

Maven Central Slack Status

Javadoc - Core Javadoc - Entity-View Javadoc - JPA-Criteria


Blaze-Persistence is a rich Criteria API for JPA providers.

What is it?

Blaze-Persistence is a rich Criteria API for JPA providers that aims to be better than all the other Criteria APIs available. It provides a fluent API for building queries and removes common restrictions encountered when working with JPA directly. It offers rich pagination support and also supports keyset pagination.

The Entity-View module can be used to create views for JPA entites. You can roughly imagine an entity view is to an entity, what a RDBMS view is to a table.

The JPA-Criteria module implements the Criteria API of JPA but is backed by the Blaze-Persistence Core API so you can get a query builder out of your CriteriaQuery objects.

With Spring Data or DeltaSpike Data integrations you can make use of Blaze-Persistence easily in your existing repositories.


Blaze-Persistence is not only a Criteria API that allows to build queries easier, but it also comes with a lot of features that are normally not supported by JPA providers.

Here is a rough overview of new features that are introduced by Blaze-Persistence on top of the JPA model

  • Use CTEs and recursive CTEs
  • Use modification CTEs aka DML in CTEs
  • Make use of the RETURNING clause from DML statements
  • Use the VALUES clause for reporting queries and soon make use of table generating functions
  • Create queries that use SET operations like UNION, EXCEPT and INTERSECT
  • Manage entity collections via DML statements to avoid reading them in memory
  • Define functions similar to Hibernates SQLFunction in a JPA provider agnostic way
  • Use many built-in functions like GROUP_CONCAT, date extraction, date arithmetic and many more
  • Easy pagination and simple API to make use of keyset pagination

In addition to that, Blaze-Persistence also works around some JPA provider issues in a transparent way.

How to use it?

Blaze-Persistence is split up into different modules. We recommend that you define a version property in your parent pom that you can use for all artifacts. Modules are all released in one batch so you can safely increment just that property.


Alternatively you can also use our BOM in the dependencyManagement section.



If you want a sample application with everything setup where you can poke around and try out things, just go with our archetypes!

Core-only archetype:

mvn archetype:generate "-DarchetypeGroupId=com.blazebit" "-DarchetypeArtifactId=blaze-persistence-archetype-core-sample" "-DarchetypeVersion=1.5.0-Alpha4"

Entity view archetype:

mvn archetype:generate "-DarchetypeGroupId=com.blazebit" "-DarchetypeArtifactId=blaze-persistence-archetype-entity-view-sample" "-DarchetypeVersion=1.5.0-Alpha4"

Spring-Data archetype:

mvn archetype:generate "-DarchetypeGroupId=com.blazebit" "-DarchetypeArtifactId=blaze-persistence-archetype-spring-data-sample" "-DarchetypeVersion=1.5.0-Alpha4"

Spring-Boot archetype:

mvn archetype:generate "-DarchetypeGroupId=com.blazebit" "-DarchetypeArtifactId=blaze-persistence-archetype-spring-boot-sample" "-DarchetypeVersion=1.5.0-Alpha4"

DeltaSpike Data archetype:

mvn archetype:generate "-DarchetypeGroupId=com.blazebit" "-DarchetypeArtifactId=blaze-persistence-archetype-deltaspike-data-sample" "-DarchetypeVersion=1.5.0-Alpha4"

Java EE archetype:

mvn archetype:generate "-DarchetypeGroupId=com.blazebit" "-DarchetypeArtifactId=blaze-persistence-archetype-java-ee-sample" "-DarchetypeVersion=1.5.0-Alpha4"

Supported Java runtimes

All projects are built for Java 7 except for the ones where dependencies already use Java 8 like e.g. Hibernate 5.2, Spring Data 2.0 etc. So you are going to need a JDK 8 for building the project.

We also support building the project with JDK 9 and try to keep up with newer versions. If you want to run your application on a Java 9 JVM you need to handle the fact that JDK 9+ doesn't export the JAXB and JTA APIs anymore. In fact, JDK 11 will even remove the modules so the command line flags to add modules to the classpath won't work.

Since libraries like Hibernate and others require these APIs you need to make them available. The easiest way to get these APIs back on the classpath is to package them along with your application. This will also work when running on Java 8. We suggest you add the following dependencies.

    <!-- In a managed environment like Java EE, use 'provided'. Otherwise use 'compile' -->
    <!-- In a managed environment like Java EE, use 'provided'. Otherwise use 'compile' -->
    <!-- In a managed environment like Java EE, use 'provided'. Otherwise use 'compile' -->

The javax.transaction and javax.activation dependencies are especially relevant for the JPA metamodel generation.

Supported environments/libraries

The bare minimum is JPA 2.0. If you want to use the JPA Criteria API module, you will also have to add the JPA 2 compatibility module. Generally, we support the usage in Java EE 6+ or Spring 4+ applications.

See the following table for an overview of supported versions.

Module Minimum version Supported versions
Hibernate integration Hibernate 4.2 4.2, 4.3, 5.0, 5.1, 5.2, 5.3, 5.4 (not all features are available in older versions)
EclipseLink integration EclipseLink 2.6 2.6 (Probably 2.4 and 2.5 work as well, but only tested against 2.6)
DataNucleus integration DataNucleus 4.1 4.1, 5.0
OpenJPA integration N/A (Currently not usable. OpenJPA doesn't seem to be actively developed anymore and no users asked for support yet)
Entity View CDI integration CDI 1.0 1.0, 1.1, 1.2, 2.0
Entity View Spring integration Spring 4.3 4.3, 5.0, 5.1, 5.2
DeltaSpike Data integration DeltaSpike 1.7 1.7, 1.8, 1.9
Spring Data integration Spring Data 1.11 1.11, 2.0, 2.1, 2.2, 2.3
Spring Data WebMvc integration Spring Data 1.11, Spring WebMvc 4.3 Spring Data 1.11 - 2.3, Spring WebMvc 4.3 - 5.2
Spring Data WebFlux integration Spring Data 2.0, Spring WebFlux 5.0 Spring Data 2.0 - 2.3, Spring WebFlux 5.0 - 5.2
Spring HATEOAS WebMvc integration Spring Data 2.2, Spring WebMvc 5.2 Spring Data 2.3, Spring WebMvc 5.2, Spring HATEOAS 1.0+
Jackson integration 2.8.11 2.8.11+
GraphQL integration 5.2 5.2+
JAX-RS integration Any JAX-RS version Any JAX-RS version
Quarkus integration 1.4.2 1.4+

Manual setup

For compiling you will only need API artifacts and for the runtime you need impl and integration artifacts.

See the core documentation for the necessary dependencies needed to setup Blaze-Persistence. If you want to use entity views, the entity view documentation contains a similar setup section describing the necessary dependencies.


The current documentation is a reference manual and is split into a reference for the core module and for the entity-view module. At some point we might introduce topical documentation, but for now you can find articles on the Blazebit Blog

Core quick-start

First you need to create a CriteriaBuilderFactory which is the entry point to the core api.

CriteriaBuilderConfiguration config = Criteria.getDefault();
// optionally, perform dynamic configuration
CriteriaBuilderFactory cbf = config.createCriteriaBuilderFactory(entityManagerFactory);

NOTE: The CriteriaBuilderFactory should have the same scope as your EntityManagerFactory as it is bound to it.

For demonstration purposes, we will use the following simple entity model.

public class Cat {
    private Integer id;
    private String name;
    @ManyToOne(fetch = FetchType.LAZY)
    private Cat father;
    @ManyToOne(fetch = FetchType.LAZY)
    private Cat mother;
    private Set<Cat> kittens;
    // Getter and setters omitted for brevity

If you want to select all cats and fetch their kittens as well as their father you do the following.

cbf.create(em, Cat.class).fetch("kittens.father").getResultList();

This will create quite a query behind the scenes:

SELECT cat FROM Cat cat LEFT JOIN FETCH cat.kittens kittens_1 LEFT JOIN FETCH kittens_1.father father_1

An additional bonus is that the paths and generally every expression you write will get checked against the metamodel so you can spot typos very early.

JPA Criteria API quick-start

Blaze-Persistence provides an implementation of the JPA Criteria API what allows you to mostly code against the standard JPA Criteria API, but still be able to use the advanced features Blaze-Persistence provides.

All you need is a CriteriaBuilderFactory and when constructing the actual query, an EntityManager.

// This is a subclass of the JPA CriteriaBuilder interface
BlazeCriteriaBuilder cb = BlazeCriteria.get(criteriaBuilderFactory);
// A subclass of the JPA CriteriaQuery interface
BlazeCriteriaQuery<Cat> query = cb.createQuery(Cat.class);

// Do your JPA Criteria query logic with cb and query
Root<Cat> root = query.from(Cat.class);
query.where(cb.equal(root.get(, "Felix"));

// Finally, transform the BlazeCriteriaQuery to the Blaze-Persistence Core CriteriaBuilder
CriteriaBuilder<Cat> builder = query.createCriteriaBuilder(entityManager);
// From here on, you can use all the power of the Blaze-Persistence Core API

// And finally fetch the result
List<Cat> resultList = builder.getResultList();

This will create a query that looks just about what you would expect:

SELECT cat FROM Cat cat WHERE = :param_0

This alone is not very spectacular. The interesting part is that you can use the Blaze-Persistence CriteriaBuilder then to do your advanced SQL things or consume your result as entity views as explained in the next part.

Entity-view usage

Every project has some kind of DTOs and implementing these properly isn't easy. Based on the super quick-start model we will show how entity views come to the rescue!

To make use of entity views, you will need a EntityViewManager with entity view classes registered. In a CDI environment you can inject a EntityViewConfiguration that has all discoverable entity view classes registered, but in a normal Java application you will have to register the classes yourself like this:

EntityViewConfiguration config = EntityViews.createDefaultConfiguration();
EntityViewManager evm = config.createEntityViewManager(criteriaBuilderFactory);

NOTE: The EntityViewManager should have the same scope as your EntityManagerFactory and CriteriaBuilderFactory as it is bound to it.

An entity view itself is a simple interface or abstract class describing the structure of the projection that you want. It is very similar to defining an entity class with the difference that it is based on the entity model instead of the DBMS model.

public interface CatView {
    public Integer getId();

    @Mapping("CONCAT(, 's kitty ', name)")
    public String getCuteName();

    public SimpleCatView getFather();

public interface SimpleCatView {
    public Integer getId();

    public String getName();


The CatView has a property cuteName which will be computed by the JPQL expression CONCAT(, 's kitty ', name) and a subview for father. Note that although not required in this particular case, every entity view for an entity type should have an id mapping if possible. Entity views without an id mapping will by default have equals and hashCode implementations that consider all attributes, whereas with an id mapping, only the id is considered. The SimpleCatView is the projection which is used for the father relation and only consists of the id and the name of the Cat.

You just created two DTO interfaces that contain projection information. Now the interesting part is that entity views can be applied on any query, so you can define a base query and then create the projection like this:

CriteriaBuilder<Cat> cb = criteriaBuilderFactory.create(entityManager, Cat.class);
CriteriaBuilder<CatView> catViewBuilder = evm.applySetting(EntityViewSetting.create(CatView.class), cb);
List<CatView> catViews = catViewBuilder.getResultList();

This will behind the scenes execute the following optimized query and transparently build your entity view objects based on the results.

    CONCAT(, 's kitty ',,,
FROM Cat cat
LEFT JOIN cat.father father_1
LEFT JOIN cat.mother mother_1
WHERE father_1 IS NULL
   OR LIKE :param_0

See the left joins created for relations used in the projection? These are implicit joins which are by default what we call "model-aware". If you specified that a relation is optional = false, we would generate an inner join instead. This is different from how JPQL path expressions are normally interpreted, but in case of projections like in entity views, this is just what you would expect! You can always override the join type of implicit joins with joinDefault if you like.

Questions or issues

Drop by on Slack Status and ask questions any time or just create an issue on GitHub or ask on Stackoverflow.

Setup local development

Here some notes about setting up a local environment for testing.

Setup general build environment

Although Blaze-Persistence still supports running on Java 7, the build must be run with at least JDK 8. When doing a release at least a JDK 9 is required as we need to build some Multi-Release or MR JARs. Since we try to support the latest JDK versions as well, we require developers that want to build the project with JDK 11+ to define a system property for a release build.

The system property jdk8.home should be set to the path to a Java 7 or 8 installation that contains either jre/lib/rt.jar or jre/lib/classes.jar. This property is necessary when using JDK 11+ because sun.misc.Unsafe.defineClass was removed.

Building the website and documentation

You have to install GraphViz and make it available in your PATH.

After that, it's easiest to just invoke ./ which builds the documentation, website and starts an embedded server to serve at port 8820.

Checkstyle in IntelliJ

  1. Build the whole thing with mvn clean install once to have the checkstyle-rules jar in your M2 repository
  2. Install the CheckStyle-IDEA Plugin
  3. After a restart, go to Settings > Other Settings > Checkstyle
  4. Add a Third party check that points to the checkstyle-rules.jar of your M2 repository
  5. Add a configuration file named Blaze-Persistence Checkstyle rules pointing to checkstyle-rules/src/main/resources/blaze-persistence/checkstyle-config.xml
  6. Use target/checkstyle.cache for the property checkstyle.cache.file

Now you should be able to select Blaze-Persistence Checkstyle rules in the dropdown of the CheckStyle window. + Click on Check project and checkstyle will run once for the whole project, then it should do some work incrementally.

Testing a JPA provider and DBMS combination

By default, a Maven build mvn clean install will test against H2 and Hibernate 5.2 but you can activate different profiles to test other combinations. To test a specific combination, you need to activate at least 4 profiles

  • One of the JPA provider profiles
    • hibernate-5.4
    • hibernate-5.3
    • hibernate-5.2
    • hibernate-5.1
    • hibernate-5.0
    • hibernate-4.3
    • hibernate
    • eclipselink
    • datanucleus-5.1
    • datanucleus-5
    • datanucleus-4
    • openjpa
  • A DBMS profile
    • h2
    • postgresql
    • mysql
    • mysql8
    • oracle
    • db2
    • mssql
    • firebird
    • sqllite
  • A Spring data profile
    • spring-data-2.3.x
    • spring-data-2.2.x
    • spring-data-2.1.x
    • spring-data-2.0.x
    • spring-data-1.11.x
  • A DeltaSpike profile
    • deltaspike-1.7
    • deltaspike-1.8
    • deltaspike-1.9

The default DBMS connection infos are defined via Maven properties, so you can override them in a build by passing the properties as system properties.

  • jdbc.url
  • jdbc.user
  • jdbc.password
  • jdbc.driver

The values are defined in e.g. core/testsuite/pom.xml in the respective DBMS profiles.

Switching JPA provider profiles in IntelliJ

When switching between Hibernate and other JPA provider profiles, IntelliJ does not unmark the basic or hibernate source directories in core/testsuite. If you encounter errors like duplicate class file found or something alike, make sure that

  • With a Hibernate profile you unmark the core/testsuite/src/main/basic directory as source root
  • With a non-Hibernate profile you unmark the core/testsuite/src/main/hibernate and core/testsuite/src/test/hibernate directory as source root

Unmarking as source root can be done by right clicking on the source directory, going to the submenu Mark directory as and finally clicking Unmark as Sources Root.

Using DataNucleus profiles in IntelliJ

DataNucleus requires bytecode enhancement to work properly which requires an extra step to be able to do testing within IntelliJ. Usually when switching the JPA provider profile, it is recommended to trigger a Rebuild Project action in IntelliJ to avoid strange errors causes by previous bytecode enhancement runs. After that, the entities in the project core/testsuite have to be enhanced. This is done through a Maven command.

  • DataNucleus 4: mvn -P "datanucleus-4,h2,deltaspike-1.8,spring-data-2.0.x,eclipselink" -pl core/testsuite,integration/spring-data/testsuite datanucleus:enhance
  • DataNucleus 5: mvn -P "datanucleus-5,h2,deltaspike-1.8,spring-data-2.0.x,eclipselink" -pl core/testsuite,integration/spring-data/testsuite datanucleus:enhance
  • DataNucleus 5.1: mvn -P "datanucleus-5.1,h2,deltaspike-1.8,spring-data-2.0.x,eclipselink" -pl core/testsuite,integration/spring-data/testsuite datanucleus:enhance

After doing that, you should be able to execute any test in IntelliJ.

Note that if you make changes to an entity class or add a new entity class you might need to redo the rebuild and enhancement.


When installing the 3.x version, you also need a 3.x JDBC driver. Additionally you should add the following to the firebird.conf

WireCrypt = Enabled

After creating the DB with create database 'localhost:test' user 'sysdba' password 'sysdba';, you can connect with JDBC with jdbc:firebirdsql:localhost:test?charSet=utf-8


When setting up Oracle locally, keep in mind that when you connect to it, you have to set the NLS_SORT to BINARY. Since the JDBC driver derives values from the locale settings of the JVM, you should set the default locale settings to en_US. In IntelliJ when defining the Oracle database, go to the Advanced tab an specify the JVM options -Duser.language=en.

When using the Oracle docker container via oracle you might want to specify the following properties when executing tests -Djdbc.url=jdbc:oracle:thin:@ -Djdbc.user=SYSTEM -Djdbc.password=oracle

JDBC Driver

You have to install the JDBC driver manually. If you install Oracle XE locally, you can take it from $ORACLE_HOME/jdbc otherwise download it from Copy the jar to $M2_HOME/com/oracle/ojdbc14/ and you should be good to go.

If you use the docker container, extract the jdbc driver from the container via docker cp oracle:/u01/app/oracle/product/11.2.0/xe/jdbc/lib/ojdbc6.jar ojdbc.jar

mvn -q install:install-file -Dfile=ojdbc.jar -DartifactId=ojdbc14 -Dversion= -Dpackaging=jar -DgeneratePom=true

Install Oracle locally

Download Oracle XE from During installation use the password "oracle" which is also the default password for the docker image.


When using the DB2 docker container via db2 you might want to specify the following properties when executing tests -Djdbc.url=jdbc:db2:// -Djdbc.user=db2inst1 -Djdbc.password=db2inst1-pwd

JDBC Driver

You have to install the JDBC driver manually. If you install DB2 Express locally, you can take it from $DB2_HOME/sqllib/java otherwise download it from

When using the docker container, you can find in the copy script to extract the JDBC driver from the container in Install via the following commands.

mvn -q install:install-file -Dfile=db2jcc4.jar -DartifactId=db2jcc4 -Dversion=9.7 -Dpackaging=jar -DgeneratePom=true

mvn -q install:install-file -Dfile=db2jcc_license_cu.jar -DartifactId=db2jcc_license_cu -Dversion=9.7 -Dpackaging=jar -DgeneratePom=true

SQL Server

When using the DB2 docker container via mssql you might want to specify the following properties when executing tests -Djdbc.url=jdbc:sqlserver://

JDBC Driver

Since the JDBC driver is officially available in Maven central, you don't have to separately install it.

GraalVM for native images with Quarkus

The general setup required for building native images with GraalVM is described in

  • Install GraalVM 20.0.0 and make sure you install the native-image tool and set GRAALVM_HOME environment variable
  • Install required packages for a C development environment

For example, run the following maven build to execute native image tests for H2:

mvn -pl examples/quarkus/testsuite/native/h2 -am integration-test -Ph2 -Pnative

Under Windows, make sure you run maven builds that use native image from the VS2017 native tools command line.

Website deployment

You can use to deploy to the target environment but need to configure in ~/.m2/settings.xml the following servers.

Id: User/Password: user/****

Id: User/Password: user/****


This distribution, as a whole, is licensed under the terms of the Apache License, Version 2.0 (see LICENSE.txt).


Project Site:

You can’t perform that action at this time.