Skip to content


Switch branches/tags

Latest commit


Git stats


Failed to load latest commit information.
Latest commit message
Commit time


Publication tool for publishing XML XSD's, SKOS thesauri, RDFS/OWL ontologies and SHACL validation rules on

Best practices and guidelines can be found in and


The application is a simple Dropwizard application including an RDF4j triple store.

Using HTTP Content Negotiation, the application will either show human-friendly HTML or send a machine-readable RDF representation (JSON-LD, N-Triples or Turtle)


Configuration is done using a YAML file, an example is included in the source.


Both Lucene and RDF4J triplestore need (separate) writable directories. In addition, there must be a directory for importing the thesauri, and another one for the generated dump files in several formats.

The directories must be created before starting the application, the exact location can be configured in the YAML file.

Admin tasks

Publishing / updating thesauri

One can import a new thesauri using a valid SKOS file in N-Triples or Turtle format. Note that the skos:ConceptScheme must at least have an English dcterms:title and dcterms:description, as they will be used by the HTML viewer.

Importing and updating thesauri is done by putting the SKOS file into the import directory importDir and running POSTing the data to the import URL. The name parameter must contain the file name of the SKOS file to be imported.

curl -X POST --data "name=newthes.nt" http://localhost:8081/tasks/vocab-import

The file name will be used to construct an RDF4j Context / named graph

The import task will add some VoID statistics and metadata, and generate dump files in several formats (JSON-LD, N-Triples and Turtle) in the downloadDir

Re-indexing Lucene

Run the lucene-reindex task using HTTP PUT, e.g.

curl -X POST http://localhost:8081/tasks/lucene-reindex

The index is stored in the luceneDir