The web application for GraphDB APIs
- Clone the project or check it out from version control.
- Open a terminal, navigate to the project's root directory, and run
npm install:cito install all required dependencies. Theapiproject is built automatically.
https://ontotext-ad.github.io/graphdb-workbench/developers-guide
The workbench can be run in a development mode by executing npm run start. This will start each of the child applications
in watch mode and will proxy requests to a GraphDB instance running on localhost:7200 (default).
This project enforces consistent code quality using ESLint, configured per microfrontend/package.
Each microfrontend has its own eslint.config.js or eslint.config.cjs
Lint rules are automatically enforced on file save and commit
The legacy-workbench package uses an older ESLint setup but is still linted on commit and during local development.
You can have ESLint running in your editor:
IntelliJ: enabled via Settings > Languages & Frameworks > JavaScript > Code Quality Tools > ESLint
The microfrontends which have .scss files also have stylelint.config.cjs configurations. These are used when committing a file.
Automatic style linting can also be set up in the editor:
Intellij: Settings > Languages & Frameworks > Style Sheets > Stylelint. Click Enable. One microfrontend's configuration file can be chosen per project with Intellij. Currently, there is no root style lint config specifically for this purpose.
Before each commit, ESLint runs on staged files via Husky and lint-staged.
This project uses Husky and lint-staged to automatically lint only staged files during a commit.
Pre-commit behavior:
-
Lints changed files using each package’s local ESLint config
-
Blocks the commit if any lint errors remain
To troubleshoot:
- Look for lint error messages in your terminal
- Run
npm run lint-staged --debugto see which files are affected - Run
npm run lintmanually to check all files
The workbench is regularly published as a package in the NPM registry.
For ease of use in local development with a locally running GraphDB at localhost:7200, there is also a
Docker compose that can be built and started with docker-compose up --build. The docker-compose requires
to have .env file in the root directory of the project where the HOST_IP environment variable
must be specified, e.g. HOST_IP=10.131.2.176. This is needed to proxy requests to locally running GraphDB.
GraphDB exposes a configuration param -Dgraphdb.workbench.home for overriding the bundled workbench.
This allows to easily point it to the dist/ folder of the workbench after it has been bundled
with npm run build.
Note: Wrongly configuring the parameter will result in GraphDb responding with HTTP 404.
The Docker distribution of GraphDB can also be configured to serve custom workbench, the only difference is that the workbench must be mounted, example:
docker run -d \
-p 7200:7200 \
-v /graphdb-workbench/dist:/workbench docker-registry.ontotext.com/graphdb-free:9.0.0 \
-Dgraphdb.workbench.home=/workbench
Note: Instead of mounting the workbench, this can be done in a custom Docker image using the GraphDB one as a base and then copy the custom workbench.
The CI pipeline is managed using a Jenkinsfile, which defines the stages for building, testing, and analyzing the project.
The CI pipeline uses a Docker Compose utility setup to ensure independence from the Node.js version installed on the Jenkins agent.
The Jenkinsfile defines the CI pipeline for building, testing, and analyzing the GraphDB Workbench project. Below is an overview of the workflow:
| Stage | Purpose | Tools/Commands |
|---|---|---|
| Build Info | Logs environment details for traceability | |
| Install | Installs dependencies and builds packages | npm run install:ci, docker-compose |
| Build | Builds the project | npm run buid, docker-compose |
| Lint | Performs lint checks | npm run lint, docker-compose |
| Test | Runs unit tests | npm run test, docker-compose |
| Cypress Test | Executes end-to-end tests | npm run cy:run, |
| Sonar | Performs static code analysis to ensure code quality | npm run sonar |
Note: If any stage fails, the pipeline is marked as failed, but proceeds to the next stage.