This is an enhanced FAIR Data Point, specifically the search capability is enhanced by expanding the search keywords with associations terms extracted froma an ontology. Also a ranking algorithm is applied to the rank the result list.
This work is supported by a SURF-DCC pilot, Enhancing FAIR Data Point's Search Capability as a FAIR Service.
FAIR Data Point (FDP) is a REST API for creating, storing, and serving FAIR metadata. This FDP implementation also presents a Web-based graphical user interface (GUI). The metadata contents are generated semi-automatically according to the FAIR Data Point software specification document.
More information about FDP, how to deploy it and use it can be found in the FDP Deployment and REST API usage Documentation.
- FAIR Data Point Client
- FAIR Data Point E2E Tests
- FAIR Data Point Documentation
- OpenRefine Metadata Extension
FAIR Data Point API comes with an embedded OpenAPI documentation using Swagger. The
details of API calls can be found there. It also allows trying out API calls directly. To access the FDP swagger
document please visit the following url via web
browser localhost:8080/swagger-ui.html (for local deployment)
or https://your.domain.tld/swagger-ui.html
for your deployment (
e.g. app.fairdatapoint.org/swagger-ui.html). More detailed descriptions and examples of these API calls is available in the Deployment and Usage instructions
- Java (JDK 17)
- MongoDB (4.2)
- Maven (3.2.5 or higher)
- Docker (19.03.0-ce or higher) - for building Docker image only
To run the application, a MongoDB instance is required to be running. To configure the MongoDB with standard
connection (mongodb://localhost:27017/fdp
), simply instruct Spring Boot to use the development
profile. Then run:
$ mvn spring-boot:run -Dspring-boot.run.profiles=development
Alternatively, create an application.yml
file in the project root
and configure the mongodb address
, and then run:
$ mvn spring-boot:run
Run from the root of the project:
$ mvn test
Run from the root of the project:
$ mvn package
Run from the root of the project (requires building jar
file using mvn package
as shown above):
$ docker build -t fairdatapoint:local .
If you do not have Java and Maven locally, you can build the Docker image using Docker (instead of using locally
built jar
file):
$ docker build -f Dockerfile.build -t fairdatapoint:local .
Once you've built a docker image, follow the instructions on: https://fairdatapoint.readthedocs.io/
Keep in mind that your docker image is named fairdatapoint:local
in this case.
To your application.yml, add an additional section
search:
ontologyUrls:
- <url to owl file, to be indexed>
associationRelevanceThreshold: <a decimal number, the higher, the shallower the search>
At startup, when parsing a new ontology, the amount of memory needed can grow quickly. For example: Thesaurus.owl has a 12 GB RAM peak. Keep this in mind when running this server.
Most of the GET
requests are publicly accessible. In contrast, POST
, PUT
, DELETE
, and PATCH
requests are
mainly secured. We use JWT Tokens
and Bearer Token Authentication. The
token can be retrieved using /tokens
endpoint where you send username and password. For details, visit the OpenAPI
documentation.
Default users
- ADMIN:
- Username:
albert.einstein@example.com
- Password:
password
- Username:
- USER:
- Username:
nikola.tesla@example.com
- Password:
password
- Username:
We maintain a CHANGELOG, you should also take a look at our Contributing guidelines and Code of Conduct.
The following paper can be cite as a reference paper for the FAIR Data Point:
@article{10.1162/dint_a_00160,
author = {Bonino da Silva Santos, Luiz Olavo and Burger, Kees and Kaliyaperumal, Rajaram and Wilkinson, Mark D.},
title = "{FAIR Data Point: A FAIR-oriented approach for metadata publication}",
journal = {Data Intelligence},
pages = {1-21},
year = {2022},
month = {08},
issn = {2641-435X},
doi = {10.1162/dint_a_00160},
url = {https://doi.org/10.1162/dint\_a\_00160},
eprint = {https://direct.mit.edu/dint/article-pdf/doi/10.1162/dint\_a\_00160/2038268/dint\_a\_00160.pdf}}
This project is licensed under the MIT License - see the LICENSE file for more details.