This Python library can be used from the commandline, it automatically generates descriptive metadata about a RDF knowledge graph (classes, instances, relations between classes).
We run SPARQL queries against the SPARQL endpoint provided by the user to produce HCLS dataset statistics for the dataset in the RDF Turtle format.
- Python 3.6 or higher, with pip
- Docker (optional)
Provide instructions to install the package
Install directly from GitHub to try it:
pip3 install git+https://github.com/MaastrichtU-IDS/fair-metadata.git
Or install from source code for development. Using -e
means that changes to the source code will be automatically update the package locally.
pip3 install -e .
Provide working examples on how to run the package
Check the commands available:
fair-metadata
fair-metadata analyze --help
Generate descriptive metadata, about types and relations, for a SPARQL endpoint
fair-metadata analyze https://graphdb.dumontierlab.com/repositories/test-vincent -o metadata.ttl
Create complete metadata description for your dataset, you will be asked a few questions (such as homepage, license and reference for this dataset)
fair-metadata create -o dataset_metadata.ttl
You can also import and use this library in Python:
from fair_metadata.generate_metadata import generate_hcls_from_sparql
sparql_endpoint = 'https://graphdb.dumontierlab.com/repositories/test-vincent'
dataset_uri = 'https://w3id.org/d2s/distribution/default'
g = generate_hcls_from_sparql(sparql_endpoint, dataset_uri)
This repository uses GitHub Actions to:
- Automatically run tests at each push to the
main
branch- It uploads the test coverage to SonarCloud (requires to set the
SONAR_TOKEN
secret)
- It uploads the test coverage to SonarCloud (requires to set the
- Publish the package to PyPI when a release is created (N.B.: the version of the package needs to be increased in setup.py before).
You will need to provide your login credentials using secrets in the repository settings to publish to PyPI:
PYPI_USERNAME
andPYPI_PASSWORD
Install PyTest:
pip3 install -U pytest
Run the tests:
pytest
Run a specific test in a file, and display print
in the output:
pytest tests/test_fair_metadata.py::test_create_dataset_metadata -s
Build the image:
docker build -t fair-metadata .
Run a container:
docker run -it --rm -v $(pwd):/root fair-metadata create -o dataset_metadata.ttl