Running example can be viewed here: https://epoz.org/shmarql
A SPARQL endpoint explorer, that allows you to "bounce" the subject, predicate or object around for a public SPARQL endpoint so that you can explore the shape of the data. useful if you encounter a new dataset that you do not know and would like to quickly get a feel of what the data looks like.
You can test/run a local copy, if you have Docker installed by doing:
docker run --rm -p 8000:8000 -it ghcr.io/epoz/shmarql:latest
This will pull the latest version from the Github package registry, delete the container again after running, and export it on port 8000. You can then view it in your browser at: http://localhost:8000/
SHMARQL also has a built-in triplestore which you can use to share your RDF data over a SPARQL interface. To use it, you need to specify the path from which to load the datafiles at startup, using an environment variable: DATA_LOAD_PATHS
.
This also means that the path in which the data is stored is is "visible" to the docker container via for example a mounted volume.
Here is an example, where you have some .ttl files stored in a directory named ./databases
docker run --rm -p 8000:8000 -it -v $(pwd)/databases:/data -e DATA_LOAD_PATHS=/data ghcr.io/epoz/shmarql:latest
This will load all .ttl files found in the specified directory, and make it available under a /sparql endpoint, eg. http://localhost:8000/sparql
docker run --rm -it -p 8000:8000 -it -e DEBUG=1 -e FTS_FILEPATH=/fts -e DATA_LOAD_PATHS=https://yogaontology.org/ontology.ttl ghcr.io/epoz/shmarql:latest
SELECT * WHERE {
?sub <http://shmarql.com/fts> "Sa*" .
}
Under development is RDF2vec support. This can be enabled by adding the path to store the embeddings like so:
docker run --rm -it -p 8000:8000 -it -e DEBUG=1 -e RDF2VEC_FILEPATH=/vec -e DATA_LOAD_PATHS=https://yogaontology.org/ontology.ttl ghcr.io/epoz/shmarql:latest
Then you can query the triples for similar entities by using the magic predicate: <http://shmarql.com/vec>
similar to the fulltext query above.
☣️ This is a new experimental feature, and needs more extenstive testing.
If you would like to run and modify the code in this repo, there is a Dockerfile which includes the necessary versions of the required libraries.
First, build the Docker image like so:
docker build -t shmarql .
Now you can run a local copy with:
docker run -it --rm -p 8000:8000 shmarql