Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Documentation requests #129

Open
rpgoldman opened this issue Aug 17, 2021 · 2 comments
Open

Documentation requests #129

rpgoldman opened this issue Aug 17, 2021 · 2 comments

Comments

@rpgoldman
Copy link
Contributor

@balhoff

  1. It would be helpful if you could add a little more documentation of the configuration file. I'm not entirely sure what the name key is for. How does this relate to the ontology identifier (if at all)? Also, what are the acceptable values for the reasoner key? Looking at https://github.com/phenoscape/owlery/blob/master/src/main/scala/org/phenoscape/owlery/Owlery.scala I can see that they are

    • "structural"
    • "elk"
    • "hermit"
    • "jfact"
    • "whelk"

    Any suggestions about how to choose between them? I'm somewhat familiar with HermiT and Fact, but the rest are new to me.

    Also, if we have multiple ontologies to use together, do we make a single ontology, load that, as a named context, and make sure it imports the other ontologies? That relates to the question about the "name" -- does a name name a full context? The fact that it is associated with a reasoner suggests that the answer is "yes".

  2. Would it be possible to augment the Docker hub page with an example docker run command? I think I can figure one out (and if I do, I will post it here), but I'm not at all confident. A config file example for use with the docker container would also be helpful (presumably it would put all the ontologies in the /srv directory, and mount that from the host).

  3. Would it be possible to explain a bit more about the following?

    It answers DL queries. For other queries, like label annotations, it is meant to be used in conjunction with a separate triplestore sparql endpoint.

    Am I right in thinking this means something like "if you want to know all the Individuals that satisfy an OWL class definition, owlery will do that, but if you want to then query the properties of those Individuals, you will need to do that in an ancillary triplestore"? Is that the right interpretation?

    If so, is the expectation that owlery users would somehow integrate such a triple store into the server (presumably as a different endpoint)? And would we want to do something like run a DL reasoner on our ontology, saving the results, and load the augmented set of triples into the other triple store. I have to admit I don't know what a federated SPARQL query, but will do some research.

Thanks!

@rpgoldman
Copy link
Contributor Author

A quick follow-up: I was able to get Owlery to work, or at least to come up and expose its API -- I haven't had time to experiment with it much yet.

If I understand correctly, if one gives a local directory as the value of the location configuration key for a KB, then Owlery will load all the ontology files in that directory (and one should ensure that no non-ontology files are present). Otherwise, one can give a URL, and the URL will point to a single OWL file. I'm not sure what one would do to load only a single local file. Perhaps a file:/// URL? At any rate, it's not very important, because it is easy to make a directory with only one file in it.

Another question: does the Owlery server serve up its own OpenAPI/Swagger spec? I have been using the one that GitHub points at, but I thought it was normal for a server to serve up its own API spec, as well. If the Owlery server does, I'm not looking for it in the right place.

@rpgoldman
Copy link
Contributor Author

As far as an example Docker command is concerned, here is what I have used (it's in a Makefile, which accounts for the odd context):

server: $(turtle_files)
	mkdir -p $(makeFileDir)server-files
	cp $(turtle_files) $(makeFileDir)server-files
	docker run --rm -p 8080:8080 -v $(makeFileDir):/srv/owl -v $(makeFileDir)owlery-conf:/srv/conf phenoscape/owlery:latest

With this, I can just do make server to bring my server up. But I think the docker run command could be pulled out of the above and put in the docs on DockerHub.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant