You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In order to merge the "sesame native" and "stardog remote" branches we are proposing to define a protocol for the drafter API which can be implemented for different backends. Initially it will be specific to the functionality required by the drafter server, however drafter client defines various protocols for drafter functionality so eventually the goal is to merge these into a single drafter.protocols library which both will implement as required.
Drafter is currently tied quite heavily to the sesame repository API so another goal of the modular backend is to move this explicit dependency into the sesame-specific backend(s).
There are two main parts to the API - the various SPARQL endpoints (raw, live, draft, state) and the 'job' functionality for creating, populating and publishing draft graphs.
Job API
Below are the proposed functions for the job API functions to add to the protocol
create-managed-graph
create-managed-graph(uri: URI, metadata: Map[String, String] = Map.empty): Unit
Defines a new managed graph in the state graph. This function already exists in the drafter client.
create-draft-graph
create-graft-graph(live-uri: URI, metadata: Map[String, String] = Map.empty): URI
This function exists in the drafter client however the drafter server version expects the live-uri to already exist as a managed graph while the client documentation does not specify this restriction.
migrate-graph-to-live
def migrate-graphs-to-live(draftGraphs: Seq[URI]): Job
Migrates a collection of draft graphs to live. Note the implementation of this function in the sesame native and stardog-remote branches are quite different.
delete-graph
def deleteGraph(graphUri: URI, contentsOnly: Boolean = false): Job
Deletes a managed graph and optionally its entry in the state graph. This is called drop-graph in the drafter client protocols but it is not yet implemented there.
append-data-to-graph-job
append-data-to-graph-job(graphUri: URI, statements: Seq[Statement], metadata: Map[String, String] = Map.empty): Job
Appends the given data and optional metadata to a managed graph. Called append-data! in drafter client.
update-metadata
updateMetadata(graphs: Seq[URI], metadata: Map[String, String]): Unit
Updates the metadata associated with a collection of managed graphs. Called assoc-metadata in drafter client although that only allows a single graph to be specified.
delete-metadata
delete-metadata(graph: Seq[URI], metaKeys: Seq[String]): Unit
Deletes the named metadata keys associated with a collection of managed graphs. Called dissoc-metadata in drafter client although that only allows a single graph to be specified.
SPARQL API
The SPARQL API should be separated from the Sesame Repository API.
update-restricted
update-restricted(query: String, restrictions: Set[URI] = Set.empty): Unit
Submit an update statement with a given graph restriction.
Querying
The query API currently makes reference to various concrete sesame repository types to do content negotiation. Initially this dependency could be broken by providing three functions in the query API:
In order to merge the "sesame native" and "stardog remote" branches we are proposing to define a protocol for the drafter API which can be implemented for different backends. Initially it will be specific to the functionality required by the drafter server, however drafter client defines various protocols for drafter functionality so eventually the goal is to merge these into a single
drafter.protocols
library which both will implement as required.Drafter is currently tied quite heavily to the sesame repository API so another goal of the modular backend is to move this explicit dependency into the sesame-specific backend(s).
There are two main parts to the API - the various SPARQL endpoints (raw, live, draft, state) and the 'job' functionality for creating, populating and publishing draft graphs.
Job API
Below are the proposed functions for the job API functions to add to the protocol
create-managed-graph
create-managed-graph(uri: URI, metadata: Map[String, String] = Map.empty): Unit
Defines a new managed graph in the state graph. This function already exists in the drafter client.
create-draft-graph
create-graft-graph(live-uri: URI, metadata: Map[String, String] = Map.empty): URI
This function exists in the drafter client however the drafter server version expects the
live-uri
to already exist as a managed graph while the client documentation does not specify this restriction.migrate-graph-to-live
def migrate-graphs-to-live(draftGraphs: Seq[URI]): Job
Migrates a collection of draft graphs to live. Note the implementation of this function in the
sesame native
andstardog-remote
branches are quite different.delete-graph
def deleteGraph(graphUri: URI, contentsOnly: Boolean = false): Job
Deletes a managed graph and optionally its entry in the state graph. This is called
drop-graph
in the drafter client protocols but it is not yet implemented there.append-data-to-graph-job
append-data-to-graph-job(graphUri: URI, statements: Seq[Statement], metadata: Map[String, String] = Map.empty): Job
Appends the given data and optional metadata to a managed graph. Called
append-data!
in drafter client.update-metadata
updateMetadata(graphs: Seq[URI], metadata: Map[String, String]): Unit
Updates the metadata associated with a collection of managed graphs. Called
assoc-metadata
in drafter client although that only allows a single graph to be specified.delete-metadata
delete-metadata(graph: Seq[URI], metaKeys: Seq[String]): Unit
Deletes the named metadata keys associated with a collection of managed graphs. Called
dissoc-metadata
in drafter client although that only allows a single graph to be specified.SPARQL API
The SPARQL API should be separated from the Sesame Repository API.
update-restricted
update-restricted(query: String, restrictions: Set[URI] = Set.empty): Unit
Submit an update statement with a given graph restriction.
Querying
The query API currently makes reference to various concrete sesame repository types to do content negotiation. Initially this dependency could be broken by providing three functions in the query API:
where
QueryType
indicates whether the query is anASK
CONSTRUCT
orSELECT
query.The text was updated successfully, but these errors were encountered: