JS RDF store with SPARQL support
JavaScript Other
Pull request Compare This branch is 1 commit ahead, 166 commits behind antoniogarrote:master.
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Failed to load latest commit information.



rdfstore-js is a pure Javascript implementation of a RDF graph store with support for the SPARQL query and data manipulation language.

var rdfstore = require('rdfstore');

rdfstore.create(function(store) {
  store.execute('LOAD <http://dbpedia.org/resource/Tim_Berners-Lee> INTO GRAPH <http://example.org/people>', function() {

    store.setPrefix('dbp', 'http://dbpedia.org/resource/');
    store.node(store.rdf.resolve('dbp:Tim_Berners-Lee'), "http://example.org/people", function(success, graph) {

      var peopleGraph = graph.filter(store.rdf.filters.type(store.rdf.resolve("foaf:Person")));
      store.execute('PREFIX rdf:  <http://www.w3.org/1999/02/22-rdf-syntax-ns#>\
                     PREFIX foaf: <http://xmlns.com/foaf/0.1/>\
                     PREFIX : <http://example.org/>\
                     SELECT ?s FROM NAMED :people { GRAPH ?g { ?s rdf:type foaf:Person } }',
                     function(success, results) {

                       console.log(peopleGraph.toArray()[0].subject.valueOf() === results[0].s.value);




rdfstore-js can be executed in a web browser or can be included as a library in a node.js application. It can also be executed as a stand-alone SPARQL end-point accepting SPARQL RDF Protocol HTTP requests. Go to the bottom of this page to find some application examples using the library.

The current implementation is far from complete but it already passes all the test cases for the SPARQL 1.0 query language and supports data manipulation operations from the SPARQL 1.1/Update version of the language.

Some other features included in the library are the following:

  • SPARQL 1.0 support
  • SPARQL 1.1/Update support
  • Partial SPARQL 1.1 query support
  • JSON-LD parser
  • Turtle/N3 parser
  • W3C RDF Interfaces API
  • RDF graph events API
  • Partial support for property paths in queries
  • Custom filter functions
  • Parallel execution where WebWorkers are available
  • Persistent storage using HTML5 LocalStorage in the browser version
  • Persistent storage using MongoDB in the Node.js version
  • Node.js HTTP server implementating the SPARQL Protocol for RDF recommendation


Documentation for the store can be found here.

SPARQL support

rdfstore-js supports at the moment SPARQL 1.0 and most of SPARQL 1.1/Update. Only some parts of SPARQL 1.1 query have been implemented yet.

This is a list of the different kind of queries currently implemented:

  • SELECT queries
  • UNION, OPTIONAL clauses
  • NAMED GRAPH identifiers
  • ORDER BY clauses
  • SPARQL 1.0 filters and builtin functions
  • variable aliases
  • variable aggregation: MAX, MIN, COUNT, AVG, SUM functions
  • GROUP BY clauses
  • DISTINCT query modifier
  • CONSTRUCT queries
  • ASK queries
  • INSERT DATA queries
  • DELETE DATA queries
  • DELETE WHERE queries
  • LOAD queries
  • CREATE GRAPH clauses

These are supported components in property path expressions:

  • Sequence: elt1/elt2/elt3
  • Zero or more occurrences: elt*
  • One or more occurrences: elt+


To use the library in a node.js application, there is available a package that can be installed using the NPM package manager:

$npm install rdfstore

The library can be used as a persistent RDF store using MongoDB as the backend. An instance of MongoDB must be running in order to use this version of the store.

It is also possible to use rdfstore-js in a web application being executed in a browser. There is minimized version of the library in a single Javascript file that can be linked from a HTML document. There is also a minimized and gunzipped version available. Both versions have been compiled using Google's Closure Javascript compiler. The persistent versions can be found here (min).


The library can be built for different environments using the included Ruby script and configuration file. The JSON 1.5 Ruby gem is required by the build script.

To build the library for node.js execute the following command from the root directory of the project:

$./make.rb nodejs

To build the library for the browser configuration, execute the following command:

$./make.rb browser

To build the library for the browser, including support for persistent storage execute this command:

$./make.rb browser_persistent

The output of each configuration will be created in the dist subdirectory at the root path of the project.


To execute the whole test suite of the library, including the DAWG test cases for SPARQL 1.0 and the test cases for SPARQL 1.1 implemented at the moment, the build script can be used:

$./make.rb tests

The tests depend on nodeunit. That node.js library must be installed in order to run the tests.

You can also run the tests on the minimized version of the library with the command:

$./make.rb test_min

Additionally, there are some smoke tests for both browser versions that can be found ithe 'browsertests' directory. These tests are now also available online at these adresses:

Stand-alone SPARQL end-point

The Node.js version of the store can be used as a stand-alone SPARQL end-point. In the library distribution there is an executable script that can be used in UNIX platforms to invoke the store as an application.

$npm install rdfstore
$./node_modules/rdfstore/bin/rdfstorejs webserver --store-name test --store-engine mongodb

The previous shell command starts the execution of an instance of the store that uses a persistent instance of MongoDB as the backend and can accept HTTP SPARQL protocol requests.

The rdfstorejs script can also be used to some administrative tasks. For example it can be used to load RDF data into a graph in the store:

$./bin/rdfstorejs load http://dbpedia.org/resource/Tim_Berners-Lee http://test.com/graph1 --store-name test --store-engine mongodb

When dealing with a remote resource, the store will perform automatically content negotiation. When loading data from a local file or standard input, the media type must be passed to the store with the --media-type flag.

The previous command loads the graph for a DBPedia article in the store graph passed as second argument. This graph can be retrieved using a HTTP requests according to the SPARQL RDF Protocol:

$./node_modules/rdfstore/bin/rdfstorejs webserver --store-name test --store-engine mongodb &
$ curl -v -d "default-graph-uri=http://test.com/graph1" --data-urlencode "query=select * { ?s ?p ?o } limit 3" -H "Accept: application/rdf+xml" http://localhost:8080/sparql

* About to connect() to localhost port 8080 (#0)
*   Trying ::1... Connection refused
*   Trying fe80::1... Connection refused
*   Trying connected
* Connected to localhost ( port 8080 (#0)
> POST /sparql HTTP/1.1
> User-Agent: curl/7.19.7 (universal-apple-darwin10.0) libcurl/7.19.7 OpenSSL/0.9.8l zlib/1.2.3
> Host: localhost:8080
> Accept: application/rdf+xml
> Content-Length: 104
> Content-Type: application/x-www-form-urlencoded
< HTTP/1.1 200 OK
< Content-Type: application/sparql-results+xml
< Access-Control-Allow-Origin: *
< Access-Control-Allow-Methods: POST, GET, OPTIONS
< Access-Control-Allow-Headers: Content-Type, Depth, User-Agent, X-File-Size, X-Requested-With, If-Modified-Since, X-File-Name, Cache-Control
< Connection: keep-alive
< Transfer-Encoding: chunked
* Connection #0 to host localhost left intact
* Closing connection #0
<?xml version="1.0" encoding="UTF-8"?><sparql xmlns="http://www.w3.org/2005/sparql-results#"><head><variable name="s"/>...</sparql>

The store supports these formats in the response of CONSTRUCT SPARQL queries: rdf/xml, turtle, json-ld. When responding to SELECT and ASK queries results can be retrieved in the normative rdf/xml serialization, but they can also be retrieved as JSON passing an application/ld+json media type in the HTTP Accept header.

Data can be removed from an instance of the store using a persistent backend with the clear command:

$./bin/rdfstorejs clear --store-name test --store-engine mongodb

Several aspects of the server execution can be configured passing arguments to the rdfstorejs script. A list of these flags, as well as a list of the available commands can be obtained invoking the script without arguments:


Usage: rdfstorejs Command [Args] [Options]

* webserver: starts the HTTP frontend for the store
* load URI|stdin [dstGraphURI]: load the graph pointed by the URI argument into the store. The graph will be loaded in the 'dstGraphURI' graph or the default graph if none specified
* clear: removes all data from the store
-p: server port [8080]
--webserver-port: server port [8080]
-prot: protocol to use http | https [http]
--webserver-protocol: protocol to use http | https [http]
--webserver-path: Path where the SPARQL endpoint will be accessible [/sparql]
--webserver-ssl-key: Path to the SSL private key file [./ssl/privatekye.pem]
--webserver-ssl-cert: Path to the SSL certfiviate file [./ssl/certificate.pem]
-cors: Should the server accept CORS requests [true]
--webserver-cors-enabled: Should the server accept CORS requests [true]
--store-tree-order: BTree index tree order used in the in memory backend [15]
--store-engine: What backend should the store use: 'memory' and 'mongodb' are possible values [memory]
--store-name: Name to be used to store the quad data in the persistent backend [rdfstore_js]
--store-overwrite: If set to 'true' previous data in the persistent storage will be removed at startup [false]
--store-mongo-domain: If store-engine is set to 'mongodb', location of the MongoDB server [localhost]
--store-mongo-port: If store-engine is set to 'mongodb', port where the MongoDB server is running [27017]
-mime: When loading a local RDF file or loading from input stream, media type of the data to load [application/rdf+xml]
--media-type: When loading a local RDF file or loading from input stream, media type of the data to load [application/rdf+xml]


The following table shows the execution times obtained running the LUBM benchmark in different browsers. The data has been generated using the LUBM data generator for a single university. Text for some queries have been adapted, since the store does not support inference yet. The text of all the queries can be found here. All the queries have been executed on a desktop system runnin OSX 10.6 with the exception of the Internet Explorer tests that have been execute in a virtualized image of Windows7.

The amount of data loaded is 100545 triples, around 11MB of data. Times are measured in seconds.

 Chrome 16Safari 5Firefox Aurora 11Internet Explorer 9
query 00.5521.1760.8340.771
query 10.0050.0330.0430.016
query 20.0180.1490.0460.111
query 30.0050.0220.0260.023
query 40.1550.5020.3110.603
query 50.0430.0910.1090.131
query 60.0230.0390.0450.057
query 70.3240.5730.731.678
query 80.8281.5811.7892.548
query 100.0080.0220.0240.027
query 110.0010.0030.0060.003
query 120.0030.0070.0110.006
query 130.0420.1030.0980.119
query 140.0090.0280.0240.035

The following list shows the insertion time of the 100K triples into the store:

  • Chrome 16: 9.559 secs
  • Safari 5: 6.661 secs
  • Firefox Aurora 11: 16.523 secs
  • Internet Explorer 9: 17.042 secs.


This is a small overview of the rdfstore-js API.

###Store creation

//nodejs only
var rdfstore = require('rdfstore');

// in the browser the rdfstore object
// is already defined

// alt 1
rdfstore.create(function(store) {
  // the new store is ready

// alt 2
var store = rdfstore.create()

// alt 3
new rdfstore.Store(function(store) {
  // the new store is ready

// alt 4
store = new rdfstore.Store();

###Persistent store creation (Browser)

In order to use persistent storage in the browser, an option named 'persitent' must be passed with value 'true' in the options for the store. An additional flag 'overwrite' indicates if the data for this store can be used to drop old data or read previously stored data. Optionally, a name for the store can also be passed as an argument. This name can be used to manipulate several persistent stores in the same browser.

At the moment, webworkers cannot be used with the persistent version of the store.

new rdfstore.Store({persistent:true, name:'myappstore', overwrite:true}, function(store){
  // Passing overwrite:true to the options will make the store to drop all previous data.
  // Several stores can be used, providing different names for the stores

###Persistent store creation (Node.js)

The Node.js version of the library uses MongoDB as the persistent backend and Node.js MongoDB driver to establish a connection between the store engine and the backend. The options 'persistent' and 'engine' with value 'mongodb' must be passed as parameters. The 'overwrite' parameter can also be used to clean the data stored in the persistent storage. Configuration of the MongoDB instance to be used can be passed using the parameters 'mongoDomain' and 'mongoPort'. Finally the parameter 'mongoOptions' can be used to pass configuration options to the Node.js MongoDB driver (check the driver documentation for more information).

new rdfstore.Store({persistent:true, 
                    name:'myappstore', // quads in MongoDB will be stored in a DB named myappstore
                    overwrite:true,    // delete all the data already present in the MongoDB server
                    mongoDomain:'dbserver', // location of the MongoDB instance, localhost by default
                    mongoPort:27017 // port where the MongoDB server is running, 27017 by default
                   }, function(store){

###Query execution

// simple query execution
store.execute("SELECT * { ?s ?p ?o }", function(success, results){
  if(success) {
    // process results        
    if(results[0].s.token === 'uri') {

// execution with an explicit default and named graph

var defaultGraph = [{'token':'uri', 'vaue': graph1}, {'token':'uri', 'value': graph2}, ...];
var namedGraphs  = [{'token':'uri', 'vaue': graph3}, {'token':'uri', 'value': graph4}, ...];

store.executionWithEnvironment("SELECT * { ?s ?p ?o }",defaultGraph,
  namedGraphs, function(success, results) {
  if(success) {
    // process results

###Construct queries RDF Interfaces API

var query = "CONSTRUCT { <http://example.org/people/Alice> ?p ?o } \
             WHERE { <http://example.org/people/Alice> ?p ?o  }";

store.execute(query, function(success, graph){
  if(graph.some(store.rdf.filters.p(store.rdf.resolve('foaf:name)))) {
    nameTriples = graph.match(null, 

    nameTriples.forEach(function(triple) {

###Loading remote graphs

rdfstore-js will try to retrieve remote RDF resources across the network when a 'LOAD' SPARQL query is executed. The node.js build of the library will use regular TCP sockets and perform proper content negotiation. It will also follow a limited number of redirections. The browser build, will try to perform an AJAX request to retrieve the resource using the correct HTTP headers. Nevertheless, this implementation is subjected to the limitations of the Same Domain Policy implemented in current browsers that prevents cross domain requests. Redirections, even for the same domain, may also fail due to the browser removing the 'Accept' HTTP header of the original request. rdfstore-js relies in on the jQuery Javascript library to peform cross-browser AJAX requests. This library must be linked in order to exeucte 'LOAD' requests in the browser.

store.execute('LOAD <http://dbpedialite.org/titles/Lisp_%28programming_language%29>\
               INTO GRAPH <lisp>', function(success){
  if(success) {
    var query = 'PREFIX foaf:<http://xmlns.com/foaf/0.1/> SELECT ?o \
                 FROM NAMED <lisp> { GRAPH <lisp> { ?s foaf:page ?o} }';
    store.execute(query, function(success, results) {
      // process results

###High level interface

The following interface is a convenience API to work with Javascript code instead of using SPARQL query strings. It is built on top of the RDF Interfaces W3C API.

/* retrieving a whole graph as JS Interafce API graph object */

store.graph(graphUri, function(graph){
  // process graph

/* Exporting a graph to N3 (this function is not part of W3C's API)*/
store.graph(graphUri, function(graph){
  var serialized = graph.toNT();

/* retrieving a single node in the graph as a JS Interface API graph object */

store.node(subjectUri, function(graph) {
  //process node
store.node(subjectUri, graphUri, function(graph) {
  //process node

/* inserting a JS Interface API graph object into the store */

// inserted in the default graph
store.insert(graph, function(success) {}) ;

// inserted in graphUri
store.insert(graph, graphUri, function(success) {}) ;

/* deleting a JS Interface API graph object into the store */

// deleted from the default graph
store.delete(graph, function(success){});

// deleted from graphUri
store.delete(graph, graphUri, function(success){});

/* clearing a graph */

// clears the default graph

// clears a named graph
store.clear(graphUri, function(success){});

/* Parsing and loading a graph */

// loading local data
store.load("text/turtle", turtleString, function(success, results) {});

// loading remote data
store.load('remote', remoteGraphUri, function(success, results) {});

/* Registering a parser for a new media type */

// The parser object must implement a 'parse' function
// accepting the data to parse and a callback function.

store.registerParser("application/rdf+xml", rdXmlParser);

###RDF Interface API

The store object includes a 'rdf' object implementing a RDF environment as described in the RDF Interfaces 1.0 W3C's working draft. This object can be used to access to the full RDF Interfaces 1.0 API.

var graph = store.rdf.createGraph();
                                 function(triple){ var name = triple.object.valueOf();
                                                   var name = name.slice(0,1).toUpperCase() 
                                                   + name.slice(1, name.length);
                                                   triple.object = store.rdf.createNamedNode(name);
                                                   return triple;}));

store.rdf.setPrefix("ex", "http://example.org/people/");
graph.add(store.rdf.createTriple( store.rdf.createNamedNode(store.rdf.resolve("ex:Alice")),
                                  store.rdf.createLiteral("alice") ));

var triples = graph.match(null, store.rdf.createNamedNode(store.rdf.resolve("foaf:name")), null).toArray();

console.log("worked? "+(triples[0].object.valueOf() === 'Alice'));

###Default Prefixes

Default RDF name-spaces can be specified using the registerDefaultNamespace. These names will be included automatically in all queries. If the same name-space is specified by the client in the query string the new prefix will shadow the default one. A collection of common name-spaces like rdf, rdfs, foaf, etc. can be automatically registered using the registerDefaultProfileNamespace function.

new Store.Store({name:'test', overwrite:true}, function(store){
    store.execute('INSERT DATA {  <http://example/person1> <http://xmlns.com/foaf/0.1/name> "Celia" }', function(result, msg){


       store.execute('SELECT * { ?s foaf:name ?name }', function(success,results) {
           test.ok(success === true);
           test.ok(results.length === 1);
           test.ok(results[0].name.value === "Celia");

###JSON-LD Support

rdfstore-js implements parsers for Turtle and JSON-LD. The specification of JSON-LD is still an ongoing effort. You may expect to find some inconsistencies between this implementation and the actual specification.

        jsonld = {
             "rdf": "http://www.w3.org/1999/02/22-rdf-syntax-ns#",
             "xsd": "http://www.w3.org/2001/XMLSchema#",
             "name": "http://xmlns.com/foaf/0.1/name",
             "age": {"@id": "http://xmlns.com/foaf/0.1/age", "@type": "xsd:integer" },
             "homepage": {"@id": "http://xmlns.com/foaf/0.1/homepage", "@type": "xsd:anyURI" },
             "ex": "http://example.org/people/"
          "@id": "ex:john_smith",
          "name": "John Smith",
          "age": "41",
          "homepage": "http://example.org/home/"

store.setPrefix("ex", "http://example.org/people/");

store.load("application/ld+json", jsonld, "ex:test", function(success, results) {
  store.node("ex:john_smith", "ex:test", function(success, graph) {
    // process graph here

###Events API

rdfstore-js implements an experimental events API that allows clients to observe changes in the RDF graph and receive notifications when parts of this graph changes. The two main event functions are subscribe that makes possible to set up a callback function that will be invoked each time triples matching a certain pattern passed as an argument are added or removed, and the function startObservingNode that will be invoked with the modified version of the node each time triples are added or removed from the node.

var cb = function(event, triples){ 
  // it will receive a notifications where a triple matching
  // the pattern s:http://example/boogk, p:*, o:*, g:*
  // is inserted or removed.
  if(event === 'added') {
    console.log(triples.length+" triples have been added");  
  } else if(event === 'deleted') {
    console.log(triples.length+" triples have been deleted");  
// .. do something;
// stop receiving notifications

The main difference between both methods is that subscribe receives the triples that have changed meanwhile startObservingNode receives alway the whole node with its updated triples. startObservingNode receives the node as a RDF Interface graph object.

var cb = function(node){ 
  // it will receive the updated version of the node each
  // time it is modified.
  // If the node does not exist, the graph received will
  // not contain triples.
  console.log("The node has now "+node.toArray().length+" nodes");
// if only tow arguments are passed, the default graph will be used.
// A graph uri can be passed as an optional second argument.
// .. do something;
// stop receiving notifications

In the same way, there are startObservingQuery and stopObservingQuery functions that makes possible to set up callbacks for whole SPARQL queries. The store will try to be smart and not perform unnecessary evaluations of these query after quad insertion/deletions. Nevertheless too broad queries must be used carefully with the events API.

###Custom Filter Functions

Custom filter function can be registered into the store using the registerCustomFunction function. This function receives two argument, the name of the custom function and the associated implementation. This functions will be available in a SPARQL query using the prefix custom. The function implementation will receive two arguments, an object linking to the store query filters engine and a list with the actual arguments. Arguments will consist of literal or URIs objects. Results from the function must also be literal or URI objects.

The query filters engine can be used to access auxiliary function to transform literals into JavaScript types using the effectiveTypeValue function, boolean values using the effectiveBooleanValue, to build boolean litearl objects (ebvTrue, ebvFalse) or return an error with the ebvError. Documentation and source code for the QueryFilters object n the 'js-query-engine' module can be consulted to find information about additional helper functions.

The following test shows a simple examples of how custom functions can be invoked:

new Store.Store({name:'test', overwrite:true}, function(store) {
        '@prefix test: <http://test.com/> .\
         test:A test:prop 5.\
     test:B test:prop 4.\
     test:C test:prop 1.\
     test:D test:prop 3.',
        function(success) {

	var invoked = false;
	store.registerCustomFunction('my_addition_check', function(engine,args) {
	    // equivalent to var v1 = parseInt(args[0].value), v2 = parseInt(args[1]);

	    var v1 = engine.effectiveTypeValue(args[0]);
	    var v2 = engine.effectiveTypeValue(args[1]);

	    // equivalent to return {token: 'literal', type:"http://www.w3.org/2001/XMLSchema#boolean", value:(v1+v2<5)};

	    return engine.ebvBoolean(v1+v2<5);

                'PREFIX test: <http://test.com/> \
                 SELECT * { ?x test:prop ?v1 .\
                            ?y test:prop ?v2 .\
                            filter(custom:my_addition_check(?v1,?v2)) }',
                function(success, results) {
		test.ok(results.length === 3);
		for(var i=0; i<results.length; i++) {
		    test.ok(parseInt(results[i].v1.value) + parseInt(results[i].v2.value) < 5 );


RDFStore includes experimental support for webworkers since version 0.4.0 in both browser and node.js versions. The store can be initialized in a new thread using the connect method of the Store object.

The library will try to create a worker and will return a connection object providing the same interface of the store object. If the creation of the worker fails, because webworkers support is not enabled in the platform/browser, a regular store object will be returned instead. Since both objects implement the same interface, client code can be wrote without taking into consideration the actual implementation of the store interface.

Store.connect("/js/rdfstore_min.js", {}, function(success,store) {
    if(success) {
      // store is a connection to the worker
      console.log(store.isWebWorkerConnection === true);
    } else {
      // connection was not possible. A store object has been returned instead
    store.execute('INSERT DATA {  <http://example/book3> <http://example.com/vocab#title> <http://test.com/example> }', function(result, msg){
        store.execute('SELECT * { ?s ?p ?o }', function(success,results) {
          console.log(results.length === 1);

The connect function can receive three arguments, the URL/Path where the script of the store is located, so the webworkers layer can load it, a hash with the aguments for the Store.create function that will be used in the actual creation of the store object and a callback that will be invoked with a success notification and the store implementation. In the Node.js version, it is not required to provide the path to the store script, the location of the store module will be provided by default.

At the moment, the usability of this feature is limited to those browsers where the web workers framework is enabled. It has been tested with the current version of Chrome and Firefox Aurora 8.0a2. Support in the Node.js version is provided by the webworkers module. Webworkers are simulated in the Node.js version as forked child processes being executed in the background. A version of node >= 0.6.1 is required.

Web worker threads execute in the browser in a very restrictive environment due to security reasons. WebWorkers for example, cannot access the local storage API. As a consequence, workers cannot be used with the persistent version of the store. These restrictions are not present in the Node.js version.

##Standalone RDF-JS Interface API

Now it is also possible to use the RDF JS Interface API without as a standolone module.

The code is distributed for Node.js as the 'rdf_js_interface' module. It can be installed directly from NPM:

$npm install rdf_js_interface

After installing the module it can be required in the code of a Node.js application:

var RDFJSInterface = require('rdf_js_interface');

var graph = new RDFJSInterface.Graph();

graph.add(rdf.createTriple( rdf.createBlankNode(),
                            rdf.createNamedNode("http://test.com/MyClass") ));

The module has also been compiled for the browser. The original and minimised versions can be found here:

The module declares a new property 'RDFJSInterface' in the 'window' object pointing to the API object:

var graph = new RDFJSInterface.Graph();

graph.add(rdf.createTriple( rdf.createBlankNode(),
                            rdf.createNamedNode("http://test.com/MyClass") ));

##Related libraries and examples

There are some other libraries we have developed and that can be used with rdfstore-js to make it easier to build JS applicatons using RDF and linked data:

  • SemanticKO and extension for Knockoutjs that make possible to establish bidirectional bindings between the DOM tree and the RDF graph. It also includes some other utilities for building single page JS applications on top of the RDF graph stored by rdfstore-js.
  • JSON-LD Macros a library for describing transformations of JSON APIs into JSON-LD so it can be imported into rdfstore-js.

We have also built some demo applications used to test the store:

  • Geek Talk a web client aggregating information for a Github's project from different data APIs like Twitter, HackerNews or StackOverflow (github.)
  • social.rdf a personal linked data server collecting one user's information from different social web sites (github).
  • social.rdf vis an example of how to use rdfstore-js with a data visualization library like d3.js.
  • SemanticKO examples a collection of interactive examples of SemanticKO
  • JSON-LD Macros example a small interactive example of how to use JSON-LD Macros to build data transformations.
  • Node.js WebID demo. ##Reusable modules.

rdfstore-js is built from a collection of general purpose modules. Some of these modules can be easily extracted from the library and used on their own.

This is a listing of the main modules that can be re-used:

  • src/js-trees: in-memory and persistent tree data structures: binary trees, red-black trees, b-trees, etc.
  • src/js-sparql-parser: a SPARQL and a Turtle parsers built using the PEG.js parsing expression grammars library.
  • src/js-trees/src/utils: a continuation passing style inspired library for different code flow constructions
  • src/js-communication/src/json_ld_parser: JSON-LD parser implementation.
  • src/js-query-enginesrc/rdf_js_interface: Javascript Interface 1.0 API implementation.


rdfstore-js is still at the beginning of its development. If you take a look at the library and find a way to improve it, please ping us. We'll be very greatful for any bug report or pull-request.


Antonio Garrote, email:antoniogarrote@gmail.com, twitter:@antoniogarrote.

This code includes a modified version of the JSON-LD parser built by Digital Bazaar (see LICENSE file at https://github.com/digitalbazaar/jsonld.js/blob/master/LICENSE). It also includes a modified version of N3 parser developed as part of the rdflib/Tabulator project (https://github.com/linkeddata/rdflib.js) under the MIT license.


Christian Langanke


Licensed under the GNU Lesser General Public License Version 3 (LGPLV3), copyright Antonio Garrote 2011-2012.