Skip to content

REST API

cnoelle edited this page Apr 8, 2018 · 9 revisions

Content

Setup

The REST endpoint for FendoDB is https://localhost:8443/rest/fendodb, if running on the local machine with default settings; otherwise adapt host and port. When running with OGEMA, the credentials of a machine user must be passed with the request, for instance by appending the user and pw parameters. The example rundirs come with a default user rest with password rest, which we will use throughout this page. Note that the user must be assigned the permission to access the FendoDB endpoint; for statically configured users this can be achieved by editing the respective entry in the file config/ogema.roles:

allow { [machine "testuser"] (org.smartrplace.logging.fendodb.permissions.FendoDbPermission "*" "*") } "testuser"

followed by a clean start. The default rest user already has the required permission.

Note that the example rundirs come with self-signed TLS-certificats, hence we will need to disable certificate checks when accessing the REST endpoint via https. In the curl-examples below, the -k parameter is responsible for this.

Formatting

The REST interface can provide json, xml or csv data. Simply adapt the Accept and/or Content-Type headers accordingly. Default is csv, if the respective header is missing. Supported headers:

  • json: application/json
  • xml: application/xml
  • csv: text/csv

Time conventions: timestamps may be passed either as long values (milliseconds since 1970-01-01T00:00:00Z), or formatted according to the pattern yyyy-MM-dd('T'HH(:mm(:ss))), where the brackets denote optional parts. Hence, for instance '2018-04-05', '2018-04-05T12', '2018-04-05T12:23:00' and '1522886400000' are all admissible. For the time being, the time zone cannot be specified; all date-times are interpreted in UTC time.

Getting started

After starting the framework, verify that the REST endpoint is accessible by issuing a request to root:

curl -k -i -H "Accept: application/json" -X GET https://localhost:8443/rest/fendodb?user=rest&pw=rest

This will list all available database instances. If we started OGEMA with the default settings, the data/slotsdb instance should be shown. Now let us create a new instance fendodb0 for testing:

curl -k -i -X PUT "https://localhost:8443/rest/fendodb?db=fendodb0&target=database&user=rest&pw=rest"

Sending again a GET request to root, we should now see the new database. Verify that the new instance is empty:

curl -k -i -H "Accept: application/json" -X GET "https://localhost:8443/rest/fendodb?db=fendodb0&target=find&user=rest&pw=rest"

Create the first timeseries:

curl -k -i -X PUT "https://localhost:8443/rest/fendodb?db=fendodb0&target=timeseries&id=test0&updatemode=on_value_update&user=rest&pw=rest"

and add a data point:

curl -k -i -H "Content-Type: application/json" -d '{"time":"2018-04-05T00:00:00","value":23.2,"quality":"GOOD"}' -X POST "https://localhost:8443/rest/fendodb?db=fendodb0&target=value&id=test0&user=rest&pw=rest"

Read timeseries data:

curl -k -i -H "Accept: application/json" -X GET "https://localhost:8443/rest/fendodb?db=fendodb0&id=test0&target=data&user=rest&pw=rest"

Optionally, you may specify a start and end timestamp for the data to be read, using the start and end parameters respectively.

Note that you can also set multiple data points at once:

curl -k -i -H "Content-Type: application/json" -d '{"time":"2018-04-06T01:00:00","value":1,"quality":"GOOD"}\n{"time":"2018-04-06T02:00:00","value":2,"quality":"GOOD"}' -X POST "https://localhost:8443/rest/fendodb?db=fendodb0&target=values&id=test0&user=rest&pw=rest"

The content does not strictly adhere to the JSON specification here; rather it consists of a stream of json data.

Reference

Below is a list of supported REST operations. All requests must be sent to the root "/rest/fendodb", path information is not used. Authentication information is omitted here.

GET

Action Required parameters Optional parameters
get all database instances
get all timeseries in a db * target=data
* db=<database id>
get all data points in a ts * target=data
* db=<database id>
* id=<timeseries id>
* start=<timestamp>
* end=<timestamp>
* interval=<sampling interval in ms>
get next/previous value * target=nextvalue/previousvalue
* db=<database id>
* id=<timeseries id>
* time=<timestamp>
get size * target=size
* db=<database id>
* id=<timeseries id>
* start=<timestamp>
* end=<timestamp>
get all tags (property keys) * target=tags
* db=<database id>
* id=<timeseries id> (if not present, tags from all timeseries are returned)
search for timeseries *target=find
* db=<database id>
* id=<list of timeseries ids>
* properties=<properties as key=value pairs>
* tags=<list of tags>
*idexcluded=<list of timeseries ids>
get statistics on timeseries * target=stats
* db=<database id>
* provider=<list of statistics providers>
Example providers: avg, cnt, max, min, maxT, minT
* id=<list of timeseries ids>
* properties=<properties as key=value pairs>
* tags=<list of tags>
* idexcluded=<list of timeseries ids>
* start=<timestamp>
* end=<timestamp>

PUT

Action Required parameters Optional parameters
create a database instance * target=database
* db=<database id>
create a timeseries, or update its update mode * target=timeseries
* db=<database id>
* id=<timeseries id>
* updatemode=on_value_update|on_value_changed|fixed_interval
* interval=<interval in ms>; only if updatemode is fixed_interval
add properties * target=properties
* db=<database id>
* id=<timeseries id>
* properties=<list of key=value pairs>

POST

Action Required parameters Body
add a data point * target=value
* db=<database id>
* id=<timeseries id>
A serialized data point (see below)
add multiple data points * target=values
* db=<database id>
* id=<timeseries id>
A stream of serialized data points (see below)

A single data point has the following format:

JSON

{
    "time":1523094069000,
    "value":23.4,
    "quality":"GOOD"
}

XML

<entry xsi:type="SampledDouble">
    <time>1519030977043</time>
    <value>12.0</value>
    <quality>GOOD</quality>
</entry>

CSV

1519030977043;12.0;GOOD

The last column (quality) is optional for CSV; if not present, quality GOOD is assumed. In general, quality can be GOOD or BAD, the exact meaning of the two values may depend on the context. Typically, quality BAD is used to mark gaps in a timeseries.

A stream of data points simply consists of multiple data points. Note that a stream of JSON values or XML values is not valid JSON again, respectively XML. Nevertheless, the same Content-Type and Accept header is used for those. A stream of CSV values should be separated by the linebreak character '\n'.

DELETE

Action Required parameters Optional parameters
Delete data before or after a given timestamp (note that this applies to all timeseries in the database; FendoDB does not support deleting individual data points) * target=data
At least one of
* start=<timestamp> (delete data younger than start)
* end=<timestamp> (delete data older than end)
delete a timeseries * target=timeseries
* db=<database id>
* id=<timeseries id>
delete a property (key=value pair) * target=properties
* db=<database id>
* id=<timeseries id>
* properties=<list of key=value pairs>
delete a tag (all properties with a given key) * target=tag
* db=<database id>
* id=<timeseries id>
* tags=<list of property keys>

Next

Clone this wiki locally