Skip to content
Storage and database adapters for project Thoth
Branch: master
Clone or download
thoth-zuul Merge pull request #900 from thoth-station/kebechet-pytest-5.1.1
Automatic update of dependency pytest from 5.1.0 to 5.1.1

Latest commit 78e04fb Aug 21, 2019
Type Name Latest commit message Commit time
Failed to load latest commit information.
.github Add standard project template and code owners Jul 16, 2019
docs Add PI for Conv1D and Conv2D for tensorflow Jul 30, 2019
tests Remove old test Jul 30, 2019
thoth/storages Release of version 0.18.6 Aug 14, 2019
.coafile Ignore changelog file in coala, it's getting too large May 28, 2019
.gitignore Use Sphinx for documentation Mar 14, 2019
.pylintrc some pylint fixed Aug 28, 2018
.thoth.yaml Add Thoth's configuration file Mar 19, 2019
.zuul.yaml Release of version 0.18.6 Aug 14, 2019
LICENSE Use proper LICENSE file Apr 24, 2018 Include also requirements-test.txt in package Dec 3, 2018
OWNERS Create OWNERS Apr 24, 2018
Pipfile Solved conflict pinning to older version Aug 1, 2019
Pipfile.lock 📌 Automatic update of dependency pytest from 5.1.0 to 5.1.1 Aug 20, 2019
README.rst State in the README file how to debug graph database queries Jul 19, 2019 Reformat using black, fix some coala warnings Apr 24, 2019
docker-compose.yaml Revert changes in docker-compose Jul 30, 2019
pyproject.toml added a pyproject.toml to keep black happy Nov 20, 2018
requirements-test.txt Tests for Ceph adapter May 17, 2018
requirements.txt Schema proposal for Dgraph Apr 24, 2019 Fix handling of pytest arguments in Jul 30, 2019


Thoth Storages

This library provides a library called thoth-storages used in project Thoth. The library exposes core queries and methods for Dgraph database as well as adapters for manipulating with Ceph via its S3 compatible API.

Installation and Usage

The library can be installed via pip or Pipenv from PyPI:

pipenv install thoth-storages

The library does not provide any CLI, it is rather a low level library supporting other parts of Thoth.

You can run prepared testsuite via the following command:

pipenv install --dev
pipenv run python3 test

# To generate docs:
pipenv run python3 build_sphinx

Automatically generate schema for Graph database

To automatically generate schema for the graph database from models defined in this module, run:

PYTHONPATH=. pipenv run python3 ./ --output thoth/storages/graph/schema.rdf

After running this command, the RDF file describing schema will be updated based on changes in model.

from thoth.storages import GraphDatabase

# Also provide configuration if needed.
graph = GraphDatabase()

Running Dgraph locally

You can use docker-compose present in this repository to run a local Dgraph instance. It does not use TLS certificates (so you must not to provide GRAPH_TLS_PATH environment variable).

$ docker-compose up

After running the command above (make sure your big fat daemon is up using systemctl start docker), you should be able to access a local Dgraph instance at localhost:9080. This is also the default configuration for Dgraph's adapter - you don't need to provide GRAPH_SERVICE_HOST explicitly.

The provided docker-compose has also Ratel enabled for to have an UI for graph database content. To access it visit http://localhost:8000/.

The provided docker-compose uses volume mounted from /tmp. After you computer restart, the content will not be available anymore.

If you would like to experiment with Dgraph programatically, you can use the following code snippet as a starting point:

from thoth.storages import GraphDatabase

graph = GraphDatabase()
# To clear database:
# graph.drop_all()
# To initialize schema in the graph database:
# graph.initialize_schema()

Schema adjustment in deployment

It's possible to perform adjustments of schema in a deployemnt. It's important that there are no open transactions (simply retry schema creation until it succeeds). You can use relevant endpoint on Management API for this purpose.

If there are changes in types, Dgraph tries to automatically perform conversion from an old type to the new one as described in the new schema (e.g. a float to string). Invalid schema changes (e.g. parsing string into a float, but the string cannot be parsed as a float) result in schema change errors. These errors need to be handled programatically by deployment administrator (ideally avoid such conversions).

Creating own performance indicators

You can create your own performance indicators. To create own performance indicator, create a script which tests desired functionality of a library. An example can be matrix multiplication script present in performance repository. This script can be supplied to Dependency Monkey to validate certain combination of libraries in desired runtime and buildtime environment or directly on Amun API which will run the given script using desired software and hardware configuration. Please follow instructions on how to create a performance script shown in the README of performance repo.

To create relevant models, adjust thoth/storages/graph/ file and add your model. Describe parameters (reported in @parameters section of performance indicator result) and result (reported in @result). The name of class should match name which is reported by performance indicator run.

class PiMatmul(PerformanceIndicatorBase):
    """A class for representing a matrix multiplication micro-performance test."""

        Required("matrix_size"): int,
        Required("dtype"): str,
        Required("reps"): int,
        Required("device"): str,

    SCHEMA_RESULT = Schema({
        Required("elapsed"): float,
        Required("rate"): float,

    # Device used during performance indicator run - CPU/GPU/TPU/...
    device = model_property(type=str, index="exact")
    matrix_size = model_property(type=int, index="int")
    dtype = model_property(type=str, index="exact")
    reps = model_property(type=int, index="int")
    elapsed = model_property(type=float)
    rate = model_property(type=float)

After you have created relevant model, register your model to ALL_PERFORMANCE_MODELS and re-generate graph database schema (as discussed above).

Online debugging of queries done to Dgraph

You can print to logger all the queries that are performed to a Dgraph instance. To do so, set the following environment variables:

You can’t perform that action at this time.