Skip to content
This repository was archived by the owner on May 31, 2019. It is now read-only.

[WIP] feat: Read all graphql configuration from postgres#9

Closed
david-martin wants to merge 8 commits intoaerogear-attic:masterfrom
david-martin:read-gql-config-from-db
Closed

[WIP] feat: Read all graphql configuration from postgres#9
david-martin wants to merge 8 commits intoaerogear-attic:masterfrom
david-martin:read-gql-config-from-db

Conversation

@david-martin
Copy link
Copy Markdown
Contributor

@david-martin david-martin commented Jul 2, 2018

feat: Read Data Source, Schema & Resolver configuration from posgres

The sequelize module is used for this.
A DataSource, GraphQLSchema & Resolver model is defined.
On startup, the first schema is retrieved from the database,
and all Resolvers & DataSources are read and passed into the schema parser
logic.

The local dev setup has changed due to the introduction of postgres.
See README for details.
The gist of it is:

npm run db:start
npm run db:seed      # Creates sequelize model tables & seeds some example data
npm run dev

Access graphiql on http://localhost:8000/graphiql

to stop:

npm run db:stop

Other changes in this PR

  • remove example json files as they are no longer used
  • remove k8s setup from README (to be replaced by a link to APB repo/setup in future)
  • a change to the resolverMaker & datasourceParser logic to look for a field & name key respectively on each item in the item array, rather than using the key of an object

TODO:

  • Add models for GraphQLSchema, Resolver & Data Sources, removing all file based config
  • Merge with in-memory poc changes
  • Verify schema is OK for the Admin UI
  • Tests

Separate follow up PR for:

  • Auto reload the schema if the database table changes (any creates/deletes/updates)

@david-martin david-martin requested review from aliok and darahayes July 2, 2018 16:13
})

return DataSource
}
Copy link
Copy Markdown
Contributor Author

@david-martin david-martin Jul 2, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pb82 fyi, here is the sequelize model for a Data Source, based on the schema being used for the UI
(i.e. https://github.com/aerogear/data-sync-ui/blob/master/gql/schema.js#L4-L16)

I omitted an 'id' field as that is automatically added (auto increment primary key) by sequelize.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@david-martin thanks, we will use the same.

README.md Outdated
```
docker run --rm --name=postgres -p 5432:5432 -e POSTGRES_PASSWORD=mysecretpassword -v `pwd`/examples:/tmp/examples -d postgres
docker exec postgres psql -U postgres -f /tmp/examples/create_tables.example.sql
docker exec postgres psql -U postgres -f /tmp/examples/create_example_datasource.sql
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This fails because the table is not created before insert operation. @david-martin

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good catch.
I'll update to start the server before creating the example datasource

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So, what I understand is, Sequlize should create the table. But it didn't do in my case. After I manually created the table and run the server, I saw log statements like "CREATE TABLE etc " in the logs.

Here are the log statements:

Executing (default): SELECT "id", "name", "type", "config", "createdAt", "updatedAt" FROM "DataSources" AS "DataSource";
Executing (default): SELECT t.typname enum_name, array_agg(e.enumlabel ORDER BY enumsortorder) enum_value FROM pg_type t JOIN pg_enum e ON t.oid = e.enumtypid JOIN pg_catalog.pg_namespace n ON n.oid = t.typnamespace WHERE n.nspname = 'public' AND t.typname='enum_DataSources_type' GROUP BY 1
Executing (default): CREATE TYPE "public"."enum_DataSources_type" AS ENUM('InMemory', 'Postgres');
Executing (default): SELECT typname, typtype, oid, typarray FROM pg_type WHERE (typtype = 'b' AND typname IN ('hstore', 'geometry', 'geography')) OR (typtype = 'e')
Executing (default): CREATE TABLE IF NOT EXISTS "DataSources" ("id"   SERIAL , "name" VARCHAR(255), "type" "public"."enum_DataSources_type", "config" JSON, "createdAt" TIMESTAMP WITH TIME ZONE NOT NULL, "updatedAt" TIMESTAMP WITH TIME ZONE NOT NULL, PRIMARY KEY ("id"));
Executing (default): SELECT i.relname AS name, ix.indisprimary AS primary, ix.indisunique AS unique, ix.indkey AS indkey, array_agg(a.attnum) as column_indexes, array_agg(a.attname) AS column_names, pg_get_indexdef(ix.indexrelid) AS definition FROM pg_class t, pg_class i, pg_index ix, pg_attribute a WHERE t.oid = ix.indrelid AND i.oid = ix.indexrelid AND a.attrelid = t.oid AND t.relkind = 'r' and t.relname = 'DataSources' GROUP BY i.relname, ix.indexrelid, ix.indisprimary, ix.indisunique, ix.indkey ORDER BY i.relname;

It first does a "SELECT" and in later stages it does "CREATE TABLE IF NOT EXISTS".

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Other than this, progress so far looks good.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@aliok I've updated the README steps, and changed the order on startup to fully initialise models first (i.e. create if not exists) before trying to list any.

i.e. this 8f83102

@darahayes
Copy link
Copy Markdown

@david-martin is this ready to go?

@david-martin
Copy link
Copy Markdown
Contributor Author

@darahayes not quite.
About to push resolver & schema models too.
Also, I'd like to try get some tests in place.

@david-martin david-martin changed the title [WIP] feat: Read Data Source configuration from posgres [WIP] feat: Read all graphql configuration from postgres Jul 3, 2018
@@ -0,0 +1,19 @@
module.exports = (sequelize, DataTypes) => {
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@@ -0,0 +1,7 @@
module.exports = (sequelize, DataTypes) => {
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fyi @pb82 model for GraphQLSchema. See https://github.com/aerogear/data-sync-server/pull/9/files#diff-5709bb01e7a28a79644cde19374931d6R3 for example of what gets inserted

@david-martin
Copy link
Copy Markdown
Contributor Author

@darahayes @aliok I've rebased and pushed a significant update to this.
Please see the PR description for info.

@wtrocki wtrocki self-requested a review July 3, 2018 12:15
Dara Hayes and others added 7 commits July 3, 2018 13:56
The `sequelize` module is used for this.
A `DataSource`, `GraphQLSchema` & `Resolver` model is defined.
On startup, the first schema is retrieved from the database,
and all Resolvers & DataSources are read and passed into the schema parser
logic.

The local dev setup has changed due to the introduction of postgres.
See README for details.
The gist of it is:

```
npm run db:start
npm run db:seed      # Creates sequelize model tables & seeds some example data
npm run dev
```

to stop:

```
npm run db:stop
```

The server is expected to fail starting at this time as the `InMemory`
data source is not implemented yet.
You should see an error like this on startup if the db is working OK.

```
Error: Unhandled data source type: InMemory
    at _.forEach (/home/dmartin/work/data-sync-server/server/lib/dataSourceParser.js:16:13)
    at arrayEach (/home/dmartin/work/data-sync-server/node_modules/lodash/lodash.js:516:11)
    at Function.forEach (/home/dmartin/work/data-sync-server/node_modules/lodash/lodash.js:9342:14)
    at module.exports (/home/dmartin/work/data-sync-server/server/lib/dataSourceParser.js:7:5)
    at module.exports (/home/dmartin/work/data-sync-server/server/lib/schemaParser.js:7:23)
    at module.exports (/home/dmartin/work/data-sync-server/server/server.js:32:14)
    at <anonymous>
```

This error should not happen after the `InMemory` data source and
resolver mappings have been added to the server.
@david-martin
Copy link
Copy Markdown
Contributor Author

@darahayes I have rebased on top of your in-memory datasource branch.
It's now possible to run the server (after starting & seeding the db),
and make requests using graphiql against the in memory data source.

@darahayes
Copy link
Copy Markdown

Closing this because changes were merged with #12

@darahayes darahayes closed this Jul 4, 2018
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants