Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

reduce nesting at database level #32

Closed
wants to merge 7 commits into from

Conversation

jaime-ez
Copy link
Contributor

@jaime-ez jaime-ez commented May 8, 2020

As with other connectors, the data from the deepstream structure is inverted at database and reverted when querying. This is a breaking change, so the version is bumped to 3.0.0
Plus some connections opts to make the tests more accurate.
And some files where linted.

@yasserf
Copy link
Contributor

yasserf commented May 9, 2020

I do apologise for this, but I feel like converting this to typescript and making it up to spec with the rest of the connectors is a good idea before doing a major jump.

Happy to do the legwork on this.

@jaime-ez
Copy link
Contributor Author

Ok for the typescript rewrite, I'm not fluent in typescript so If you can start I could contribute once the core is rewritten.
But I don't see how you could remove the transform-data functionality from the connector...you could put the functions inside the set and get methods, or is there another way? (I ask because I'll use for now my own version of the connector)

@yasserf
Copy link
Contributor

yasserf commented May 10, 2020

Yeah no worries for asking!

Postgres is a bit of a tricky JSON store. Normally I would personally have custom adaptors to insert into a normal column based database.

However, the connector is generic, which means when we insert a record we store it as JSON with the following structure:

{
  [deepstream_version_key]: version,
  ...value
}

instead of

{ 
  _d: value,
  v: version 
}

and when reading we just do

const { [deepstream_version_key]: version, ...value } = dbData
callback(name, version, value)

Theres caveats with the terrible way lists work.

@jaime-ez
Copy link
Contributor Author

jaime-ez commented May 10, 2020 via email

@yasserf
Copy link
Contributor

yasserf commented May 10, 2020

Okay so I just pushed all the changes to typescript in.

The issue this connector tries to be really smart which works for some usecases but is a pain for others. Basically it assumes reads happen less often then writes, but also has a write buffer for writes. It ties into the aspect of thinking you have a cache layer so you can lazily write to the DB and hence buffer updates, but when you read you missed in the cache so its more urgent.

Anyways, point is the majority of the functionality works, but it isn't complete yet. The tests are written in a way that makes it really hard to fix them without commenting in and out due to it's linear nature.

Biggest change in this which is important is this.

| id | version | val |
| mercedes | 10 | "{}" |

which pretty much makes versions totally unattached to use manipulating the data, which is epic.

@yasserf
Copy link
Contributor

yasserf commented May 10, 2020

I won't be able to work on this for another week, I have a pretty busy schedule ahead of me tbh. Typescript compiles, some of the tests pass and the main issue is with notifications. If your happy to look at it awesome, otherwise will look at it when it's next on my list, already spent way more time on deepstream for this week lol

@jaime-ez
Copy link
Contributor Author

Looks awesome, I'll try to dive in in the near future. I too have spent more than scheduled updating to deepstream v5 :), but I'm glad I did, it's been a great experience getting to know the codebase a bit and learn the patterns, and thanks for all the help!

@yasserf
Copy link
Contributor

yasserf commented May 10, 2020

Super happy with all your feedback and help too! I’ll try sneak in some time to get Postgres some, all the heavy lifting is there already and should only take a couple more hours before releasing.

Not related to PR, but what version where you upgrading from?

@yasserf
Copy link
Contributor

yasserf commented May 10, 2020

I should also update the non storage APIs in this repo to use async/await, will make tests way cleaner and is generally just a better experience.

@yasserf
Copy link
Contributor

yasserf commented May 10, 2020

One last comment as well, if I was to use Postgres and deepstream I wouldn’t make deepstream create the tables automatically. Instead I would just create Postgres and the Schema as normal, and then use insert replace on conflict for all the fields (instead of a value parameter). That way you can do all the Postgres magic.

Another benefit for me from that is

Postgres -> generate typescript -> generate JSON schemas

That way you can also validate all your records as they enter deepstream. It’s a workflow vi use often now and it makes life super easy.

We could put that in this db connector as well, but it would be way easier to create another Vanilla connector without all this schema listening stuff that does that, and without all this buffering logic that again isn’t as useful with low To medium writes.

@yasserf
Copy link
Contributor

yasserf commented May 10, 2020

Okay so this all works and is now in master.

I know this is probably a mean ask, but it would be great if you could:
a) Test it out locally (clone the repo, install dependencies, npm link)
b) Update the docs on deepstream.io (https://github.com/deepstreamIO/deepstream.io-website/blob/master/content/tutorials/60-plugins/30-database/postgres/index.md)

Generally just changing it to use promises

const connector = new PostgresConnector( settings )

connector.init()
await connector.whenReady()
await connector.subscribe( event => {
        //event will be a map of event and table for CREATE_TABLE and DESTROY_TABLE
        // { event: 'CREATE_TABLE', table: 'some-table' })
        // or of event, table and key for INSERT, UPDATE AND DELETE, e.g.
        // { event: 'INSERT', table: 'some-table', key: 'some-key' }
 })
    //subscriptions can be removed
  await connector.unsubscribe()

    // the connector also comes with a facility to get a map of all tables and the numbers of items within
    const result = connector.getSchemaOverview() 
        /* result will be e.g.
        {
            'some-table': 2,
            'some-other-table': 1,
            'new-table': 1,
            'table-a': 2,
            'table-b': 2
        }
        */
    })
})

And updating the docs below and just testing the cost as well. The tests all pass though and they are alot (50+ of them).

@yasserf yasserf closed this May 10, 2020
@jaime-ez
Copy link
Contributor Author

Not related to PR, but what version where you upgrading from?

v3

One last comment as well, if I was to use Postgres and deepstream I wouldn’t make deepstream create the tables automatically. Instead I would just create Postgres and the Schema as normal, and then use insert replace on conflict for all the fields (instead of a value parameter). That way you can do all the Postgres magic.

Another benefit for me from that is

Postgres -> generate typescript -> generate JSON schemas

Thats just a logic progression or there is a tool-script that generates JSON schemas from postgres?

That way you can also validate all your records as they enter deepstream. It’s a workflow vi use often now and it makes life super easy.

That sounds really helpful.

We could put that in this db connector as well, but it would be way easier to create another Vanilla connector without all this schema listening stuff that does that, and without all this buffering logic that again isn’t as useful with low To medium writes.

Until now using jsonb has been ok for me, jsonb queries are powerful enough, but this seems to be the next step.

Okay so this all works and is now in master.

I know this is probably a mean ask, but it would be great if you could:
a) Test it out locally (clone the repo, install dependencies, npm link)
b) Update the docs on deepstream.io (https://github.com/deepstreamIO/deepstream.io-website/blob/master/content/tutorials/60-plugins/30-database/postgres/index.md)

No problem, I'll try to do it during the upcoming week.

@yasserf
Copy link
Contributor

yasserf commented May 11, 2020

Thats just a logic progression or there is a tool-script that generates JSON schemas from postgres?

There are tools that do this, you just need to point it towards the schema and it prints out all the types

That sounds really helpful. (JSON validation)

This is also a tool that generated JSONSchema used by AJV and swagger for validation. Makes making APIs really nice, you just need to ensure you design your db and permissions properly,

Until now using jsonb has been ok for me, jsonb queries are powerful enough, but this seems to be the next step.

This is really easy to do btw, just means you need to have a reduce function to correctly build the sql query, but this would be generic.

It would also be alot more db sage, the current connector does basic escaping but the pg library is more powerful (and slower, but lets face the fact node is slow so if you want it to be super fast, don't save it in the DB)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants