Skip to content
This repository has been archived by the owner on Apr 17, 2023. It is now read-only.

Ability to publish changes upon creation/update/deletion without necessarily creating the subscriptions GraphQL query /resolvers #1896

Open
machi1990 opened this issue Aug 18, 2020 · 16 comments

Comments

@machi1990
Copy link
Contributor

Is your feature request related to a problem? Please describe.

Being able to publish (over an external pub sub queue e.g Kafka topic) in one Graphback process and letting the subscription be handled by a complete lightweight another Graphback process(es) that's dedicated for subscriptions only.

Describe the solution you'd like

See subXXX knobs in https://graphback.dev/docs/next/model/annotations#arguments

Having a fine grained configuration pub/sub knobs.

Now we have subCreate, subDelete, subUpdate that handles publish and subscribing without being able to opt in for one or the other.

Essentially:

  • This creates also the subscription queries / resolvers even when you do not need them. Sometimes you may just want the changes to be publish in the external queue for processing by another service / process / server and not strict a GrapQLClient.
  • This automatically publishes to the topic, sometimes you might want to only consume the changes from an external Queue via a Change Data Capture tooling e.g Debezium has done its magic. Meaning the Graphback process will not be responsible with the publish, it will only react to the changes received from a topic. Probably do some transformation so that they correspond to the type defined in the Schema etc

What I'll like is to split the subXXX knobs into:

  • subXXX (generate queries and resolvers for the subscription)
  • pubXXX (handle the publish without generating subscription queries / resolvers)

This will enable for a more lightweight processes:

  • subscription processes that only reacts to changes and expose them via subscription resolvers
  • data change processes that only publish changes via GraphQL mutations

The two processes will have to have a strict "event" contract between to smoothen the communication between them.

Describe alternatives you've considered

This can still be achieved with the current version but it not fine tuned as I would have hoped.

@machi1990
Copy link
Contributor Author

/cc @craicoverflow, @wtrocki
Automatically generated comment to notify maintainers

@craicoverflow
Copy link

This makes a lot of sense, thank you for the issue! Do you see subXXX overriding pubXXX if pubXXX is false? As subCreate on its own would not have any use.

@machi1990
Copy link
Contributor Author

I see them working independently.

This makes a lot of sense, thank you for the issue! Do you see subXXX overriding pubXXX if pubXXX is false? As subCreate on its own would not have any use.

Taking

"""
@model(subCreate: true, create: false, update: false, pubCreate: false ....)
"""
type Note {
  id: ID!
}

In this schema, subCreate would create:

type Subscription {
  newNote(filter: NoteSubscriptionFilter): Note!
}

and the corresponding resolver, subscribing to the specific Note creation queue. How events are getting publish to the queue would be up the publisher, another Graphback process, another GraphQLCrud process etc (so long as they conform to the same event contract ).

@machi1990
Copy link
Contributor Author

/cc @wtrocki

@wtrocki wtrocki added the triage label Aug 31, 2020
@wtrocki
Copy link
Contributor

wtrocki commented Aug 31, 2020

We also need the opposite situation:

Having subscription handlers available but do not publish any events on CREATE UPDATE AND DELETE

Workaround for now exist:

What we need is to define the model and disable all crud operations on it apart from subscription. Then we use kafka pubsub to listen to events (topics are configurable and documented) and it should work with debezium.

We need to build sample template to demo this better.

Also currently events (topics) are just internal part of the graphback - if we move to event streaming solution we will need extra capability to be able to specify topics directly in the config or schema.

Moving to generic streaming platform will enable us to process changes using debezium directly from db or external even sources.

@wtrocki
Copy link
Contributor

wtrocki commented Aug 31, 2020

I trelolozed this (added to trello) 7-datasync-kafka-debezium-enabled-event-streaming-approach

@machi1990
Copy link
Contributor Author

We also need the opposite situation:

Having subscription handlers available but do not publish any events on CREATE UPDATE AND DELETE

Yes, this is described in the issue description.

Workaround for now exist:

What we need is to define the model and disable all crud operations on it apart from subscription. Then we use kafka pubsub to listen to events (topics are configurable and documented) and it should work with debezium.

Indeed.

We need to build sample template to demo this better.

Also currently events (topics) are just internal part of the graphback - if we move to event streaming solution we will need extra capability to be able to specify topics directly in the config or schema.

Moving to generic streaming platform will enable us to process changes using debezium directly from db or external even sources.

+1 on this plus the ability to specify any pre-processing operation (e.g payload transformation ) that needs to be done before the received event is sent to the subscribing client.

@machi1990 machi1990 added the enhancement New feature or request label Aug 31, 2020
@machi1990 machi1990 changed the title [Discussion] Ability to publish changes upon creation/update/deletion without necessarily creating the subscriptions GraphQL query /resolvers Ability to publish changes upon creation/update/deletion without necessarily creating the subscriptions GraphQL query /resolvers Aug 31, 2020
@wtrocki
Copy link
Contributor

wtrocki commented Aug 31, 2020

For transformation we have separate feature for content mapping

@machi1990
Copy link
Contributor Author

For transformation we have separate feature for content mapping

Nice. Does it apply even under this context? E.g suppose that the source of the events is a DBZ (which pushes events to a Kafka topic), on Graphback side, we'll subscribing to this topic, what'll be desirable is to not merely to the subscription but to supply some sort of transformation function which is to be applied to the event before its sent to the client.

@craicoverflow
Copy link

Considering this would be a breaking change and a new feature, do we see this happening for a 0.17.x release?

@wtrocki
Copy link
Contributor

wtrocki commented Sep 7, 2020

what'll be desirable is to not merely to the subscription but to supply some sort of transformation function which is to be applied to the event before its sent to the client.

Yep. Since this will be needed to be applied to the queries/mutations and subscriptions we can reuse this logic.

Considering this would be a breaking change and a new feature, do we see this happening for a 0.17.x release?

Post 1.0 release.
https://trello.com/c/1lH9SqKu/7-datasync-kafka-debezium-enabled-event-streaming-approach

The approach will be to do POC (same as datasync) without touching core or affecting graphback releases.

@craicoverflow
Copy link

This appears to be resolved, closing (reopen if I am wrong)

@machi1990
Copy link
Contributor Author

This solves only a part of it, but there is not ability to not publish changes from within the application.

See

if (this.pubSub && this.crudOptions.subCreate) {

You can play with this repository especially this commit https://github.com/aerogear/datasync-example/blob/245b324a08f6ad72ff5ed728273e9700f8b69952.

This line https://github.com/aerogear/datasync-example/blob/245b324a08f6ad72ff5ed728273e9700f8b69952/graphback-debezium-integeration/src/kafka-sub.ts#L30
should never be called.

@wtrocki
Copy link
Contributor

wtrocki commented Sep 23, 2020

In production ready scenario streaming platforms will never support edits (because data is stream) so we will always have 2 pubsub engines - one for the classical pubsub and one that is done specifically to some model that works for streaming only. If we going to get that into officially supported scenario separate annotations might be needed.

@machi1990
Copy link
Contributor Author

... If we going to get that into officially supported scenario separate annotations might be needed.

I think we should support this usecase by splitting the subXXX annotation key into two:

  • pubXXX once this is activated, we'll allow publishing of the changes.
  • and subXXX once this is activated, we'll create the corresponding Subscription type in the schema, and subscribe to changes only (no publish).

The current situation is that subXXX is responsible of both the publishing and subscriptions.

@wtrocki
Copy link
Contributor

wtrocki commented Sep 23, 2020

Yep. This is how our competion seems to be doing subscriptions at the moment.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

3 participants