-
-
Notifications
You must be signed in to change notification settings - Fork 569
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature Request: map columns with the same prefix to nested objects #1050
Comments
If we implement this, I think it should also allow arbitrary groupings. In addition to the glob pattern, we could also allow something like:
This would combine with the standard We could potentially implement this in a backwards compatible way by deprecating the existing fields and adding the nested object too. This could be an option. |
For querying, you can already do this using a computed column:
That doesn't solve it for mutations though, but perhaps this is already sufficient for your use case. |
I just built a plugin for this in V5: https://gist.github.com/benjie/b9f3b6d46db0b6ae2524a6c9fd15fb9a |
(In case you've not heard about V5: https://dev.to/graphile/intro-to-postgraphile-v5-part-1-replacing-the-foundations-3lh0 ) |
I'm submitting a ...
PostGraphile version:
v4.3.3
Expected behavior:
Currently, using custom composite types and separate tables are the only way to expose nested objects in your generated GraphQL schema. It would be nice if there were an easy way to map columns named with the same prefix to nested properties on a single object. For example, assuming I have several columns with the prefix
products.inventory_*
, I'd like to be able to map those under a single objectProduct.inventory.*
in the GQL schema.This could be implemented on top of the existing
@name
smart comment, extending the current implementation to support nested paths i.e.However, in addition to explicit mappings, a more dynamic option (where the user can specify a path prefix) is probably desirable to avoid unnecessary verbosity:
The text was updated successfully, but these errors were encountered: