Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RFC: Fix ambiguity with null variable values and default values #418

Merged
merged 7 commits into from Apr 18, 2018

Conversation

Projects
None yet
7 participants
@leebyron
Copy link
Collaborator

commented Mar 6, 2018

(Rendering Preview: https://out-nmocjoypvp.now.sh)

This is a NON-BREAKING validation change which allows some previously invalid queries. It is also a behavioral execution change which changes how values interact with variable and argument default values.

There is currently ambiguity and inconsistency in how null values are coerced and resolved as part of variable values, default values, and argument values. This inconsistency and ambiguity can allow for null values to appear at non-null arguments, which might result in unforseen null-pointer-errors.

The version of this proposal to be merged includes the following:

  • NEW: Non-null arguments and input object fields are no longer required to be provided a value/variable if that argument also supplies a default value.
  • NEW: Similarly, non-null input object fields can be omitted if they supply a default value. (Note that previously this validation clause was missing from the spec, however has long been included in reference implementations)
  • NEW: All variables can now have default values by way of removing a validation rule. Previously it was invalid to supply a default value with a non-null variable.
  • BUG FIX: Variables with a nullable type and a null default value cannot be provided to a non-null argument or input field.
  • NEW: Optional (nullable) variables can now be used for arguments or input object fields which supply default values.
  • CLARITY: Redefine CoerceVariableValues() to make it more clear how to treat a lack of runtime value vs the explicit value null with respect to default values and type coercion.
  • NEW: Redefine CoerceArgumentValues() to ensure providing a null variable value to a Non-Null argument type causes a field error, and to add clarity to how null values and default values should be treated.
  • NEW: Redefine Input object coercion rules to ensure providing a null value or runtime variable value to a non-null type causes a field error, and improve clarity for these coercion rules. This mirrors the changes to CoerceArgumentValues().

Previous version of this proposal:

This appears in three distinct but related issues:

Validation: All Variable Usages are Allowed

The explicit value null may be used as a default value for a variable with a nullable type, however this rule asks to treat a variable's type as non-null if it has a default value. Instead this rule should specifically only treat the variable's type as non-null if the default value is not null.

Additionally, the AreTypesCompatible algorithm is underspecificied, which could lead to further misinterpretation of this validation rule.

Coercing Variable Values

CoerceVariableValues() allows the explicit null value to be used instead of a default value. This can result in a null value flowing to a non-null argument due to the validation rule mentioned above. Instead a default value must be used even when an explicit null value is provided. This is also more consistent with the explanation for validation rule "Variable Default Value Is Allowed"

Also, how to treat an explicit null value is currently underspecified. While an input object explains that a null value should result in an explicit null value at the input object field, there is no similar explaination for typical scalar input types. Instead, CoerceVariableValues() should explicitly handle the null value to make it clear a null is the resulting value in the coercedValues Map.

Coercing Argument Values

The CoerceArgumentValues() algorithm is intentionally similar to CoerceVariableValues() and suffers from the same inconsistency. Explicit null values should not take precedence over default values, and should also be explicitly handled rather than left to underspecified input scalar coercion.

RFC: Fix ambiguity with null variable values and default values
> This is a **behavioral change** which changes how explicit `null` values interact with variable and argument default values. This also changes a validation rule which makes the rule more strict.

There is currently ambiguity and inconsistency in how `null` values are coerced and resolved as part of variable values, default values, and argument values. This inconsistency and ambiguity can allow for `null` values to appear at non-null arguments, which might result in unforseen null-pointer-errors.

This appears in three distinct but related issues:

**Validation: All Variable Usages are Allowed**

The explicit value `null` may be used as a default value for a variable with a nullable type, however this rule asks to treat a variable's type as non-null if it has a default value. Instead this rule should specifically only treat the variable's type as non-null if the default value is not `null`.

Additionally, the `AreTypesCompatible` algorithm is underspecificied, which could lead to further misinterpretation of this validation rule.

**Coercing Variable Values**

`CoerceVariableValues()` allows the explicit `null` value to be used instead of a default value. This can result in a null value flowing to a non-null argument due to the validation rule mentioned above. Instead a default value must be used even when an explicit `null` value is provided. This is also more consistent with the explanation for validation rule "Variable Default Value Is Allowed"

Also, how to treat an explicit `null` value is currently underspecified. While an input object explains that a `null` value should result in an explicit `null` value at the input object field, there is no similar explaination for typical scalar input types. Instead, `CoerceVariableValues()` should explicitly handle the `null` value to make it clear a `null` is the resulting value in the `coercedValues` Map.

**Coercing Argument Values**

The `CoerceArgumentValues()` algorithm is intentionally similar to `CoerceVariableValues()` and suffers from the same inconsistency. Explicit `null` values should not take precedence over default values, and should also be explicitly handled rather than left to underspecified input scalar coercion.
@leebyron

This comment has been minimized.

Copy link
Collaborator Author

commented Mar 6, 2018

Since this changes behavior, I'd love your feedback @IvanGoncharov, @OlegIlyenko, et al

@andimarek

This comment has been minimized.

Copy link

commented Mar 6, 2018

@IvanGoncharov

This comment has been minimized.

Copy link
Member

commented Mar 6, 2018

@leebyron I fully agree with the changes from Validation: All Variable Usages are Allowed 👍

Coercing Variable Values
Coercing Argument Values

Explicit null values should not take precedence over default values.

This is very confusing at the first glance, especially for argument values. Here is the example from JS:

function f(arg = 'foo') {
  return arg;
}

f()      // 'foo'
f(null)  // null
f('bar') // 'bar'

And same in Python and probably all other langs so I would expect the same from GraphQL.
But I fully understand technical challenge here so I need to think more about it.

@leebyron

This comment has been minimized.

Copy link
Collaborator Author

commented Mar 6, 2018

Absolutely correct, @IvanGoncharov. There are two paths forward to solve for the ambiguity, each of which have some tradeoffs.

The path I'm proposing here preserves the ability to treat $var: String = "default" as non-nullable and usable in an argument expecting String!. The tradeoff is that a default value would take precedence over an explicit null value.

The alternative is to no longer treat $var: String = "default" as non-nullable while preserving explicit null over default values. This would mean queries which pass variables with default values to arguments expecting non-null types would no longer be valid.

My concern is that the change cost of this alternative is too high since passing a variable with a default value to a non-null argument is an existing pattern. However I'm less certain how common overriding a default value with an explicit null value is, though I certainly understand the incongruity with existing programming languages. I haven't found an example of it within queries at Facebook yet, so I would love to hear use cases to better understand change cost.

@OlegIlyenko

This comment has been minimized.

Copy link

commented Mar 6, 2018

It's an interesting change. I remember the time when I finally correctly implemented the support for the null in sangria. As far as I remember, my initial implementation was doing the same as was suggested in this PR: "Explicit null values should not take precedence over default values.". Later on I refactored it in order to comply with the reference implementation.

Along the way, I also faced these issues where null value can potentially appear in nullable the argument with a default value (which might be treated as not-null since it has a default). As a solution to this problem I introduced following 2 things:

  • I delay the value resolution to the last moment. This led to the introduction of a new Trinary container which able to preserve information about the default value (NullWithDefault).
  • By default nullable arguments with a default value are treated as non-null on a type level. But since I have a Trinary value around I can force the usage of a default value which might come from the variable definition (this one has precedence over other default values) or argument/input field default. I also provide an extra function to get a nullable variation of the value (Option[Value]) - this one behaves in the same way as reference implementation at the moment.

I find this solution quite robust - by default, it behaves similarly to the proposal in this PR, but it also provides a way to propagate explicit null value (and ignore all of the defaults) when this information is significant.

Though it is still possible to trick the system by setting the default value to null, like here:

type Query {
  article(id: ID = null): Atricle
}

Do you think it would make sense to disallow this? (for arguments, input fields, and variables)

I personally don't have practical insights on API that takes advantage of the null value. I know that some people use it to explicitly "unset" fields in the database. But in this scenario, I don't really see the need for a default value. So it's hard for me to assess the impact of this change, but I find the use case quite niche.

@IvanGoncharov

This comment has been minimized.

Copy link
Member

commented Mar 10, 2018

@leebyron It's not an ideal solution since we are trying to fix validation issue by changing execution behavior. Ideally, we would allow defaults to be specified on nonull arguments:

query foo($arg: String! = "foo")

{} => $arg === "foo"
{ arg: null } => Exception

query bar($arg: String = "bar")

{} => $arg === "bar"
{ arg: null } => $arg === null

But it's not a viable alternative since it will break huge number of existing queries 😢

So practically speaking if there is no GraphQL APIs that distinguish between null and default value in field arguments then I don't see a point to fight for theoretical purity.

That said I think it's very important question so we need to be extremely careful and do some research beyond the Facebook use case.
I volunteer to do research on a major public APIs(Shopify, Yelp, ...).

GitHub was my first victim and tried to pass nulls in every field with a default argument. A few of them treat null the same way as absent value but in one case it actually broke query:
image
Moreover, in GitHub schema repository defined as nullable so absence of this field inside data suggests that it wasn't simple exception inside resolver.

@eapache

This comment has been minimized.

Copy link

commented Mar 14, 2018

IIUC this is ~ the same issue as #359.

I also find the "explicit nulls don't take precedence over defaults" confusing. Per the issue I originally filed, I think this can be solved by by allowing non-null variables/arguments/input-fields to have default values. This is I think what you mean by

The alternative is to no longer treat $var: String = "default" as non-nullable while preserving explicit null over default values. This would mean queries which pass variables with default values to arguments expecting non-null types would no longer be valid.

The change cost of that approach is real, but the migration is fairly trivial and I think important: anywhere that this ambiguity currently occurs the author is forced to resolve it by adding a ! to the variable to make it clear that explicit nulls are forbidden. If they did intend to support nulls for that variable then their query is invalid no matter what and that's something they should be aware of.

Updated based on feedback.
This updates this proposal to be a bit broader in scope however much narrower in breaking behavior changes.

Mirroring the changes in graphql/graphql-js#1274, this update better defines the difference between a "required" and "non-null" argument / input field as a non-null typed argument / input-field with a default value is no longer required. As such the validation rule which prohibited queries from using non-null variables and default values has been removed. This also adds clarity to the input field validation - this rule has existed in the GraphQL.js reference implementation however was found missing within the spec.

This also updates the CoerceVariableValues() and CoerceArgumentValues() algorithms to retain explicit null values overriding a default value (minimizing breaking changes), however critically adding additional protection to CoerceArgumentValues() to explicitly block null values from variables - thus allowing the older pattern of passing a nullable variable into a non-null argument while limiting the problematic case of an explicit null value at runtime.

@leebyron leebyron force-pushed the null-variables branch from d3599a4 to 3dc6d1b Mar 30, 2018

@leebyron

This comment has been minimized.

Copy link
Collaborator Author

commented Mar 30, 2018

Excellent feedback everyone. I've made serious revisions which I hope both protect against allowing null values to non-null arguments while preserving existing behavior. The result is a bit broader in scope, however no longer breaks existing queries in either validation or execution.

The reference PR has also been updated in graphql/graphql-js#1274

This update better defines the difference between a "required" and "non-null" argument / input-field as a "non-null" type which includes a default value is no longer required. As such the validation rule which prohibited queries from using non-null variables and default values has been removed completely. This change also adds clarity to the input field validation - this rule has existed in the GraphQL.js reference implementation however was found missing within the spec. It essentially mirrors the change of the required argument rule.

This also updates the CoerceVariableValues() and CoerceArgumentValues() algorithms to retain the behavior of explicit null values overriding a default value (minimizing breaking changes), however critically adding additional protection to CoerceArgumentValues() to explicitly block null variable values from passing into a non-null argument. This retains the existing common pattern of passing a nullable variable with a default value into a non-null argument while removing the problematic case of an explicit null value at runtime.

I'd love another look at this and the accompanying PR. I feel confident in this approach but would appreciate your feedback.

@@ -1358,6 +1359,31 @@ For example the following query will not pass validation.
```
### Input Object Required Fields

This comment has been minimized.

Copy link
@leebyron

leebyron Mar 30, 2018

Author Collaborator

This rule exists as part of GraphQL.js's ValuesOfCorrectType but was missing from the spec

@@ -1494,44 +1520,6 @@ fragment HouseTrainedFragment {
```
### Variable Default Value Is Allowed

This comment has been minimized.

Copy link
@leebyron

leebyron Mar 30, 2018

Author Collaborator

This rule is completely removed since the queries it prohibited we explicitly wish to support.

* If {variableType} is not a non-null and {defaultValue} is provided and not {null}:
* Let {variableType} be the non-null of {variableType}.
* Let {argumentType} be the type of the argument {variableUsage} is passed to.
* AreTypesCompatible({argumentType}, {variableType}) must be {true}.

This comment has been minimized.

Copy link
@leebyron

leebyron Mar 30, 2018

Author Collaborator

Most of this change is for clarity, however the default value is now checked for {null} first

* Return {AreTypesCompatible(itemArgumentType, itemVariableType)}.
* Otherwise return {false}.
* If {variableType} is a list type return {false}.
* Return if {variableType} and {argumentType} are identical.

This comment has been minimized.

Copy link
@leebyron

leebyron Mar 30, 2018

Author Collaborator

Other than the checking default values being removed from this algorithm, this is all for clarity and doesn't signify any other behavioral change

@eapache

This comment has been minimized.

Copy link

commented Apr 2, 2018

I'm a bit torn - on the one hand I think this approach is probably the best we can do in terms of a clarification that is strictly non-breaking. On the other hand I'm not sure that strictly non-breaking is something we should be striving for in this case:

  • The last published spec is still marked as a draft, with no implicit or explicit compatibility guarantees.
  • Even Golang (whose explicit compatibility guarantee is one I use as a good model) reserves the right to clarify undefined or inconsistent behaviour in a way that may break programs.
  • The "from-scratch" design (if we weren't concerned about compatibility at all) is I think fairly uncontroversially to decouple default values from the nullability of arguments entirely. For anyone new to the spec, that behaviour seems the least surprising, and much simpler than the behaviour proposed here.

I realize that putting a breaking change like that in the spec causes its own huge set of pains for implementations that have to migrate, but I guess I'm not convinced that isn't still the best long-term solution.

All that said, I do believe this current approach is the best we can do while being non-breaking, so if that's the way that we want to go then I won't object further.

@leebyron

This comment has been minimized.

Copy link
Collaborator Author

commented Apr 2, 2018

@eapache - great points. I actually think what I now have here is very close to what I would propose in a "from scratch" design.

The execution behavior is almost exactly as I would expect it in a from-scratch design as it respects explicit values over default values in the same way as other programming languages. The only difference I might have expected in a from-scratch design is to be even more trusting of variable values in CoerceArgumentValues(), where this proposal explicitly checks for null values and throws field errors. This isn't trading on correctness however, so I'm not concerned at all about that extra check. This is still technically a "breaking change" by the definition that execution behavior of existing queries will now be different. Previously null values could be provided to non-null typed arguments; after this proposal this is an error. I think it's pretty uncontroversial to explain this as a bug fix.

The validation behavior is really where we've made specific considerations to avoid breaking existing queries, however this proposal is still very close to what I would propose in a "from scratch" design.

This proposal makes 3 important changes to query validation:

  • Removal of the "Variable Default Value Is Allowed" rule. I argue this should have been removed when we introduced explicit null values with the expectation that they supersede default values. This rule originally sought to prevent "dead code" style misunderstandings, however in the present context it does not succeed at that and instead it prevents real possibilities. In a "from scratch" design I would include this change exactly as is.

  • Changes the definition of "required" arguments and input fields. Previously a non-null type meant that an argument or input field was required to be provided. However because arguments may have default values, this limits their expressivity in the same was the previous rule. This proposal changes this such that non-null typed arguments/input-fields are only considered required if they also do not have a default value. This, again, I would propose unchanged in a "from scratch" design.

  • Change the "Variable Usages are Allowed" rule tweaking the rules for treating nullable variables as non-null for the sake of determining where they're allowed to be used. This is not what I would do in a from-scratch design. Instead I would propose removing the ability for nullable variables to flow into non-null arguments at all. Where the previous two changes expanded the possible set of queries that pass validation, this change would have restricted that possible set. This is also, unfortunately, a very common pattern so that restriction could be a very costly breaking change. However, I definitely agree that enforcing that strictness for newly produced GraphQL services would be valuable!

I will look at editing this to both make it more clear where the caveat for existing systems is while making it possible (or even encouraged) for new systems not to adopt that caveat.

@eapache are there other differences from the current state of this PR that you would consider approaching differently in a from-scratch design?

@eapache

This comment has been minimized.

Copy link

commented Apr 2, 2018

Change the "Variable Usages are Allowed" rule tweaking the rules for treating nullable variables as non-null for the sake of determining where they're allowed to be used. This is not what I would do in a from-scratch design. Instead I would propose removing the ability for nullable variables to flow into non-null arguments at all.

Yup, this is the one that is bugging me :)

I will look at editing this to both make it more clear where the caveat for existing systems is while making it possible (or even encouraged) for new systems not to adopt that caveat.

Awesome. I was going to suggest that we include an RFC 2119 header in the spec so we can use SHOULD/MAY appropriately here, but I note that you already did that, it just hasn't made it into http://facebook.github.io/graphql/ yet. I think that servers SHOULD prevent nullable variables from flowing into non-null arguments, but they MAY permit it in accordance with old versions of the spec in order to maintain compatibility. Thoughts?

are there other differences from the current state of this PR that you would consider approaching differently in a from-scratch design?

I'll go through again but I think that was the only one.

Otherwise, the argument is optional.
Arguments can be required. If the argument type is non-null and does not have a
default value, the argument is required and furthermore the explicit value
{null} may not be provided. Otherwise, the argument is optional.

This comment has been minimized.

Copy link
@eapache

eapache Apr 2, 2018

I read this to imply (probably unintentionally) that a non-null argument with a default value is allowed to be passed an explicit null?

One step further towards the idealized "from scratch" proposal, this …
…makes it more explicitly clear that changing the effective type of a variable definition is only relevent when supporting legacy clients and suggests that new services should not use this behavior.

I like that this balances a clear description of how this rule should work for existing services along with a stricter and therefore safer future path for new services.
@leebyron

This comment has been minimized.

Copy link
Collaborator Author

commented Apr 3, 2018

@eapache - I just updated this with your feedback incorporated. I'm glad we agree on the ideal outcome of this proposal.

My last update is one step further towards the idealized "from scratch" proposal, this makes it more explicitly clear that changing the effective type of a variable definition is only relevant when supporting legacy clients and suggests that new services should not use this behavior. I like that this balances a clear description of how this rule should work for existing services along with a stricter and therefore safer future path for new services.

I also just updated graphql/graphql-js#1274 to include this and make this a breaking change by default while providing an API for using this legacy behavior.

I'd love a last look by anyone else interested

Editing AreTypesCompatible() to avoid trailing "Otherwise return fals…
…e" statements for easier reading. Functionality is equivalent.
@leebyron

This comment has been minimized.

Copy link
Collaborator Author

commented Apr 5, 2018

Updating this a bit. I'm still trying to convince myself that the breaking change is worth doing. I realized that it also has a side effect of not allowing an optional variable to be provided to a optional non-null argument.

query Example($var: Int!) {
  field(arg: $var)
}

type Query {
  field(arg: Int! = 0): Int
}

While the above query is valid, it is invalid to not provide a runtime value for $var, even though arg supplies a default value. Perhaps this is just an acceptable cost.

An alternative, (which I'm also not sure about) is allowing nullable (optional) variables to flow into non-null optional positions, while still throwing at runtime for explicitly provided null values (already included in this proposal).

For example, this is currently invalid but would become valid in the alternative proposal:

query Example($var: Int) { # nullable
  field(arg: $var) # non-null but optional
}

type Query {
  field(arg: Int! = 0): Int
}

In that case, omitting $var at runtime could just use arg's default value.

Thoughts on this?

@leebyron

This comment has been minimized.

Copy link
Collaborator Author

commented Apr 6, 2018

After discussing this in depth with @dschafer at a whiteboard - we've talked ourselves out of any breaking change - I think that having a fork in the expected behavior of validation rules is just too easy to get wrong, liable to cause issues during deployment, and could create a bifurcation of tooling environments. We came up with mitigation strategies for Facebook's codebases and APIs, however also agreed that public APIs like Github would be much much more challenging to solve for.

Even though the behavior is not an ideal "from scratch" - I think that it's worth avoiding the breaking change. I still think the spec should call out this potential issue directly in a non-normative note for clarity.

leebyron added some commits Apr 6, 2018

Update "All Variable Usages are Allowed" to remove breaking change.
Also attempts to improve clarity and formatting and adds an example case.
@leebyron

This comment has been minimized.

Copy link
Collaborator Author

commented Apr 6, 2018

I've just updated the description to mirror the current change. I believe this is ready to merge and will first wait for a bit for others to review and comment.

@mike-marcacci
Copy link

left a comment

These changes look good, with just one edit.

More generally, I've wrestled with the idea of making a change that breaks previously-valid queries, and as much as I'd prefer the "from-scratch" behavior described above, I think this is the right call.

However, I don't think we should be terribly concerned with a bifurcation of tooling. Because we're already altering the spec for query validation to expand the definition of validity, we're going to see a mismatch in the wild: some servers won't update their software, while new clients will expect looser queries to work. This is probably unavoidable, and makes me think that there should be a "graphql version" or programmatically readable "feature set" (as discussed in the WG meeting) supplied in the result of an introspection query.

If such a thing were implemented, tooling (clients, code generators, compilers, etc) could allow the developer to opt into different validation behaviors based on the target server. This is a ton of added complexity, of course, but the problem it would solve exists with the proposed changes, even without SHOULD clauses describing the preferred, stricter rules.

* Otherwise, if {argumentValue} is a {Variable}:
* Add an entry to {coercedValues} named {argumentName} with the
value {value}.
* Otherwise:

This comment has been minimized.

Copy link
@mike-marcacci

mike-marcacci Apr 11, 2018

You probably don't want this "otherwise" here, since coercion and resulting failures need to happen if {argumentValue} is a {Variable}.

This comment has been minimized.

Copy link
@leebyron

leebyron Apr 12, 2018

Author Collaborator

It is intentional. Variables are coerced during CoerceVariableValues before execution of a query at which point and resulting failures are also reported before execution, so a second coercion is not necessary.

There's an existing note below this algorithm capturing this point

This comment has been minimized.

Copy link
@mike-marcacci

mike-marcacci Apr 12, 2018

Ah I see – I missed that note below. 👍

@IvanGoncharov

This comment has been minimized.

Copy link
Member

commented Apr 11, 2018

However, I don't think we should be terribly concerned with a bifurcation of tooling.

I agree with @mike-marcacci I think most of the tooling doesn't need to be backward compatible. There is only one scenario we shouldn't break: existing queries compiled inside clients that were deployed in production should be still supported by GraphQL server.

That means that all client libraries, ESlint plugin, client code generation, GraphiQL and other libraries/tools that are used only in development or during build process can be easily updated.

The only part of the ecosystem that should be backward compatible are GraphQL servers and maybe some proxy servers (e.g. Apollo Engine).

That's why I think both specification and graphql-js should use "from-scratch" behavior by default. Compatibility section of the specification can provide an alternative version of IsVariableUsageAllowed function and graphql-js can have VariablesInAllowedPositionCompatible validation rule.

Because we're already altering the spec for query validation to expand the definition of validity, we're going to see a mismatch in the wild: some servers won't update their software, while new clients will expect looser queries to work.

This is not a problem since if a new client stops working with old servers it would be caught during development. So I think we should distinguish between changes that will break already deployed clients and changes that will result in temporary inconvenience during development.
However, I fully agree that breaking changes are unavoidable in the future and we should prepare for them:

This is probably unavoidable, and makes me think that there should be a "graphql version" or programmatically readable "feature set" (as discussed in the WG meeting) supplied in the result of an introspection query.

I think any mechanism involving introspection query is not a solution since the majority of GraphQL clients doesn't do any handshake; they just send queries to the server. So the only solution I see is for sever to detect what was the expected behavior during client development and fallback to it.
So I think having some kind of "graphql version" delivered together with GraphQL query is the right call.

After discussing this in depth with @dschafer at a whiteboard - we've talked ourselves out of any breaking change - I think that having a fork in the expected behavior of validation rules is just too easy to get wrong, liable to cause issues during deployment, and could create a bifurcation of tooling environments. We came up with mitigation strategies for Facebook's codebases and APIs, however also agreed that public APIs like Github would be much much more challenging to solve for.

@leebyron I want to propose a solution. Currently, all GraphQL clients send GraphQL request document inside query field. But ExecuteRequest names the same parameter document:

ExecuteRequest(schema, document, operationName, variableValues, initialValue)

So it makes sense to rename this field to document inside GraphQL request.
In addition to that, we can make version a required field for all requests that use document. And both specification and reference implementation should provide the algorithm for converting request documents to a new version.

So a server can implement the following algorithm:

  • Request contains document:
    • if the request also contains query, return a fatal error
    • if the request doesn't contain version, return a fatal error
    • use version from the request
  • Request contains query then assume version is equal to legacy one(e.g. 0)
  • if version is newer than the last version supported by server return fatal error.

Pros:

  • Don't break any deployed client.
  • Add versioning to queries that allow bugfixes similar to this one.
  • Allows clear error messages
  • Rename query to document that makes sense on its own and solve a lot of terminology problems.

Cons:

  • One additional field in every request.
  • New clients stop working with old servers.

I also think we should use an integer for version field and increment it only when breaking change is made.

@leebyron

This comment has been minimized.

Copy link
Collaborator Author

commented Apr 12, 2018

Thanks for the reviews, @mike-marcacci and @IvanGoncharov.

@IvanGoncharov I think there are too many critical issues with the versioning approach you're suggesting. Beyond being fundamentally inconsistent with one of GraphQL's core values to not need versioning, the introduction of versions causes even more breaking changes that we're trying to avoid.

Currently, all GraphQL clients send GraphQL request document inside query field. But ExecuteRequest names the same parameter document

This seems like it is an issue not for the spec but for HTTP services, and is just a naming convention so probably not a viable mechanism for specification algorithms. The spec document should be pretty consistent in referring to "document" and "operation" terms already.

I also think we should use an integer for version field and increment it only when breaking change is made.

I would much prefer to not make breaking changes at all - I still believe our goal for evolving the GraphQL specification should be to arrive at a stable base and project confidence and stability. An incrementing integer version implies that there will be a continuous stream of breaking changes. Also, having worked on API surfaces (including public facing) that have dealt with breaking changes in the past, incrementing integer versions do not adequately capture scenarios where a client has adopted support for one breaking change while it has not yet adopted support for another - the fixed ordering of the version numbers forces an adoption curve that does not align with client expectation or capability. I'd like to strongly avoid making the same mistake with GraphQL.

@mike-marcacci I know you had similar ideas:

This is probably unavoidable, and makes me think that there should be a "graphql version" or programmatically readable "feature set" (as discussed in the WG meeting) supplied in the result of an introspection query.

I think these are interesting ideas, but I'd love to avoid expanding the surface area of this proposal since I believe it is in a final reviewable state. I believe that finding consensus on programmable feature set support in a client-server communication is going to be quite challenging and require a lot of exploration. Just the two proposals you two offered are very different.

My hope is that we can arrive at consensus around this proposal in its current form. I'm not comfortable proposing a change which breaks the validation of existing queries.

@mike-marcacci

This comment has been minimized.

Copy link

commented Apr 12, 2018

I think these are interesting ideas, but I'd love to avoid expanding the surface area of this proposal since I believe it is in a final reviewable state.

Agreed; apologies for nudging the discussion off-topic. To reiterate what I intended to be the core point of my review, the current proposal successfully avoids breaking things that are already deployed while adding very little mess. This ended up a much cleaner solution than initially seemed possible 👍

@leebyron

This comment has been minimized.

Copy link
Collaborator Author

commented Apr 18, 2018

Thanks for the reviews and offline conversations to everyone involves. Super pleased with how this ended up.

@leebyron leebyron merged commit 63a508c into master Apr 18, 2018

2 checks passed

continuous-integration/travis-ci/pr The Travis CI build passed
Details
continuous-integration/travis-ci/push The Travis CI build passed
Details

@leebyron leebyron deleted the null-variables branch Apr 18, 2018

leebyron added a commit that referenced this pull request Apr 18, 2018

RFC: Fix ambiguity with null variable values and default values (#418)
This change corresponds to a spec proposal (#418) which solves an ambiguity in how variable values and default values behave with explicit null values, and changes validation rules to better define the behavior of default values. Otherwise, this ambiguity can allows for null values to appear in non-null argument values, which may result in unforeseen null-pointer-errors.

In summary this change includes:

* Removal of `VariablesDefaultValueAllowed` validation rule. All variables may now specify a default value.

* Change to `VariablesInAllowedPosition` rule to explicitly not allow a `null` default value when flowing into a non-null argument, and now allows optional (nullable) variables in non-null arguments that provide default values.

* Changes to `ProvidedRequiredArguments` rule (renamed from `ProvidedNonNullArguments`) to no longer require values to be provided to non-null arguments which provide a default value.

* Changes to `getVariableValues()` and `getArgumentValues()` to ensure a `null` value never flows into a non-null argument.

* Changes to `valueFromAST()` to ensure `null` variable values do not flow into non-null types.

* Adds to the `TypeInfo` API to allow referencing the expected default value at a given AST position.

leebyron added a commit to graphql/graphql-js that referenced this pull request Apr 18, 2018

SPEC/BUG: Ambiguity with null variable values and default values (#1274)
This change corresponds to a spec proposal (graphql/graphql-spec#418) which solves an ambiguity in how variable values and default values behave with explicit null values, and changes validation rules to better define the behavior of default values. Otherwise, this ambiguity can allows for null values to appear in non-null argument values, which may result in unforeseen null-pointer-errors.

In summary this change includes:

* **BREAKING:** Removal of `VariablesDefaultValueAllowed` validation rule. All variables may now specify a default value.

* Change to `VariablesInAllowedPosition` rule to explicitly not allow a `null` default value when flowing into a non-null argument, and now allows optional (nullable) variables in non-null arguments that provide default values.

* Changes to `ProvidedRequiredArguments` rule (**BREAKING:** renamed from `ProvidedNonNullArguments`) to no longer require values to be provided to non-null arguments which provide a default value.

* Changes to `getVariableValues()` and `getArgumentValues()` to ensure a `null` value never flows into a non-null argument.

* Changes to `valueFromAST()` to ensure `null` variable values do not flow into non-null types.

* Adds to the `TypeInfo` API to allow referencing the expected default value at a given AST position.

filipncs added a commit to filipncs/graphql-java that referenced this pull request May 14, 2018

bbakerman added a commit to graphql-java/graphql-java that referenced this pull request May 23, 2018

@lirsacc lirsacc referenced this pull request Jun 29, 2018

Closed

v1 / first release #1

19 of 25 tasks complete
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.