Proposal: remove special front end behavior for await expressions with context dynamic
.
#3649
Labels
feature
Proposed language feature that solves one or more problems
Background: I'm trying to update the expression inference rules in inference.md to match what was implemented, so that we can have a rigorous specification of the
inference-update-3
logic I landed in https://dart-review.googlesource.com/c/sdk/+/353440. In the process I've discovered some subtle differences between the analyzer and front end type inference ofawait
expressions. This issue addresses one such difference.In the analyzer, type inferring an expression in a context
dynamic
is always considered equivalent to type inferring it in a context_
. This is because the analyzer uses the methodTypeAnalyzer.analyzeExpression
whenever it supplies a context when type inferring an expression, andTypeAnalyzer.analyzeExpression
changes a context ofdynamic
to a context of_
.The front end also has logic that changes a context of
dynamic
to a context of_
, but this logic isn't used for all expression types; it is only used for generic invocations and expressions appearing inside pattern constructs (such as the expressions inside a switch expression). I believe there are only two circumstances in which the difference is observable: when inferring anawait
expression and when inferring an if-null expression. This issue is about the observable difference when inferring anawait
expression.When inferring an
await
expression in a context ofdynamic
, the front end analyzes theawait
expression's subexpression using a context ofdynamic
. Whereas the analyzer analyzes the sameawait
expression's subexpression using a context ofFutureOr<_>
.For the most part, the way contexts feed into the type inference algorithm is by appearing to the right of
<#
in the subtype constraint generation algorithm. This algorithm treats a right hand side ofdynamic
nearly identically to a right hand side ofFutureOr<_>
, so it is difficult to come up with an example of code that the analyzer and front end treat differently.But we can exploit the special behavior of
e1 ?? e2
, which behaves as follows: if it is type inferred in context_
, thene2
is type inferred using the static type ofe1
as its context; whereas if it is type inferred in contextK
, thene2
is type inferred usingK
as its context.Here is an example program that is analyzed differently:
This example is accepted by the analyzer, because:
await h(f ?? g(0)..foo()))
is inferred using a context ofdynamic
.h(f ?? g(0)..foo()))
is inferred using a context ofFutureOr<_>
.f ?? (g(0)..foo())
is inferred using a context ofFutureOr<_>
.g(0)..foo()
is inferred using a context ofFutureOr<_>
.g(0)
is inferred using a context ofFutureOr<_>
.g
isFuture<T>
, and that satisfiesFutureOr<_>
for allT
, downwards inference ofg(0)
does not constrain the type ofT
. So the type ofT
is set toint
during upwards inference.g(0)
has static typeFuture<int>
...foo()
resolves to the extension methodfoo
defined inextension on Future<int>
.But it is rejected by the front end, because:
await h(f ?? (g(0)..foo()))
is inferred using a context ofdynamic
.h(f ?? g(0)..foo()))
is inferred using a context ofdynamic
.h
is a generic invocation, the contextdynamic
is changed to_
, sof ?? (g(0)..foo())
is inferred using a context of_
.g(0)..foo()
is inferred using a context ofFuture<num>?
(the static type off
).g(0)
is inferred using a context ofFuture<num>?
.g
isFuture<T>
, and that satisfiesFutureOr<num>?
only ifT <: num
, the type ofT
is set tonum
during downwards inference.g(0)
has static typeFuture<num>
...foo()
does not resolve to the extension methodfoo
defined inextension on Future<int>
.Whereas this program is accepted by both the analyzer and the front end:
(The only difference is that the variable
x
has been changed to be implicitly typed rather than having an explicit type ofdynamic
, so theawait
expression is now analyzed using a context of_
).I propose to base my update of the expression inference rules in inference.md on the analyzer's behavior in this corner case, and to change the front end's behavior to match the analyzer's.
This change could in principle cause the front end to reject a program that it previously accepted, or to cause a change in the behavior of a compiled program by changing a reified type parameter. But I believe it will be extremely unlikely to produce an observable difference in practice, because contexts of
_
andFutureOr<_>
are treated so similarly by type inference.Nonetheless, I'm happy to push this change through the breaking change process.
@dart-lang/language-team any objections?
The text was updated successfully, but these errors were encountered: