-
-
Notifications
You must be signed in to change notification settings - Fork 610
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
"auto ref" fails to match when IFTI succeeds (strip to level const) #18622
Labels
Comments
schveiguy (@schveiguy) commented on 2013-07-08T12:10:26ZFor clarification, the error displayed for the given code is:
Error: template testautoref.foo(T)(auto ref T[] i) cannot deduce template function from argument types !()(immutable(int[]))
And it does match immutable(int)[]. As implied, if you change to version(none), it does indeed work.
const(int[]) also fails.
A workaround that seems to work, but I don't really like it, is to add overloads for immutable(T[]) and const(T[]) |
code (@MartinNowak) commented on 2013-07-08T17:16:21ZJust an educated guess. The problem seems to be that the value is a L-value so the signature becomes (ref T[]) which cannot match immutable(int[]). |
k.hara.pg commented on 2013-07-08T18:05:35Z(In reply to comment #2)
> Just an educated guess. The problem seems to be that the value is a L-value so
> the signature becomes (ref T[]) which cannot match immutable(int[]).
That's exactly the current compiler's behavior. "auto ref" always behave as "ref" parameter against lvalue argument `i`, then T[] cannot deduce type T from immutable(int[]). |
monarchdodra commented on 2013-07-09T03:56:35Z(In reply to comment #3)
> (In reply to comment #2)
> > Just an educated guess. The problem seems to be that the value is a L-value so
> > the signature becomes (ref T[]) which cannot match immutable(int[]).
>
> That's exactly the current compiler's behavior. "auto ref" always behave as
> "ref" parameter against lvalue argument `i`, then T[] cannot deduce type T from
> immutable(int[]).
More generally, it seems auto ref will never operate a cast on an lvalue *even if* the parameter is not templatized. This is strange because auto ref *will* do it for RValues. Here is another (reduced) example that shows it.
//----
void foo()(auto ref long a);
void main()
{
int get();
int a;
foo(get()); //Fine, rvalue int is cast to long
foo(a); //Nope!
}
//----
main.d(7): Error: template main.foo does not match any function template declaration. Candidates are:
main.d(1): main.foo()(auto ref long a)
main.d(7): Error: template main.foo()(auto ref long a) cannot deduce template function from argument types !()(int)
//----
I also spotted this (which, IMO, is even more problematic):
//----
struct S
{
@property long get();
alias get this;
}
void foo()(auto ref long a);
void main()
{
S s;
foo(S()); //Fine.
foo(s); //Nope!
}
//----
This time, it gives the "cryptic" error:
main.d(14): Error: s.get() is not an lvalue
=> But that's strange: foo takes by auto ref... what do I care about lvalue?
Because s is an Lvalue, it would appear the auto ref is "primed" to take by ref. It then fails when an actual Rvalue is given. |
schveiguy (@schveiguy) commented on 2013-07-11T07:17:57Z(In reply to comment #3)
> That's exactly the current compiler's behavior. "auto ref" always behave as
> "ref" parameter against lvalue argument `i`, then T[] cannot deduce type T from
> immutable(int[]).
To the user, auto ref should really mean "use ref if possible, otherwise do not"
In other words, I think auto ref should be the equivalent of having two identical templates, one with ref, and one without. Because IFTI can do some implicit casting, and implicit casting changes lvalues to rvalues, I think the rule is incorrect.
The algorithm should be:
If lvalue, then try ref. If that does not work, try non-ref version. If that does not work, error. |
monarchdodra commented on 2013-07-11T10:18:31Z(In reply to comment #5)
> (In reply to comment #3)
>
> > That's exactly the current compiler's behavior. "auto ref" always behave as
> > "ref" parameter against lvalue argument `i`, then T[] cannot deduce type T from
> > immutable(int[]).
>
> To the user, auto ref should really mean "use ref if possible, otherwise do
> not"
>
> In other words, I think auto ref should be the equivalent of having two
> identical templates, one with ref, and one without. Because IFTI can do some
> implicit casting, and implicit casting changes lvalues to rvalues, I think the
> rule is incorrect.
For the record (without putting any words in Kenji's mouth), I think he was just stating what the compiler was *doing*, and why the code was rejected. I don't think he meant to say that the current behavior was correct.
Kenji, could you confirm that this is what you meant? That this is a correct "rejects-valid" ? |
k.hara.pg commented on 2013-07-11T11:04:34Z(In reply to comment #6)
> For the record (without putting any words in Kenji's mouth), I think he was
> just stating what the compiler was *doing*, and why the code was rejected. I
> don't think he meant to say that the current behavior was correct.
>
> Kenji, could you confirm that this is what you meant? That this is a correct
> "rejects-valid" ?
No, indeed I just talked about the current compiler's work, but I'm sure that current behavior is correct.
The main point is what "auto ref" is doing - it's ref-ness deduction.
If given argument is an lvalue, auto ref parameter would become ref. If it's an rvalue, would become non-ref. Nothing else is done.
Type deduction and ref-ness deduction are completely orthogonal. It means that the combination of each deduction results might finally reject given argument. It's exactly what happened in the OP code.
So I can say it's expected behavior.
I believe that one language feature (in here "auto ref") should do just one thing correctly. Orthogonality between each features could reduce language's "special case", and combining them would bring abundant usage. For example, when you use "auto ref", the ref-ness deduction reliably avoids copy-construction of given argument.
A brief conclusion is: "auto ref" is not a magic parameter. |
monarchdodra commented on 2013-07-11T11:17:31Z(In reply to comment #7)
> (In reply to comment #6)
> > For the record (without putting any words in Kenji's mouth), I think he was
> > just stating what the compiler was *doing*, and why the code was rejected. I
> > don't think he meant to say that the current behavior was correct.
> >
> > Kenji, could you confirm that this is what you meant? That this is a correct
> > "rejects-valid" ?
>
> No, indeed I just talked about the current compiler's work, but I'm sure that
> current behavior is correct.
>
> The main point is what "auto ref" is doing - it's ref-ness deduction.
> If given argument is an lvalue, auto ref parameter would become ref. If it's an
> rvalue, would become non-ref. Nothing else is done.
> Type deduction and ref-ness deduction are completely orthogonal. It means that
> the combination of each deduction results might finally reject given argument.
> It's exactly what happened in the OP code.
>
> So I can say it's expected behavior.
I strongly disagree. The argument of "ref-ness deduction" should be applied *after* type deduction. though. Applying it *before* makes no sense.
Having a function that accepts an RValue of type U, yet reject an LValue of type U, is unheard of. It is strongly anti-expected. Finally, regardless of "how" auto ref works, this definitely goes against "what" auto ref is trying to solve: Writing functions that can take either by value or by ref, without having to write overloads.
As is, the only solution here is to *not* use auto ref, but instead, write ref/value overloads: auto ref => Failed to provide solution it was designed for => design is flawed. |
schveiguy (@schveiguy) commented on 2013-07-11T11:32:46Z(In reply to comment #7)
> The main point is what "auto ref" is doing - it's ref-ness deduction.
> If given argument is an lvalue, auto ref parameter would become ref. If it's an
> rvalue, would become non-ref. Nothing else is done.
> Type deduction and ref-ness deduction are completely orthogonal. It means that
> the combination of each deduction results might finally reject given argument.
> It's exactly what happened in the OP code.
I agree that was what auto-ref is supposed to do, but in this case, it's hurting more than helping. I can't think of a valid reason to forbid this, can you?
This bug report should be changed to an enhancement, though (and the description appropriately modified). |
k.hara.pg commented on 2013-07-11T17:36:24Z(In reply to comment #8)
> I strongly disagree. The argument of "ref-ness deduction" should be applied
> *after* type deduction. though. Applying it *before* makes no sense.
>
> Having a function that accepts an RValue of type U, yet reject an LValue of
> type U, is unheard of. It is strongly anti-expected. Finally, regardless of
> "how" auto ref works, this definitely goes against "what" auto ref is trying to
> solve: Writing functions that can take either by value or by ref, without
> having to write overloads.
No. Repeat, "auto ref" parameter is not a magic. I think ref-ness deduction is the essential purpose of "auto ref". Accepting both lvalue and rvalue is not its *direct* purpose. It's a secondary behavior derived from ref-ness deduction. There's simple, consistent, and orthogonal rule.
> As is, the only solution here is to *not* use auto ref, but instead, write
> ref/value overloads: auto ref => Failed to provide solution it was designed for
> => design is flawed.
No, instead you can use following signatures:
void foo(T)(auto ref T i) if (isDynamicArray!T) {}
// Accept any types by T with ref-ness deduction,
// then restrict T to dynamic array.
or:
void foo(T)(auto ref inout(T[]) i) {}
// capture the type qualifier by inout, then receiving qualified lvalue
// with ref won't violate const correctness. |
k.hara.pg commented on 2013-07-11T17:51:05Z(In reply to comment #9)
> (In reply to comment #7)
> > The main point is what "auto ref" is doing - it's ref-ness deduction.
> > If given argument is an lvalue, auto ref parameter would become ref. If it's an
> > rvalue, would become non-ref. Nothing else is done.
> > Type deduction and ref-ness deduction are completely orthogonal. It means that
> > the combination of each deduction results might finally reject given argument.
> > It's exactly what happened in the OP code.
>
> I agree that was what auto-ref is supposed to do, but in this case, it's
> hurting more than helping. I can't think of a valid reason to forbid this, can
> you?
Current "auto ref" does work under the IFTI mechanism, and IFTI does not implicitly convert the function arguments during the deduction process (In this case, convert immutable(int[]) to immutable(int)[]). It should occur *after* type deduction finished.
> This bug report should be changed to an enhancement, though (and the
> description appropriately modified).
I think we should go forward non-template-auto-ref design (in ref? scope ref?) rather than modifying current auto ref. It has already been determined that it won't deduce ref-ness so it should work on non-template functions. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
monarchdodra reported this on 2013-07-08T10:05:36Z
Transferred from https://issues.dlang.org/show_bug.cgi?id=10574
CC List
Description
The text was updated successfully, but these errors were encountered: