Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Proposal: Tuple deconstruction rules should be equivalent to method overload resolution rules #1998

Closed
Jeremy-Price opened this issue Nov 12, 2018 · 11 comments

Comments

@Jeremy-Price
Copy link

The ‘Deconstruct’ declarations:

    class Foo
    {
        public void Deconstruct(out int a, out int b) => a = b = 0;
        public void Deconstruct(out string c, out string d) => c = d = null;
    }

and usages:

(int a, int b) = foo;
(string c, string d) = foo;

Lead to ambiguous invocation errors.

Calling these as regular methods, however, works fine:

foo.Deconstruct(out int a, out int b);
foo.Deconstruct(out string c, out string d);

Of course, using ‘var’ would be ambiguous in this case, and the behavior of deconstruction seems to indicate that it is equivalent to the above but with ‘var’.

But it’s not uncommon for a type to be deconstructable into two different tuples that just happen to have the same arity, so I think the deconstruction behavior should be equivalent to explicitly calling these methods.

@HaloFour
Copy link
Contributor

@Jeremy-Price

But it’s not uncommon for a type to be deconstructable into two different tuples that just happen to have the same arity,

What's your cite for this?

I'd argue that not only should it not be allowed, but it also doesn't make any sense. Deconstruction of a tuple (or tuple-likes) should be strictly by arity, not by some combination of the types of elements.

@Jeremy-Price
Copy link
Author

The arity of a deconstructor is just a random number--it's an accident of how the type is factored and has nothing to do with its semantics.

Consider, for example a type 'Resource' that can be deconstructed into a (name, path) or (name, guid) pair. The fact that the second pair was (name, guid) and not some other random combination of implementation data structures is an accident and I don't see any good reason to forbid this kind of behavior.

@HaloFour
Copy link
Contributor

@Jeremy-Price

It's not a random number. Deconstruction is strictly positional and is based entirely on arity, not structure. In the design of that deconstruction is also recursive pattern matching where each position can be further matched against a type, where overloading is not something that makes sense. The idea that a type can be deconstructed into two separate tuple forms of the same arity but with different structures just doesn't make sense.

@theunrepentantgeek
Copy link

Here's an example where it might make sense - what if an instance of FileInfo could be deconstructed in two different ways, like this ...

(DirectoryInfo folder, string fileName) = fileReference;
(string folderName, string fileName) = fileReference;

As a determined fan of semantic types, I'd argue that the first form is superior - but I know (and respect!) developers who would be equally certain that the second form is the best.

Currently, only one could be supported because of the arity rule.

TBH, I'm not convinced that either form should be available - it seems to be bordering on misuse of deconstruction. Then again, I'm not wholly against it either ... #conflicted 😕

@ufcpp
Copy link

ufcpp commented Nov 13, 2018

#1092

@Austin-bryan
Copy link

I'd argue that not only should it not be allowed, but it also doesn't make any sense. Deconstruction of a tuple (or tuple-likes) should be strictly by arity, not by some combination of the types of elements.

What makes you think that?

I was gonna say nay on this feature because I thought that order doesn't matter when we deconstruct, but since it still does, I think it would still make more sense to be more consistent with methods.

At the same time though, it could be confusing using someone else's code as deconstructors aren't as easy to find out about as methods are, where you can just hit . and start typing the keywords you think the method would have.

@DavidArno
Copy link

Definitely worth reading the thread that @ufcpp links to (#1092). As @HaloFour says, by allowing multiple deconstructs with the same number of parameters, we'd cause all sorts of issues with recursive pattern matching. The latter is made far simpler by effectively only allowing one deconstruct per arity count.

It's a trade-off. Overloading of deconstructors would occasionally be useful. Recursive pattern matching will be incredibly useful. So it makes sense to favour language rules that benefit the latter over the former.

@gafter
Copy link
Member

gafter commented Nov 13, 2018

The current rules are necessary to support pattern-matching. The rules are intentional and a result of deliberate decisions in C# 7.

@gafter gafter closed this as completed Nov 13, 2018
@Jeremy-Price
Copy link
Author

I don't understand how this conflicts with pattern matching in the case where types are explicitly stated on the left-hand side. There is no nontrivial pattern matching in that case, and that rule does not conflict with more elaborate pattern-matching schemes. In those cases, the types would be absent from the left-hand side.

@gafter
Copy link
Member

gafter commented Nov 13, 2018

Pattern-matching does not have a left-hand-side. More elaborate pattern-matching scheme is to have a parenthesized list of patterns to match, and we use the number of patterns to decide which Deconstruct to use.

@lantz83
Copy link

lantz83 commented Apr 19, 2019

Any chance this will change in the future? It limits the "neatness" of deconstruction pretty severely, imo.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants