Join GitHub today
GitHub is home to over 31 million developers working together to host and review code, manage projects, and build software together.
Sign upType inference failure involving binary operators, traits, references, and integer defaulting #36549
Comments
This comment has been minimized.
This comment has been minimized.
|
This is due to the fact that we "short-circuit" primitive operations. Since we know that In fact, this case is mentioned in a comment in the compiler explaining why we do this. For the middle two cases, the compiler effectively supplies the type hint for you, since it knows that I'm not sure how to deal with this. I'm not sure if we even can. We need to know the types to select the trait implementation, but in this case, the implementation is required to select the correct type. |
This comment has been minimized.
This comment has been minimized.
|
@Aatch I feel like making the types applied by integer type default to not be forced. Namely we could assign |
Mark-Simulacrum
added
the
A-inference
label
Jun 23, 2017
Mark-Simulacrum
added
the
C-bug
label
Jul 26, 2017
This comment has been minimized.
This comment has been minimized.
|
Edit: I have moved this concern to a new issue: #57447 Original postA much simpler demonstration: (relevant URLO thread) let _: f32 = 1. - 1.; // allowed
let _: f32 = 1. - &1.; // type error
let _: f32 = &1. - 1.; // type error
let _: f32 = &1. - &1.; // type errorTo a mortal like me, it seems that the only reason the first line works is because the compiler must have a special case for binary operations between two unconstrained "floating-point flavored" type inference variables. Can the compiler not just special case the latter three examples in the same way it special cases the first? |
This comment has been minimized.
This comment has been minimized.
// Finally, and most oddly, using an identity cast or type ascription
// from `u32` to `u32` also convinces the inference engine:
let c: &u32 = &5; ((c >> 8) as u32 & 0xff) as u8;I confess that this example is surprising. It's hard to picture what the current implementation actually looks like. :/ |
This comment has been minimized.
This comment has been minimized.
|
Never mind, I see now. These may be different issues. In my understanding, that last example works because the type of |
solson commentedSep 16, 2016
We expect this to compile without error (reasoning below):
Compile error:
Problem: The
0xffis getting inferred toi32instead ofu32, which would work.Explanation of my understanding of the code:
chas type&u328is defaulted toi32(perhaps0xffis also unhelpfully defaulted toi32at this stage?)c >> 8has typeu32viaimpl<'a> Shr<i32> for &'a u32(to the best of my knowledge)0xffto infer tou32to use theu32: BitAnd<u32>impl, but it fails.Working examples (each with a slight twist):
Who thought identity casts were useless?
cc @retep998 @nagisa