-
Notifications
You must be signed in to change notification settings - Fork 12.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
#[repr(C)] enum bit width determination algo does not match that of the C compiler #28925
Comments
Credit goes out to @retep998 for discovering this. |
#[repr(C)]
enum bit width determination algo does not match that of the C compiler
See also |
@briansmith: I think |
I think what the C compiler does by default is sometimes determined by the ABI. For example, I believe sometimes GCC's default for that flag differs depending on how GCC was configured when it was built. IMO, instead of matching the default of "the" C compiler (in fact, there is often more than one C compiler, each of which may have a different default), Rust should default to matching the ABI. |
I tripped over this while trying to add |
This by itself is arguably a bug, even on boring platforms like x86 Linux. Simple constructive proof: #include <stdio.h>
enum E {
L = 0,
H = 0x100000000
};
int main() {
printf("%zu > %zu\n", sizeof(enum E), sizeof(void*));
return 0;
}
That isn't portable ISO C, but GCC and Clang accept it without warnings unless The original discriminant sizing (and So, trying to summarize the problems here:
As for |
In C, support for enum determinants that cannot be represented by an
int
is implementation-defined. Many compilers choose a different basic type for representation, based on the types or the values of the determinants. This is where the algorithm currently used by Rust for#[repr(C)]
enums may differ from the one used by the C compiler, although the difference can be observed in fairly marginal cases.GCC uses the types of the determinant expressions to find the best fit, so this is a 32-bit type on 64-bit Linux:
Yet this is 64-bit:
This is because the type of integer constant
0x80000000
is determined asunsigned int
due to its hexadecimal notation, and the negation preserves the type.Rust first coerces all discriminants to
isize
and then apparently works out the best fitting representation type from the value range, where fitting means that negative values are preserved as such. So this ends up being 64-bit:I'm uncertain as to which approach is the best for fixing this. Trying to match the behavior of C compilers quirk-for-quirk does not seem to be feasible: there may be more than one compiler per target, and in fact the behavior even changes with compiler options, as e.g. the type of an integer constant is determined differently pre-C99 and post-C99. Perhaps a more conservative solution would be to lint on discriminant values that are out of the
libc::c_int
domain and suggest using fixed-width representations such as#[repr(u32)]
.The text was updated successfully, but these errors were encountered: