compiler rejects valid ull UInt128 literals #313

Open
stepancheg opened this Issue Aug 7, 2012 · 2 comments

2 participants

@stepancheg
Collaborator

seems like compiler rejects valid uint128 literals. Like in this code:

main() {
    var a = 0xff0102030405060708090a0b0c0d0e0f_ull;
}
###############################
main() {
    var a = 0xff0102030405060708090a0b0c0d0e0f_ull;
------------^
}

###############################
./tmp.clay(2,12): error: uint128 literal out of range

compilation context: 
    main()
  ./lib-clay/core/system/system.clay(22,30):
    runMain(Int32, Pointer[Pointer[Int8]], Static[main])
  ./lib-clay/core/system/system.clay(32,13):
    callMain(Static[main], Int32, Pointer[Pointer[Int8]])
    external main
@jckarter
Owner

Literals larger than 64 bits aren't supported yet. GCC and clang have similar limitations. Note that int128 will only work on Unix x86-64 platforms as well.

@stepancheg
Collaborator

In that case compiler message that explains that it is know issue, would help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment