-
Notifications
You must be signed in to change notification settings - Fork 18.3k
Closed
Labels
Milestone
Description
I expect this program to print the precision of the internal multiprecision floats used for storing floating-point constants in the compiler. cmd/gc uses an Mpint with 16 29-bit values, which should be 463 bits plus a sign bit. The program prints 434 on the playground, which is 15*29-1, not 16*29-1. It is possible that my logic here is wrong and that the program does not in fact compute the precision of the internal floating point representation used for constants. However, being off by 29 in a compiler with 29-bit mp limbs cannot be a coincidence. I wonder if the final word in the compiler representation is being maintained or used incorrectly. It wouldn't be the first time. package main import ( "fmt" "math" ) const ulp = (1.0+(2.0/3.0)) - (5.0/3.0) func main() { fmt.Println(math.Log(1/math.Abs(ulp))/math.Log(2)) } http://play.golang.org/p/_3XxPwYeVx