Open
Description
.clif
Test Case
test optimize
set opt_level=none
set preserve_frame_pointers=true
set enable_multi_ret_implicit_sret=true
function %main() -> i64, f32 fast {
block0():
v8 = iconst.i64 -3524126683585344751
v10 = f32const 0x1.66e07ap-1
v36 = band.f32 v10, v10
return v8,v36
}
; print: %main()
When the opt_level
is set to none
, the result is as follows.
[x86 ] %main() -> [-3524126683585344751, 0x1.66e07ap-1]
[riscv64] %main() -> [-3524126683585344751, 0x1.66e07ap-1]
[aarch64] thread 'main' panicked at /home/obfuscator/.cargo/registry/src/index.crates.io-1949cf8c6b5b557f/regalloc2-0.12.2/src/ion/process.rs:1253:17:
Could not allocate minimal bundle, but the allocation problem should be possible to solve
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
But when opt_level
is switched to speed
:
[x86 ] %main() -> [-3524126683585344751, 0x1.66e07ap-1]
[riscv64] %main() -> [-3524126683585344751, 0x1.66e07ap-1]
[aarch64] %main() -> [-3524126683585344751, 0x1.66e07ap-1]
In this example, there's only one band
instruction.
On aarch64, it throws an error before optimization but runs correctly after. What’s the reason for that?