-
-
Notifications
You must be signed in to change notification settings - Fork 5.7k
Description
Hi there,
this issue was already discussed on discourse. While we could gather some clarification, it couldn't be solved. Hence I raise it as an issue here.
I am the author of ExtensibleEffects.jl and TypeClasses.jl and experience crucial performance difficulties due to bad type inference. As I got multiple requests from the community whether ExtensibleEffects.jl could be made fast, I want to tackle these problems.
ExtensibleEffects.jl is an advanced functional package, hence please bear with me if the following example looks not understandable why you ever would like to do something like this. It is a minified version of the actual ExtensibleEffects code, and I am very sorry that I was not able to simplify it anyway further so far. At least it is reproducible and fits into an issue.
My Motivation
if you want to understand the motivation for the example, this might help
```julia using ExtensibleEffects using TypeClasses using Testvector_of_eff_of_vector = map(x -> noeffect([x]), [1, 20])
e1 = vector_of_eff_of_vector[1]
e2 = vector_of_eff_of_vector[2]
some functional monadic helpers to work WITHIN the effects
mygoal(e1, e2) = @syntax_flatmap begin
v1 = e1
v2 = e2
@pure [v1; v2]
end
```julia
julia> mygoal(e1, e2)
Eff(effectful=NoEffect{Vector{Int64}}([1, 20]), length(cont)=0)
julia> @inferred mygoal(e1, e2)
ERROR: return type ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}} does not match inferred return type ExtensibleEffects.Eff
julia> @code_warntype mygoal(e1, e2)
# on the terminal this gives nice color output and will show that only the last step does not infer
The case boils down to something like the following
function test_fails(e1, e2)
combine(v1, v2) = [v1; v2]
curried_combine(v1) = v2 -> combine(v1, v2)
e1_f = map(curried_combine, e1)
f_flatmap(f) = TypeClasses.map(v2 -> f(v2), e2)
TypeClasses.flatmap(f_flatmap, e1_f)
end
@inferred test_fails(e1, e2) # same as before
# ERROR: return type ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}} does not match inferred return type ExtensibleEffects.Eff
@code_warntype test_fails(e1, e2) # same as beforeIf this could be stabilized, much is gained for ExtensibleEffects.jl
Same code, two execution orders, one fails, the other infers
Here the one which works
using ExtensibleEffects
using TypeClasses
using Test
vector_of_eff_of_vector = map(x -> noeffect([x]), [1, 20])
e1 = vector_of_eff_of_vector[1]
e2 = vector_of_eff_of_vector[2]
function prepare_test(e1, e2)
combine(v1, v2) = [v1; v2]
curried_combine(v1) = v2 -> combine(v1, v2)
e1_f = map(curried_combine, e1)
f_flatmap(f) = TypeClasses.map(v2 -> f(v2), e2)
f_flatmap, e1_f
end
f_flatmap, e1_f = prepare_test(e1, e2)
@inferred TypeClasses.flatmap(f_flatmap, e1_f) # infers perfectly
function test_infers(e1, e2)
f_flatmap, e1_f = prepare_test(e1, e2)
TypeClasses.flatmap(f_flatmap, e1_f)
end
@inferred test_infers(e1, e2) # infers perfectlyAnd here the one which fails
using ExtensibleEffects
using TypeClasses
using Test
vector_of_eff_of_vector = map(x -> noeffect([x]), [1, 20])
e1 = vector_of_eff_of_vector[1]
e2 = vector_of_eff_of_vector[2]
function prepare_test(e1, e2)
combine(v1, v2) = [v1; v2]
curried_combine(v1) = v2 -> combine(v1, v2)
e1_f = map(curried_combine, e1)
f_flatmap(f) = TypeClasses.map(v2 -> f(v2), e2)
f_flatmap, e1_f
end
function test_infers(e1, e2)
f_flatmap, e1_f = prepare_test(e1, e2)
TypeClasses.flatmap(f_flatmap, e1_f)
end
@inferred test_infers(e1, e2)
# ERROR: return type ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}} does not match inferred return type ExtensibleEffects.Eff
f_flatmap, e1_f = prepare_test(e1, e2)
@inferred TypeClasses.flatmap(f_flatmap, e1_f)
# ERROR: return type ExtensibleEffects.Eff{NoEffect{Vector{Int64}}, Tuple{}} does not match inferred return type ExtensibleEffects.EffIt seems that a top-level call to TypeClasses.flatmap at the right place informs the compiler about things it usually does not have available (but should have available).
This drives me crazy 😄 I feel like a little child: 10 years programming experience are not enough to solve this on my own, I am depending on you deep core Julia developers and hope someone recognizes what is going on here.
(tested on Julia 1.7.1 and Julia 1.8.0-beta3.4)
```julia julia> versioninfo() Julia Version 1.7.1 Commit ac5cc99 (2021-12-22 19:35 UTC) Platform Info: OS: Linux (x86_64-pc-linux-gnu) CPU: Intel(R) Core(TM) i7-1065G7 CPU @ 1.30GHz WORD_SIZE: 64 LIBM: libopenlibm LLVM: libLLVM-12.0.1 (ORCJIT, icelake-client) ```julia> versioninfo()
Julia Version 1.8.0-beta3.4
Commit a4e69c5088 (2022-05-20 09:32 UTC)
Platform Info:
OS: Linux (x86_64-unknown-linux-gnu)
CPU: 8 × Intel(R) Core(TM) i7-1065G7 CPU @ 1.30GHz
WORD_SIZE: 64
LIBM: libopenlibm
LLVM: libLLVM-13.0.1 (ORCJIT, icelake-client)
Threads: 1 on 8 virtual cores
Environment:
LD_LIBRARY_PATH = /run/opengl-driver/lib:/run/opengl-driver-32/lib:/usr/lib:/usr/lib32:/nix/store/0fih0yvy9lwxkaaci06gw0x1f5a5aqld-sane-config/lib/sane