Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

remove incorrect usage of generate function #39

Closed
wants to merge 3 commits into from

Conversation

vtjnash
Copy link
Contributor

@vtjnash vtjnash commented Apr 12, 2017

This usage cannot be inferred, as it calls the trait function at an undefined point in time. It is incorrect and invalid to treat generated as a pure annotation. Instead, assume that the user will provide inferrably-pure functions as traits.

(untested locally)

Edit (by @mauro3): Other issues touching on this:
- #39
- #40
- JuliaLang/julia#21356

This usage cannot be inferred, as it calls the trait function at an undefined point in time. It is incorrect and invalid to treat `generated` as a pure annotation. Instead, assume that the user will provide inferrably-pure functions as traits.
@@ -152,13 +151,13 @@ macro traitimpl(tr)
if !negated
return quote
$fnhead = $trname{$(paras...)}
$isfnhead = true # Add the istrait definition as otherwise
VERSION < "v0.6-" && ($isfnhead = true) # Add the istrait definition as otherwise

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should be v"0.6-" instead?

# method-caching can be an issue.
end
else
return quote
$fnhead = Not{$trname{$(paras...)}}
$isfnhead = false# Add the istrait definition as otherwise
VERSION < "v0.6-" && ($isfnhead = false) # Add the istrait definition as otherwise
Copy link

@sbromberger sbromberger Apr 12, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

as above.

@ChrisRackauckas
Copy link

This doesn't seem to infer.

check_first_arg(f,T::Type) = check_first_arg(typeof(f),T)
function check_first_arg{F}(::Type{F}, T::Type)
    typ = Tuple{Any, T, Vararg}
    typ2 = Tuple{Any, Type{T}, Vararg} # This one is required for overloaded types
    method_table = Base.MethodList(F.name.mt) # F.name.mt gets the method-table
    for m in method_table
        (m.sig<:typ || m.sig<:typ2) && return true
    end
    return false
end

using SimpleTraits
# Standard
@traitdef HasJac{F}
__has_jac(f) = check_first_arg(f, Val{:jac})
@traitimpl HasJac{F} <- __has_jac(F)
@traitfn f{X; HasJac{X}}(x::X) = true
@traitfn f{X; !HasJac{X}}(x::X) = false

has_jac{T}(f::T)   = istrait(HasJac{T})

f123(x,y) = 1
f123(::Val{:jac}, x) = 2
g123(x,y) = 1

has_jac(f123)
has_jac(g123)

@code_llvm has_jac(f123)
@code_llvm has_jac(g123)

@traitfn f{X; HasJac{X}}(x::X) = _internal_func_1()
@traitfn f{X; !HasJac{X}}(x::X) = _internal_func_2()

_internal_func_1() = 1.0
_internal_func_2() = 1

@code_llvm f(f123)
@code_llvm f(g123)

before gave

; Function Attrs: uwtable
define i8 @julia_has_jac_73001() #0 {
top:
  ret i8 1
}

; Function Attrs: uwtable
define i8 @julia_has_jac_73014() #0 {
top:
  ret i8 0
}
; Function Attrs: uwtable
define double @julia_f_73052() #0 {
top:
  ret double 1.000000e+00
}

; Function Attrs: uwtable
define i64 @julia_f_73054() #0 {
top:
  ret i64 1
}

but now gives

; Function Attrs: uwtable
define i8 @julia_has_jac_72990() #0 {
top:
  %0 = call i8 @julia_istrait_72991(%jl_value_t* inttoptr (i64 2230100528 to %jl_value_t*)) #1
  ret i8 %0
}

; Function Attrs: uwtable
define i8 @julia_has_jac_73008() #0 {
top:
  %0 = call i8 @julia_istrait_73009(%jl_value_t* inttoptr (i64 2239456752 to %jl_value_t*)) #1
  ret i8 %0
}

; Function Attrs: uwtable
define %jl_value_t* @julia_f_73048() #0 {
top:
  %0 = call %jl_value_t*** @jl_get_ptls_states() #4
  %1 = alloca [9 x %jl_value_t*], align 8
  %.sub = getelementptr inbounds [9 x %jl_value_t*], [9 x %jl_value_t*]* %1, i64 0, i64 0
  %2 = getelementptr [9 x %jl_value_t*], [9 x %jl_value_t*]* %1, i64 0, i64 6
  %3 = getelementptr [9 x %jl_value_t*], [9 x %jl_value_t*]* %1, i64 0, i64 2
  %4 = getelementptr [9 x %jl_value_t*], [9 x %jl_value_t*]* %1, i64 0, i64 5
  %5 = bitcast %jl_value_t** %2 to i8*
  call void @llvm.memset.p0i8.i32(i8* %5, i8 0, i32 24, i32 8, i1 false)
  %6 = bitcast [9 x %jl_value_t*]* %1 to i64*
  %7 = bitcast %jl_value_t** %3 to i8*
  call void @llvm.memset.p0i8.i64(i8* %7, i8 0, i64 24, i32 8, i1 false)
  store i64 14, i64* %6, align 8
  %8 = getelementptr [9 x %jl_value_t*], [9 x %jl_value_t*]* %1, i64 0, i64 1
  %9 = bitcast %jl_value_t*** %0 to i64*
  %10 = load i64, i64* %9, align 8
  %11 = bitcast %jl_value_t** %8 to i64*
  store i64 %10, i64* %11, align 8
  store %jl_value_t** %.sub, %jl_value_t*** %0, align 8
  store %jl_value_t* null, %jl_value_t** %4, align 8
  %12 = call i8 @julia_check_first_arg_72993(%jl_value_t* inttoptr (i64 2227553040 to %jl_value_t*), %jl_value_t* inttoptr (i64 2228459344 to %jl_value_t*)) #1
  %13 = and i8 %12, 1
  %14 = icmp eq i8 %13, 0
  br i1 %14, label %L3, label %if

L3:                                               ; preds = %top, %if
  %"#temp#.0" = phi %jl_value_t* [ inttoptr (i64 2230100528 to %jl_value_t*), %if ], [ inttoptr (i64 2232394096 to %jl_value_t*), %top ]
  %15 = getelementptr [9 x %jl_value_t*], [9 x %jl_value_t*]* %1, i64 0, i64 7
  store %jl_value_t* %"#temp#.0", %jl_value_t** %2, align 8
  store %jl_value_t* inttoptr (i64 2230100624 to %jl_value_t*), %jl_value_t** %15, align 8
  %16 = call %jl_value_t* @jl_f_isa(%jl_value_t* null, %jl_value_t** %2, i32 2)
  store %jl_value_t* %16, %jl_value_t** %3, align 8
  %17 = getelementptr inbounds %jl_value_t, %jl_value_t* %16, i64 -1, i32 0
  %18 = bitcast %jl_value_t** %17 to i64*
  %19 = load i64, i64* %18, align 8
  %20 = and i64 %19, -16
  %21 = inttoptr i64 %20 to %jl_value_t*
  %22 = icmp eq %jl_value_t* %21, inttoptr (i64 2147420144 to %jl_value_t*)
  br i1 %22, label %pass, label %fail

L4:                                               ; preds = %pass
  %23 = getelementptr [9 x %jl_value_t*], [9 x %jl_value_t*]* %1, i64 0, i64 3
  store %jl_value_t* %"#temp#.0", %jl_value_t** %2, align 8
  store %jl_value_t* inttoptr (i64 2232394192 to %jl_value_t*), %jl_value_t** %15, align 8
  %24 = call %jl_value_t* @jl_f_isa(%jl_value_t* null, %jl_value_t** %2, i32 2)
  store %jl_value_t* %24, %jl_value_t** %23, align 8
  %25 = getelementptr inbounds %jl_value_t, %jl_value_t* %24, i64 -1, i32 0
  %26 = bitcast %jl_value_t** %25 to i64*
  %27 = load i64, i64* %26, align 8
  %28 = and i64 %27, -16
  %29 = inttoptr i64 %28 to %jl_value_t*
  %30 = icmp eq %jl_value_t* %29, inttoptr (i64 2147420144 to %jl_value_t*)
  br i1 %30, label %pass.13, label %fail12

L6:                                               ; preds = %pass.13
  store %jl_value_t* %"#temp#.0", %jl_value_t** %15, align 8
  store %jl_value_t* inttoptr (i64 2147436272 to %jl_value_t*), %jl_value_t** %2, align 8
  store %jl_value_t* inttoptr (i64 2147434688 to %jl_value_t*), %jl_value_t** %34, align 8
  %31 = call %jl_value_t* @jl_apply_generic(%jl_value_t** %2, i32 3)
  br label %L8

L7:                                               ; preds = %pass.13, %pass
  %"#temp#1.0" = phi %jl_value_t* [ inttoptr (i64 2270796704 to %jl_value_t*), %pass ], [ inttoptr (i64 2270797424 to %jl_value_t*), %pass.13 ]
  store %jl_value_t* %"#temp#1.0", %jl_value_t** %4, align 8
  store %jl_value_t* %"#temp#.0", %jl_value_t** %15, align 8
  store %jl_value_t* inttoptr (i64 2147436272 to %jl_value_t*), %jl_value_t** %2, align 8
  store %jl_value_t* inttoptr (i64 2147434688 to %jl_value_t*), %jl_value_t** %34, align 8
  %32 = call %jl_value_t* @jl_invoke(%jl_value_t* %"#temp#1.0", %jl_value_t** %2, i32 3)
  br label %L8

L8:                                               ; preds = %L7, %L6
  %storemerge = phi %jl_value_t* [ %31, %L6 ], [ %32, %L7 ]
  %"#temp#2" = getelementptr [9 x %jl_value_t*], [9 x %jl_value_t*]* %1, i64 0, i64 4
  store %jl_value_t* %storemerge, %jl_value_t** %"#temp#2", align 8
  %33 = load i64, i64* %11, align 8
  store i64 %33, i64* %9, align 8
  ret %jl_value_t* %storemerge

if:                                               ; preds = %top
  br label %L3

fail:                                             ; preds = %L3
  call void @jl_type_error_rt(i8* inttoptr (i64 580078736 to i8*), i8* inttoptr (i64 137083904 to i8*), %jl_value_t* inttoptr (i64 2147420144 to %jl_value_t*), %jl_value_t* %16)
  unreachable

pass:                                             ; preds = %L3
  %34 = getelementptr [9 x %jl_value_t*], [9 x %jl_value_t*]* %1, i64 0, i64 8
  %35 = icmp eq %jl_value_t* %16, inttoptr (i64 2147602544 to %jl_value_t*)
  br i1 %35, label %L4, label %L7

fail12:                                           ; preds = %L4
  call void @jl_type_error_rt(i8* inttoptr (i64 580078736 to i8*), i8* inttoptr (i64 137083904 to i8*), %jl_value_t* inttoptr (i64 2147420144 to %jl_value_t*), %jl_value_t* %24)
  unreachable

pass.13:                                          ; preds = %L4
  %36 = icmp eq %jl_value_t* %24, inttoptr (i64 2147602544 to %jl_value_t*)
  br i1 %36, label %L6, label %L7
}
#X<:Any) in module Main at C:\Users\Chris\.julia\v0.5\SimpleTraits\src\SimpleTraits.jl:307

; Function Attrs: uwtable
define %jl_value_t* @julia_f_73050() #0 {
top:
  %0 = call %jl_value_t*** @jl_get_ptls_states() #4
  %1 = alloca [9 x %jl_value_t*], align 8
  %.sub = getelementptr inbounds [9 x %jl_value_t*], [9 x %jl_value_t*]* %1, i64 0, i64 0
  %2 = getelementptr [9 x %jl_value_t*], [9 x %jl_value_t*]* %1, i64 0, i64 6
  %3 = getelementptr [9 x %jl_value_t*], [9 x %jl_value_t*]* %1, i64 0, i64 2
  %4 = getelementptr [9 x %jl_value_t*], [9 x %jl_value_t*]* %1, i64 0, i64 5
  %5 = bitcast %jl_value_t** %2 to i8*
  call void @llvm.memset.p0i8.i32(i8* %5, i8 0, i32 24, i32 8, i1 false)
  %6 = bitcast [9 x %jl_value_t*]* %1 to i64*
  %7 = bitcast %jl_value_t** %3 to i8*
  call void @llvm.memset.p0i8.i64(i8* %7, i8 0, i64 24, i32 8, i1 false)
  store i64 14, i64* %6, align 8
  %8 = getelementptr [9 x %jl_value_t*], [9 x %jl_value_t*]* %1, i64 0, i64 1
  %9 = bitcast %jl_value_t*** %0 to i64*
  %10 = load i64, i64* %9, align 8
  %11 = bitcast %jl_value_t** %8 to i64*
  store i64 %10, i64* %11, align 8
  store %jl_value_t** %.sub, %jl_value_t*** %0, align 8
  store %jl_value_t* null, %jl_value_t** %4, align 8
  %12 = call i8 @julia_check_first_arg_73010(%jl_value_t* inttoptr (i64 2228462416 to %jl_value_t*), %jl_value_t* inttoptr (i64 2228459344 to %jl_value_t*)) #1
  %13 = and i8 %12, 1
  %14 = icmp eq i8 %13, 0
  br i1 %14, label %L3, label %if

L3:                                               ; preds = %top, %if
  %"#temp#.0" = phi %jl_value_t* [ inttoptr (i64 2239456752 to %jl_value_t*), %if ], [ inttoptr (i64 2239615792 to %jl_value_t*), %top ]
  %15 = getelementptr [9 x %jl_value_t*], [9 x %jl_value_t*]* %1, i64 0, i64 7
  store %jl_value_t* %"#temp#.0", %jl_value_t** %2, align 8
  store %jl_value_t* inttoptr (i64 2239615888 to %jl_value_t*), %jl_value_t** %15, align 8
  %16 = call %jl_value_t* @jl_f_isa(%jl_value_t* null, %jl_value_t** %2, i32 2)
  store %jl_value_t* %16, %jl_value_t** %3, align 8
  %17 = getelementptr inbounds %jl_value_t, %jl_value_t* %16, i64 -1, i32 0
  %18 = bitcast %jl_value_t** %17 to i64*
  %19 = load i64, i64* %18, align 8
  %20 = and i64 %19, -16
  %21 = inttoptr i64 %20 to %jl_value_t*
  %22 = icmp eq %jl_value_t* %21, inttoptr (i64 2147420144 to %jl_value_t*)
  br i1 %22, label %pass, label %fail

L4:                                               ; preds = %pass
  %23 = getelementptr [9 x %jl_value_t*], [9 x %jl_value_t*]* %1, i64 0, i64 3
  store %jl_value_t* %"#temp#.0", %jl_value_t** %2, align 8
  store %jl_value_t* inttoptr (i64 2239457232 to %jl_value_t*), %jl_value_t** %15, align 8
  %24 = call %jl_value_t* @jl_f_isa(%jl_value_t* null, %jl_value_t** %2, i32 2)
  store %jl_value_t* %24, %jl_value_t** %23, align 8
  %25 = getelementptr inbounds %jl_value_t, %jl_value_t* %24, i64 -1, i32 0
  %26 = bitcast %jl_value_t** %25 to i64*
  %27 = load i64, i64* %26, align 8
  %28 = and i64 %27, -16
  %29 = inttoptr i64 %28 to %jl_value_t*
  %30 = icmp eq %jl_value_t* %29, inttoptr (i64 2147420144 to %jl_value_t*)
  br i1 %30, label %pass.13, label %fail12

L6:                                               ; preds = %pass.13
  store %jl_value_t* %"#temp#.0", %jl_value_t** %15, align 8
  store %jl_value_t* inttoptr (i64 2147436272 to %jl_value_t*), %jl_value_t** %2, align 8
  store %jl_value_t* inttoptr (i64 2147434744 to %jl_value_t*), %jl_value_t** %34, align 8
  %31 = call %jl_value_t* @jl_apply_generic(%jl_value_t** %2, i32 3)
  br label %L8

L7:                                               ; preds = %pass.13, %pass
  %"#temp#1.0" = phi %jl_value_t* [ inttoptr (i64 2270800736 to %jl_value_t*), %pass ], [ inttoptr (i64 2270802176 to %jl_value_t*), %pass.13 ]
  store %jl_value_t* %"#temp#1.0", %jl_value_t** %4, align 8
  store %jl_value_t* %"#temp#.0", %jl_value_t** %15, align 8
  store %jl_value_t* inttoptr (i64 2147436272 to %jl_value_t*), %jl_value_t** %2, align 8
  store %jl_value_t* inttoptr (i64 2147434744 to %jl_value_t*), %jl_value_t** %34, align 8
  %32 = call %jl_value_t* @jl_invoke(%jl_value_t* %"#temp#1.0", %jl_value_t** %2, i32 3)
  br label %L8

L8:                                               ; preds = %L7, %L6
  %storemerge = phi %jl_value_t* [ %31, %L6 ], [ %32, %L7 ]
  %"#temp#2" = getelementptr [9 x %jl_value_t*], [9 x %jl_value_t*]* %1, i64 0, i64 4
  store %jl_value_t* %storemerge, %jl_value_t** %"#temp#2", align 8
  %33 = load i64, i64* %11, align 8
  store i64 %33, i64* %9, align 8
  ret %jl_value_t* %storemerge

if:                                               ; preds = %top
  br label %L3

fail:                                             ; preds = %L3
  call void @jl_type_error_rt(i8* inttoptr (i64 580078736 to i8*), i8* inttoptr (i64 137083904 to i8*), %jl_value_t* inttoptr (i64 2147420144 to %jl_value_t*), %jl_value_t* %16)
  unreachable

pass:                                             ; preds = %L3
  %34 = getelementptr [9 x %jl_value_t*], [9 x %jl_value_t*]* %1, i64 0, i64 8
  %35 = icmp eq %jl_value_t* %16, inttoptr (i64 2147602544 to %jl_value_t*)
  br i1 %35, label %L4, label %L7

fail12:                                           ; preds = %L4
  call void @jl_type_error_rt(i8* inttoptr (i64 580078736 to i8*), i8* inttoptr (i64 137083904 to i8*), %jl_value_t* inttoptr (i64 2147420144 to %jl_value_t*), %jl_value_t* %24)
  unreachable

pass.13:                                          ; preds = %L4
  %36 = icmp eq %jl_value_t* %24, inttoptr (i64 2147602544 to %jl_value_t*)
  br i1 %36, label %L6, label %L7
}

@vtjnash
Copy link
Contributor Author

vtjnash commented Apr 12, 2017

That's correct, check_first_arg is impossible to infer. The idea behind traits (specifically, the THTT post) was that basing them on applicable was wrong (uncomputable at compile-time).

@sbromberger
Copy link

So I don't understand why check_first_arg isn't inferable (type-stable). It returns a Bool regardless of its arguments.

@ChrisRackauckas
Copy link

It's at least not pure because it uses a global state (the methods table). That's what I would've thought the issue was. But before it would compile a version given the current state of the methods table, which worked really well. Sad that hack is gone...

@vtjnash
Copy link
Contributor Author

vtjnash commented Apr 12, 2017

@sbromberger If you look at code_typed, you would be able to see that it is doing exactly what you suggest. It is type-stable and inference knows that it returns a Bool.

@sbromberger
Copy link

sbromberger commented Apr 12, 2017

@vtjnash then I can't reconcile

That's correct, check_first_arg is impossible to infer.

and

It is type-stable

with your comment in JuliaLang/julia#21356 (comment).

I'm honestly not trying to be difficult; I'm having a really hard time understanding why things broke and what I need to do to get them back to the way they were so we can continue prototyping LightGraphs in the REPL.

@davidanthoff
Copy link
Contributor

I don't think type-stability of the trait function is enough to get back the performance we had on julia 0.5. The design in this package really relies on SimpleTraits.trait methods that are so simple that they can be inlined, and they might even have to be essentially constants for things to work. That worked when these were generated functions: essentially the functions that were generated just returned a constant value, whereas this PR moves the logic that was previously in the generated function code into a normal function, which makes the actual trait methods too complicated for the holy-trait dispatch story.

Essentially this PR changes things such that we see a lot more "dynamic dispatch" (in the sense explained here) than we saw on julia 0.5, and that leads to way worse performance.

I don't see how one can get around this on julia 0.6, I think at the end of the day this was a breaking change that was not properly phased in via the normal deprecation cycle, and it broke programmed traits in this package.

I'm also not in favor or merging this PR. While it restores the functionality of programmed traits on julia 0.6 (albeit with a bad performance penalty), there are cases where the old version implementation will still work on julia 0.6 and give good performance. I think the right strategy for julia 0.6 is to have two separate macros to define programmed traits. One that uses generated functions, and one that doesn't (i.e. behaves like the implementation in this PR). I think that is the best we can hope for here.

@ChrisRackauckas
Copy link

ChrisRackauckas commented Apr 12, 2017

To copy from the Gitter, it's not about inferring check_first_arg, it's about inferring the @traitimpl function. Before, it was generating

@generated SimpleTraits.trait{F}(::Type{HasJac{F}}) = __has_jac(F) ? :(HasJac{F}) : :(Not{HasJac{F}})

But, if __has_jac(F) is pure, it does not need to be a generated function in order to be type-inferrable. To see this, notice

is_directed(x::Float64) = true
is_directed(x) = false
function test_isdirected(x)
  if is_directed(x)
    return 1.0
  else
    return 1
  end
end
@code_llvm test_isdirected(1)
@code_llvm test_isdirected(1.0)
@code_warntype test_isdirected(1)
@code_warntype test_isdirected(1.0)
julia> @code_llvm test_isdirected(1)

; Function Attrs: uwtable
define i64 @julia_test_isdirected_60956(i64) #0 !dbg !5 {
top:
  ret i64 1
}

julia> @code_llvm test_isdirected(1.0)

; Function Attrs: uwtable
define double @julia_test_isdirected_60958(double) #0 !dbg !5 {
top:
  ret double 1.000000e+00
}

julia> @code_warntype test_isdirected(1)
Variables:
  #temp#@_1
  #temp#@_2

Body:
  begin
      return $(QuoteNode(1))
  end::Int64

julia> @code_warntype test_isdirected(1.0)
Variables:
  #temp#@_1
  #temp#@_2

Body:
  begin
      return $(QuoteNode(1.0))
  end::Float64

that test_isdirected function is type-stable, because the is_directed function is pure and it can inline the constant, get rid of the branch, and directly know the result.

So if your function is pure,

SimpleTraits.trait{F}(::Type{HasJac{F}}) = __has_jac(F) ? :(HasJac{F}) : :(Not{HasJac{F}})

would have always worked. And the return type here is inferrable, if __has_jac was pure.

This change disallows the hack that allowed my example, but my example wasn't necessarily well-behaved before anyways. It would set the trait using the global state at the first time of implication, and even it was supposed to change, it wouldn't actually change in the future. This is the weird behavior of pre-265 which is now disallowed, and it was essentially a hack that used the fact that Julia had the 265 problem.

@davidanthoff
Copy link
Contributor

@ChrisRackauckas As a side note, is there any place that explains what a @pure function is? I can't find any mention of it in the docs...

@ChrisRackauckas
Copy link

I think the right strategy for julia 0.6 is to have two separate macros to define programmed traits. One that uses generated functions, and one that doesn't (i.e. behaves like the implementation in this PR). I think that is the best we can hope for here.

Not sure this should be the case. The cases which required this to be an @generated function were never well-behaved to begin with. They would have given the wrong trait if the result changed due to a global state change. That was always an issue if the trait implication function was not pure. The 265 fix makes sure this type of incorrectness cannot occur, and thus breaks the hack that allowed that case to "work".

@ChrisRackauckas As a side note, is there any place that explains what a @pure function is? I can't find any mention of it in the docs...

I could be wrong, @mbauman would know better, but it's a function whose result is directly computable from the inputs. Arrays have pointers, so that's not allowed. Closures use the environment, so they aren't allowed. It's essentially, inputs are immutables, outputs are immutables, and so it could compile on each input to know what the result would be, and inline that result during compilation? Is that close, @mbaumann?

I too would like to see a more thorough (or the existence of) a definition for it. But essentially, things like

is_directed(x::Float64) = true
is_directed(x) = false

will act as compilation time values, and in some cases you can make them more complex than that and still act like compilation time values by adding Base.@pure in front.

@vtjnash
Copy link
Contributor Author

vtjnash commented Apr 12, 2017

It's sort of like a limited generated function – it promises that the result will always be the same constant regardless of when the method is called. But note that this is separate from the pure trait mentioned by Chris above, which is instead an attribute of inference being able to tell that the function has no side-effects and returns a constant.

@ChrisRackauckas
Copy link

ChrisRackauckas commented Apr 12, 2017

I think @mauro3 is gone still... he'll have a nice discussion to read when he comes back 👍 .

But yes, I think this is pretty uncontroversial and should be merged. You can get around the change by directly changing

@traitimpl HasJac{G} <- __has_jac(G)

to

@generated SimpleTraits.trait{F}(::Type{HasJac{F}}) = __has_jac(F) ? :(HasJac{F}) : :(Not{HasJac{F}})

and you'll keep the generated functions and all of the problems that come with them (doesn't work in the REPL and doesn't actually guarantee correctness). It'll hurt DiffEq, but I knew all along this would be a problem (but it's unlikely the way we use it that someone would add a Jacobian AFTER they solve the differential equation). But the true answer for us is to just do something different (more along the lines of Optim.jl's wrappers for functions with Jacobians).

But LightGraphs.jl will be fine because its uses are pure. There's no performance loss there too, since it should work like the example I posted for test_isdirected. I am not sure about yours @davidanthoff .

@vtjnash
Copy link
Contributor Author

vtjnash commented Apr 12, 2017

That's true, although I would probably recommend updating the __has_jac function instead. It should usually be sufficient to mark it as @pure (if inference isn't already figuring it out), just don't try to add more methods to it later.

@davidanthoff
Copy link
Contributor

I have to admit that I still don't understand why pure helps here, but then I don't fully understand what pure actually implies... So I'm happy to go along with whatever you guys suggest here.

For my own use case I don't need performance and I have a whole bunch of options to work around this.

@mauro3
Copy link
Owner

mauro3 commented Apr 18, 2017

@vtjnash: thanks for looking into this!

I've played around with below script and found that 0.5 and 0.6 behaves pretty differently (see comments in the code). In fact, I did check this before on 0.5 as I hoped @pure would make the generated functions unnecessary but it did not work back then.

immutable Tr1 end
immutable Tr2 end

f(x) = _f(x, traitfn(x))
_f(x,::Type{Tr1}) = 1
_f(x,::Type{Tr2}) = 2

# Generates fast code on 0.6 but not on 0.5  (3x @pure does not help either)
istr(::Int) = true
istr(::Float64) = false
traitfn(x) = istr(x) ? Tr1 : Tr2

# # Generates fast code on 0.5 and 0.6
# istr(::Type{Int}) = true
# istr(::Type{Float64}) = false
# @generated traitfn(x) = istr(x) ? :(Tr1) : :(Tr2)

# # Generates fast code on 0.5 and throws an error on 0.6
# # (as istr is defined after generated function)
# @generated traitfn(x) = istr(x) ? :(Tr1) : :(Tr2)
# istr(::Type{Int}) = true
# istr(::Type{Float64}) = false

f(5.0)
@code_native f(5.0)

So it would probably make sense to return a generated function on 0.5 to keep performance. My understanding is that in 0.5 this type of generated functions was still allowed (or at least it just works).

Also, this generated function probably needs to be removed as that suffers the same problem of @ChrisRackauckas applicable example above, right? This means that the IsIterator trait would dispatch dynamically (slow) as that function is not pure, right?
EditEdit: this is issue #40. In particular see comment: #40 (comment).

Edit: and this will become slow with this change.

@ChrisRackauckas
Copy link

ChrisRackauckas commented Apr 18, 2017

My understanding is that in 0.5 this type of generated functions was still allowed (or at least it just works).

In v0.5 it had the problems I mentioned, but with the 265 problems Julia just never cared and let it slide even if it could be inaccurate. I believe @pure should also work here on v0.5, just not on the examples which are not actually pure. Is that not the case?

Also, this generated function probably needs to be removed as that suffers the same problem of @ChrisRackauckas applicable example above, right? This means that the IsIterator trait would dispatch dynamically (slow) as that function is not pure, right?

Yes.

@mauro3
Copy link
Owner

mauro3 commented Apr 18, 2017

I believe @pure should also work here on v0.5, no?

No, using this:

Base.@pure istr(::Int) = true
Base.@pure istr(::Float64) = false
Base.@pure traitfn(x) = istr(x) ? Tr1 : Tr2

generates slow code on 0.5. (and does not matter on 0.6)

@mauro3
Copy link
Owner

mauro3 commented Apr 20, 2017

As I haven't heard back from @vtjnash, my plan is to update this PR to keep the "evil" generated function in 0.5.

@mbaumann
Copy link

mbaumann commented Apr 20, 2017 via email

@mauro3
Copy link
Owner

mauro3 commented Apr 20, 2017

Sorry about this! The correct one is @mbauman.

@sbromberger
Copy link

@mauro3 thanks for your work on this. When the fix gets merged, would you mind tagging a new version so I can include it in my dependencies?

mauro3 added a commit that referenced this pull request Apr 23, 2017
mauro3 added a commit that referenced this pull request Apr 23, 2017
@mauro3
Copy link
Owner

mauro3 commented Apr 23, 2017

I updated this PR over in #41. In particular:

  • I kept the generated functions for 0.5, i.e. disabled @vtjnash's update for 0.5.
  • I added a (ugly) macro to test whether slow or fast dispatch happens
  • I added tests for the BaseTraits ensuring that their dispatch is fast.
  • I did not update the IsIterator trait and it still uses a generated function.

Currently Travis shows test errors which I cannot reproduce locally, please let me know if you see this locally.

For those of you using SimpleTraits in Julia 0.6, could you check that this still works as expected? You can do this with (using PR #41's branch):

@check_fast_traitdispatch MyTrait SomeType true

checks that a traitfunction

@traitfn fn(x::::MyTrait) = 1

does not dispatch dynamically.

mauro3 added a commit that referenced this pull request Apr 23, 2017
@mauro3 mauro3 closed this Apr 23, 2017
@mauro3
Copy link
Owner

mauro3 commented Apr 23, 2017

@sbromberger, can you check that this works for LightGraphs? Others affected also. If all good I'll tag tomorrow.

@sbromberger
Copy link

sbromberger commented Apr 23, 2017

@mauro3 looks like the issues affecting LightGraphs have been fixed on latest master of SimpleTraits. Thank you.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants