Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove eval from at-memoize macro #48

Open
kmsquire opened this issue Feb 19, 2020 · 4 comments · May be fixed by #59
Open

Remove eval from at-memoize macro #48

kmsquire opened this issue Feb 19, 2020 · 4 comments · May be fixed by #59

Comments

@kmsquire
Copy link
Member

Right now, the @memoize macro creates the cache dictionary using eval:

fcachename = Symbol("##", f, "_memoized_cache")
fcache = isdefined(Main, fcachename) ?
getfield(Main, fcachename) :
Core.eval(Main, :(const $fcachename = ($dicttype)()))

This is generally considered a bad idea in macros, and until recently, also caused an error here because the code was evaling into the Memoize package (see #32).

Right now, using eval gives the following behavior:

  1. all memoized methods of a given function share the memoization cache
  2. if a method is overwritten, the cache is cleared, and any other global resources created/held by the previous version of that method are released:

    Memoize.jl/test/runtests.jl

    Lines 257 to 266 in 1709785

    finalized = false
    @memoize function method_rewrite()
    x = []
    finalizer(x->(global finalized; finalized = true),x)
    x
    end
    method_rewrite()
    @memoize function method_rewrite() end
    GC.gc()
    @test finalized

I made one attempt to simply use a different cache dictionary for each method of a given function (JuliaCollections:1709785...JuliaCollections:a9170e5). This mostly works, but as-is causes the test above to fail, because the old cache is not released when the method is overwritten.

There may be an easy way around this, but it's not obvious to me right now.

Thoughts and/or pull requests welcome!

Cc: @cstjean

@o314
Copy link

o314 commented Oct 13, 2020

Hello,

if a method is overwritten

It does not seem to me that multidispatch is handled. i will not address the
world age issue there and only consider, the multidispatch case.

Thru the example of factorial, i will try to make this code run

CASE A - ONE CACHE PER FUNC

const _FACT_LUT = Dict{Type{<:Tuple},Dict}()

# memoizing only(methods(factorial, Tuple{Int64}))
_FACT_LUT[Tuple{Int64}] = Dict{Int64,Int64}()
let _inst_lut = _FACT_LUT[Tuple{Int64}]
    _inst_lut[1] = 1
    for n in 2:20
        _inst_lut[n] = _inst_lut[n-1] * n
    end
end

# memoizing only(methods(factorial, Tuple{Int128}))
_FACT_LUT[Tuple{Int128}] = Dict{Int128,Int128}()
let _inst_lut = _FACT_LUT[Tuple{Int128}]
    _inst_lut[1] = 1
    for n in 2:34
        _inst_lut[n] = _inst_lut[n-1] * n
    end
end

CASE B - ONE CACHE FOR ALL FUNC

const _LUT = Dict{Type{<:Tuple},Dict}() 
# ...


# memoizing only(methods(factorial, Tuple{Int64}))
sig = Tuple{typeof(factorial),Union{Int64, UInt64}}
sigarg = Union{Int64, UInt64}
_LUT[sig] = Dict{sigarg,sigarg}()
let _inst_lut = _LUT[sig]
    # init (3rd arg ?)
    _inst_lut[1] = 1
    for n in 2:20
        _inst_lut[n] = _inst_lut[n-1] * n
    end
end

# memoizing only(methods(factorial, Tuple{Int128}))
sig = Tuple{typeof(factorial),Int128}
sigarg = Int128
_LUT[sig] = Dict{sigarg,sigarg}()
let _inst_lut = _LUT[sig]
    # init (3rd arg ?)
    _inst_lut[1] = 1
    for n in 2:34
        _inst_lut[n] = _inst_lut[n-1] * n
    end
end

to get that at macro

  • force typed arg @ function sig expr
  • collect them
  • rebuild tupletype
    • case a, done with 1 lut per func computed elsewhere
    • case b, prepend the typeof(named_function_name) to the tupletype
  • later, dispatch @ dict according to it

The main point is to have one subcache per method instance eg. dispatchtuple/tupletype; and a supercache to collect subcaches

Does that help you ?

@o314
Copy link

o314 commented Oct 22, 2020

update

after

@memoized factorial(n::Int) =
    n == 1 ?
        1 :
        factorial(n-1) * n

factorial(10)

Try to generate something like

_CACHE[Tuple{typeof(factorial), Int}] =
    Dict{Tuple{Int64},Any}(
        (7,) => 5040,
        (4,) => 24,
        (8,) => 40320,
        (3,) => 6,
        (6,) => 720,
        (9,) => 362880,
        (2,) => 2,
        (5,) => 120,
        (10,) => 3628800,
        (1,) => 1)

ps Any should be Int @ Dict. has to be fixed
pps: i am storing tuple of one, one may unwrap them.

cstjean added a commit that referenced this issue Oct 28, 2020
Fix #48, at the cost of putting the variable in a poorly-performing global.

Not sure if this is acceptable. It's frustrating that Julia seemingly lacks the tools to deal with
this elegantly.

- If `const` worked in a local function, we'd just put `const` and be done with it.
- If `typeconst` existed for global variables, that would work too.

Memoization.jl uses generated functions, which causes other problems. And it feels like the
wrong solution too.
@cstjean cstjean linked a pull request Oct 28, 2020 that will close this issue
@cstjean
Copy link
Collaborator

cstjean commented Nov 8, 2020

It also causes all instances of a closure to share the same cache, which is wrong:

julia> g = []
       for x in 1:10
       push!(g, 
             @memoize f(y) = x+y)
       end

julia> g[1](1)
2

julia> g[2](1)
2

@cstjean
Copy link
Collaborator

cstjean commented Dec 30, 2020

Ah, this solution by @yuyichao might work out for us:

julia> begin
       local foo=Ref(1)
       g() = foo[]
       end
g (generic function with 1 method)

julia> @btime g()
  1.538 ns (0 allocations: 0 bytes)
1

willow-ahrens pushed a commit to willow-ahrens/Memoize.jl that referenced this issue Jan 5, 2021
Fix JuliaCollections#48, at the cost of putting the variable in a poorly-performing global.

Not sure if this is acceptable. It's frustrating that Julia seemingly lacks the tools to deal with
this elegantly.

- If `const` worked in a local function, we'd just put `const` and be done with it.
- If `typeconst` existed for global variables, that would work too.

Memoization.jl uses generated functions, which causes other problems. And it feels like the
wrong solution too.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
3 participants