A plugin for Gen, enabling the use of any of Flux's optimizers for parameter learning in generative functions from Gen's static or dynamic modeling languages.
The only new function exposed by GenFluxOptimizers.jl
is the FluxOptimConf
constructor: it takes in a Flux optimizer type (e.g., ADAM
, RMSProp
, ADAGrad
, etc.), as well as a tuple of arguments to the optimizer. The resulting FluxOptimConf
object can be used the same way that any Gen update configuration can, as the argument to a ParamUpdate
object. For example:
using Gen
using Flux.Optimise
using GenFluxOptimizers
@gen function f()
@param p :: Float64
...
end
data = choicemap(...)
tr, = generate(f, (), data)
# Use Flux ADAM optimizer
adam_update = ParamUpdate(FluxOptimConf(Optimise.ADAM, (0.1, (0.9, 0.999))), f)
for i=1:1000
accumulate_param_gradients!(tr)
apply!(adam_update)
end