Skip to content

Latest commit

 

History

History
39 lines (24 loc) · 1.11 KB

optimizers.rst

File metadata and controls

39 lines (24 loc) · 1.11 KB

Optimizers

Default optimizer: OptimizerWrapper which offers additional update modifier options, so instead of using TFOptimizer directly, a customized Adam optimizer can be specified via:

Agent.create(
    ...
    optimizer=dict(
        optimizer='adam', learning_rate=1e-3, clipping_threshold=1e-2,
        multi_step=10, subsampling_fraction=64, linesearch_iterations=5,
        doublecheck_update=True
    ),
    ...
)

tensorforce.core.optimizers.OptimizerWrapper

tensorforce.core.optimizers.TFOptimizer

tensorforce.core.optimizers.NaturalGradient

tensorforce.core.optimizers.Evolutionary

tensorforce.core.optimizers.ClippingStep

tensorforce.core.optimizers.MultiStep

tensorforce.core.optimizers.DoublecheckStep

tensorforce.core.optimizers.LinesearchStep

tensorforce.core.optimizers.SubsamplingStep

tensorforce.core.optimizers.Synchronization

tensorforce.core.optimizers.Plus