Skip to content

Commit

Permalink
added an explenation how the explorers can be added to Learning Agents
Browse files Browse the repository at this point in the history
  • Loading branch information
DavoudTaghawiNejad committed Aug 4, 2012
1 parent c96e46c commit 05d7226
Showing 1 changed file with 11 additions and 1 deletion.
12 changes: 11 additions & 1 deletion pybrain/rl/explorers/explorer.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,16 @@ class Explorer(Module):
""" An Explorer object is used in Agents, receives the current state
and action (from the controller Module) and returns an explorative
action that is executed instead the given action.
Explorer have to be added to the learner before adding the learner
to the LearningAgent.
For Example::
controller = ActionValueNetwork(2, 100)
learner = SARSA()
learner.explorer = NormalExplorer(1, 0.1)
self.learning_agent = LearningAgent(controller, learner)
"""

def activate(self, state, action):
Expand All @@ -20,4 +30,4 @@ def activate(self, state, action):

def newEpisode(self):
""" Inform the explorer about the start of a new episode. """
pass
pass

0 comments on commit 05d7226

Please sign in to comment.