-
Notifications
You must be signed in to change notification settings - Fork 179
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Learning syntax #344
Comments
I like 3 of these suggestions. Another possibility is: Pro is that you have fewer lines, and I believe will scale? But I'm not sure I'm groking the scaling concern. On 30 April 2014 18:34:29 GMT-04:00, Trevor Bekolay notifications@github.com wrote:
|
The scaling concern is how would you apply both a BCM and PES learning rule to the same connection? |
A simple way to use a |
That makes huge connection calls, but that is another possibility. I'll add it to the list. |
It almost seems if you want to do that you should define another new rule, since you'll have to handle their interactions somehow. On 30 April 2014 20:12:57 GMT-04:00, Trevor Bekolay notifications@github.com wrote:
|
List updated. I think some rules are composable (hPES = BCM + PES after all)... if they do interact that's a new rule, sure. But there's combinatorial explosion if you don't make it possible to compose rules together at all. I'm thinking also about synaptic scaling or other rules that maintain some kind of homeostasis; they also don't really interact with another rule that might be happening. |
I am not sure whether the fourth option is exactly what @celiasmith meant. I don't like option 2. To me the different learning rules specify how something behaves, but is nothing which exists as an “object” (like a node) in network. My concern with @celiasmith is that it lengthens the list of function arguments (though the problem of long arguments lists is a bit alleviated by keyword arguments in Python). Finally, to throw out another possibility: with nengo.Network():
pre = nengo.Ensemble(nengo.LIF(10), dimensions=1)
post = nengo.Ensemble(nengo.LIF(10, dimensions=1)
with nengo.BCM(tau=0.01), nengo.PES(error_conn):
conn = nengo.Connection(pre, post) I guess, that fits the usual Nengo “idiomatic“, but might overdo it. |
Hmm.. could we edit those examples to also show the creation of the Here's option 3, I think with nengo.Network():
pre = nengo.Ensemble(nengo.LIF(10), dimensions=1)
post = nengo.Ensemble(nengo.LIF(10, dimensions=1)
conn = nengo.Connection(pre, post)
nengo.LearningRule(conn, nengo.BCM(tau=0.01))
nengo.LearningRule(conn, nengo.PES(nengo.Connection(error, post))) Is that the intended sort of use? What about this for 4: with nengo.Network():
pre = nengo.Ensemble(nengo.LIF(10), dimensions=1)
post = nengo.Ensemble(nengo.LIF(10), dimensions=1)
nengo.LearningRule(nengo.Connection(pre, post),
[nengo.BCM(tau=0.01), nengo.PES(nengo.Connection(error, post)])) None of the options are feeling right to me yet -- this is me just brainstorming how some of these might look in practice... :) |
Personally I like encapsulating the learning rule inside the Connection object. As @jgosmann says, it feels odd to have learning rules as a top level modelling object. Something like with nengo.Network():
pre = nengo.Ensemble(nengo.LIF(10), dimensions=1)
post = nengo.Ensemble(nengo.LIF(10), dimensions=1)
error = nengo.Connection(..., post)
conn = nengo.Connection(pre, post,
learning_rule=[nengo.BCM(tau=0.01), nengo.PES(error)) It does add some length to the |
I feel similarly. I wonder if we might be able to do another trick like the Networks, where we have something that helps users create the learning connection, but it's something that just gets turned into more basic components by the time the backend sees it? |
+1 |
This might also make sense as an option instead of specifying with nengo.Network():
pre = nengo.Ensemble(nengo.LIF(10), dimensions=1)
post = nengo.Ensemble(nengo.LIF(10), dimensions=1)
error = nengo.Connection(..., post)
conn = nengo.Connection(pre, post)
conn.learning_rule = [nengo.BCM(tau=0.01), nengo.PES(error)] (I think if we go this way we should be able to either specify it like this or in the constructor) |
Cool that's what I want too but Trevor said I was bad for asking... :( Seriously though, +1 On 2014-05-01, at 5:15 PM, Travis DeWolf notifications@github.com wrote:
|
I thought @tcstewar's snippet was essentially what I put for option 4, but I see now that... I'm crazy? Or something? Anyway, I propose those of us who will be at the sprint and care about learning rule syntax should have a meeting / argument about it at some point. Perhaps tomorrow after the lab meeting or Saturday morning. |
Fisticuffs! On 2014-05-02, at 12:13 AM, Trevor Bekolay notifications@github.com wrote:
|
Conclusion: @drasmuss gets his way yet again conn = nengo.Connection(pre, post)
conn.learning_rule = [nengo.BCM(tau=0.01), nengo.PES(error)] or conn = nengo.Connection(pre, post, learning_rule=[nengo.BCM(tau=0.01), nengo.PES(error)]) or conn = nengo.Connection(pre, post)
conn.learning_rule = nengo.BCM(tau=0.01) or conn = nengo.Connection(pre, post, learning_rule=nengo.BCM(tau=0.01)) |
Aww yeah I knew my boys got my back ✊ |
Implemented in #363. |
We've had this discussion a few times in various places, but a fair bit has changed since then both in terms of the codebase and the people that have touched the learning code, so let's reopen this discussion.
How to specify learning rules?
1. Connection function
Add a function to connections for applying a learning rule to this connection.
A sub-decision here is what to call the function.
apply
?learn
?add_rule
?Pros:
Cons:
2. Connection argument to each learning rule
Accept a connection as an argument.
Pros:
Cons:
3. LearningRule object
This is kind of an intermediate option between 1 and 2.
Pros:
Cons:
4: Connection argument can accept an iterable
Have I missed any options? Are there more pros/cons to each option? Which option would you prefer?
The text was updated successfully, but these errors were encountered: