Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Learning syntax #344

Closed
tbekolay opened this issue Apr 30, 2014 · 18 comments
Closed

Learning syntax #344

tbekolay opened this issue Apr 30, 2014 · 18 comments

Comments

@tbekolay
Copy link
Member

We've had this discussion a few times in various places, but a fair bit has changed since then both in terms of the codebase and the people that have touched the learning code, so let's reopen this discussion.

How to specify learning rules?

1. Connection function

Add a function to connections for applying a learning rule to this connection.

with nengo.Network():
    pre = nengo.Ensemble(nengo.LIF(10), dimensions=1)
    post = nengo.Ensemble(nengo.LIF(10, dimensions=1)
    conn = nengo.Connection(pre, post)
    conn.apply(nengo.BCM(tau=0.01))
    conn.apply(nengo.PES(error_conn))

A sub-decision here is what to call the function. apply? learn? add_rule?

Pros:

  • It's obvious that the learning rule is stored in the connection (doesn't need global state)
  • Same learning rule can be applied to multiple connections

Cons:

  • We don't often use functions on our objects, so it's not really "idiomatic" in terms of the Nengo codebase

2. Connection argument to each learning rule

Accept a connection as an argument.

with nengo.Network():
    pre = nengo.Ensemble(nengo.LIF(10), dimensions=1)
    post = nengo.Ensemble(nengo.LIF(10, dimensions=1)
    conn = nengo.Connection(pre, post)
    nengo.BCM(conn, tau=0.01)
    nengo.PES(conn, error_conn)

Pros:

  • Looks more like the rest of Nengo (which is mostly object constructors, even for "verb"y things like connections)

Cons:

  • Requires handling this in every learning rule object, which is a lot of unnecessary boilerplate
  • Must make a new object for each learned connection

3. LearningRule object

This is kind of an intermediate option between 1 and 2.

with nengo.Network():
    pre = nengo.Ensemble(nengo.LIF(10), dimensions=1)
    post = nengo.Ensemble(nengo.LIF(10, dimensions=1)
    conn = nengo.Connection(pre, post)
    nengo.LearningRule(conn, nengo.BCM(tau=0.01))
    nengo.LearningRule(conn, nengo.PES(error_con))

Pros:

  • Same learning rule can be applied to multiple connections
  • More idiomatic than 1

Cons:

  • Would have to be store in Network or connection, neither of which is obvious from looking at the code (thought we do often store things in the network sneakily)

4: Connection argument can accept an iterable

with nengo.Network():
    pre = nengo.Ensemble(nengo.LIF(10), dimensions=1)
    post = nengo.Ensemble(nengo.LIF(10, dimensions=1)
    conn = nengo.Connection(pre, post)
    nengo.LearningRule(conn, [nengo.BCM(tau=0.01), nengo.PES(error_conn)])

Have I missed any options? Are there more pros/cons to each option? Which option would you prefer?

@celiasmith
Copy link
Contributor

I like 3 of these suggestions. Another possibility is:
Conn = nengo.LearnedConnection(pre, post, nengo.BCM(...))

Pro is that you have fewer lines, and I believe will scale? But I'm not sure I'm groking the scaling concern.

On 30 April 2014 18:34:29 GMT-04:00, Trevor Bekolay notifications@github.com wrote:

We've had this discussion a few times in various places, but a fair bit
has changed since then both in terms of the codebase and the people
that have touched the learning code, so let's reopen this discussion.

How to specify learning rules?

1. Connection function

Add a function to connections for applying a learning rule to this
connection.

with nengo.Network():
   pre = nengo.Ensemble(nengo.LIF(10), dimensions=1)
   post = nengo.Ensemble(nengo.LIF(10, dimensions=1)
   conn = nengo.Connection(pre, post)
   conn.apply(nengo.BCM(tau=0.01))

A sub-decision here is what to call the function. apply? learn?
add_rule?

Pros:

  • It's obvious that the learning rule is stored in the connection
    (doesn't need global state)
  • Same learning rule can be applied to multiple connections

Cons:

  • We don't often use functions on our objects, so it's not really
    "idiomatic" in terms of the Nengo codebase

2. Connection argument to each learning rule

Accept a connection as an argument.

with nengo.Network():
   pre = nengo.Ensemble(nengo.LIF(10), dimensions=1)
   post = nengo.Ensemble(nengo.LIF(10, dimensions=1)
   conn = nengo.Connection(pre, post)
   nengo.BCM(conn, tau=0.01)

Pros:

  • Looks more like the rest of Nengo (which is mostly object
    constructors, even for "verb"y things like connections)

Cons:

  • Requires handling this in every learning rule object, which is a lot
    of unnecessary boilerplate
  • Must make a new object for each learned connection

3. LearningRule object

This is kind of an intermediate option between 1 and 2.

with nengo.Network():
   pre = nengo.Ensemble(nengo.LIF(10), dimensions=1)
   post = nengo.Ensemble(nengo.LIF(10, dimensions=1)
   conn = nengo.Connection(pre, post)
   nengo.LearningRule(conn, nengo.BCM(tau=0.01))

Pros:

  • Same learning rule can be applied to multiple connections
  • More idiomatic than 1

Cons:

  • Would have to be store in Network or connection, neither of which is
    obvious from looking at the code (thought we do often store things in
    the network sneakily)

Things that won't work

My original PES implementation added a learning_rule argument to
Connection. As has been discussed in the past, this won't work because
then only one learning rule can be applied to each connection, which
doesn't scale.


Have I missed any options? Are there more pros/cons to each option?
Which option would you prefer?


Reply to this email directly or view it on GitHub:
#344

@tbekolay
Copy link
Member Author

tbekolay commented May 1, 2014

The scaling concern is how would you apply both a BCM and PES learning rule to the same connection?

@hunse
Copy link
Collaborator

hunse commented May 1, 2014

A simple way to use a learning_rule argument would be to have it accept either a single learning rule or an iterable of learning rules.

@tbekolay
Copy link
Member Author

tbekolay commented May 1, 2014

That makes huge connection calls, but that is another possibility. I'll add it to the list.

@celiasmith
Copy link
Contributor

It almost seems if you want to do that you should define another new rule, since you'll have to handle their interactions somehow.

On 30 April 2014 20:12:57 GMT-04:00, Trevor Bekolay notifications@github.com wrote:

The scaling concern is how would you apply both a BCM and PES learning
rule to the same connection?


Reply to this email directly or view it on GitHub:
#344 (comment)

@tbekolay
Copy link
Member Author

tbekolay commented May 1, 2014

List updated. I think some rules are composable (hPES = BCM + PES after all)... if they do interact that's a new rule, sure. But there's combinatorial explosion if you don't make it possible to compose rules together at all. I'm thinking also about synaptic scaling or other rules that maintain some kind of homeostasis; they also don't really interact with another rule that might be happening.

@jgosmann
Copy link
Collaborator

jgosmann commented May 1, 2014

I am not sure whether the fourth option is exactly what @celiasmith meant.

I don't like option 2. To me the different learning rules specify how something behaves, but is nothing which exists as an “object” (like a node) in network.

My concern with @celiasmith is that it lengthens the list of function arguments (though the problem of long arguments lists is a bit alleviated by keyword arguments in Python).

Finally, to throw out another possibility:

with nengo.Network():
    pre = nengo.Ensemble(nengo.LIF(10), dimensions=1)
    post = nengo.Ensemble(nengo.LIF(10, dimensions=1)
    with nengo.BCM(tau=0.01), nengo.PES(error_conn):
        conn = nengo.Connection(pre, post)

I guess, that fits the usual Nengo “idiomatic“, but might overdo it.

@tcstewar
Copy link
Contributor

tcstewar commented May 1, 2014

Hmm.. could we edit those examples to also show the creation of the error_con connection? I think that really affects how the examples look, and should be part of this decision:

Here's option 3, I think

with nengo.Network():
    pre = nengo.Ensemble(nengo.LIF(10), dimensions=1)
    post = nengo.Ensemble(nengo.LIF(10, dimensions=1)
    conn = nengo.Connection(pre, post)
    nengo.LearningRule(conn, nengo.BCM(tau=0.01))
    nengo.LearningRule(conn, nengo.PES(nengo.Connection(error, post)))

Is that the intended sort of use? What about this for 4:

with nengo.Network():
    pre = nengo.Ensemble(nengo.LIF(10), dimensions=1)
    post = nengo.Ensemble(nengo.LIF(10), dimensions=1)
    nengo.LearningRule(nengo.Connection(pre, post), 
        [nengo.BCM(tau=0.01), nengo.PES(nengo.Connection(error, post)]))

None of the options are feeling right to me yet -- this is me just brainstorming how some of these might look in practice... :)

@drasmuss
Copy link
Member

drasmuss commented May 1, 2014

Personally I like encapsulating the learning rule inside the Connection object. As @jgosmann says, it feels odd to have learning rules as a top level modelling object. Something like

with nengo.Network():
    pre = nengo.Ensemble(nengo.LIF(10), dimensions=1)
    post = nengo.Ensemble(nengo.LIF(10), dimensions=1)
    error = nengo.Connection(..., post)
    conn = nengo.Connection(pre, post, 
                 learning_rule=[nengo.BCM(tau=0.01), nengo.PES(error))

It does add some length to the Connection call, but I don't really mind that. I really like the cleanliness we have right now, where Ensembles, Connections, Nodes, and Probes are the only objects at the network level.

@tcstewar
Copy link
Contributor

tcstewar commented May 1, 2014

Personally I like encapsulating the learning rule inside the Connection object. [...] I really like the cleanliness we have right now, where Ensembles, Connections, Nodes, and Probes are the only objects at the network level.

I feel similarly.

I wonder if we might be able to do another trick like the Networks, where we have something that helps users create the learning connection, but it's something that just gets turned into more basic components by the time the backend sees it?

@studywolf
Copy link
Collaborator

Personally I like encapsulating the learning rule inside the Connection object.

+1

@tcstewar
Copy link
Contributor

tcstewar commented May 1, 2014

This might also make sense as an option instead of specifying learning_rule in the constructor:

with nengo.Network():
    pre = nengo.Ensemble(nengo.LIF(10), dimensions=1)
    post = nengo.Ensemble(nengo.LIF(10), dimensions=1)
    error = nengo.Connection(..., post)
    conn = nengo.Connection(pre, post)
    conn.learning_rule = [nengo.BCM(tau=0.01), nengo.PES(error)]

(I think if we go this way we should be able to either specify it like this or in the constructor)

@celiasmith
Copy link
Contributor

Cool that's what I want too but Trevor said I was bad for asking... :(

Seriously though, +1

On 2014-05-01, at 5:15 PM, Travis DeWolf notifications@github.com wrote:

Personally I like encapsulating the learning rule inside the Connection object.

+1


Reply to this email directly or view it on GitHub.

@tbekolay
Copy link
Member Author

tbekolay commented May 2, 2014

I thought @tcstewar's snippet was essentially what I put for option 4, but I see now that... I'm crazy? Or something? Anyway, I propose those of us who will be at the sprint and care about learning rule syntax should have a meeting / argument about it at some point. Perhaps tomorrow after the lab meeting or Saturday morning.

@celiasmith
Copy link
Contributor

Fisticuffs!

On 2014-05-02, at 12:13 AM, Trevor Bekolay notifications@github.com wrote:

I thought @tcstewar's snippet was essentially what I put for option 4, but I see now that... I'm crazy? Or something? Anyway, I propose those of us who will be at the sprint and care about learning rule syntax should have a meeting / argument about it at some point. Perhaps tomorrow after the lab meeting or Saturday morning.


Reply to this email directly or view it on GitHub.

@tbekolay
Copy link
Member Author

tbekolay commented May 2, 2014

Conclusion: @drasmuss gets his way yet again

conn = nengo.Connection(pre, post)
conn.learning_rule = [nengo.BCM(tau=0.01), nengo.PES(error)]

or

conn = nengo.Connection(pre, post, learning_rule=[nengo.BCM(tau=0.01), nengo.PES(error)])

or

conn = nengo.Connection(pre, post)
conn.learning_rule = nengo.BCM(tau=0.01)

or

conn = nengo.Connection(pre, post, learning_rule=nengo.BCM(tau=0.01))

@drasmuss
Copy link
Member

drasmuss commented May 2, 2014

Aww yeah I knew my boys got my back ✊

@tbekolay tbekolay mentioned this issue May 31, 2014
@tbekolay
Copy link
Member Author

tbekolay commented Jun 3, 2014

Implemented in #363.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

No branches or pull requests

7 participants