Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added amplitude to LIF #1325

Merged
merged 1 commit into from
Oct 25, 2017
Merged

Added amplitude to LIF #1325

merged 1 commit into from
Oct 25, 2017

Conversation

hunse
Copy link
Collaborator

@hunse hunse commented Jun 14, 2017

Motivation and context:

Allow scaling on the output of LIF neurons. This can be useful when working with learning, to allow the output of the neuron to be in a more normalized range.

How has this been tested?

I added a unit test that checks that the scaled output of a normal neuron (amplitude == 1) matches that that of the neuron with amplitude set, both for the static rates and the dynamic spikes.

How long should this take to review?

  • Quick (less than 40 lines changed or changes are straightforward)

Types of changes:

  • New feature (non-breaking change which adds functionality)

Checklist:

  • I have read the CONTRIBUTING.rst document.
  • I have updated the documentation accordingly.
  • I have included a changelog entry.
  • I have added tests to cover my changes.
  • All new and existing tests passed.

@drasmuss
Copy link
Member

drasmuss commented Jun 14, 2017

I usually do this by changing max_rates instead. For example, instead of doing

ens = nengo.Ensemble(1, 1, max_rates=[100], neuron_type=nengo.LIFRate(amplitude=0.1))

I'd do

ens = nengo.Ensemble(1, 1, max_rates=[10], neuron_type=nengo.LIFRate())

Obviously not exactly the same (the former is scaling the output of the nonlinearity, the latter is scaling the input), but gets you a similar result.

Another option is to incorporate the amplitude into the connection weights on any efferent connections (e.g., transform=amplitude), or the gains of postsynaptic neurons (these would be exactly the same, but slightly more complicated).

So even though I would actually find the amplitude parameter helpful for training/learning, as you point out, I am kind of inclined against it because

  1. It is somewhat redundant with existing functionality
  2. It adds an extra parameter
  3. It kind of muddies the waters of max_rates (would we say that nengo.Ensemble(1, 1, max_rates=[100], neuron_type=nengo.LIFRate(amplitude=0.1)) has a maximum firing rate of 100Hz, or 10Hz?)

That being said, I'm not strongly opposed, so if other people like it I'm totally fine with adding amplitude in.

@arvoelke
Copy link
Contributor

Could this be made a parameter / feature of the learning rule(s) instead?

@hunse
Copy link
Collaborator Author

hunse commented Jun 14, 2017

The problem with max_rates is that it actually changes the firing rates. This affects how the network operates. In my networks, I want the neurons to be firing at ~100 Hz, to help minimize the noise caused by spikes.

Incorporating amplitude into postsynaptic connection weights would certainly be the "normal" way to do it. In fact, if I wanted to describe my model to anybody either in neuroscience or machine learning, I would tell them the amplitude just gets folded in to the efferent connection weights, such that my model has the same structure as traditional models. Really the reason that it helps me is that I can scale my neurons to be more like sigmoids or ReLUs, so if I train a network using one of those neuron types with backprop, then it's easy for me to swap in a LIF neuron type instead and still have things work with similar learning rates and initial weight magnitudes.

This is also kind of the reason that I wouldn't want to have it be part of the learning rule, because whether you want this scaling or not really depends on the neuron model that you're using.

@jgosmann
Copy link
Collaborator

jgosmann commented Jul 6, 2017

How does this affect different backends?

@hunse
Copy link
Collaborator Author

hunse commented Jul 6, 2017

How does this affect different backends?

Like pretty much anything in the Nengo API, they could decide whether to support it or not. For the most part, I don't think it should be too hard to support, because in essence it can be folded into the connection weights when you're actually computing it (so all the computations involved would be things current hardware can do).

Copy link
Collaborator

@jgosmann jgosmann left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some minor things, otherwise LGTM.

nengo/neurons.py Outdated
@@ -231,11 +231,13 @@ class LIFRate(NeuronType):

tau_rc = NumberParam('tau_rc', low=0, low_open=True)
tau_ref = NumberParam('tau_ref', low=0)
amplitude = NumberParam('amplitude', low=0)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wonder if this should be low_open=True? I suppose one could set the amplitude to 0 without crashing Nengo, but it seems like it wouldn't make much sense.

Also, this parameter needs to be added to the docstring.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done.

nengo/neurons.py Outdated
@@ -298,8 +300,9 @@ class LIF(LIFRate):

min_voltage = NumberParam('min_voltage', high=0)

def __init__(self, tau_rc=0.02, tau_ref=0.002, min_voltage=0):
super(LIF, self).__init__(tau_rc=tau_rc, tau_ref=tau_ref)
def __init__(self, tau_rc=0.02, tau_ref=0.002, amplitude=1, min_voltage=0):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Technically this breaks backwards compatibility if anyone provided min_voltage as a positional argument. But I'm fine with that, I think min_voltage was rarely used if at all.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, people shouldn't rely on the position of keyword arguments. But there's also no need to have amplitude in front of min_voltage, so I'll switch them.

@hunse
Copy link
Collaborator Author

hunse commented Jul 14, 2017

I've implemented @jgosmann's recommended changes.

Copy link
Collaborator

@jgosmann jgosmann left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you, @hunse! :)

This can be useful when working with learning, as it allows
the output of the neuron to be in a more normalized range.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

None yet

6 participants