Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Plateau Neuron - Fixed Point Model #781

Open
wants to merge 20 commits into
base: main
Choose a base branch
from

Conversation

kds300
Copy link
Contributor

@kds300 kds300 commented Sep 1, 2023

Issue Number: #775

Objective of pull request: Add Plateau neuron model to lava as described in the above issue.

Pull request checklist

Your PR fulfills the following requirements:

  • Issue created that explains the change and why it's needed
  • Tests are part of the PR (for bug fixes / features)
  • Docs reviewed and added / updated if needed (for bug fixes / features)
  • PR conforms to Coding Conventions
  • PR applys BSD 3-clause or LGPL2.1+ Licenses to all code files
  • Lint (flakeheaven lint src/lava tests/) and (bandit -r src/lava/.) pass locally
  • Build tests (pytest) passes locally

Pull request type

Please check your PR type:

  • Bugfix
  • Feature
  • Code style update (formatting, renaming)
  • Refactoring (no functional changes, no api changes)
  • Build related changes
  • Documentation changes
  • Other (please describe):

What is the current behavior?

  • There is no current behavior, this is a new model.

What is the new behavior?

  • The model will take two separate inputs, one to the "soma" and another to the "dendrite". Both the soma and dendrite potentials will follow sub-threshold LIF dynamics. When the dendrite potential crosses its threshold it resets and sends the neuron to the UP state. The UP state will last for a specified duration of time steps. During this time, the soma is able to spike and reset if its potential crosses its threshold. If the neuron is in the DOWN state, the soma potential will follow subthreshold LIF dynamics regardless of whether it crosses the threshold.

Does this introduce a breaking change?

  • Yes
  • No

Supplemental information

@kds300 kds300 changed the title Plateau neuron Plateau Neuron - Fixed Point Model Sep 5, 2023
Signed-off-by: Marcus G K Williams <Marcus G K Williams 168222+mgkwill@users.noreply.github.com>
@mgkwill
Copy link
Contributor

mgkwill commented Oct 18, 2023

Hi @kds300, Thanks for this PR. Sorry it took us so long to take a look.

Pushed a PR to kds300#7 that should fix the codacy issues.

I will do more review in the next few days and I've added a few others for comment and review.

Comment on lines 110 to 114
def run_spk(self):
"""The run function that performs the actual computation during
execution orchestrated bgy a PyLoihiProcessModel using the
LoihiProtocol.
"""
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

typo in orchestrated bgy a PyLoihiProcessModel, should be orchestrated by a PyLoihiProcessModel

@mgkwill mgkwill requested a review from tim-shea October 18, 2023 06:55
class Plateau(AbstractProcess):
"""Plateau Neuron Process.

Couples two modified LIF dynamics. The neuron posesses two potentials,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for providing such an extensive, well written docstring!

Is 'potential' the right word, or 'voltage'?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure, I'm used to treating them (mostly) interchangeably. I can change it to voltage since that's more consistent with other neuron models.

from lava.magma.core.process.ports.ports import InPort, OutPort


class Plateau(AbstractProcess):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a minor point. Since this is a LIF neuron, did you consider to let it inherit from the AbstractLIF process, and adding a 'LIF' in the class name? Not sure if it makes sense in this specific example, though. Up to you.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I originally didn't do this since I thought of it as two combined LIF neurons instead of a modified LIF neuron. I'll look over the AbstractLIF process and see if it makes sense to inherit from that.

from lava.tests.lava.proc.lif.test_models import VecSendProcess, VecRecvProcess


class SpikeGen(AbstractProcess):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you check whether the Spiker Process in
'lava\src\lava\proc\spiker\process.py'
already implements the behavior that you expect from SpikeGen?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since these two processes are very similar, it would be good to avoid redundant processes.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe better to extend the Spiker, inherit from it, or write a common abstract spiker, rather than having two independent processes.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've replaced the custom SpikeGen process with implementing source RingBuffers. I didn't use the Spiker because that sends periodic spikes and I only wanted single spikes to be sent.

def run_spk(self):
"""Send the appropriate spikes for the given time step
"""
self.s_out.send(self.spike_data[:, self.time_step - 1])
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One of the advantages of event-based, neuromorphic computing is that we don't need to communicate '0's. Thus, could you check whether the SpikeGen actually sends a meaningful information at the respective time step, and ensure not to send anything instead of a 0?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've removed this process, but I'll definitely keep an eye out for this in any future processes I make!

@@ -0,0 +1,178 @@
# Copyright (C) 2023 Intel Corporation
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for providing extensive unit tests!

0, 6400, 4800, 3600, 2700, 2025, 1518, 1138, 853, 639
]
self.assertListEqual(expected_v_dend, test_v_dend)
self.assertListEqual(expected_v_soma, test_v_soma)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you check if lintin passes?
'flakeheaven lint src/lava tests'

My guess is that there are a few points that must change, including missing lines at the end of files. Not functionally relevant, but important to keep a clean code base :-)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've run the linting, (flakeheaven and bandit) and they pass on my local copy of the code. Also, I have the empty line at the end of the files locally, but it doesn't seem to show up on the github versions. Does github just not show the empty line at the end?


Precisions of state variables

- du_dend : unsigned 12-bit integer (0 to 4095)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the process, you specify that the variables are float. If you want them to be integers, I would specify that in the Process.
Also, is this model really using fixed point precision? Or is it bit accurate to Loihi?
In the later case, we tend to call it "...ModelBitAcc", rather than "...ModelFixed"

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I wasn't sure if bit-accurate would be appropriate for this, since I don't have a version of this that runs on Loihi. I tried to give the variables the same precision as on Loihi.

Precisions of state variables

- du_dend : unsigned 12-bit integer (0 to 4095)
- du_soma : unsigned 12-bit integer (0 to 4095)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please note: On Loihi, variables typically have 8, 16, or 24 bit precision. Thus, you may want to avoid 12bit.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I chose 12-bit for the du and dv values since that was the precision of those values in the PyLifModelBitAcct. Since I'm choosing the precisions based on bit-accurate models, should I change this model to bit-accurate, as suggested in a different comment?

- du_dend : unsigned 12-bit integer (0 to 4095)
- du_soma : unsigned 12-bit integer (0 to 4095)
- vth_dend : unsigned 17-bit integer (0 to 131071)
- vth_soma : unsigned 17-bit integer (0 to 131071)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, could you add input validation? I.e. a method that checks that the provided variables are in fact integer, and in the required range?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know that we are also sloppy on this sometimes ;-)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've added input validation for the dv_dend, dv_soma, vth_dend, vth_soma, and up_dur variables. With the restriction on the dv's being set to 4095, I also had to make some changes to the decay process, setting the decay_unity = 4095. I'm not sure if this model could still be considered bit-accurate at this point.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, I fixed the typo in the docstring referenced above, du_dend and du_soma should both be dv's

Copy link
Contributor

@phstratmann phstratmann left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your contribution. Great code quality!

Just a few minor points I would ask you to change.

Please reach out if you have any questions.

@kds300
Copy link
Contributor Author

kds300 commented Oct 18, 2023

Thanks for your contribution. Great code quality!

Just a few minor points I would ask you to change.

Please reach out if you have any questions.

Thanks for the feedback! I'll go through code and make the requested changes.

…nd, dv_soma, vth_dend, and vth_soma

Signed-off-by: kevin <kds300@psu.edu>
…ateauModelFixed docstring

Signed-off-by: kevin <kds300@psu.edu>
…nd and dv_soma (0, 4095)

Signed-off-by: kevin <kds300@psu.edu>
@kds300
Copy link
Contributor Author

kds300 commented Oct 19, 2023

I updated my local branch with several of the recommended changes, and commented on some of the requested changes that I have questions about.

@kds300
Copy link
Contributor Author

kds300 commented Oct 19, 2023

Hi @kds300, Thanks for this PR. Sorry it took us so long to take a look.

Pushed a PR to kds300#7 that should fix the codacy issues.

I will do more review in the next few days and I've added a few others for comment and review.

@mgkwill Thanks for fixing the codacy issues! I'm still a bit new to github, especially using it collaboratively, so I had started making changes before merging that pull request. I added a commit to sort out some of the merge conflicts I created and then merged the pull request.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants