Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

My changes #684

Open
wants to merge 23 commits into
base: master
Choose a base branch
from

Conversation

salah-daddi-nounou
Copy link

@salah-daddi-nounou salah-daddi-nounou commented May 5, 2024

Bi-sigmoid rule integration in Bindsnet

Our work involved extending the BindsNET framework to incorporate the Bi-sigmoid STDP rule.
This unique learning rule is directly derived from the behavior of physical spintronic synapses. It is the result of accurate electrical simulations that allowed us to determine the change in the conductance of the synapse with respect to the relative delay between electrical pulses in its terminals.
The rule was formalized by the following equation after fitting to simulation data (normalized conductance) :

$\Delta w(\Delta t) = \frac{-\frac{A}{1 + e^{-k_0(\Delta t-t_0)}} - \frac{A}{1 + e^{-k_1(\Delta t-t_1)}} + A}{A}$
Parameters: $A = 7.95 \times 10^5$, $k_0 = 0.474723045$, $t_0 = 20.77893753$, $k_1 = 0.757072031$, $t_1 = 48.93860322$.

Before hardware implementation, simulating the SNN with this rule in a functional framework like BindsNet is crucial. It achieved more than 91% accuracy on MNIST.

The process of integrating the Bi-sigmoid STDP rule into the BindsNET framework is as follows:

  1. We developed a custom training script named salah_example.py, inspired by the Bindsnet example eth_mnist.py. This script trains the SNN and subsequently saves the trained weights and network parameters as PyTorch objects (.pt files).

  2. Within the nodes.py base class, we introduced a new trace, x2, defined by the bisigmoid_trace(t_) function. We added the trace x2 to the block that manages the traces, enabling all neuron types to record this novel trace as they spike.

  3. In learning.py, the Bi-sigmoid learning rule is introduced, updating synaptic weights based on the x2 trace. Here how this rule is used to update weights in the network:

    • Trace Recording: Both input and output neurons record two key traces: x2, defined by the bisigmoid function, and s, indicating spikes. Here, source represents the input neuron while target refers to the output (excitatory) neuron.
    • Rule Application: The Bi-sigmoid learning rule is applied exclusively when the output neuron (target.s) spikes. At this moment, the connection between the input and output neurons is updated based on the value of source.x2 at the time of target.s spiking.
  4. A new network model, Salah_model, was introduced in models.py, similar to the model DiehlAndCook2015 but notably incorporating the Bi-sigmoid rule as the update_rule.

  5. Finally, for the purpose of inference, evaluate_plot.py was created. This script facilitates the loading of trained weights, performs model evaluation, and generates plots to visually represent the network's performance post-training. We separated Training and testing files. Saving the weights after training is crucial for later analysis.

To go further on the physical simulations of the spintroinc synapse, check out our article.

@Hananel-Hazan
Copy link
Collaborator

Thanks @salah-daddi-nounou for the PR, I will review later on

# language=rst
"""
Bi_sigmoid STDP rule involving only post-synaptic spiking activity. The weight update
quantity is poisitive if the post-synaptic spike occures shortly after the presynatpic spike,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

poisitive --> positive

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For performance, please create a new neuron with the relevant parameters for your model. Maybe LIF_Bi-sigmoid.

@@ -389,6 +389,11 @@ def _connection_update(self, **kwargs) -> None:
"""
Post-pre learning rule for ``Connection`` subclass of ``AbstractConnection``
class.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This need to be generalized, MNIST is only example.

Bi_sigmoid learning rule for ``Connection`` subclass of ``AbstractConnection``
class.

self.source.s : 28 *28 array of 0 and 1 source_s : array converted to 1D vector (784*1)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same as before, the description need to be generalize, MNIST is only one way to use BindsNET

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants