Skip to content
AdroitErudite edited this page May 14, 2021 · 28 revisions

Welcome to the Josephson_Junction_Neuromorphic-Raspi4- wiki!

This project involves the creation of a simple neural network by simulating Josephson junctions. Here the physics of the system will be described in greater detail. A Josephson Junction is a special circuit component that makes use of quantum tunneling to produce a macroscopic effect. The tunneling of special Cooper pairs of this component produces a current that is not based on the potential difference at the junction but rather the phase difference. The make up of a Josephson Junction is as follows: a connection of two super conductors via a weak link that is either an insulator, a non conducting material, or a physically restrained section (SIS, SNS, and SsS respectively).

The Josephson Junction ('Single Junction.ipynb')

Current in a Josephson Junction must obey Kirkoff's Law:

The model of the Josephson junction is known as a Resistively and Capacitively Shunted Junction, RCSJ. (Gross, p. 97)

The is our supercurrent that depends solely on the phase of the junction and the characteristic current:

The Josephson junction has a voltage phase relation of:

Here our normal current is a result of the quasiparticles created by the thermal break up of Cooper pairs in the junction with a voltage applied from the voltage phase relation. If there was no voltage applied these particles would not contribute to the current, but when they do we get the relation:
where is the normal conductance.

If the voltage changes, then the junction has a displacement current from the relation:

Lastly, there is also fluctuation current from the thermal noise in the junction.

Using these equations, we can rewrite the current as follows for the basic equawtion of a Josephson Junction:

Which can be rewritten as

from our voltage phase relation. Note that the flux quanta absorbed the charge and Plank's constant.

If the fluctuating current is negligible and normalize time by taking , we can nondimensionalize the entire function to be tunable by a single parameter:

where the time derivatives are with respect to the normalized time with .

Examining the equation shows that this is a model for a damped pendulum which can easily be numerically solved with Newton's method for each derivative:

These methods are stable enough to be used to reproduce figures (MIT slides 16-17) shown in the 'Single Junction.ipynb' file.

The Single Neuron ('Single Neuron.ipynb')

When you take two junctions and assemble them in a configuration known as a Rapid Single Flux Quantum, RSFQ, circuit shown above (Crotty p.1) the two junctions becomes coupled forming a system of equations:

and

Again, even though these are coupled ODE's, newton's method can still be used to solve them. With the correct parameters given by Crotty (p.4). We can see behavior that mimics a biological nueron activation. Physically this periodic behavior is phase slipping of a double pendulum. This was used to recreate the figure showing the neuromorphic behavior of the system shown in Fig. 2b of Crotty in the 'Single Neuron.ipynb' notebook. Where v is the first normalized time derivative of the phase.

Param:

The behavior of this system mimics the biological process of a neuron, with the profile matching the shape of the action potential of the Hodgkin-Huxley model (Crotty p.2). This is because each junction is individually acting like the Sodium and Potassium ion channels in a neuron (Crotty p.1). With this it is now possible to assemble the neurons in a network which would in theory run at the order of picoseconds, with real neurons running at milliseconds.
The following table from Crotty shows the relative speeds of different neuron models along with the estimated speed of the Josephson Junctions model:

Comparing all the models, the estimated time for a JJ is orders of magnitudes higher than the other models, especially as the numbers of neurons increase. This is also true for sparse and dense networks, which represents the neurons being unconnected or connected all together respectively. Therefore, providing that a network can be created from these junctions, the overall speed of simulating biological neural networks should increase quite dramatically.

Neural Networking ('Simple Neural Network.ipynb')

The first step in developing a neural network out of Josephson Junctions is to know how to make a network in the first place. I was able to use the a guide and code found online. Here my architecture is simple, there are two inputs for A and B and sent into a hidden layer with 4 weights, and then there is a single output. Here it is pictured:

The math behind the neural network is explained in much better detail here, so I highly recommend looking at it and trying it yourself. My neural network model was to determine what the outputs of an XOR (exclusive or) logic gate. It was able to minimize its losses after about 400 iterations of training. Now the next step is to go about creating the network for the Josephson Junctions.

This however is not as simple as just using a function because now the system is not based on a simple processing with iterating. The model is now producing continuous current (it is simply discretized to be computed). This means that a specific type of neural network is needed. This model is called a Spiking Neural Network.

This is for networks that behave much like neurons that fire when they reach a specific threshold of input, which is how they transmit information. The model of the Josephson junctions is exactly what is needed for a spiking neural network. Now instead of having inputs weights and outputs there are input neurons synapses and output neurons.

Connecting Neurons and Forming a Network ('Connecting Neurons.ipynb')

In order to create a network of neurons, the first thing needed to do is give them the ability to connect through each other. Biologically, this is the synapse between the neurons, and is required to make other neurons fire. These are also the connections in our neural network and will what the weights of the network will be applied to. In Crotty, the potential and the current of the synapse follow a system of coupled differential equations:

and

In the Connecting Neurons notebook, I show a single neuron driving another neuron when it fires by having a current that is only on for a length of time. When the first neuron stops, so does the driven one.

Spike Train, Membrane Potential and Learning ('Weights.ipynb')

Spiking neural networks now have an added complexity in that everything must be done in 'real time'. Weights can not be easily back propagated by a simple loss function. Typically, the output of a neuron is converted into a spike train where at each reception of a spike, the potential increases and then decays with time such that if enough spikes are received in a certain time, the neuron spikes. In the model used, the connection is already defined, so there is no need to convert the output flux into a spike train to convert into a potential.

Learning

The learning of the network is a bit more complicated. The most common way to train a spiking neural network is through an unsupervised learning process. This process means that the system is stimulated with the responses that are sought after and the neurons change their connections to out put a spiking neuron at the output. This method of learning was taken from a similar network to recognize a specific pattern. The outputs now need to be binary for spiking and not spiking, so this does make things a little easier. The most common of this type of learning is Hebbian learning which has a learning rate multiplied by an exponential dependent on the time difference between input and output spiking, divided by a time scale. Here is the equation for it:

for

for

A and B are the learning rates of the system, and o and i represent output and input timings respectively, and tau is the timescale for your firings. This allows the weights to change over time for the system. With this is it possible to set up a network of neurons that form a spiking neural network.

The Network ('Josephson Neural Network.ipynb')

The implementation and results of which are discussed in greater detail in the 'Josephson Neural Network.ipynb' file. It is again the same network discussed with the goal of acting like an XOR gate. This code ended up having mixed success. The system exhibits training to respond to specific stimuli, but with some of the implementation there is a limit on inhibiting a response. Here is what it looks like. Flux is the black, and current is the blue and red.

There is some learning happening and the system responds better to the first input rather than the second, so the network appears to remember the first training better than the second part. However, both inputs have the network spike even more than the first, probably because there is no inhibiting for input signals. The outcome does seem like a simpler OR gate that reacts more strongly to the first neuron in the input layer. Overall, it looks like if it had more work on it that it could work perfectly, but unfortunately I was unable to get to this desired conclusion.

Further Work

The network does have the beginnings of something that can actually work. There are flaws to the learning methods discussed in the notebook and the implementation of the code could be improved. Feel free to give this a try for yourself and improve upon the code. Looking at the source papers and looking more into the methods of learning would be the best to implement. The neuron model itself works properly and now it is the infrastructure that needs to be tinkered with. Overall, the foundation of the project is that Josephson junctions can be modeled to behave like neurons, and this system fits the bill. If I had used a different neuron model, I would almost certainly would have the same issues present here, so there is still much success to the project.

Get Started!

Creating the simplistic neural network I used for the baseline is detailed here: https://towardsdatascience.com/how-to-build-your-own-neural-network-from-scratch-in-python-68998a08e4f6. If you are coming from the video in the networking folder, this is the link you want! It goes through the basics of making this neural network for yourself. All it needs is numpy, and you are set to start coding this yourself! Like I said before, feel free to use my code how you like.

References

Gross, R., & Marx, A. (2005). Applied Superconductivity: Josephson Effect and Superconducting Electronics. Manuscript, Walther-Meißner-Institut, Garching. Retrieved from https://www.wmi.badw.de/teaching/Lecturenotes/AS/AS_Chapter3.pdf

Crotty, P., Schult, D., & Segall, K. (2010). Josephson junction simulation of neurons. Physical Review E, 82(1). doi:10.1103/physreve.82.011914

Generalized Josephson Junctions. (2003). Lecture Notes, Massachusetts Institute of Technology, Cambridge. Retrieved from http://web.mit.edu/6.763/www/FT03/Lectures/Lecture13.pdf

Cheng, R., Goteti, U. S., & Hamilton, M. C. (2019). Superconducting neuromorphic computing using Quantum Phase-Slip Junctions. IEEE Transactions on Applied Superconductivity, 29(5), 1-5. doi:10.1109/tasc.2019.2892111

Loy, J. (2020, March 4). How to build your own Neural Network from scratch in Python. Medium. https://towardsdatascience.com/how-to-build-your-own-neural-network-from-scratch-in-python-68998a08e4f6

http://alexlenail.me/NN-SVG/index.html

Christophe, François & Mikkonen, Tommi & Andalibi, Vafa & Koskimies, Kai & Laukkarinen, Teemu. (2015). Pattern recognition with Spiking Neural Networks: a simple training method. http://ceur-ws.org/Vol-1525/paper-21.pdf