Skip to content

ypeleg/komplex

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Komplex

Keras extension for Complex tensors and complex functions.


An amazing repository of @omrijsharon (https://github.com/omrijsharon).

All credits whatsoever goes to him!


I just translated his work to be using keras&tf backend.

You can think of me as some sort of a pretty bad NLP translation algorithm. Nothing more!


Inspired by https://github.com/williamFalcon/pytorch-complex-tensor.

Based on the papers:

Table of Content:

Functions

  • exp(z)
  • log(z)
  • sin(z)
  • cos(z)
  • tan(z)
  • tanh(z)
  • sigmoid(z)
  • softmax(z)

ReLU function versions for complex numbers

More information in the documentation below

  • CReLU(z)
  • zReLU(z)
  • modReLU(z, bias)

ComplexTensor Operation

  • addition (z + other and other + z)
  • subtraction (z - other and other - z)
  • multiplication (z * other and other * z)
  • matrix multiplication (z @ other and other @ z)
  • division (z / other and other / z)

ComplexTensor Functions and Properties

  • z.real (real part of z)
  • z.imag (imaginary part of z)
  • z.PDF(dim) (Probability density function, more information in the documentation below)
  • z.wave(dim) (returns a normalized ComplexTensor which can be used as a wave function (more information below))
  • z.size() (tensor size)
  • len(z) (tensor length)
  • z.euler() (returns 2 tensors: R and in Euler's representation)
  • abs(z) ()
  • z.magnitude() ()
  • z.angle() (Angle of a complex element )
  • z.phase() (Phase of a complex element (can be negative or ))
  • z.tensor() or z.z (Get raw Tensor)
  • z.conj() (Conjugate)
  • z.T or z.t() (Transpose)
  • z.H or z.h() (Hermitian Conjugate)

Examples

  • Defaults
  • 5 ways to create a ComplexTensor
  • Using komplex functions
  • Euler representation

Quantum Learning:

  • Probability density function
  • Wave function

Additional information

Probability density function

z.PDF(dim)

dim plays the same roll as in K.softmax function. This function returns the probability density function of your ComplexTensor which is the equivalent of the expectation value in quantum mechanics. The function divides (normalizes) the ComplexTensor by the sum of abs(z) in dimension dim and takes the abs of the result. If left empty or dim=None, the ComplexTensor will be divided by the sum of abs(z) in all dimensions.

Wave function

z.wave(dim)

dim plays the same roll as in K.softmax function. This function returns a normalized ComplexTensor which is the equivalent of a quantum wave function. The function divides the ComplexTensor by the sum of abs(z) in dimension dim. If left empty or dim=None, the ComplexTensor will be divided by the sum of abs(z) in all dimensions.

Softmax

Eq.(36) in the paper Complex-valued Neural Networks with Non-parametric Activation Functions

https://arxiv.org/pdf/1802.08026.pdf

Simone Scardapane, Steven Van Vaerenbergh, Amir Hussain and Aurelio Uncini

ReLU function versions for complex numbers

CReLU(z)

Deep Complex Networks Eq.(5).

https://arxiv.org/pdf/1705.09792.pdf

Chiheb Trabelsi, Olexa Bilaniuk, Ying Zhang, Dmitriy Serdyuk, Sandeep Subramanian, João Felipe Santos, Soroush Mehri, Negar Rostamzadeh, Yoshua Bengio & Christopher J Pal

zReLU(z)

Pages 15-16 in the dissertation: On complex valued convolutional neural networks.

https://arxiv.org/pdf/1602.09046.pdf

Nitzan Guberman, Amnon Shashua.

Also refered as Guberman ReLU in Deep Complex Networks Eq.(5) (https://arxiv.org/pdf/1705.09792.pdf).

modReLU(z, bias)

Eq.(8) in the paper: Unitary Evolution Recurrent Neural Networks

https://arxiv.org/pdf/1511.06464.pdf

Martin Arjovsky, Amar Shah, and Yoshua Bengio.

Notice that |z| (z.magnitude) is always positive, so if b > 0 then |z| + b > = 0 always. In order to have any non-linearity effect, b must be smaller than 0 (b<0).

Examples

In the begining of the code you must import the library:

import komplex

Defaults:

  • ComplexTensor default is complex=True. See explanation in 4.a.
  • ComplexTensor default is requires_grad=True.

5 ways to create a ComplexTensor

  1. Inserting a tuple of Keras tensors or numpy arrays with the same size and dimensions. The first tensor/array will be the real part of the new ComplexTensor and the second tensor/array will be the imaginary part.
a = tf.random.normal(3,5)
b = tf.random.normal(3,5)
z = komplex.ComplexTensor((a,b))
  1. Converting a complex numpy array to a ComplexTensor:
z_array = np.random.randn(3,5) + 1j*np.random.randn(3,5)
z = komplex.ComplexTensor(z_array)
  1. Inserting a ComplexTensor into ComplexTensor. Completely redundant operation. A waste of computation power. Comes with a warning.
z_array = np.random.randn(3,5) + 1j*np.random.randn(3,5)
z_complex = komplex.ComplexTensor(z_array)
z = komplex.ComplexTensor(z)
  1. a. Inserting a tf tensor / numpy array which contains only the real part of the ComplexTensor:
x = np.random.randn(3,5)
#or
x = tf.random.normal(3,5)
z = komplex.ComplexTensor(x, complex=False)
  1. b. Inserting a tf tensor which contains the real and the imaginary parts of the ComplexTensor. Last dimension size must be 2. Does not work with numpy arrays.
x = np.random.randn(3,5,2)
z = komplex.ComplexTensor(x, complex=True)
  1. Inserting a list of complex numbers to ComplexTensor:
x = [1, 1j, -1-1j]
z = komplex.ComplexTensor(x)

Using komplex functions

exp(log(z)) should be equal to z:

x = [1,1j,-1-1j]
z = komplex.ComplexTensor(x, requires_grad=False)
log_z = komplex.log(z)
exp_log_z = komplex.exp(log_z)

we get:

ComplexTensor([ 1.000000e+00+0.j       , -4.371139e-08+1.j       ,
       -9.999998e-01-1.0000001j], dtype=complex64)

which is the original [1,1j,-1-1j] with a small numerical error.

euler representation

We can get r and of Euler's representation. Lets compare ComplexTensor with Numpy:

x = [1,1j,-1-1j]
z = komplex.ComplexTensor(x, requires_grad=False)
r, theta = z.euler()
print("ComplexTensor\nr = ", r, '\ntheta = ', theta)
z_np = np.array(x)
print("\nNumpy\nr = ", abs(z_np), '\ntheta = ', np.angle(z_np))

we get:

ComplexTensor
r =  tensor([1.0000, 1.0000, 1.4142]) 
theta =  tensor([0.0000, 1.5708, 3.9270])

Numpy
r =  [1.         1.         1.41421356] 
theta =  [ 0.          1.57079633 -2.35619449]

the last element of theta seems to be different, yet the difference between the two outputs is , which means it is the same angle.

Quantum Learning

Probability density function

If z is 2x2 ComplexTensor, then

abs_psi = z.PDF()

returns a probabilities/Categorical tensor of measuring the ij's state . This Categorical can be samples at will by:

abs_psi.sample()

Wave function

If z is 100x5 ComplexTensor, then

psi = z.PDF(dim=0)

is a collection of 5 wave functions with 100 states each. This ComplexTensor can be used with Quantum Operators: where P is an operator at your choice. For instance, In 1D, will be a (1D) vector and P will be a (2D) matrix.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages