Pytorch extension for Complex tensors and complex functions.
Inspired by the unfinished work of William Falcon: pytorch-complex-tensor.
Based on the papers:
- Deep Complex Networks
- Unitary Evolution Recurrent Neural Networks
- On Complex Valued Convolutional Neural Networks
- exp(z)
- log(z)
- sin(z)
- cos(z)
- tan(z)
- tanh(z)
- sigmoid(z)
- softmax(z)
More information in the documentation below
- CReLU(z)
- zReLU(z)
- modReLU(z, bias)
- addition (z + other and other + z)
- subtraction (z - other and other - z)
- multiplication (z * other and other * z)
- matrix multiplication (z @ other and other @ z)
- division (z / other and other / z)
- z.real (real part of z)
- z.imag (imaginary part of z)
- z.PDF(dim) (Probability density function, more information in the documentation below)
- z.wave(dim) (returns a normalized ComplexTensor which can be used as a wave function (more information below))
- z.size() (tensor size)
- len(z) (tensor length)
- z.euler() (returns 2 tensors: R and in Euler's representation)
- abs(z) ()
- z.magnitude() ()
- z.angle() (Angle of a complex element )
- z.phase() (Phase of a complex element (can be negative or ))
- z.tensor() or z.z (Get raw torch Tensor)
- z.conj() (Conjugate)
- z.T or z.t() (Transpose)
- z.H or z.h() (Hermitian Conjugate)
- z.requires_grad_() (same as pytorch's requires_grad_())
- Defaults
- 5 ways to create a ComplexTensor
- Using torchlex functions
- Euler representation
- Probability density function
- Wave function
z.PDF(dim)
dim plays the same roll as in torch.softmax function. This function returns the probability density function of your ComplexTensor which is the equivalent of the expectation value in quantum mechanics. The function divides (normalizes) the ComplexTensor by the sum of abs(z) in dimension dim and takes the abs of the result. If left empty or dim=None, the ComplexTensor will be divided by the sum of abs(z) in all dimensions.
z.wave(dim)
dim plays the same roll as in torch.softmax function. This function returns a normalized ComplexTensor which is the equivalent of a quantum wave function. The function divides the ComplexTensor by the sum of abs(z) in dimension dim. If left empty or dim=None, the ComplexTensor will be divided by the sum of abs(z) in all dimensions.
Eq.(36) in the paper Complex-valued Neural Networks with Non-parametric Activation Functions
Simone Scardapane, Steven Van Vaerenbergh, Amir Hussain and Aurelio Uncini
Deep Complex Networks Eq.(5).
Chiheb Trabelsi, Olexa Bilaniuk, Ying Zhang, Dmitriy Serdyuk, Sandeep Subramanian, João Felipe Santos, Soroush Mehri, Negar Rostamzadeh, Yoshua Bengio & Christopher J Pal
Pages 15-16 in the dissertation: On complex valued convolutional neural networks.
Nitzan Guberman, Amnon Shashua.
Also refered as Guberman ReLU in Deep Complex Networks Eq.(5).
Eq.(8) in the paper: Unitary Evolution Recurrent Neural Networks.
Martin Arjovsky, Amar Shah, and Yoshua Bengio.
Notice that |z| (z.magnitude) is always positive, so if b > 0 then |z| + b > = 0 always. In order to have any non-linearity effect, b must be smaller than 0 (b<0).
In the begining of the code you must import the library:
import torchlex
- ComplexTensor default is complex=True. See explanation in 4.a.
- ComplexTensor default is requires_grad=True.
- Inserting a tuple of torch tensors or numpy arrays with the same size and dimensions. The first tensor/array will be the real part of the new ComplexTensor and the second tensor/array will be the imaginary part.
a = torch.randn(3,5)
b = torch.randn(3,5)
z = torchlex.ComplexTensor((a,b))
- Converting a complex numpy array to a ComplexTensor:
z_array = np.random.randn(3,5) + 1j*np.random.randn(3,5)
z = torchlex.ComplexTensor(z_array)
- Inserting a ComplexTensor into ComplexTensor. Completely redundant operation. A waste of computation power. Comes with a warning.
z_array = np.random.randn(3,5) + 1j*np.random.randn(3,5)
z_complex = torchlex.ComplexTensor(z_array)
z = torchlex.ComplexTensor(z)
- a. Inserting a torch tensor / numpy array which contains only the real part of the ComplexTensor:
x = np.random.randn(3,5)
#or
x = torch.randn(3,5)
z = torchlex.ComplexTensor(x, complex=False)
- b. Inserting a torch tensor which contains the real and the imaginary parts of the ComplexTensor. Last dimension size must be 2. Does not work with numpy arrays.
x = torch.randn(3,5,2)
z = torchlex.ComplexTensor(x, complex=True)
- Inserting a list of complex numbers to ComplexTensor:
x = [1, 1j, -1-1j]
z = torchlex.ComplexTensor(x)
exp(log(z)) should be equal to z:
x = [1,1j,-1-1j]
z = torchlex.ComplexTensor(x, requires_grad=False)
log_z = torchlex.log(z)
exp_log_z = torchlex.exp(log_z)
we get:
ComplexTensor([ 1.000000e+00+0.j , -4.371139e-08+1.j ,
-9.999998e-01-1.0000001j], dtype=complex64)
which is the original [1,1j,-1-1j] with a small numerical error.
We can get r and of Euler's representation. Lets compare ComplexTensor with Numpy:
x = [1,1j,-1-1j]
z = torchlex.ComplexTensor(x, requires_grad=False)
r, theta = z.euler()
print("ComplexTensor\nr = ", r, '\ntheta = ', theta)
z_np = np.array(x)
print("\nNumpy\nr = ", abs(z_np), '\ntheta = ', np.angle(z_np))
we get:
ComplexTensor
r = tensor([1.0000, 1.0000, 1.4142])
theta = tensor([0.0000, 1.5708, 3.9270])
Numpy
r = [1. 1. 1.41421356]
theta = [ 0. 1.57079633 -2.35619449]
the last element of theta seems to be different, yet the difference between the two outputs is , which means it is the same angle.
If z is 2x2 ComplexTensor, then
abs_psi = z.PDF()
returns a probabilities/Categorical tensor of measuring the ij's state . This Categorical can be samples at will by:
abs_psi.sample()
If z is 100x5 ComplexTensor, then
psi = z.PDF(dim=0)
is a collection of 5 wave functions with 100 states each. This ComplexTensor can be used with Quantum Operators: where P is an operator at your choice. For instance, In 1D, will be a (1D) vector and P will be a (2D) matrix.