-
Notifications
You must be signed in to change notification settings - Fork 0
Power of linear neural networks
Problem 4
Power of linear neural networks:
Let
Consider fully connected neural networks with one hidden layer, n input neurons and one output neuron (the hidden layer can have any number of neurons).
All hidden and output neurons have identity activation, that is
Given n, define a neural network as above that computes the function
We can construct a fully connected neural network with one hidden layer, n input neurons, and one output neuron that computes the
First, note that
This suggests that we can use a linear combination of the input neurons as the input to the output neuron.
Let
Then we can define the output neuron's input as:
If we choose the weights and bias appropriately, the sign of z will be the same as the sign of the sum
Specifically, we can set:
\begin{eqnarray*} w_1 = w_2 = ... = w_n &=& 1/n\ b &=& -1/2 \end{eqnarray*}
With these choices, we have:
If,
Thus, the sign of z is the same as the sign of the sum, which is the desired output.
However, we only need one hidden neuron for this network, since the identity activation means that the output of each hidden neuron is just a linear combination of the input neurons.
Therefore, a neural network with one hidden layer, n input neurons, and one output neuron with the above weights and bias computes the MAJ_{n} function.