Skip to content

Commit

Permalink
Browse files Browse the repository at this point in the history
  • Loading branch information
bfortuner committed Apr 25, 2017
2 parents ddb0cec + dda2d70 commit 239a1c3
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions docs/forwardpropagation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -62,13 +62,13 @@ To accomodate arbitrarily large inputs or outputs, we need to make our code more
Weight Initialization
---------------------

Unlike last time where ``Wh`` and ``Wo`` were scalar numbers, our new weight variables will be numpy arrays. Each array will hold all the weights for its own layer — one weight for each synapse. Below we initialize each array with the numpy's ``np.random.rand(rows, cols)`` method, which returns a matrix of random numbers drawn from a normal distribution (mean 0, variable 1).
Unlike last time where ``Wh`` and ``Wo`` were scalar numbers, our new weight variables will be numpy arrays. Each array will hold all the weights for its own layer — one weight for each synapse. Below we initialize each array with the numpy's ``np.random.randn(rows, cols)`` method, which returns a matrix of random numbers drawn from a normal distribution with mean 0 and variance 1.

.. literalinclude:: ../code/nn_matrix.py
:language: python
:pyobject: init_weights

Here's an example of calling ``random.rand()``:
Here's an example calling ``random.randn()``:

::

Expand Down

0 comments on commit 239a1c3

Please sign in to comment.