Skip to content
This repository has been archived by the owner on Jul 10, 2021. It is now read-only.

Commit

Permalink
Adding layer-type specific weight initialisation ranges. Improves con…
Browse files Browse the repository at this point in the history
…vergence of Sigmoid slightly on `plot_mlp.py` for example. These numbers are more standard.

Closes #9.
  • Loading branch information
alexjc committed Apr 21, 2015
1 parent cce4cc3 commit caec028
Showing 1 changed file with 9 additions and 1 deletion.
10 changes: 9 additions & 1 deletion sknn/mlp.py
Original file line number Diff line number Diff line change
Expand Up @@ -268,7 +268,15 @@ def _create_mlp(self, X, y, nvis=None, input_space=None):
for i, layer in enumerate(self.layers[:-1]):
fan_in = self.unit_counts[i] + 1
fan_out = self.unit_counts[i + 1]
lim = numpy.sqrt(6) / (numpy.sqrt(fan_in + fan_out))

if layer[0] == "Tanh":
lim = numpy.sqrt(6) / (numpy.sqrt(fan_in + fan_out))
elif layer[0] in ("Rectifier", "Maxout"):
lim = 2.0 / fan_in
elif layer[0] == "Sigmoid":
lim = 4.0 * numpy.sqrt(6) / (numpy.sqrt(fan_in + fan_out))
else:
lim = 0.005

layer_name = "Hidden_%i_%s" % (i, layer[0])
hidden_layer = self._create_hidden_layer(layer_name, layer, irange=lim)
Expand Down

0 comments on commit caec028

Please sign in to comment.