Skip to content

Conversation

DimitriF
Copy link
Contributor

A few changes:

  • the option use_bias in trainr trigger the use of a bias for network calculation
  • during the network crossing, I went step by step: weight multiplication, eventually bias addition, unit activation (last one is the sigmoid/logistic/stuff)
  • if use_bias == T, a new object in the output list is added: bias_synapse
  • in predictr, if bias_synapse in the list, the bias is added during network crossing

I didn't care about efficiency and calculate the bias anyway, I just take them into account or not during network crossing, I don't think there is a bottleneck here.

the RNG changed because of bias generation so I add a seed in the test_rnn.R and change the result (I needed it to check that the use_bias = F wasn't messing around compare to the last version)

@coveralls
Copy link

Coverage Status

Coverage decreased (-1.2%) to 87.047% when pulling e7ffc91 on DimitriF:master into fd06501 on bquast:sigmoid.

@bquast
Copy link
Owner

bquast commented May 19, 2016

perfect, again i'm going to have read this in more detail later, but the fact that it passes the test should say enough.

at some point in the future I will add some more tests just to be sure

@bquast bquast merged commit 29974e6 into bquast:sigmoid May 19, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants