-
Notifications
You must be signed in to change notification settings - Fork 0
/
connection.rb
65 lines (54 loc) · 2.49 KB
/
connection.rb
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
# frozen_string_literal: true
# Neuronet module / Connection class
module Neuronet
# Connections between neurons are there own separate objects. In Neuronet, a
# neuron contains it's bias, and a list of it's connections. Each connection
# contains it's weight (strength) and connected neuron.
class Connection
attr_accessor :neuron, :weight
# Connection#initialize takes a neuron and a weight with a default of 0.0.
def initialize(neuron = Neuron.new, weight: 0.0)
@neuron = neuron
@weight = weight
end
# The connection's mu is the activation of the connected neuron.
def mu = @neuron.activation
alias activation mu
# The connection's mju is 𝑾𝓑𝒂'.
def mju = @weight * @neuron.derivative
# The connection kappa is a component of the neuron's sum kappa:
# 𝜿 := 𝑾 𝝀'
def kappa = @weight * @neuron.lamda
# The weighted activation of the connected neuron.
def weighted_activation = @neuron.activation * @weight
# Consistent with #update
alias partial weighted_activation
# Connection#update returns the updated activation of a connection, which is
# the weighted updated activation of the neuron it's connected to:
# weight * neuron.update
# This method is the one to use whenever the value of the inputs are changed
# (or right after training). Otherwise, both update and value should give
# the same result. When back calculation are not needed, use
# Connection#weighted_activation instead.
def update = @neuron.update * @weight
# Connection#backpropagate modifies the connection's weight in proportion to
# the error given and passes that error to its connected neuron via the
# neuron's backpropagate method.
def backpropagate(error)
@weight += @neuron.activation * Neuronet.noise[error]
if @weight.abs > Neuronet.maxw
@weight = @weight.positive? ? Neuronet.maxw : -Neuronet.maxw
end
@neuron.backpropagate(error)
self
end
# On how to reduce the error, the above makes it obvious how to interpret
# the equipartition of errors among the connections. Backpropagation is
# symmetric to forward propagation of errors. The error variable is the
# reduced error, 𝛆(see the wiki notes).
# A connection inspects itself as "weight*label:...".
def inspect = "#{Neuronet.format % @weight}*#{@neuron.inspect}"
# A connection puts itself as "weight*label".
def to_s = "#{Neuronet.format % @weight}*#{@neuron}"
end
end