Skip to content

Commit

Permalink
Drhuffman12/cmn basic rnn part 6 (#54)
Browse files Browse the repository at this point in the history
* drhuffman12/cmn_basic_rnn_part_6 comment out RnnConcerns::TrainInSequence while I refactor RnnConcerns::SplitTrainingDat

* drhuffman12/cmn_basic_rnn_part_6 refactor RnnConcerns::SplitTrainingDat

* drhuffman12/cmn_basic_rnn_part_6 replace '#split_for_training' with '#indexes_for_training_and_eval' and 'TrainingData' with 'TrainingIndexes'

* unset ameba version

* set ameba to master branch

* disable ameba (for now)

* disable ameba (for now)

* disable ameba

* drhuffman12/cmn_basic_rnn_part_6 reorg RNN related under `Ai4cr::NeuralNetwork::Rnn`

* drhuffman12/cmn_basic_rnn_part_6 reorg RNN related under `Ai4cr::NeuralNetwork::Rnn`

* drhuffman12/cmn_basic_rnn_part_6 code cleanup

* drhuffman12/cmn_basic_rnn_part_6 try to force full docker rebuild

* drhuffman12/cmn_basic_rnn_part_6 try to force full docker rebuild

* drhuffman12/cmn_basic_rnn_part_6 adjust version/branch of ascii_bar_charter

* drhuffman12/cmn_basic_rnn_part_6 reorg CI steps (so can see Crystal version)

* drhuffman12/cmn_basic_rnn_part_6 shuffle dependencies

* drhuffman12/cmn_basic_rnn_part_6 shuffle dependencies

* drhuffman12/cmn_basic_rnn_part_6 Cleanup requires

* drhuffman12/cmn_basic_rnn_part_6 refactoring re error_distance and related

* drhuffman12/cmn_basic_rnn_part_6 refactoring re error_distance and related (part 2)

* drhuffman12/cmn_basic_rnn_part_6 refactoring re error_distance and related (part 3)

* drhuffman12/cmn_basic_rnn_part_6 refactoring re error_distance and related (part 4)

* drhuffman12/cmn_basic_rnn_part_6 refactoring re error_distance and related (part 5)

* drhuffman12/cmn_basic_rnn_part_6 error_distance should decrease each training round

* drhuffman12/cmn_basic_rnn_part_6 add CalcAndGuess for RnnSimpleTeam

* drhuffman12/cmn_basic_rnn_part_6 inital code and tests for RnnSimpleTeamConcerns::TrainAndAdjust

* drhuffman12/cmn_basic_rnn_part_6 comment out 'debug_msg'

* drhuffman12/cmn_basic_rnn_part_6 add BreedUtils and Breeder

* drhuffman12/cmn_basic_rnn_part_6 formatting

* drhuffman12/cmn_basic_rnn_part_6 comment out test for Ai4cr::NeuralNetwork::Rnn::RnnSimpleConcerns::TrainAndAdjust till that is refactored w/ Breeder

* drhuffman12/cmn_basic_rnn_part_6 refactor; remove RnnSimpleTeam (take a diff approach in later PR)

* drhuffman12/cmn_basic_rnn_part_6 bump version
  • Loading branch information
drhuffman12 committed Jan 19, 2021
1 parent f4ef197 commit a08a797
Show file tree
Hide file tree
Showing 46 changed files with 1,213 additions and 1,068 deletions.
4 changes: 2 additions & 2 deletions .circleci/config.yml
Expand Up @@ -5,11 +5,11 @@ jobs:
- image: crystallang/crystal:nightly
steps:
- checkout
- run: shards install
- run: mkdir -p test-results/spec
- run: crystal -v > test-results/crystal_version.txt
- run: shards install
- run: scripts/version_info > test-results/app_version.txt
- run: time bin/ameba --no-color > test-results/static_code_analysis.ameba.txt
# - run: time bin/ameba --no-color > test-results/static_code_analysis.ameba.txt
- run: time scripts/test_always > test-results/spec/results.txt
- run: time scripts/test_always_junit_format

Expand Down
13 changes: 9 additions & 4 deletions .github/workflows/crystal.yml
Expand Up @@ -16,16 +16,21 @@ jobs:

steps:
- uses: actions/checkout@v2
- name: Install dependencies
run: shards install

- name: Show Crystal version
run: crystal -v

- name: Install dependencies
run: shards install && shards update

- name: Show repo version
run: scripts/version_info
- name: Run static code analysis
run: bin/ameba --no-color

# - name: Run static code analysis
# run: bin/ameba --no-color
- name: Run tests
run: scripts/test_always

# run: crystal spec
- name: Run tests (w/ junit format)
run: scripts/test_always_junit_format
1 change: 1 addition & 0 deletions Dockerfile
@@ -1,3 +1,4 @@
# FROM crystallang/crystal:nightly-build
FROM crystallang/crystal:nightly-alpine-build

WORKDIR /app
Expand Down
32 changes: 23 additions & 9 deletions shard.yml
@@ -1,32 +1,46 @@
name: ai4cr
version: 0.1.17
version: 0.1.18

authors:
- Daniel Huffman <drhuffman12@yahoo.com>

crystal: ">= 0.36.0"
# crystal: 1.0

license: MIT

development_dependencies:
ameba:
github: crystal-ameba/ameba
version: ~> 0.13.3
## REVERT to crystal-ameba/ameba after it is Crystal 1.0 compatible.
## See also: https://github.com/crystal-ameba/ameba/pull/173

## Un-comment/edit after Crystal 0.36.0 or 1.0 compatible:
# ameba:
# github: crystal-ameba/ameba
# branch: master
# # version: ~> 0.13.3
# ## REVERT to crystal-ameba/ameba after it is Crystal 1.0 compatible.
# ## See also: https://github.com/crystal-ameba/ameba/pull/173
# github: drhuffman12/ameba
# branch: drhuffman12/bump_crystal_version_to_1
# # branch: drhuffman12/bump_crystal_version_to_1

ascii_bar_charter:
github: drhuffman12/ascii_bar_charter
# version: ~> 1.4.0
branch: master

## Un-comment after Crystal 1.0 compatible:
## Un-comment/edit after Crystal 0.36.0 or 1.0 compatible:
# icr:
# github: crystal-community/icr
# branch: master

## Un-comment/edit after Crystal 0.36.0 or 1.0 compatible:
spectator:
# gitlab: arctic-fox/spectator
# branch: master
# version: ">= 0.9.31"

github: drhuffman12/spectator
branch: drhuffman12/bump_crystal_version_to_1
# branch: drhuffman12/bump_crystal_version_to_1
# branch: drhuffman12/master
branch: drhuffman12/bump_crystal_version_to_0_36_0b

## Un-comment after Crystal 1.0 compatible:
# aasm:
Expand Down
146 changes: 146 additions & 0 deletions spec/ai4cr/breed_utils_spec.cr
@@ -0,0 +1,146 @@
require "./../spec_helper"
require "./../spectator_helper"
require "./../../src/ai4cr/breed_utils.cr"

# class Breeder
# include Ai4cr::BreedUtils
# end

Spectator.describe Ai4cr::BreedUtils do
let(parents1) { (-10..0).to_a.map { |i| i/10.0 } }
let(parents2) { (0..10).to_a.map { |i| i/10.0 } }

let(breeder) {
Ai4cr::Breeder.new
}

# before(:each) do
# include Ai4cr::BreedUtils
# end

describe "breed_value" do
let(parent_a) { 0.0 }
let(parent_b) { 1.0 }
let(p_dist) { 1.0 * (parent_b - parent_a) }
let(c_min) { parent_a - 0.5 }
let(c_max) { parent_a + 2 * p_dist - 0.5 }

context "debug" do
it "p_dist" do
expect(p_dist).to eq(1.0)
end

it "c_min" do
expect(c_min).to eq(-0.5)
end

it "c_max" do
expect(c_max).to eq(1.5)
end
end

context "returns" do
it "a Float64" do
child = breeder.breed_value(0, 1)

expect(child).to be_a(Float64)
end

context "between" do
it "expected 'min'" do
child = breeder.breed_value(0, 1)

expect(child).to be >= c_min
end

it "expected 'max'" do
child = breeder.breed_value(0, 1)

expect(c_max).to be >= child
end
end
end
end

describe "breed_nested" do
context "given two Int32 values" do
let(parent_a) { 0 }
let(parent_b) { 1 }

it "does not raise" do
expect {
breeder.breed_nested(parent_a, parent_b)
}.not_to raise_error
end

context "returns" do
it "a Float64" do
child = breeder.breed_nested(parent_a, parent_b)

expect(child).to be_a(Float64)
end
end
end

context "given two Float64 values" do
let(parent_a) { 0.0 }
let(parent_b) { 1.0 }

it "does not raise" do
expect {
breeder.breed_nested(parent_a, parent_b)
}.not_to raise_error
end

context "returns" do
it "a Float64" do
child = breeder.breed_nested(parent_a, parent_b)

expect(child).to be_a(Float64)
end
end
end

context "given two Array(Float64) values" do
let(parent_a) { [0.0, 0.1] }
let(parent_b) { [0.9, 1.0] }

it "does not raise" do
expect {
breeder.breed_nested(parent_a, parent_b)
}.not_to raise_error
end

context "returns" do
it "a Float64" do
child = breeder.breed_nested(parent_a, parent_b)

expect(child).to be_a(Array(Float64))
end
end
end

# context "given two Hash(String, Float64) values" do
# let(parent_a) {
# Hash{"zero" => 0.0}
# }
# let(parent_b) {
# Hash{"one" => 1.0}
# }

# it "does not raise" do
# expect {
# breeder.breed_nested(parent_a, parent_b)
# }.not_to raise_error
# end

# context "returns" do
# it "a Float64" do
# child = breeder.breed_nested(parent_a, parent_b)

# expect(child).to be_a(parent_a.class)
# end
# end
# end
end
end
4 changes: 2 additions & 2 deletions spec/ai4cr/neural_network/backpropagation_spec.cr
Expand Up @@ -201,8 +201,8 @@ describe Ai4cr::NeuralNetwork::Backpropagation do
assert_approximate_equality_of_nested_list net.activation_nodes, net2.activation_nodes
end

it "@calculated_error_total of the dumped net approximately matches @calculated_error_total of the loaded net" do
assert_approximate_equality_of_nested_list net.calculated_error_total, net2.calculated_error_total
it "@error_distance of the dumped net approximately matches @error_distance of the loaded net" do
assert_approximate_equality_of_nested_list net.error_distance, net2.error_distance
end
end
end
Expand Down
28 changes: 14 additions & 14 deletions spec/ai4cr/neural_network/cmn/chain_spec.cr
Expand Up @@ -26,8 +26,8 @@ describe Ai4cr::NeuralNetwork::Cmn::Chain do
context "#init_network" do
it "the 'outputs_guessed' start as zeros" do
# prep net vvv
net0 = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: 2, width: 4, learning_style: Ai4cr::NeuralNetwork::Cmn::LS_SIGMOID, disable_bias: false)
net1 = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: 4, width: 3, learning_style: Ai4cr::NeuralNetwork::Cmn::LS_SIGMOID, disable_bias: true)
net0 = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: 2, width: 4, learning_style: Ai4cr::NeuralNetwork::LS_SIGMOID, disable_bias: false)
net1 = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: 4, width: 3, learning_style: Ai4cr::NeuralNetwork::LS_SIGMOID, disable_bias: true)

net0.init_network
net0.learning_rate = 0.25
Expand Down Expand Up @@ -57,8 +57,8 @@ describe Ai4cr::NeuralNetwork::Cmn::Chain do
context "#eval" do
it "the 'outputs_guessed' are updated as expected" do
# prep net vvv
net0 = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: 2, width: 4, learning_style: Ai4cr::NeuralNetwork::Cmn::LS_SIGMOID, disable_bias: false)
net1 = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: 4, width: 3, learning_style: Ai4cr::NeuralNetwork::Cmn::LS_SIGMOID, disable_bias: true)
net0 = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: 2, width: 4, learning_style: Ai4cr::NeuralNetwork::LS_SIGMOID, disable_bias: false)
net1 = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: 4, width: 3, learning_style: Ai4cr::NeuralNetwork::LS_SIGMOID, disable_bias: true)

net0.init_network
net0.learning_rate = 0.25
Expand Down Expand Up @@ -91,8 +91,8 @@ describe Ai4cr::NeuralNetwork::Cmn::Chain do
# TODO: FIX!!!

# prep net vvv
net0 = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: 2, width: 4, learning_style: Ai4cr::NeuralNetwork::Cmn::LS_SIGMOID, disable_bias: false)
net1 = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: 4, width: 3, learning_style: Ai4cr::NeuralNetwork::Cmn::LS_SIGMOID, disable_bias: true)
net0 = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: 2, width: 4, learning_style: Ai4cr::NeuralNetwork::LS_SIGMOID, disable_bias: false)
net1 = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: 4, width: 3, learning_style: Ai4cr::NeuralNetwork::LS_SIGMOID, disable_bias: true)

net0.init_network
net0.learning_rate = 0.25
Expand Down Expand Up @@ -120,10 +120,10 @@ describe Ai4cr::NeuralNetwork::Cmn::Chain do

delta = 0.001

puts
puts "hard_coded_weights0: #{hard_coded_weights0.pretty_inspect}"
puts "net0.weights: #{net0.weights.pretty_inspect}"
puts
# puts
# puts "hard_coded_weights0: #{hard_coded_weights0.pretty_inspect}"
# puts "net0.weights: #{net0.weights.pretty_inspect}"
# puts
assert_approximate_inequality_of_nested_list(hard_coded_weights0, net0.weights, delta)
end
end
Expand All @@ -136,10 +136,10 @@ describe Ai4cr::NeuralNetwork::Cmn::Chain do
layer_3_size_without_bias = 6
layer_4_size_without_bias = 7

nt = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: layer_0_size_without_bias, width: layer_1_size_without_bias, learning_style: Ai4cr::NeuralNetwork::Cmn::LS_TANH, disable_bias: false)
nr = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: layer_1_size_without_bias, width: layer_2_size_without_bias, learning_style: Ai4cr::NeuralNetwork::Cmn::LS_RELU, disable_bias: true)
np = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: layer_2_size_without_bias, width: layer_3_size_without_bias, learning_style: Ai4cr::NeuralNetwork::Cmn::LS_PRELU, disable_bias: true)
ne = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: layer_3_size_without_bias, width: layer_4_size_without_bias, learning_style: Ai4cr::NeuralNetwork::Cmn::LS_SIGMOID, disable_bias: true)
nt = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: layer_0_size_without_bias, width: layer_1_size_without_bias, learning_style: Ai4cr::NeuralNetwork::LS_TANH, disable_bias: false)
nr = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: layer_1_size_without_bias, width: layer_2_size_without_bias, learning_style: Ai4cr::NeuralNetwork::LS_RELU, disable_bias: true)
np = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: layer_2_size_without_bias, width: layer_3_size_without_bias, learning_style: Ai4cr::NeuralNetwork::LS_PRELU, disable_bias: true)
ne = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: layer_3_size_without_bias, width: layer_4_size_without_bias, learning_style: Ai4cr::NeuralNetwork::LS_SIGMOID, disable_bias: true)

arr = Array(Ai4cr::NeuralNetwork::Cmn::MiniNet).new
arr << nt
Expand Down
19 changes: 8 additions & 11 deletions spec/ai4cr/neural_network/cmn/mini_net_spec.cr
Expand Up @@ -3,18 +3,18 @@ require "./../../../spec_helper"
describe Ai4cr::NeuralNetwork::Cmn::MiniNet do
describe "when importing and exporting as JSON" do
[
Ai4cr::NeuralNetwork::Cmn::LS_PRELU,
Ai4cr::NeuralNetwork::Cmn::LS_RELU,
Ai4cr::NeuralNetwork::Cmn::LS_SIGMOID,
Ai4cr::NeuralNetwork::Cmn::LS_TANH,
Ai4cr::NeuralNetwork::LS_PRELU,
Ai4cr::NeuralNetwork::LS_RELU,
Ai4cr::NeuralNetwork::LS_SIGMOID,
Ai4cr::NeuralNetwork::LS_TANH,
].each do |learning_style|
context "when given height: 2, width: 3, learning_style: #{learning_style}" do
context "when exporting to JSON" do
np1 = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: 2, width: 3, learning_style: learning_style)
np1_json = np1.to_json
np1_hash = JSON.parse(np1_json).as_h

expected_keys = ["width", "height", "height_considering_bias", "width_indexes", "height_indexes", "inputs_given", "outputs_guessed", "weights", "last_changes", "error_total", "outputs_expected", "input_deltas", "output_deltas", "disable_bias", "learning_rate", "momentum", "error_distance_history_max", "error_distance_history", "learning_style", "deriv_scale"]
expected_keys = ["width", "height", "height_considering_bias", "width_indexes", "height_indexes", "inputs_given", "outputs_guessed", "weights", "last_changes", "error_distance", "outputs_expected", "input_deltas", "output_deltas", "disable_bias", "learning_rate", "momentum", "error_distance_history_max", "error_distance_history", "learning_style", "deriv_scale"]
expected_keys.each do |key|
it "it has top level key of #{key}" do
(np1_hash.keys).should contain(key)
Expand All @@ -33,12 +33,9 @@ describe Ai4cr::NeuralNetwork::Cmn::MiniNet do
# it "re-exported JSON matches imported JSON" do
# (np1_json).should eq(np2_json)
# end
# e.g.:
# Expected: "{\"width\":3,\"height\":2,\"height_considering_bias\":3,\"width_indexes\":[0,1,2],\"height_indexes\":[0,1,2],\"inputs_given\":[0.0,0.0,1.0],\"outputs_guessed\":[0.0,0.0,0.0],\"weights\":[[0.7318031568424814,0.534853051161922,0.21857644593495615],[-0.6591430323844467,-0.2012854441173063,-0.3036688821984831],[0.3937028443098609,-0.1193921136297592,-0.5135509965693288]],\"last_changes\":[[0.0,0.0,0.0],[0.0,0.0,0.0],[0.0,0.0,0.0]],\"error_total\":0.0,\"outputs_expected\":[0.0,0.0,0.0],\"input_deltas\":[0.0,0.0,0.0],\"output_deltas\":[0.0,0.0,0.0],\"disable_bias\":false,\"learning_rate\":0.18325052338453365,\"momentum\":0.8206852816702831,\"error_distance\":1.0,\"error_distance_history_max\":10,\"error_distance_history\":[],\"learning_style\":10,\"deriv_scale\":0.001}"
# got: "{\"width\":3,\"height\":2,\"height_considering_bias\":3,\"width_indexes\":[0,1,2],\"height_indexes\":[0,1,2],\"inputs_given\":[0.0,0.0,1.0],\"outputs_guessed\":[0.0,0.0,0.0],\"weights\":[[0.7318031568424814,0.534853051161922,0.21857644593495618],[-0.6591430323844467,-0.2012854441173063,-0.3036688821984831],[0.3937028443098609,-0.11939211362975921,-0.5135509965693288]],\"last_changes\":[[0.0,0.0,0.0],[0.0,0.0,0.0],[0.0,0.0,0.0]],\"error_total\":0.0,\"outputs_expected\":[0.0,0.0,0.0],\"input_deltas\":[0.0,0.0,0.0],\"output_deltas\":[0.0,0.0,0.0],\"disable_bias\":false,\"learning_rate\":0.18325052338453365,\"momentum\":0.8206852816702831,\"error_distance\":1.0,\"error_distance_history_max\":10,\"error_distance_history\":[],\"learning_style\":10,\"deriv_scale\":0.001}"

# However, it seems to be fine when you split it out by top-level keys:
expected_keys = ["width", "height", "height_considering_bias", "width_indexes", "height_indexes", "inputs_given", "outputs_guessed", "weights", "last_changes", "error_total", "outputs_expected", "input_deltas", "output_deltas", "disable_bias", "learning_rate", "momentum", "error_distance_history_max", "error_distance_history", "learning_style", "deriv_scale"]
expected_keys = ["width", "height", "height_considering_bias", "width_indexes", "height_indexes", "inputs_given", "outputs_guessed", "weights", "last_changes", "error_distance", "outputs_expected", "input_deltas", "output_deltas", "disable_bias", "learning_rate", "momentum", "error_distance_history_max", "error_distance_history", "learning_style", "deriv_scale"]
expected_keys.each do |key|
it "re-exported JSON matches imported JSON for top level key of #{key}" do
(np1_json[key]).should eq(np2_json[key])
Expand All @@ -60,7 +57,7 @@ describe Ai4cr::NeuralNetwork::Cmn::MiniNet do
# NOTE Below are all for learing style Sigmoid; tests should be added to cover the other learning styles
describe "#eval" do
describe "when given a net with structure of [3, 2]" do
net = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: 3, width: 2, learning_style: Ai4cr::NeuralNetwork::Cmn::LS_SIGMOID)
net = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: 3, width: 2, learning_style: Ai4cr::NeuralNetwork::LS_SIGMOID)

inputs = [0.1, 0.2, 0.3]
hard_coded_weights = [
Expand Down Expand Up @@ -99,7 +96,7 @@ describe Ai4cr::NeuralNetwork::Cmn::MiniNet do

describe "#train" do
describe "when given a net with structure of [3, 2]" do
net = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: 3, width: 2, learning_style: Ai4cr::NeuralNetwork::Cmn::LS_SIGMOID)
net = Ai4cr::NeuralNetwork::Cmn::MiniNet.new(height: 3, width: 2, learning_style: Ai4cr::NeuralNetwork::LS_SIGMOID)
hard_coded_weights = [
[-0.9, 0.7],
[-0.9, 0.6],
Expand Down

0 comments on commit a08a797

Please sign in to comment.