Skip to content

Neural network and matrix library with parallized learning build from the ground up. Educational project.

License

Notifications You must be signed in to change notification settings

Lommix/snail_nn

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

[WIP] Snail NN - smol neural network library

Minimalistic CPU based neural network library with backpropagation and parallelized stochastic gradient descent.

Examples

Storing images inside the neural network, upscaling and interpolate between them.

cargo run --example imagepol --release

image


The mandatory xor example

cargo run --example xor --release

image


Example Code:

use snail_nn::prelude::*;

fn main(){
    let mut nn = Model::new(&[2, 3, 1]);
    nn.set_activation(Activation::Sigmoid)

    let mut batch = TrainingBatch::empty(2, 1);
    let rate = 1.0;

    // AND - training data
    batch.add(&[0.0, 0.0], &[0.0]);
    batch.add(&[1.0, 0.0], &[0.0]);
    batch.add(&[0.0, 1.0], &[0.0]);
    batch.add(&[1.0, 1.0], &[1.0]);

    for _ in 0..10000 {
        let (w_gradient, b_gradient) = nn.gradient(&batch.random_chunk(2));
        nn.learn(w_gradient, b_gradient, rate);
    }

    println!("ouput {:?} expected: 0.0", nn.forward(&[0.0, 0.0]));
    println!("ouput {:?} expected: 0.0", nn.forward(&[1.0, 0.0]));
    println!("ouput {:?} expected: 0.0", nn.forward(&[0.0, 1.0]));
    println!("ouput {:?} expected: 1.0", nn.forward(&[1.0, 1.0]));
}

Features

  • Sigmoid, Tanh & Relu activation functions
  • Parallelized stochastic gradient descent

Todos

  • Wgpu compute shaders

About

Neural network and matrix library with parallized learning build from the ground up. Educational project.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages