Skip to content

a simple neural network implementation from a perspective of linear algebra.

License

Notifications You must be signed in to change notification settings

mxhagen/aicaramba

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

aicaramba ✨

a simple neural network implementation from a perspective of linear algebra.

the library is mostly developed for recreational and educational purposes for myself and only depends on the rand-crate for randomization of newly created weight- and bias matrices.

for a usage example see src/bin/xor.rs, which simulates an XOR-logic-gate using a small neural network.


features

currently available features of the library:

  • ReLU and Sigmoid activation functions
  • the MSE loss function
  • a single, down-to-earth struct that contains the whole network

roadmap

what might happen down the road:

  • BCE loss function (requires output layer sigmoid activation - not a trivial addition)
  • serde (de-)serialization to easily store checkpoints/training progress.
  • perhaps a MNIST example (?)

About

a simple neural network implementation from a perspective of linear algebra.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages