Multi-Layer Perceptron Feedforward neural network. Backpropagation and Genetic learning algorithm. Plus WinForms UI.
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
src
web
.gitignore
LICENSE
README.md

README.md

MuLaPeGASim

Multi-Layer Perceptron Feedforward neural network. Backpropagation and Genetic learning algorithm. Plus WinForms UI.

This was originally developed by Rene Schulte and Torsten Bär in 2004 but still works with Visual Studio 2015!

MuLaPeGASim is a Multi-Layer Perceptron Feedforward neural network developed as an assignment for the courses "Artificial Intelligence" and "Genetic algorithms". The name "MuLaPeGASim" stands for (Mu)lti(la)yer (Pe)rceptron (G)enetic (A)lgorithm (Sim)ulator. :) And as the name implies, MuLaPeGASim is a multilayer Perceptron neural network simulator with some special features for Optical Character Recognition (OCR) problems. It is possible to design a Multilayer Feed-Forward network, create training patterns and train it with a Backpropagation learning algorithm (batch or online, Momentum, Flat Spot Elimination) or with a genetic learning algorithm. The patterns could be entered manually or created automatically for OCR learning. It's also possible to extract characters from an image.

The code was originally implemented in 2004 with State of the Art .NET 1.1 and WinForms UI. Looking back at it, there are some serious software design flaws in there considering it was one of our first larger projects with .NET and C#. Not to mention one of the worst UIs ever seen. Theactual algorightms are still nice though.

For further information check out the project homepage. The app manual of MuLaPeGASim itself is written in English and the project homepage also includes German descriptions of the implemented algorithms. https://teichgraf.github.io/MuLaPeGASim/web/

Enjoy it!