Skip to content

418Coffee/verceptron

Repository files navigation

Verceptron – a perceptron written in V

Table of contents

Quickstart

  1. Clone the repository:
git clone https://github.com/418Coffee/verceptron.git
cd verceptron/
  1. You may edit same configuration variables in verceptron.v.
  2. Train and test the model by running:
v run .
  1. Or to visualise the model you may run (requires ffmpeg to be installed and added to PATH):
./visualise.sh

General Information

A perceptron is essentially the simplest neural "network", consisting of a single neuron. It was invented in 1943 by McCulloch and Pitts. The first implementation of a perceptron was built in 1958 by Rosenblatt. Rosenblatt created a perceptron that was designed to classify two sets of images from a 20x20 array of cadmium sulfide photocells. It was proven by Albert Novikoff that a perceptron always converges when trained with a linearly separated data set.

This program is similair to the perceptron built by Rosenblatt, it consists of a 20x20 input grid and is trained to determine between rectangles and circles.

Known Limitations

Single-layer perceptrons are only capable of learning linearly separable patterns. Furthermore, Marvin Minsky and Seymour Papert proved that a single layer perceptron was uncapable to learn an XOR function. The perceptron is guaranteed to converge on a solution when trained with a linearly separable training set, but there may be many solutions for the given training set, each with different quality. It is difficult to determine whether the converged on solution is the best or not.

References

Further Reading

Contributing

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.

License

MIT