Skip to content

Implementations of perceptron algorithm with 2 hidden layers, learning vector quantization, k-means clustering algorithm, for the course Computational Intelligence @uoi

License

Notifications You must be signed in to change notification settings

billgewrgoulas/Neural-networks.

Repository files navigation

Contents

Perceptron

  • The implementetion consists of 4 layers of nodes: an input layer, two hidden layers and an output layer. Except for the input nodes, each node is a neuron and in our case the first hidden layer will use two different sigmoid activation functions, while the second will have the simple linear function. For the training the impementation will make use of forward feeding, back propagation and gradient descent. Morover to achieve the best results we will test different numbers of Epochs. The goal will be to train our model so it can classify a set of randomly generated points (x, y) into 3 different categories, and evaluate the results using the test data and the metric Root Mean Squered Error. Lastly we will find out how good our model is at learning from the given data using generalization.

Kmeans and Learning Vector Quantization(LVQ)

  • In this part we will implement Kmeans and LVQ and try to group a set of points (x, y). We will test and compare the algorithms with different number of clusters.

About

Implementations of perceptron algorithm with 2 hidden layers, learning vector quantization, k-means clustering algorithm, for the course Computational Intelligence @uoi

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages