Skip to content

ezoz49/perceptron

 
 

Repository files navigation

Pure Java Rumelhart's Multilayer Perceptron

multilayer perceptron

No more fat JAR required for starting from scratch. JAR's total size is about 40 kB!

Implements pure java Multilayer Perceptron with one of the Activation Functions and Backpropagation training algorithm.

📰 See an example of use.

Activation Functions

This activation function are implemented currently:

  • Identity
  • Sigmoid
  • ReLU
  • Leaky ReLU
  • ELU
  • SiLU
  • Softmax
  • Softplus
  • Tanh

Backpropagation Algorithm

Perceptron is trained by Backpropagation algorithm. Key formulas are described below.

At each iteration weight between i-th and j-th node is changed by:

Δw_ij = - η * y_i * δ_j                (1)

where:
η - learning rate;
y_i - output of i-th node;
δ_j - delta coefficient for j-th node.

Delta coefficient for j-th node is calculation depends on location of the node and type of loss function.

For output layer node with Softmax activation function and Cross-entropy loss function delta is evaluated by:

δ_j = y_j - e_j                        (2)

where:
y_j - real output for j-th node;
e_j - expected output for j-th node.

For output layer node with other type of activation function, except Softmax, and Least Squares loss function delta is evaluated by:

δ_j = (y_j - e_j) * f'(S_j)            (3)

where:
f'(S_j) - derivative of activation function;
S_j - input signal for j-th node.

Node input signal is:

S_j = sum(y_i * w_ij)                  (4)

where:
y_i - output of i-th node (located closer to the input layer), which connected to j-th node (located closer to output layer);
w_ij - weight between i-th and j-th node.

For hidden layer node Delta coefficient evaluated by:

δ_j = f'(S_j) * sum(w_jk * δ_k)        (5)

where:
w_jk - weight between j-th node (located closer to input layer) and k-th node (located closer to output layer);
δ_k - delta coefficient for k-th node (located closer to output layer).

How to Use Jar Library

There are 2 cases for get JAR package (~40 Kb).

  1. Get JAR from GitHub Packages. In this case you should use GitHub Personal Access Token (PAT). Configure maven repository by this GitHub Tutorial. After that you can add package to dependencies
<dependency>
    <groupId>ai.neuromachines</groupId>
    <artifactId>perceptron</artifactId>
    <version>2.0</version>
</dependency>
  1. You can get JAR package from JitPack repository without PAT. Add repository to you maven project
<repositories>
    <repository>
        <id>central</id>
        <name>Central Repository</name>
        <url>https://repo.maven.apache.org/maven2</url>
        <snapshots>
            <enabled>false</enabled>
        </snapshots>
    </repository>
    <repository>
        <id>jitpack.io</id>
        <url>https://jitpack.io</url>
    </repository>
</repositories>

and add dependency

<dependency>
    <groupId>com.github.NeuroMachinesLab</groupId>
    <artifactId>perceptron</artifactId>
    <version>2.0</version>
</dependency>

About

Pure Java (~40 Kb jar) Rumelhart's Multilayer Perceptron. Allows One of 10 Activation Function Impl and One of Loss Function: Least Square or Gross-Entropy

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Java 100.0%