Skip to content

Implementation of an ANN for recognisement of the Iris plant-family

Notifications You must be signed in to change notification settings

tech-espm/pic-plantrecogniser

Repository files navigation

PlantRecogniser - A Neural Network Aproach

This Artificial Neural Network was developed in the IntelliJ IDE 2017.3.4 and using Java 1.8 for recognise, based on inputs, Iris plant-family.


Inputs

The input layer has 4 neurons and accept this following normalized values:

  1. sepal length
  2. sepal width
  3. petal length
  4. petal width

Obs: You can find the input values at the data folder.


Structure

You can find at the packages paes.training.c1 or paes.training.c2, respectively, RNAs with one and two hidden layers. This files contains the propagation and the backpropagation algorithm (training algorithm). At paes.test.c1 or paes.test.c2, you can find the responsible classes for testing assertiveness rate.

Variables

  • int n = 3 - This variable is responsible for the number of neurons in the hidden layer;
  • int minValue = 0 - This variable is responsible for control the input's matrix;
  • int age = 1 - This variable is responsible for count the number of ages;

Obs: 1 age is equals 120 iterations.

Methods

Propagation

  • ponderationL1();
  • activationL1();
  • ponderationL2();
  • activationL2();
  • ponderationL3() Obs: Only in c2 classes;
  • activationL3() Obs: Only in c2 classes;

Error Calculation

  • errorCalculation();

BackPropagation

  • gradientCalculationL3() Obs: Only in c2 classes;
  • gradientCalculationL2();
  • gradientCalculationL1();
  • weightsUpdateL3() Obs: Only in c2 classes;
  • weightsUpdateL2();
  • weightsUpdateL1();

Outputs

The output layer has 3 neurons. The following combination represents one of the multiples outputs, that can be printed by the RNA.

  • Iris-Setosa: 0 0 1;
  • Iris-Versicolor: 0 1 0;
  • Iris-Virginica: 1 0 0';
  • Unidentified Species detected!: 0 0 0;
  • Unidentified Species detected!: 1 1 1.

Results - c1 (one hidden layer)

Sigmoid Function

  • 2 Neurons:

    • NaN;
  • 3 Neurons:

    • 93.33 % Success (28/30 successes);
  • 8 Neurons:

    • 93.33 % Success (28/30 successes);

Hyperbolic Tangent (tanh)

  • 2 Neurons:

    • NaN;
  • 3 Neurons:

    • 3.00 % Success (1/30 successes);
  • 8 Neurons:

    • 13.33 % Success (4/30 successes);

ReLU (Rectified linear unit)

  • 2 Neurons:

    • Time Exception;
  • 3 Neurons:

    • Time Exception;
  • 8 Neurons:

    • Time Exception;

Obs:

  • NaN: Incapacibility of achieving a Numeric Value. Probably because the flutuant pointer has exploded;
  • Time Exception: Incapacibility of converging in a polynomial time, or, at least, the incapacibility of converging in a time next to the sigmoid or tanh functions times.

Results - c2 (two hidden layers)

Sigmoid Function

  • 2 Neurons:

    • NaN;
  • 3 Neurons:

    • NaN;
  • 8 Neurons:

    • 33.33 % Success (10/30 successes);

Hyperbolic Tangent (tanh)

  • 2 Neurons:

    • Time Exception;
  • 3 Neurons:

    • Time Exception;
  • 8 Neurons:

    • 0.00 % Success (0/30 successes);

ReLU (Rectified linear unit)

  • 2 Neurons:

    • Time Exception;
  • 3 Neurons:

    • Time Exception;
  • 8 Neurons:

    • Time Exception;

Obs:

  • NaN: Incapacibility of achieving a Numeric Value. Probably because the flutuant pointer has exploded;
  • Time Exception: Incapacibility of converging in a polynomial time, or, at least, the incapacibility of converging in a time next to the sigmoid or tanh functions times.

References Links

Activation Functions:

FACURE, Matheus:

HAYKIN, Simon:

  • Neural Network. 3rd Edition. Artmed. 2008

Rectifier Nonlinearities Improve Neural Network Acoustic Models:

Sample Repository:


Releases

No releases published

Packages

No packages published

Languages