Skip to content

Training Tips

Raymond Phan edited this page Apr 23, 2017 · 4 revisions

If you have tried to configure the neural network and can't seem to get the neural network to converge and classify your data with high accuracy, here are some tips you can try:

  • Introduce regularization: The data may be overfitting and so try to introduce regularization at a small rate, then gradually increase until you're satisfied.

  • Decrease the learning rate: If you see that the training doesn't converge and oscillates or even diverges, try decreasing the learning rate.

  • Increase the learning rate: If you see that training is very slow and the desired output is slowly getting to where you want it to go, try increasing the learning rate.

  • Change the activation function: If you are performing linear regression, using a network with no hidden layers and specifying a linear output activation function should suffice. If you are performing non-linear regression, use hidden layers and specify Tanh, Sigmoid or ReLu for their activation functions, then use linear for the output layer.
    For classification, ensure you use Tanh, Sigmoid or ReLu for the hidden layers and one of Tanh or Sigmoid for the output layer. If you would like to perform logistic regression, specify no hidden layers and choose the Sigmoid activation function for the output layer.

  • Changing the number of hidden layers: If the dataset is rather difficult to train, try increasing the number of hidden layers.

  • Changing the number of hidden neurons per hidden layer: A general rule is set the same number of neurons for each layer in the hidden layer, but the general rule is to let the first couple of layers have more neurons than the other layers should decide to vary this.