Implements a basic L-Layer neural network. Current features are
- Inputs are images only (classification is based on folder location)
- Binary classification only
- Supports only relu, tanh and sigmoid activation functions
- Auto-initialize weights
- Augment data by flipping images horizontally
- Normalize data sets
- Use L2 regularization
- Use dropout
- Split training set into mini batches
- Use gradient descent with momentum
Define the L layer net by using the neuralnet.NewHyperParametersBuilder()
with the following options
AddLayers(a ActivationFuncName, neurons ...uint)
- adds layers with the specified neurons and activation functionAddNLayer(a ActivationFuncName, neurons uint, n uint)
- adds n indentical layers to the netSetLearningRate(learningRate float64)
- The learning rate to use, defaults to 0.01SetIterations(iterations uint)
- number of iterations used to train the model, defaults to 1000SetRegularizationFactor(regularizationFactor float64)
- the regularization factor to use. 0 indicates not to regularize.SetDropoutKeepProbability
- enables dropout by specifying the probability neurons should be kept (i.e. not dropped). 0 indicates not to dropout.SetMiniBatchSize
- splits the training set into mini batches for large data setsUseGradientDescentWithMomentum(beta float64)
- uses a exponential moving average of gradients when minimizing, allowing the learning rate to be higher as it dampens out oscillations.
The last layer must be a single neuron using the sigmoid
activation function for binary classification.
e.g.
hyperParams, err := neuralnet.NewHyperParametersBuilder().
AddLayers(neuralnet.ActivationFuncNameReLU, 3, 2).
AddLayer(neuralnet.ActivationFuncNameSigmoid, 1)
SetLearningRate(0.15).
SetIterations(5000).
SetRegularizationFactor(0.5).
SetDropoutKeepProbability(0.75).
SetMiniBatchSize(1024).
UseGradientDescentWithMomentum(0.9).
Build()
Once defined, create a training set using the neuralnet.NewImageSetBuilder()
with the following options
WithPathPrefix(pathPrefix string)
- defines a root folder to useAddFolder(pathToFolder string, classification bool)
- adds a folder to the training set, with the classification to useAddImage(pathToImage string, classification bool)
- adds a single image, with the classification to useResizeImages(width, height uint)
- resize all the imagesAugmentFlipHorizontal()
- doubles the data set by considering the images flipped horizontallyNormalize()
- normalizes the data. Note that if the training set is normalized, the test set will also need to be normalized.
If the images are not being resized, they need to be all of the same height and width.
e.g.
trainingDataSet, err := neuralnet.NewImageSetBuilder().
AugmentFlipHorizontal().
WithPathPrefix("datasets/Vegetable Images/train").
AddFolder("Cabbage", false).
AddFolder("Carrot", true).
ResizeImages(32, 32).
Normalize().
Build()
model, err := hyperParams.TrainModel(trainingDataSet)
model.Predict(trainingDataSet)
e.g.
testDataSet, err := neuralnet.NewImageSetBuilder().
WithPathPrefix("../datasets/Vegetable Images/test").
AddFolder("Cabbage", false).
AddFolder("Carrot", true).
ResizeImages(32, 32).
Normalize().
Build()
model.Predict(testDataSet)