Skip to content

Conversation

@wrzadkow
Copy link

Added missing activation functions to the multilayer.perceptron.py. Chose the ReLU activation. This solves Issue #209 Used low-level function (i.e. tf.maximum instead of tf.nn.relu) to keep the low-level style of the existing code.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant