Applying neural network with adam optimizer on heart failure clinical records dataset to compare test errors of sigmoid, tanh, and relu activation functions
-
Updated
Nov 19, 2020 - Python
Applying neural network with adam optimizer on heart failure clinical records dataset to compare test errors of sigmoid, tanh, and relu activation functions
"The 'Activation Functions' project repository contains implementations of various activation functions commonly used in neural networks. "
How Neural Networks work inside
Add a description, image, and links to the tanh-activation topic page so that developers can more easily learn about it.
To associate your repository with the tanh-activation topic, visit your repo's landing page and select "manage topics."