Skip to content
#

sigmoid-activation

Here are 40 public repositories matching this topic...

xor-neural-network

Neural Network implemented with different Activation Functions i.e, sigmoid, relu, leaky-relu, softmax and different Optimizers i.e, Gradient Descent, AdaGrad, RMSProp, Adam. You can choose different loss functions as well i.e, cross-entropy loss, hinge-loss, mean squared error (MSE)

  • Updated Aug 15, 2022
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the sigmoid-activation topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the sigmoid-activation topic, visit your repo's landing page and select "manage topics."

Learn more