Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add activation function to each layer #327

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Commits on Sep 18, 2019

  1. add activation function to each layer

    doesn't make much sense without it - otherwise this whole NN is just an elaborate linear combination.
    MaverickMeerkat committed Sep 18, 2019
    Configuration menu
    Copy the full SHA
    539c34f View commit details
    Browse the repository at this point in the history

Commits on Oct 7, 2019

  1. Update neural_network_raw.ipynb

    no need for softmax activation, as the softmax_cross_entropy_with_logits requires the logits, i.e. the values before the softmax.
    MaverickMeerkat committed Oct 7, 2019
    Configuration menu
    Copy the full SHA
    4758f22 View commit details
    Browse the repository at this point in the history