Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fold batch norm to weights and biases #779

Merged
merged 7 commits into from
Mar 6, 2019
Merged

Commits on Mar 4, 2019

  1. Quick patch for negative gamma

    Still wrong, but at least it's consistently wrong.
    Ttl committed Mar 4, 2019
    Configuration menu
    Copy the full SHA
    03ca2f5 View commit details
    Browse the repository at this point in the history
  2. Fold BN to weights and biases

    Ttl committed Mar 4, 2019
    Configuration menu
    Copy the full SHA
    6bf928c View commit details
    Browse the repository at this point in the history

Commits on Mar 5, 2019

  1. Fix handling of biases

    Ttl committed Mar 5, 2019
    Configuration menu
    Copy the full SHA
    3bef34f View commit details
    Browse the repository at this point in the history
  2. Fold batch norm in network_legacy.

    No need to do duplicate work in the backends.
    Ttl committed Mar 5, 2019
    Configuration menu
    Copy the full SHA
    2e7c951 View commit details
    Browse the repository at this point in the history
  3. Use biases in OpenCL

    Ttl committed Mar 5, 2019
    Configuration menu
    Copy the full SHA
    804ed2f View commit details
    Browse the repository at this point in the history
  4. Cleanup and small fixes

    Remove unused functions. Fix blas. Disable tensorflow backend.
    Ttl committed Mar 5, 2019
    Configuration menu
    Copy the full SHA
    de4b1f1 View commit details
    Browse the repository at this point in the history
  5. clang-format

    Ttl committed Mar 5, 2019
    Configuration menu
    Copy the full SHA
    fc302ec View commit details
    Browse the repository at this point in the history