Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A bunch of supporting functions for Golgi #306

merged 28 commits into from Dec 8, 2019

A bunch of supporting functions for Golgi #306

merged 28 commits into from Dec 8, 2019


Copy link

chewxy commented Jul 20, 2019

No description provided.

chewxy added 3 commits Jul 19, 2019
Wrote program to generate those broadcasted ops
Renamed BroadcastMul to BroadcastHadamardProd. BroadcastMul is coming soon
…to create dense triangular matrices

This comment has been minimized.

Copy link

coveralls commented Jul 20, 2019

Coverage Status

Coverage increased (+0.3%) to 63.372% when pulling e0e3652 on golgisupports into 9ee42cb on master.

chewxy added 25 commits Jul 21, 2019
…e interfaces more consistent. Previously inversef32 gave the wrong ʘUnaryOperatorType
…now the last axis. This allows for SoftMax to be done across ndarrays

Added more examples
This allows for easier lifting of the return value, however its
utility is not known at the moment.
commit 592126c
Author: Ben Leitner <>
Date:   Sun Nov 17 15:09:08 2019 -0800

    Refactor the max/sum ops to share common code. Have the type/inferShape/Do methods behave in a consistent manner: (#346)

    * Dimensions specified in the "along" parameter are reduced to size 1, but not removed. (Note: this caused TestRepeatOpDoDiff, but this version fixes it.  Perhaps we should make preserving the size-1 dimensions an option of the reduction op?)
    * If all dimensions are included, the result will be a scalar.
    * If all dimensions but 1 are included, the result is a vector, regardless of which dimension is left intact.

    Tests verify that the resulting nodes have the expected shape.

    Note: While here, fix a warning on Max's SymDiff where retVal[0] is set when retVal has not been initialized.  I wonder if this is related to #323 where SymDiff for StableSoftMax (which uses Max) was failing with a panic (probably not, as the error message there seems unrelated, but probably a good fix anyway).

    Closes #326

commit 6fd05db
Author: Olivier Wulveryck <>
Date:   Tue Nov 12 09:15:56 2019 +0100

    Examples/readme (#351)

    * chore(readme): add references to the gorgonia website

commit e6bc7dd
Merge: 9ecd7d0 d1d231f
Author: gareth <>
Date:   Sat Nov 9 06:47:29 2019 +1100

    Merge pull request #350 from mattn/fix-gomod

    Fix go.mod

commit d1d231f
Author: Yasuhiro Matsumoto <>
Date:   Fri Nov 8 21:35:58 2019 +0900

    Fix go.mod

commit 9ecd7d0
Author: Olivier Wulveryck <>
Date:   Thu Nov 7 09:59:37 2019 +0100

    Gap operator (#302)

    * feat(wip): scratch space for a Global Average Pooling operator

    * chore: skeleton of the operator

    * feat: Global Average Pool

commit 6cc7466
Author: mattn <>
Date:   Sat Nov 2 03:16:02 2019 +0900

    Improvement of example/iris (#348)

commit 6f8c10a
Author: Olivier Wulveryck <>
Date:   Thu Oct 31 22:10:37 2019 +0100

    Iris example (#347)

    * fix: do not overwrite the channel if it already exists

    * feat: multivariate linear regression

commit b7b4b2c
Author: Olivier Wulveryck <>
Date:   Wed Oct 16 15:34:26 2019 +0200

    Create FUNDING.yml (#342)
@chewxy chewxy merged commit a8bd935 into master Dec 8, 2019
1 check passed
1 check passed
continuous-integration/travis-ci/pr The Travis CI build passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
None yet
2 participants
You can’t perform that action at this time.