Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A bunch of supporting functions for Golgi #306

Merged
merged 28 commits into from
Dec 8, 2019
Merged

A bunch of supporting functions for Golgi #306

merged 28 commits into from
Dec 8, 2019

Commits on Jul 19, 2019

  1. Added KeepDims as an additional function to "decorate" another function

    Cleaned up Ones and ones
    chewxy committed Jul 19, 2019
    Configuration menu
    Copy the full SHA
    1ffd427 View commit details
    Browse the repository at this point in the history

Commits on Jul 20, 2019

  1. Added broadcasted operations to api_gen

    Wrote program to generate those broadcasted ops
    Renamed BroadcastMul to BroadcastHadamardProd. BroadcastMul is coming soon
    chewxy committed Jul 20, 2019
    Configuration menu
    Copy the full SHA
    67db2fb View commit details
    Browse the repository at this point in the history
  2. added an example to show how one may use the broadcasting operations …

    …to create dense triangular matrices
    chewxy committed Jul 20, 2019
    Configuration menu
    Copy the full SHA
    561e93d View commit details
    Browse the repository at this point in the history

Commits on Jul 21, 2019

  1. Configuration menu
    Copy the full SHA
    103dde9 View commit details
    Browse the repository at this point in the history

Commits on Jul 22, 2019

  1. Added unaryOp interface to genapi. Generating the interfaces makes th…

    …e interfaces more consistent. Previously inversef32 gave the wrong ʘUnaryOperatorType
    chewxy committed Jul 22, 2019
    Configuration menu
    Copy the full SHA
    cb4b2f2 View commit details
    Browse the repository at this point in the history
  2. Allow axis to be defined in SoftMax. Furthermore the default axis is …

    …now the last axis. This allows for SoftMax to be done across ndarrays
    
    Added more examples
    chewxy committed Jul 22, 2019
    Configuration menu
    Copy the full SHA
    6b61480 View commit details
    Browse the repository at this point in the history

Commits on Jul 23, 2019

  1. Configuration menu
    Copy the full SHA
    ae26713 View commit details
    Browse the repository at this point in the history
  2. Added some things for future

    chewxy committed Jul 23, 2019
    Configuration menu
    Copy the full SHA
    b88acb6 View commit details
    Browse the repository at this point in the history
  3. Configuration menu
    Copy the full SHA
    bf86bb3 View commit details
    Browse the repository at this point in the history

Commits on Jul 24, 2019

  1. Configuration menu
    Copy the full SHA
    628211e View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    45485a7 View commit details
    Browse the repository at this point in the history

Commits on Jul 25, 2019

  1. Configuration menu
    Copy the full SHA
    71670b7 View commit details
    Browse the repository at this point in the history

Commits on Jul 27, 2019

  1. Configuration menu
    Copy the full SHA
    ec7c2f6 View commit details
    Browse the repository at this point in the history

Commits on Jul 31, 2019

  1. added some helper functions

    chewxy committed Jul 31, 2019
    Configuration menu
    Copy the full SHA
    962a029 View commit details
    Browse the repository at this point in the history

Commits on Aug 26, 2019

  1. Updated Unconcat tor use Nodes instead of []*Node

    This allows for easier lifting of the return value, however its
    utility is not known at the moment.
    chewxy committed Aug 26, 2019
    Configuration menu
    Copy the full SHA
    1d16dff View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    62b6a2a View commit details
    Browse the repository at this point in the history

Commits on Aug 30, 2019

  1. Configuration menu
    Copy the full SHA
    00ce7c0 View commit details
    Browse the repository at this point in the history

Commits on Sep 7, 2019

  1. Configuration menu
    Copy the full SHA
    828e521 View commit details
    Browse the repository at this point in the history

Commits on Oct 8, 2019

  1. Configuration menu
    Copy the full SHA
    5681366 View commit details
    Browse the repository at this point in the history

Commits on Oct 16, 2019

  1. Configuration menu
    Copy the full SHA
    25c62a2 View commit details
    Browse the repository at this point in the history

Commits on Oct 20, 2019

  1. Added HeEtAl InitWFn

    chewxy committed Oct 20, 2019
    Configuration menu
    Copy the full SHA
    7362f50 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    926923d View commit details
    Browse the repository at this point in the history
  3. Configuration menu
    Copy the full SHA
    f1e5c6d View commit details
    Browse the repository at this point in the history

Commits on Nov 7, 2019

  1. Configuration menu
    Copy the full SHA
    04c440d View commit details
    Browse the repository at this point in the history

Commits on Nov 18, 2019

  1. Squashed commit of the following:

    commit 592126c
    Author: Ben Leitner <7515022+bdleitner@users.noreply.github.com>
    Date:   Sun Nov 17 15:09:08 2019 -0800
    
        Refactor the max/sum ops to share common code. Have the type/inferShape/Do methods behave in a consistent manner: (#346)
    
        * Dimensions specified in the "along" parameter are reduced to size 1, but not removed. (Note: this caused TestRepeatOpDoDiff, but this version fixes it.  Perhaps we should make preserving the size-1 dimensions an option of the reduction op?)
        * If all dimensions are included, the result will be a scalar.
        * If all dimensions but 1 are included, the result is a vector, regardless of which dimension is left intact.
    
        Tests verify that the resulting nodes have the expected shape.
    
        Note: While here, fix a warning on Max's SymDiff where retVal[0] is set when retVal has not been initialized.  I wonder if this is related to #323 where SymDiff for StableSoftMax (which uses Max) was failing with a panic (probably not, as the error message there seems unrelated, but probably a good fix anyway).
    
        Closes #326
    
    commit 6fd05db
    Author: Olivier Wulveryck <owulveryck@users.noreply.github.com>
    Date:   Tue Nov 12 09:15:56 2019 +0100
    
        Examples/readme (#351)
    
        * chore(readme): add references to the gorgonia website
    
    commit e6bc7dd
    Merge: 9ecd7d0 d1d231f
    Author: gareth <31232838+jokebroker@users.noreply.github.com>
    Date:   Sat Nov 9 06:47:29 2019 +1100
    
        Merge pull request #350 from mattn/fix-gomod
    
        Fix go.mod
    
    commit d1d231f
    Author: Yasuhiro Matsumoto <mattn.jp@gmail.com>
    Date:   Fri Nov 8 21:35:58 2019 +0900
    
        Fix go.mod
    
    commit 9ecd7d0
    Author: Olivier Wulveryck <owulveryck@users.noreply.github.com>
    Date:   Thu Nov 7 09:59:37 2019 +0100
    
        Gap operator (#302)
    
        * feat(wip): scratch space for a Global Average Pooling operator
    
        * chore: skeleton of the operator
    
        * feat: Global Average Pool
    
    commit 6cc7466
    Author: mattn <mattn.jp@gmail.com>
    Date:   Sat Nov 2 03:16:02 2019 +0900
    
        Improvement of example/iris (#348)
    
    commit 6f8c10a
    Author: Olivier Wulveryck <owulveryck@users.noreply.github.com>
    Date:   Thu Oct 31 22:10:37 2019 +0100
    
        Iris example (#347)
    
        * fix: do not overwrite the channel if it already exists
    
        * feat: multivariate linear regression
    
    commit b7b4b2c
    Author: Olivier Wulveryck <owulveryck@users.noreply.github.com>
    Date:   Wed Oct 16 15:34:26 2019 +0200
    
        Create FUNDING.yml (#342)
    chewxy committed Nov 18, 2019
    Configuration menu
    Copy the full SHA
    84f497a View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    f270287 View commit details
    Browse the repository at this point in the history

Commits on Dec 7, 2019

  1. Configuration menu
    Copy the full SHA
    64416fe View commit details
    Browse the repository at this point in the history
  2. Fixed Softmax

    chewxy committed Dec 7, 2019
    Configuration menu
    Copy the full SHA
    e0e3652 View commit details
    Browse the repository at this point in the history