Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release 0.1.0/sparsify 2019 02 18 #11

Merged
merged 8 commits into from
Feb 19, 2019

Commits on Feb 18, 2019

  1. Attempt to test for #4

    PyTorch's boolean comparison crap isn't useful and makes
      it a pain to test exact tensor values.
    * Will resume later
    stephenjfox committed Feb 18, 2019
    Configuration menu
    Copy the full SHA
    76e101d View commit details
    Browse the repository at this point in the history
  2. Skipping sparsify test

    It's a painfully simple function that has worked every time
      I've used it.
    - No it doesn't handle every edge case
    + Yes, it gets the job done and can be packaged for the general case
    stephenjfox committed Feb 18, 2019
    Configuration menu
    Copy the full SHA
    ed1adf1 View commit details
    Browse the repository at this point in the history
  3. Configuration menu
    Copy the full SHA
    c2939da View commit details
    Browse the repository at this point in the history
  4. Configuration menu
    Copy the full SHA
    6bb771f View commit details
    Browse the repository at this point in the history
  5. WIP: Implement shrink() in terms of resize_layers()

    It was as easy as I wanted it to be.
    * The complexity is how to handle a given nested layer
      + Those will get implemented with a given feature
      - Need to program feature detection
    
    TODO:
    + Implement the resizing on a layer-by-layer case, to
      make the shrinking a bit different
      + Instead of applying the data transformation uniformly,
        each layer gets
      + Those factors will be computed as 1 - percent_waste(layer)
    stephenjfox committed Feb 18, 2019
    Configuration menu
    Copy the full SHA
    b4c27d1 View commit details
    Browse the repository at this point in the history
  6. Configuration menu
    Copy the full SHA
    ace72af View commit details
    Browse the repository at this point in the history

Commits on Feb 19, 2019

  1. shrink_layer() is simple

    stephenjfox committed Feb 19, 2019
    Configuration menu
    Copy the full SHA
    7fe00cd View commit details
    Browse the repository at this point in the history
  2. Justification for giving Shrinkage a 'input_dimensions' property:

    > The thought is that channel depth doesn't change the output dimensions for CNNs, and that's
      attribute we're concerned with in the convulotional case...
      * Linear layers only have two dimensions, so it's a huge deal there.
      * RNNs do linear things over 'timesteps', so it's a big deal there.
      * Residual/identity/skip-connections in CNNs need this.
    
    > __It's decided__. The attribute stays
    stephenjfox committed Feb 19, 2019
    Configuration menu
    Copy the full SHA
    55b8043 View commit details
    Browse the repository at this point in the history