-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Release 0.1.0/sparsify 2019 02 18 #11
Merged
stephenjfox
merged 8 commits into
release/v0.1.0
from
release-0.1.0/sparsify_2019-02-18
Feb 19, 2019
Merged
Release 0.1.0/sparsify 2019 02 18 #11
stephenjfox
merged 8 commits into
release/v0.1.0
from
release-0.1.0/sparsify_2019-02-18
Feb 19, 2019
Commits on Feb 18, 2019
-
PyTorch's boolean comparison crap isn't useful and makes it a pain to test exact tensor values. * Will resume later
Configuration menu - View commit details
-
Copy full SHA for 76e101d - Browse repository at this point
Copy the full SHA 76e101dView commit details -
It's a painfully simple function that has worked every time I've used it. - No it doesn't handle every edge case + Yes, it gets the job done and can be packaged for the general case
Configuration menu - View commit details
-
Copy full SHA for ed1adf1 - Browse repository at this point
Copy the full SHA ed1adf1View commit details -
Configuration menu - View commit details
-
Copy full SHA for c2939da - Browse repository at this point
Copy the full SHA c2939daView commit details -
Configuration menu - View commit details
-
Copy full SHA for 6bb771f - Browse repository at this point
Copy the full SHA 6bb771fView commit details -
WIP: Implement shrink() in terms of resize_layers()
It was as easy as I wanted it to be. * The complexity is how to handle a given nested layer + Those will get implemented with a given feature - Need to program feature detection TODO: + Implement the resizing on a layer-by-layer case, to make the shrinking a bit different + Instead of applying the data transformation uniformly, each layer gets + Those factors will be computed as 1 - percent_waste(layer)
Configuration menu - View commit details
-
Copy full SHA for b4c27d1 - Browse repository at this point
Copy the full SHA b4c27d1View commit details -
Configuration menu - View commit details
-
Copy full SHA for ace72af - Browse repository at this point
Copy the full SHA ace72afView commit details
Commits on Feb 19, 2019
-
Configuration menu - View commit details
-
Copy full SHA for 7fe00cd - Browse repository at this point
Copy the full SHA 7fe00cdView commit details -
Justification for giving Shrinkage a 'input_dimensions' property:
> The thought is that channel depth doesn't change the output dimensions for CNNs, and that's attribute we're concerned with in the convulotional case... * Linear layers only have two dimensions, so it's a huge deal there. * RNNs do linear things over 'timesteps', so it's a big deal there. * Residual/identity/skip-connections in CNNs need this. > __It's decided__. The attribute stays
Configuration menu - View commit details
-
Copy full SHA for 55b8043 - Browse repository at this point
Copy the full SHA 55b8043View commit details
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.