-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove pooling #114
Remove pooling #114
Conversation
this wasn't working before because we were always flattening the idx, which implicitly assumes there's only one channel and one batch
with this, the PooledVentralStream models are all gone in addition to removing the actual code, required to going through the tests and replacing them with either the Linear_Nonlinear or Steerable_Pyramid_Freq models they're still in the notebooks also caught a bug where test_synthesis_all_plot wasn't running plot_synthesis_status every time removed a couple other tests of animate and plotting from test_metamers.py, since those are now covered in test_display.py
since my PoolingVentralStream models were the only ones that used them
removes support for two synthesis options: fraction_removed and loss_change_fraction. together, they supported the ability to only compute gradient with respect to some subset of representation if it looked like loss had stopped changing. this was not being used by anyone and was getting more complicated to support (my initial implementation assumed only one batch and channel in the representation and failed if that wasn't true) loss_change_thresh and loss_change_iter remain, because they're used for coarse-to-fine
save and load no longer handle models. instead, load is now an instance method (not a class method) and so the object must be initialized first. we then call load, modifying it in place. we check whether some attributes (those set on initialization) are equal and raise an exception if not also: - loss_1/2_norm should be set in MADCompetition's initialization - removed some unnecessary calls to plot and animate in test_mad
removes SWA support and testing, which also removes the torchcontrib requirement
also allows width_ratios to be set as args to the top-level plot_synthesis_status and animate functions (which made using the LNL model in the Metamer notebook much easier)
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
Codecov Report
@@ Coverage Diff @@
## master #114 +/- ##
==========================================
+ Coverage 76.55% 77.28% +0.73%
==========================================
Files 36 33 -3
Lines 3463 2628 -835
Branches 741 527 -214
==========================================
- Hits 2651 2031 -620
+ Misses 620 460 -160
+ Partials 192 137 -55
Continue to review full report at Codecov.
|
View / edit / reply to this conversation on ReviewNB lyndond commented on 2021-03-17T23:03:00Z This is a nicer example than before. Maybe a line about interpreting the synthesized img, saying how the model is invariant to structured high freq noise. billbrod commented on 2021-03-19T19:55:24Z Agreed. |
View / edit / reply to this conversation on ReviewNB lyndond commented on 2021-03-17T23:03:01Z Should the titles say something like "difference from target"? billbrod commented on 2021-03-19T19:55:41Z Yeah, almost certainly. billbrod commented on 2021-03-19T20:22:35Z Actually, not trivial to make that happen. For now, I'm going to leave this, because I think we can take care of it when we overhaul how the display code works. |
View / edit / reply to this conversation on ReviewNB lyndond commented on 2021-03-17T23:03:02Z This section onward should probs get its own notebook
|
View / edit / reply to this conversation on ReviewNB lyndond commented on 2021-03-17T23:03:03Z This is what I meant by having weird duplicated pbars using tqdm.tqdm billbrod commented on 2021-03-19T19:56:11Z Agreed. I'll go ahead and switch to tqdm.auto |
View / edit / reply to this conversation on ReviewNB lyndond commented on 2021-03-17T23:03:04Z I feel like Advanced again should be its own nb.
kbonnen commented on 2021-03-19T00:07:00Z I'm in agreement. But I'm wondering if we can declare that change outside the scope of this pull request?
billbrod commented on 2021-03-19T19:56:30Z Yeah, I think these should end up somewhere else (maybe not even a notebook, since they're all text), but I think they're outside the scope of this PR |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Notebook comments: mainly that they're long and we should consider splitting advanced usage into their own nb. Don't have to do that in this PR though.
General comments: Mainly curious about users really needing this any flexibility in what attrs to save/load, now that we've stripped models out?
rotate_image, crop_image_centered, flip_image -- all of these have native pytorch versions and are unused
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
g2g
I'm in agreement. But I'm wondering if we can declare that change outside the scope of this pull request?
View entire conversation on ReviewNB |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I asked for 1-2 localized changes and made a couple of comments (some are in notebooks). But I'm going to go ahead and submit this review as approved so that you're not blocked because of me.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I found two places where the arguments seem not to be used by the functions.
…ic into remove_pooling
- objective_function_kwargs no longer used, removed - pickle_load_args needs to be passed to torch.load
shouldn't be an arg, needs to be passed to plot_synthesis_status
Agreed. View entire conversation on ReviewNB |
Yeah, almost certainly. View entire conversation on ReviewNB |
Agreed. I'll go ahead and switch to tqdm.auto View entire conversation on ReviewNB |
Yeah, I think these should end up somewhere else (maybe not even a notebook, since they're all text), but I think they're outside the scope of this PR View entire conversation on ReviewNB |
Actually, not trivial to make that happen. For now, I'm going to leave this, because I think we can take care of it when we overhaul how the display code works. View entire conversation on ReviewNB |
00_simple_example: - created model in notebook to show all that's necessary - added some explanatory text about how to interpret the outcomes of synthesis others: re-ran to get use of tqdm.auto
View / edit / reply to this conversation on ReviewNB lyndond commented on 2021-03-19T21:07:59Z Is the goal of this nb to showcase each of the 4 synthesis methods really quickly? I would put section headers and such between diff methods. Otherwise it looks fine to me for now. billbrod commented on 2021-03-22T14:50:53Z I'm not really sure -- My intention was to just show a simple model, that does nothing complicated, and how it can be just dropped into multiple synthesis methods. I think we should probably leave this where it is in this PR and spend some more time on the notebook later?
|
View / edit / reply to this conversation on ReviewNB lyndond commented on 2021-03-19T21:07:59Z Would we be opposed to using nn.functional.pad() for the input image in the forward() method? or would that complicate things too much. billbrod commented on 2021-03-22T14:51:56Z I think that's probably fine, but I like that we can see the impact of not using a pad -- the point isn't that this is a reasonable model for doing anything, but that we can see why the synthesized image looks the way it does, and I think having extra differences is helpful for that. |
shouldn't be there
I'm not really sure -- My intention was to just show a simple model, that does nothing complicated, and how it can be just dropped into multiple synthesis methods. I think we should probably leave this where it is in this PR and spend some more time on the notebook later?
View entire conversation on ReviewNB |
I think that's probably fine, but I like that we can see the impact of not using a pad -- the point isn't that this is a reasonable model for doing anything, but that we can see why the synthesized image looks the way it does, and I think having extra differences is helpful for that. View entire conversation on ReviewNB |
This pull request removes large swathes of code that I was using for my foveated metamer project and that we don't want to support generally:
pooling.py
andpooling_windows.py
, the reimplementation and extension of the windows used in Jeremy's 2011 paper, have been moved to a new public LCV repo, with tests and a readme, though no tutorial (and not on pip)ventral_stream.py
, thePooledVentralStream
models I built, has been moved to my (currently-private) research repo for the projectfraction_removed
,loss_change_fraction
and support for SWA (stochastic weight averaging, and thustorchcontrib
). All of these made the synthesis optimization code more complicated, for little benefit.test_metamers.py
andtest_mad.py
)plot_representation
methods and coarse-to-fine optimization (in particular, I edited out blocks of code, which still refer toPooledV1
)