Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Abstractions #38

Merged
merged 117 commits into from
Aug 13, 2020
Merged

Abstractions #38

merged 117 commits into from
Aug 13, 2020

Commits on Jun 20, 2019

  1. starts mad_competition

    billbrod committed Jun 20, 2019
    Configuration menu
    Copy the full SHA
    670683f View commit details
    Browse the repository at this point in the history

Commits on Jul 29, 2019

  1. Configuration menu
    Copy the full SHA
    f5cb748 View commit details
    Browse the repository at this point in the history

Commits on Dec 6, 2019

  1. adds load_images function

    helper function for loading in images and making sure they're the
    right shape
    
    with tests
    billbrod committed Dec 6, 2019
    Configuration menu
    Copy the full SHA
    c25bf1c View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    4953e10 View commit details
    Browse the repository at this point in the history

Commits on Dec 13, 2019

  1. Configuration menu
    Copy the full SHA
    e899974 View commit details
    Browse the repository at this point in the history
  2. starts abstraction for synthesis methods

    moves a bunch of code from Metamer to Synthesis. currently not much
    change to those methods, but will do so as attempt to use the
    abstraction for geodesics, eigendistortions, and MAD
    billbrod committed Dec 13, 2019
    Configuration menu
    Copy the full SHA
    236dbed View commit details
    Browse the repository at this point in the history
  3. adds scikit image dependency

    billbrod committed Dec 13, 2019
    Configuration menu
    Copy the full SHA
    5145d18 View commit details
    Browse the repository at this point in the history

Commits on Jan 10, 2020

  1. Configuration menu
    Copy the full SHA
    282c5a4 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    d35d316 View commit details
    Browse the repository at this point in the history
  3. Configuration menu
    Copy the full SHA
    2e3741b View commit details
    Browse the repository at this point in the history

Commits on Jan 24, 2020

  1. Adds initial MAD competition implementatin

    with this commit, we add an initial implementation of MAD
    competition. it currently only minimizes model 1 while holding model 2
    constant, and only works on models (not metrics), but the basic
    framework is there. a fair amount of cleanup is required as well.
    
    MAD competition (like Metamer) inherits the Synthesis class and so
    makes use of much there. more functionality consolidation is needed
    here, as there was a lot of copy-and-paste
    
     - Synthesis now uses the self.names dictionary to determine what to
       target / match etc. this doesn't change how Metamer works (or
       honestly, probably any of the other synthesis methods) but is
       necessary for MAD
    
     - adds a base gradient descent optimizer choice, GD
    
     - scheduler is now optional
    
     - optimizer_step's pbar arg is now optional. if None, will not update
       information. can also add additional information to the pbar as
       kwargs
    billbrod committed Jan 24, 2020
    Configuration menu
    Copy the full SHA
    3daa103 View commit details
    Browse the repository at this point in the history

Commits on Jan 27, 2020

  1. Configuration menu
    Copy the full SHA
    53cfc1d View commit details
    Browse the repository at this point in the history

Commits on Jan 31, 2020

  1. Configuration menu
    Copy the full SHA
    997ec90 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    541366b View commit details
    Browse the repository at this point in the history
  3. bugfix: remove the third arg to objective_function

    originally, I was planning on adding an arg to objective_function to
    turn normalizing on/off, but now we always do it if possible
    billbrod committed Jan 31, 2020
    Configuration menu
    Copy the full SHA
    33f85d8 View commit details
    Browse the repository at this point in the history
  4. Configuration menu
    Copy the full SHA
    5f04dcc View commit details
    Browse the repository at this point in the history

Commits on Feb 7, 2020

  1. all targets supported

    with this, all targets are now supported, which requires an overhaul
    of how we handle getting and setting the relevant attributes.
    
    Synthesis goes back to knowing nothing about the attribute structure
    of its children classes, assuming it can just directly grab
    self.matched_representation (instead of going through self.names). it
    is the responsibility of the child class to make sure this is the
    correct attribute to modify, which we do here by modifying the getter
    and setter (and continued use of a self._names attribute)
    
     - adds way more documentation to mad competition
    
     - moves representation_error to Synthesis
    
     - mad competition has own representation_error method, to handle the
       multiple synthesis targets / models
    
     - normalizing the loss is now an option, not always done
    
     - many MADCompetition attributes now have "all" variants, which store
       across targets
    billbrod committed Feb 7, 2020
    Configuration menu
    Copy the full SHA
    e9b0bc9 View commit details
    Browse the repository at this point in the history
  2. moves normalized_mse

    it's now in Synthesis, mad_competition has its own version (like
    representation_error)
    billbrod committed Feb 7, 2020
    Configuration menu
    Copy the full SHA
    e74a045 View commit details
    Browse the repository at this point in the history

Commits on Feb 14, 2020

  1. moves plotting and animate to Synthesis

    this moves the following functions from the Metamer class to
    Synthesis, so it can be used by other synthesis methods:
    plot_representation_error, plot_synthesis_status (was
    plot_metamer_status), animate
    
    other changes, to MADCompetition:
    
     - corrects name from saved_representation_gradient_{num} to
       saved_representation_{num}_gradient
    
     - now correctly handles _all attributes and copying information back
       and forth between them and the local version, does so with helper
       functions
    
     - nu, learning_rate, gradient now lists with _all versions that are
       dicts (like the other attributes)
    
     - adds _check_state() helper function which updates the synthesis
       target if necessary and returns the current state (used by the
       various plotting functions)
    billbrod committed Feb 14, 2020
    Configuration menu
    Copy the full SHA
    9930280 View commit details
    Browse the repository at this point in the history

Commits on Feb 21, 2020

  1. plotting and animating now work with mad!

    variety of updates necessary to make this happen:
    
     - mad competition has wrappers around many of the functions, which
       just make sure that the synthesis target is appropriately set. we
       make sure these have the args in the same order as the Synthesis
       class's version (with the new args tucked on the end)
    
     - mad_competition.representation_error's model arg can also be
       'both', which returns a dictionary containing both model's
       representation errors
    
     - correctly handle the copying attributes in *and out* of _all for
       tensors and lists
    
     - adds methods for MADCompetition.plot_synthesized_image,
       plot_synthesized_image_all (shows all synthesized images),
       plot_loss, plot_loss_all, plot_synthesis_status, and animate
    
     - in Synthesis, split out plot_synthesized_image and plot_loss from
       plot_synthesis_status
    
     - Synthesis.animate now takes two more args, plot_data_attr, a list
       which specifes the attributes to plot on the second subplot (so can
       be more than one; thus we grab all artists), and rep_error_kwargs,
       which specifies additional args to pass to the
       self.representation_error() call
    
     - display.rescale_ylim works with an array like or a dict. if a
       dictionary, we take the max over all of its values
    billbrod committed Feb 21, 2020
    Configuration menu
    Copy the full SHA
    c268f27 View commit details
    Browse the repository at this point in the history

Commits on Feb 28, 2020

  1. adds norm_loss option

    for mad competition, can set whether you want to normalize the two
    models' losses or keep them the same. unsure which is best right now
    billbrod committed Feb 28, 2020
    Configuration menu
    Copy the full SHA
    8a513fd View commit details
    Browse the repository at this point in the history
  2. dispaly.plot_representation now uses vrange='indep0' for images

    because it is likely to have positive and negative values
    billbrod committed Feb 28, 2020
    Configuration menu
    Copy the full SHA
    9c4f13c View commit details
    Browse the repository at this point in the history
  3. fixes linewrap in mssim

    billbrod committed Feb 28, 2020
    Configuration menu
    Copy the full SHA
    27595ab View commit details
    Browse the repository at this point in the history
  4. mad competition now supports metrics as well as models

    the MAD competition paper, as originally written, compared SSIM and
    MSE, which are metrics, not models (they take two images and return a
    scalar distance between them). as our code was written, it only worked
    for models (which take an image and return a representation, and we
    use the L2-norm of the difference between representations for the
    distance between images).
    
    with this commit, can now use metrics for MAD competition. to do that,
    we construct a 'dummy model', that just returns a copy of the image
    and modify the function called for our objective function
    
    other changes
    
     - adds Identity() class in simulate/models/naive.py
    
     - adds MSE class to mse() function, since metrics should be functions
       now
    
     - we no longer do the whole randomizing of pixels before computing
       the loss in Synthesis._closure(). this really messes up those
       metrics that extract something meaningful from the image (i.e., all
       the ones we're interested in). so this is turned off by default,
       and only turned on when needed. right now, this is just when
       metamer has fraction_removed > 1 or loss_change_fraction <
       1. should try and generalize that tools
    
     - updates docstrings
    
     - adds loss_function attr, which is how we switch things around for
       the two models. objective_function calls this on x and y. (still
       use objective_function because it handles the normalizing)
    
     - adds _get_model_name, which first tries to grab the model's name
       attribute and, if it doesn't have one, then returns the class
       name. useful for the Identity class
    
     - raise a warning whenever the representation error is computed with
       a metric, since it will be meaningless
    billbrod committed Feb 28, 2020
    Configuration menu
    Copy the full SHA
    d28df98 View commit details
    Browse the repository at this point in the history
  5. Configuration menu
    Copy the full SHA
    18b282e View commit details
    Browse the repository at this point in the history
  6. bugfixes

     - renames MSE to mse in import statement
    
     - replaces plot_metamer_status with plot_synthesis_status
    billbrod committed Feb 28, 2020
    Configuration menu
    Copy the full SHA
    d48ec6b View commit details
    Browse the repository at this point in the history

Commits on Mar 13, 2020

  1. moves shared initialization code to Synthesis.py

    realized there was more stuff that can be done by the superclass
    billbrod committed Mar 13, 2020
    Configuration menu
    Copy the full SHA
    eee23aa View commit details
    Browse the repository at this point in the history
  2. moves metric support to Synthesis.py

    two big changes, for Synthesis, MADCompetition, Metamer:
    
     - metric support added. this just moves the bits that handle this
       from mad_competition.py to Synthesis.py
    
     - can set loss_function to custom function. by default is L2-norm of
       difference
    
     - updates everyone's documentation and call signatures
    billbrod committed Mar 13, 2020
    Configuration menu
    Copy the full SHA
    35bf51c View commit details
    Browse the repository at this point in the history

Commits on Mar 20, 2020

  1. breaks synthesize into functions and moves some to Synthesis

    this commit breaks the components of synthesize() into functions,
    moves them to Synthesize abstract metaclass as possible, and thus
    makes things more readable
    
    as part of doing this, realized why matched_representation was
    sometimes a Parameter after saving: because I was making it one when a
    NaN was hit during loss. removed that and everything seems to still
    work
    
    still need to:
    
     - figure out how I want to handle coarse-to-fine and the randomizing
       stuff
    
     - standardize synthesize() call signature
    billbrod committed Mar 20, 2020
    Configuration menu
    Copy the full SHA
    0a2f19e View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    3f2294f View commit details
    Browse the repository at this point in the history
  3. bugfix: typo

    billbrod committed Mar 20, 2020
    Configuration menu
    Copy the full SHA
    3def646 View commit details
    Browse the repository at this point in the history

Commits on Mar 27, 2020

  1. bugfix: mad_competition was messing up saved_representation

    accidentally had hard-coded (in _init_store_progress) that model_2 was
    the stable model, by setting self.saved_representation =
    list(self.saved_representation_2). fixes that, correctly sets
    synthesis_target now, and adds some warning to docstring
    billbrod committed Mar 27, 2020
    Configuration menu
    Copy the full SHA
    3abdc15 View commit details
    Browse the repository at this point in the history

Commits on Apr 3, 2020

  1. adds coarse-to-fine to Synthesis, standardizes synthesize() call

    several related commits, all aiming on finishing up the
    standardization of synthesize():
    
     - adds _init_ctf_and_randomizer_() to Synthesis class (moving over
       the relevant part from the beginning of Metamer.synthesis()). this
       function sets up the coarse-to-fine optimization and possible
       randomizations (using some subset of representation when
       calculating gradients)
    
     - Synthesis._clamp_and_store() now returns a bool specifying if we
       stored or not this iteration.
    
     - Synthesis has new function, _checkfro_stabilization(), which checks
       whether loss has stabilized or not (and does so appropriately for
       coarse to fine optimization)
    
     - Synthesis.synthesize() now has args and documentation. it still
       raises an exception if you call it, but the idea is this provides a
       framework you can copy and you should keep the arguments in the
       order they are, putting new args in the beginning
    
     - MAD competition now works with coarse to fine. coarse_to_fine is
       one of those attributes that has values for each model. it is
       always false for the stable model, though it may be true for the
       target one. MAD competition has its own _init_ctf_and_randomizer
       which calls the super and sets the stable model's coarse_to_fine to
       False
    
     - bugfix: move MAD competition's first update_target within the
       synthesis loop
    
     - bugfix: MAD competition correctly updates other
       matched_representation when _check_nan_loss is True
    
     - standardizes synthesize() call signature for both
    billbrod committed Apr 3, 2020
    Configuration menu
    Copy the full SHA
    525e020 View commit details
    Browse the repository at this point in the history
  2. fixes synthesize_all

    hadn't corrected it since the last update
    billbrod committed Apr 3, 2020
    Configuration menu
    Copy the full SHA
    4196c37 View commit details
    Browse the repository at this point in the history
  3. changes device and dtype to DEVICE and DTYPE

    to make it more clear they're global variables
    billbrod committed Apr 3, 2020
    Configuration menu
    Copy the full SHA
    0e760d5 View commit details
    Browse the repository at this point in the history
  4. send metric dummy-model to target_image device

    this should enable it to handle GPUs successfully
    billbrod committed Apr 3, 2020
    Configuration menu
    Copy the full SHA
    5ad962a View commit details
    Browse the repository at this point in the history
  5. scope doesn't need to be modules for this

    actually the test files are not needed for test_mad
    billbrod committed Apr 3, 2020
    Configuration menu
    Copy the full SHA
    241f743 View commit details
    Browse the repository at this point in the history
  6. corrects load so it will work with MADCompetition

    this requires allowing for the possibility of multiple models
    billbrod committed Apr 3, 2020
    Configuration menu
    Copy the full SHA
    e4b4af4 View commit details
    Browse the repository at this point in the history
  7. Configuration menu
    Copy the full SHA
    70e7ac1 View commit details
    Browse the repository at this point in the history
  8. adds tests file

    includes file that tests all the important things (I think)
    billbrod committed Apr 3, 2020
    Configuration menu
    Copy the full SHA
    5f9b4ee View commit details
    Browse the repository at this point in the history

Commits on Apr 10, 2020

  1. fixes broken tests

    load call signature had changed
    billbrod committed Apr 10, 2020
    Configuration menu
    Copy the full SHA
    cf6c340 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    d8c97a7 View commit details
    Browse the repository at this point in the history
  3. adds new capabilities to mad_competition, updates docstrings, tests

    several changes here, all related to the merging of master into
    abstractions on last commit
    
     - updates docstrings of Synthesis and Metamer to more accurately
       reflect current capabilities
    
     - changes Synthesis.synthesize() call signature and some of function
       for new abilities
    
     - MADCompetition now handles the new capabilities/changes correctly:
       better ability to resume synthesis, new synthesize() args
       clamp_each_iter and clip_grad_norm, default clamper now
       RangeClamper (instead of None)
    
     - adds tests for resuming synthesis (and the errors it should raise)
       for MADCompetition
    billbrod committed Apr 10, 2020
    Configuration menu
    Copy the full SHA
    27264e8 View commit details
    Browse the repository at this point in the history
  4. test_resume_exceptions bugfix

    gets it working
    billbrod committed Apr 10, 2020
    Configuration menu
    Copy the full SHA
    510750c View commit details
    Browse the repository at this point in the history
  5. more bugfixes for test_mad

    resume can only work if store_progress was not False
    billbrod committed Apr 10, 2020
    Configuration menu
    Copy the full SHA
    ada09b7 View commit details
    Browse the repository at this point in the history
  6. bugfix for synthesize_all

    billbrod committed Apr 10, 2020
    Configuration menu
    Copy the full SHA
    d577805 View commit details
    Browse the repository at this point in the history
  7. Configuration menu
    Copy the full SHA
    1a82a13 View commit details
    Browse the repository at this point in the history
  8. adds metric/classes/NLP

    so I don't have to keep creating it
    billbrod committed Apr 10, 2020
    Configuration menu
    Copy the full SHA
    28f82a9 View commit details
    Browse the repository at this point in the history
  9. adds if_existing arg to MADCompetition.synthesize_all()

    this new arg controls what to do if one of the synthesis targets has
    been run before, with three choices
    
    and adds tests
    billbrod committed Apr 10, 2020
    Configuration menu
    Copy the full SHA
    b88c48a View commit details
    Browse the repository at this point in the history
  10. Configuration menu
    Copy the full SHA
    9be2950 View commit details
    Browse the repository at this point in the history

Commits on Apr 17, 2020

  1. some quality of life changes

     - seed can now be None, in which case we don't set it. intended use
       case is to avoid resetting when resuming synthesis
    
     - store_progress can now be None in order to maintain setting from
       previous call (when resuming) and will raise an Exception if it
       changes between synthesis calls
    
     - updates documentation for optimizer to reflect full range of
       options
    
     - animation() now handles multiple calls to synthesize()
    
     - adds po.tools.data.convert_float_to_im to convert a float that lies
       between 0 and 1 to another dtype (intended: np.uint8 or np.uint16)
    billbrod committed Apr 17, 2020
    Configuration menu
    Copy the full SHA
    966ae37 View commit details
    Browse the repository at this point in the history
  2. adds Metamer tutorial

    billbrod committed Apr 17, 2020
    Configuration menu
    Copy the full SHA
    6dfa24e View commit details
    Browse the repository at this point in the history

Commits on May 4, 2020

  1. adds simple example

    this might serve as our quick start page? demonstrates basics of how
    to interact with Synthesis objects and the bare minimum of what a
    model requries
    billbrod committed May 4, 2020
    Configuration menu
    Copy the full SHA
    d29c29b View commit details
    Browse the repository at this point in the history

Commits on May 8, 2020

  1. fixes FrontEnd issue

    Front_End had a problem with torch 1.5. previously, torch would
    convert np.float64 to torch.float32; now, it converts them to
    torch.float64. so need to explicitly set them to float32
    
    also cleans up a bit
    billbrod committed May 8, 2020
    Configuration menu
    Copy the full SHA
    0c69693 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    764f7bb View commit details
    Browse the repository at this point in the history
  3. Configuration menu
    Copy the full SHA
    3999599 View commit details
    Browse the repository at this point in the history
  4. adds more info on MAD-specific args/attributes

    and those that are more relevant to it
    billbrod committed May 8, 2020
    Configuration menu
    Copy the full SHA
    18219cf View commit details
    Browse the repository at this point in the history
  5. starts Synthesis notebook

    attempts to be a tutorial for implementing a new synthesis method, but
    having trouble with it
    billbrod committed May 8, 2020
    Configuration menu
    Copy the full SHA
    2cfca2d View commit details
    Browse the repository at this point in the history

Commits on May 15, 2020

  1. Configuration menu
    Copy the full SHA
    0246833 View commit details
    Browse the repository at this point in the history

Commits on May 22, 2020

  1. adds tools/optim.py

    which will contain objective functions and the like
    billbrod committed May 22, 2020
    Configuration menu
    Copy the full SHA
    0adc73a View commit details
    Browse the repository at this point in the history
  2. overhauls loss_function

    loss functions now work differently: they must all take at least the
    keyword arguments synth_rep, ref_rep, synth_img, ref_img and return
    some scalar. this allows us to compute loss with respect ot things in
    the image, such as range or moments.
    
    this required a fair amount of work
    
    also:
    
     - adds tests
    
     - expands section of MADCompetition notebook explaining loss
       functions to include more details
    billbrod committed May 22, 2020
    Configuration menu
    Copy the full SHA
    00cf4e3 View commit details
    Browse the repository at this point in the history
  3. bugfix: removes extra arg

    billbrod committed May 22, 2020
    Configuration menu
    Copy the full SHA
    b00627b View commit details
    Browse the repository at this point in the history

Commits on May 23, 2020

  1. corrects save() and load() for metrics and loss function

    previously, wasn't saving loss function and was not able to save if we
    used a metric instead of a model because saving functions is not
    supported by default pickle. we now use dill, which does support
    this (added to setup.py as well)
    
    additionally:
    
     - loss_kwargs renamed to loss_function_kwargs to be more consistent
    
     - save now has a model_attr_names arg just like load, so will
       properly save multiple models
    
     - copies load() to Metamer, makes call signature simpler
    
     - load() call signature simplified for MADCompetition, updates
       docstring to be more MAD-specific
    
     - adds better save/load tests for Metamer and MADCompetition
    billbrod committed May 23, 2020
    Configuration menu
    Copy the full SHA
    a211da2 View commit details
    Browse the repository at this point in the history

Commits on May 25, 2020

  1. adds verbose flag

    so we can figure out the weird error
    billbrod committed May 25, 2020
    Configuration menu
    Copy the full SHA
    12b8049 View commit details
    Browse the repository at this point in the history
  2. adds small notes

    billbrod committed May 25, 2020
    Configuration menu
    Copy the full SHA
    bf669a8 View commit details
    Browse the repository at this point in the history
  3. adds pytest-timeout

    to try and figure out why the one test is hanging
    billbrod committed May 25, 2020
    Configuration menu
    Copy the full SHA
    052fe4c View commit details
    Browse the repository at this point in the history
  4. updates pytest-timeout

    billbrod committed May 25, 2020
    Configuration menu
    Copy the full SHA
    9dca101 View commit details
    Browse the repository at this point in the history

Commits on May 26, 2020

  1. Configuration menu
    Copy the full SHA
    0b04cbc View commit details
    Browse the repository at this point in the history
  2. adds more print statements?

    billbrod committed May 26, 2020
    Configuration menu
    Copy the full SHA
    ab6692e View commit details
    Browse the repository at this point in the history
  3. adds more text

    billbrod committed May 26, 2020
    Configuration menu
    Copy the full SHA
    30dd960 View commit details
    Browse the repository at this point in the history
  4. Configuration menu
    Copy the full SHA
    c4e31b7 View commit details
    Browse the repository at this point in the history
  5. fixes problem

    - the plotting in test_loss_func would cause a stall. it's not that
      important to test, so got rid of it
    
    - also goes back to regular pytest command then
    
    - removes pytest-timeout, because that wasn't actually helping
    billbrod committed May 26, 2020
    Configuration menu
    Copy the full SHA
    aef41d6 View commit details
    Browse the repository at this point in the history
  6. try another way to fix the problem

    https://stackoverflow.com/a/35403128 suggests that I should enable
    xvfb and close the open images, so let's try that.
    billbrod committed May 26, 2020
    Configuration menu
    Copy the full SHA
    29e85f7 View commit details
    Browse the repository at this point in the history
  7. Configuration menu
    Copy the full SHA
    fe3bf00 View commit details
    Browse the repository at this point in the history

Commits on Jun 12, 2020

  1. adds notebook showing simple MAD example

    with L1 and L2 norm, showing that it does the right thing, but that it
    might not be easy -- in this case, because those two are hard to
    separate
    billbrod committed Jun 12, 2020
    Configuration menu
    Copy the full SHA
    a5f2005 View commit details
    Browse the repository at this point in the history

Commits on Jun 15, 2020

  1. adds choices for how to do coarse-to-fine optimization

    standard way of doing coarse-to-fine (ctf) optimization is *not* what
    I had been doing before. before this commit, ctf computed the gradient
    with respect to the target scale individually, ignoring the
    others (not doing anything to hold them fixed). the standard way is to
    compute the gradient with respect to the target scale *and all coarser
    scales*. this commit allows you to toggle between those two choices,
    by setting coarse_to_fine to either 'together' or 'separate'.
    
    also updates MADCompetition.save to appropriately save the related
    attributes, and makes the tests for metamer and MAD more complete
    billbrod committed Jun 15, 2020
    Configuration menu
    Copy the full SHA
    8d65038 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    321775d View commit details
    Browse the repository at this point in the history

Commits on Jun 26, 2020

  1. Configuration menu
    Copy the full SHA
    8402a9b View commit details
    Browse the repository at this point in the history
  2. Merge branch 'abstractions' of github.com:LabForComputationalVision/p…

    …lenoptic into abstractions
    billbrod committed Jun 26, 2020
    Configuration menu
    Copy the full SHA
    6eed195 View commit details
    Browse the repository at this point in the history

Commits on Jun 27, 2020

  1. Configuration menu
    Copy the full SHA
    863c41c View commit details
    Browse the repository at this point in the history

Commits on Jul 11, 2020

  1. adds code to view external MAD results

    and notebook showing how it's done
    billbrod committed Jul 11, 2020
    Configuration menu
    Copy the full SHA
    e7b5d40 View commit details
    Browse the repository at this point in the history

Commits on Jul 13, 2020

  1. removes pandas import

    since it's not a requirement and this is the only place we use it. in
    the dataframe's place, we return a well-structured dictionary
    billbrod committed Jul 13, 2020
    Configuration menu
    Copy the full SHA
    fbaf91e View commit details
    Browse the repository at this point in the history

Commits on Jul 31, 2020

  1. Configuration menu
    Copy the full SHA
    1b354e7 View commit details
    Browse the repository at this point in the history

Commits on Aug 4, 2020

  1. updates to SSIM

    adds docstrings for _gaussian (renamed from gaussian), create_windows,
    ssim.
    
    Updates SSIM to allow for weighting that matches the output from MAD
    competition code
    billbrod committed Aug 4, 2020
    Configuration menu
    Copy the full SHA
    663f455 View commit details
    Browse the repository at this point in the history
  2. changes for SSIM and MAD

     - MSE now only averages across last two dimensions (height and width),
       as metrics should
    
     - SSIM:
    
       - removes unnecessary arguments (window and window_size no longer
         allowed, size_average because we always just average across last
         two dimensions)
    
       - Gaussian window is normalized in 2d, not 1d
    
       - Removes cs output
    
       - val_range renamed to dynamic_range and must always be explicitly
         set (None no longer allowed)
    
       - checks n_batches and handles them as you'd like
    
     - adds add_noise function, which adds noise such that user specifies
       the target MSE (approximately correct, not exact) and we can handle
       multiple noise values and images correctly (always with independent
       seeds and broadcasting as appropriate)
    billbrod committed Aug 4, 2020
    Configuration menu
    Copy the full SHA
    df92b37 View commit details
    Browse the repository at this point in the history
  3. Configuration menu
    Copy the full SHA
    aa86d8b View commit details
    Browse the repository at this point in the history

Commits on Aug 5, 2020

  1. Configuration menu
    Copy the full SHA
    ca564a3 View commit details
    Browse the repository at this point in the history
  2. renames min/max to best/worst

    because the mapping between min/max and best/worst is not the same for
    MSE and SSIM, this makes it clearer and sets the order for both to be
    Best, Worst, matching the paper
    billbrod committed Aug 5, 2020
    Configuration menu
    Copy the full SHA
    9b7e2ea View commit details
    Browse the repository at this point in the history
  3. corrects add_noise

    need to average noise over the last two dimensions in order to get the
    variances matched exactly
    billbrod committed Aug 5, 2020
    Configuration menu
    Copy the full SHA
    b5d56cb View commit details
    Browse the repository at this point in the history
  4. removes msssim, SSIM, MSSSIM, restructures ssim

    with this commit, we remove msssim (because there are some tricky things
    that I don't have the time to figure out -- issue opened for it if
    people want to check it out) and SSIM (caching window no longer makes
    sense), and thus MSSSIM
    
    also restructures ssim so that we have a helper function which computes
    the ssim_map, contrast map, and weight. ssim just returns the mean over
    the image. adds new function, ssim_map, which just returns the map.
    contrast map will be used by msssim if we re-implement it
    billbrod committed Aug 5, 2020
    Configuration menu
    Copy the full SHA
    d8ff0d0 View commit details
    Browse the repository at this point in the history

Commits on Aug 6, 2020

  1. adds SSIM tests

    adds tests for the new SSIM functionality, both to make sure it works as
    we'd like and that it matches MATLAB outputs. this involves downloading
    some new files off the OSF, which means the downloading function is
    updated to be a bit more general
    
    also:
    
     - updates MAD notebooks
    
     - bugfix for add_noise to make sure the sizing works out as we'd like
    billbrod committed Aug 6, 2020
    Configuration menu
    Copy the full SHA
    13adf0f View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    d3dd4e2 View commit details
    Browse the repository at this point in the history
  3. bugfix for mad competition saving

    makes sure to save all the _all attributes
    billbrod committed Aug 6, 2020
    Configuration menu
    Copy the full SHA
    d7f5933 View commit details
    Browse the repository at this point in the history
  4. corrects failing tests

    billbrod committed Aug 6, 2020
    Configuration menu
    Copy the full SHA
    85f8c72 View commit details
    Browse the repository at this point in the history
  5. Configuration menu
    Copy the full SHA
    64c294d View commit details
    Browse the repository at this point in the history

Commits on Aug 7, 2020

  1. Configuration menu
    Copy the full SHA
    58d4b00 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    77d2c72 View commit details
    Browse the repository at this point in the history
  3. removes normalized_mse

    it doesn't actually make sense
    billbrod committed Aug 7, 2020
    Configuration menu
    Copy the full SHA
    9196906 View commit details
    Browse the repository at this point in the history
  4. Configuration menu
    Copy the full SHA
    40bf306 View commit details
    Browse the repository at this point in the history
  5. renames some Synthesis attributes

    renames:
    
     - matched_ -> synthesized_
    
     - _image -> _signal (except for plot_synthesized_image, which really is
       plotting a single image)
    
     - target_ -> base_
    billbrod committed Aug 7, 2020
    Configuration menu
    Copy the full SHA
    599314a View commit details
    Browse the repository at this point in the history
  6. moves some attributes to hidden

    step, rep_warning, use_subset_for_gradient, and loss_sign, none of which
    should ever be set by the user, nor do they need to be seen
    billbrod committed Aug 7, 2020
    Configuration menu
    Copy the full SHA
    b7b7633 View commit details
    Browse the repository at this point in the history
  7. Synthesis no longer a torch.nn.Module

    this wasn't getting us anything (the only time we called super() was
    during init, and that just sets some attributes we weren't using), it
    was setting up a bunch of attributes and methods we weren't using, and
    it made some things confusing (like inheriting the parameters of its
    attributes, e.g., model)
    billbrod committed Aug 7, 2020
    Configuration menu
    Copy the full SHA
    3fc54e6 View commit details
    Browse the repository at this point in the history
  8. saves coarse_to_fine

    forgot this one
    billbrod committed Aug 7, 2020
    Configuration menu
    Copy the full SHA
    bf5f5e0 View commit details
    Browse the repository at this point in the history
  9. starts cleaning up docstring

    this starts removing some of the information from the docstrings from
    the Synthesis methods, since they're overwhelming. Stuff that's relevant
    is getting moved into a notebook, but many can be just deleted
    billbrod committed Aug 7, 2020
    Configuration menu
    Copy the full SHA
    f6d06ae View commit details
    Browse the repository at this point in the history

Commits on Aug 11, 2020

  1. Big update on documentation, reverts MAD init noise

    Moves a lot of information out of the docstrings of Synthesis,
    MADCompetition, and Metamer and into their tutorials. Also:
    
    - makes sure notebooks match new attribute names
    
    - goes back to old way of adding noise for initializing MAD (couldn't
      get the new way working with Simple_MAD, created issue)
    
    - adds info to Synthesis notebook with what the different methods and
      attributes are for
    
    - simplifies MADCompetition.synthesize_all to just use kwargs to pass
      everything to synthesize
    billbrod committed Aug 11, 2020
    Configuration menu
    Copy the full SHA
    aaba29d View commit details
    Browse the repository at this point in the history
  2. Update perceptual_distance.py

    small typos
    lyndond committed Aug 11, 2020
    Configuration menu
    Copy the full SHA
    8eb2d3c View commit details
    Browse the repository at this point in the history
  3. Configuration menu
    Copy the full SHA
    9ab9513 View commit details
    Browse the repository at this point in the history
  4. Configuration menu
    Copy the full SHA
    66b7414 View commit details
    Browse the repository at this point in the history
  5. changes to docstrings that Lyndon pointed out

    largely, this is changing type from torch.tensor to torch.Tensor, which
    is the appropriate one
    billbrod committed Aug 11, 2020
    Configuration menu
    Copy the full SHA
    f41f728 View commit details
    Browse the repository at this point in the history
  6. Merge branch 'abstractions' of github.com:LabForComputationalVision/p…

    …lenoptic into abstractions
    billbrod committed Aug 11, 2020
    Configuration menu
    Copy the full SHA
    b45fbce View commit details
    Browse the repository at this point in the history
  7. Configuration menu
    Copy the full SHA
    ff4c32a View commit details
    Browse the repository at this point in the history
  8. updates docs

    - bugfix in non_linearities.py, need to use :meth: tag in See Also
      section
    
    - Adds some tutorials and example notebooks
    
    - updates the auto-generated rst files
    
    - changes some of the markdown headings of Eigendistortions and
      Original_MAD notebooks
    billbrod committed Aug 11, 2020
    Configuration menu
    Copy the full SHA
    f2ec3c1 View commit details
    Browse the repository at this point in the history
  9. Configuration menu
    Copy the full SHA
    3fe5f3d View commit details
    Browse the repository at this point in the history

Commits on Aug 13, 2020

  1. Configuration menu
    Copy the full SHA
    b6352b0 View commit details
    Browse the repository at this point in the history
  2. cleans up tests

    - all tests now import DEVICE, DTYPE, DATA_DIR from test_plenoptic
    
    - makes sure the constants are in caps
    
    - DTYPE is now torch.float32
    
    - removes unnecessary imports
    billbrod committed Aug 13, 2020
    Configuration menu
    Copy the full SHA
    bd6792c View commit details
    Browse the repository at this point in the history
  3. deletes to_delete/ files

    billbrod committed Aug 13, 2020
    Configuration menu
    Copy the full SHA
    953fa10 View commit details
    Browse the repository at this point in the history
  4. Configuration menu
    Copy the full SHA
    c5698bf View commit details
    Browse the repository at this point in the history