New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

remove pyramid_on_features from Menpo #444

Merged
merged 12 commits into from Sep 15, 2014

Conversation

Projects
None yet
3 participants
@jabooth
Member

jabooth commented Sep 13, 2014

pyramid_on_features was a redundant kwarg - in order to set it True, one had to supply a single callable to feature in place of a list of callables. A list was only allowed if pyramid_on_features was False.

Instead, this PR drops all that, and makes a simple rule:

  1. If a single features is present, then we have the pyramid_on_features=True behavior
  2. If a list of features present the list must be n_levels long, and we have the pyramid_on_features=False behaviour.

This is on top of #443.

@jabooth jabooth added the in progress label Sep 13, 2014

jabooth added some commits Sep 13, 2014

@@ -149,7 +149,6 @@ def aam_build_benchmark(training_images, training_options=None, verbose=False):
'n_levels': 3,
'downscale': 2,
'scaled_shape_models': True,
'pyramid_on_features': True,

This comment has been minimized.

@jabooth

jabooth Sep 14, 2014

Member

a note - removing lines that say pyramid_on_features=True is almost certainly safe. If feature is not a single callable an error would already have been raised on master. We have to be a little more careful in the cases where pyramid_on_features=False though - in such cases features that are just a single callable need to be changed to lists of length n_levels with the callable repeated.

@@ -99,14 +98,13 @@ def aam_fastest_alternating_bbox(training_db_path, fitting_db_path,
'convert_to_grey': True
}
training_options = {'group': 'PTS',
'features': igo,
'features': [igo] * 3,

This comment has been minimized.

@jabooth

jabooth Sep 14, 2014

Member

Here's an example about the previous comment.

@@ -481,7 +475,6 @@ def sdm_fastest_bbox(training_db_path, fitting_db_path,
'noise_std': 0.08,
'patch_shape': (16, 16),
'n_perturbations': 15,
'pyramid_on_features': False

This comment has been minimized.

@jabooth

jabooth Sep 14, 2014

Member

If pyramid_on_features is false we are falling back to the default feature. We may want to think about what that default is in light of this change. As this is SDM it sounds like we are fine here - default is a no_op so we don't actually care, right?

@@ -107,6 +93,10 @@ def n_levels(self):
"""
return len(self.appearance_models)
@property
def pyramid_on_features(self):
return pyramid_on_features(self.features)

This comment has been minimized.

@jabooth

jabooth Sep 14, 2014

Member

Note how we re-use the pyramid_on_features(features) function defined in fitmultilievel.base

@jalabort

This comment has been minimized.

Member

jalabort commented Sep 15, 2014

+1

jabooth added a commit that referenced this pull request Sep 15, 2014

Merge pull request #444 from jabooth/rm_pof
remove pyramid_on_features from Menpo

@jabooth jabooth merged commit 7205407 into menpo:master Sep 15, 2014

1 check passed

continuous-integration/travis-ci The Travis CI build passed
Details

@jabooth jabooth removed the in progress label Sep 15, 2014

@jabooth jabooth deleted the jabooth:rm_pof branch Sep 15, 2014

@patricksnape

This comment has been minimized.

Contributor

patricksnape commented Sep 15, 2014

Here's my late +1 😢

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment