Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve Parameters class #2490

Merged
merged 14 commits into from Oct 26, 2019
Merged

Improve Parameters class #2490

merged 14 commits into from Oct 26, 2019

Conversation

@cdeil
Copy link
Member

@cdeil cdeil commented Oct 25, 2019

This PR contains fixes, improvements and cleanup for gammapy.modeling.Parameters, and related changes to Parameter, Model and some model __init__ methods.

  • Make Parameters._parameters completely private (no getter or setter)
  • Add Parameters.values Numpy array
  • Add Parameters.__len__
  • Add Parameters.from_stack and Parameters.__add__ (covar stacking not done yet)
  • Add Model._update_from_dict (temp solution, left TODO comment)
  • Remove broken Parameters.__str__
  • Remove Parameters.apply_autoscale flag (not a good place to offer this optimiser option)
  • Remove Parameters.covariance_to_table
  • Misc cleanup
@cdeil cdeil added the cleanup label Oct 25, 2019
@cdeil cdeil added this to the 0.15 milestone Oct 25, 2019
@cdeil cdeil requested a review from adonath Oct 25, 2019
@cdeil cdeil self-assigned this Oct 25, 2019
@cdeil
Copy link
Member Author

@cdeil cdeil commented Oct 25, 2019

@adonath - This is ready for final review.

@QRemy - I had to update the serialisation code and reference file in 78c3a07 - does the new version look OK or did I screw up?

@cdeil
Copy link
Member Author

@cdeil cdeil commented Oct 25, 2019

The Parameters.__str__ was broken: it called self.get_error which should have been self.error which wasn't noticed because of the use of try-except AttributeError (which is a bad coding pattern for this reason of not noticing breakage, in such cases if-else should be used).

error = "+/- {:.2f}".format(self.get_error(par))

So it never printed parameter error information. I think we never noticed because we never used this Parameters.__str__, we just used Parameters.to_table.

I'll delete Parameters.__str__ for now. If we think it's useful, we can bring it back (with a test showing that it works).

Copy link
Member

@adonath adonath left a comment

Thanks @cdeil! I've left one minor inline comment. I have only sparse internet connection, please merge whenever you think it's ready...

gammapy/modeling/parameter.py Outdated Show resolved Hide resolved
@cdeil
Copy link
Member Author

@cdeil cdeil commented Oct 25, 2019

I've done some more work here, starting to add a Numpy array values-based interface.

I tried implementing the covar stacking, but for some reason it makes some iminuit backend tests fail, something I don't understand yet. I've left TODO comments, will try to finish that tonight, or leave that for next week for another PR.

@adonath - OK to remove _filter_unique_parameters from Parameters.__init__, and instead expose that as a method and only call it where needed? Presumably this is something that should happen in the optimiser interface? Any idea where to best put that call to generate the optimiser parameters?

@adonath
Copy link
Member

@adonath adonath commented Oct 25, 2019

@cdeil I'm not sure _filter_unique_parameters can be easily removed from __init__, because there are a few places, where we rely on it, not only in the optimiser, but also in .to_table() or .to_dict() also in .set_parameter_factors(). It can maybe be refactored, but right now I don't see the motivation of keeping multiple references to a single parameter in Parameters...why do you want to remove _filter_unique_parameters? I guess the filtering could be moved to .from_stack() instead, if you prefer this...

@cdeil
Copy link
Member Author

@cdeil cdeil commented Oct 25, 2019

why do you want to remove _filter_unique_parameters?

Explicit is better then implicit.
I prefer initialisers that don't do non-obvious things.

There's other transformations, like removing fixed parameters, that need to happen when going to the optimiser. IMO those should happen in a method, not in Parameters.__init__.

I guess the filtering could be moved to .from_stack() instead, if you prefer this...

I'll try that tomorrow, and if I don't find a solution, merge as-is and postpone that.

@cdeil cdeil changed the title Clean up Parameters handling Improve Parameters class Oct 26, 2019
@cdeil
Copy link
Member Author

@cdeil cdeil commented Oct 26, 2019

I've removed the Parameters.apply_autoscale flag. I don't think this flag as state on the Parameters object is a great solution. If we want to expose that as an option, probably it should be on the Fit class. Note that I just removed the flag, the scaling functionality is still there for now, and works exactly as before: if the user doesn't set parameter errors to indicate initial step sizes to the optimiser, they will be chosen using autoscale.

I've also removed the Parameters.covariance_to_table property, changing the two notebooks where it was called to use to_table which shows the error and covariance and correlation which are Numpy arrays. There's several issues with storing a covariance matrix in a Table. It's not a symmetrix data structure in rows and columns, there's a column with parameter names, so there's one more column than rows, making indexing error-prone. Also there's the issue that each column has a unit, but in the covariance matrix rows have different units. I realise it's difficult to expose an API and data structures that make working with the covariance matrix difficult, we should add some helpers and documentation back. E.g. we could offer a http://xarray.pydata.org/en/stable/generated/xarray.DataArray.html which has labeled axes and offers the full power of slicing and indexing by position or label (could be parameter object or name), and has nice plotting and HTML printout and such things that would take 100s of lines to re-implement in Gammapy. We should discuss soon, but I'm sure Table isn't the answer here, so removed it for now.

@cdeil
Copy link
Member Author

@cdeil cdeil commented Oct 26, 2019

I tried to move the subsetting to unique parameters again from Parameters.__init__ and ran into complications with the flux point estimation and spectral scale model, where we have a compound model and are exchanging a sub-model after global Model and Parameters __init__.

@spectral_model.setter
def spectral_model(self, model):
"""`~gammapy.modeling.models.SpectralModel`"""
self._spectral_model = model
self._parameters = Parameters(
self.spatial_model.parameters.parameters
+ self.spectral_model.parameters.parameters
)

I think only creating Parameters in Model init methods, and then not allowing changes to the number of parameters after that might be the way to go. If we don't do that restriction, then fundamentally having covariance stored on the Parameters object is bad, because it's very easy to get out of sync with the parameters list and get errors (sometimes silently wrong results, sometimes confusing error messages from Numpy or Scipy or NaNs).

@adonath - Let's discuss next week, to get a clear design concerning when Parameters objects are created (whether in Model init and / or in sub-model setters, and where removing linked parameters should happen. I'm happy to implement the changes, but we have to agree on a design.

@adonath - Concerning https://docs.gammapy.org/0.14/api/gammapy.modeling.models.ScaleSpectralModel.html specifically: how about we change from this "wrapper class" design to a built-in "norm_scale" parameter that we add to all spectral models (could be done in the base class and have a default of 1 and frozen without unit). I think that might be simpler, and it would mean we can move Parameter construction into model __init__.

@cdeil cdeil merged commit ef74d35 into gammapy:master Oct 26, 2019
8 of 9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Linked issues

Successfully merging this pull request may close these issues.

None yet

2 participants