-
-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Representation arithmetic with differentials #11470
Representation arithmetic with differentials #11470
Conversation
👋 Thank you for your draft pull request! Do you know that you can use |
73987cc
to
a6229e7
Compare
a6229e7
to
50c1e27
Compare
@nstarman - maybe could you have a look at this PR? |
# super() checks that the class is identical so can this even happen? | ||
# (same class, different differentials ?) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It doesn't look like BaseRepresentationOrDifferential checks equality of differentials, so this check is necessary. Fixing the f-string does mean this line is counted as not being tested... Maybe add a quick test? Or out of scope.
except Exception: | ||
return NotImplemented | ||
|
||
if self.differentials: | ||
for key, differential in self.differentials.items(): | ||
diff_c = differential.represent_as(CartesianDifferential, base=self) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was thinking about this for a while, but I think this is the correct route: don't assume that a subclass has defined _scale_operation and default to routing through Cartesian.
My one suggestion, and apologies because it's a bit of a bigger change, is to not do the scaling here, but delegate it to the method on the differential. Looking at CartesianDifferential, it inherits _scale_operation
from BaseDifferential, which scales all components. However, ALL the other differentials also inherit the method... isn't this a bug? I would suggest moving the current _scale_operation
from BaseDifferential to CartesianDifferential and implementing on BaseDifferential a similar method to here, where things are re-represented as a CartesianDifferental and then scaled appropriately. I think this would a) fix a bug and b) mean that as _scale_operation
methods are added to each differential, they would automatically be used here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm coming back to this only now. And remember that I struggled with this as well. In the end, it is, perhaps oddly, not a bug that _scale_operation
just scales things for all differentials. E.g., consider
sph_coord + sph_diff * 1 * u.yr
In this case, the proper motions in sph_diff
should just be multiplied by the time directly - there is no need to transform them first.
But of course, this is not correct if one just does sph_rep * 10
- then the angular components should not be affected. So, I went with your suggestion to move this (mostly) up to the differentials, but they need to know whether their base was scaled as well. It is all becoming a bit messy (maybe like your .transform
) and perhaps there is a better solution...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's a good point about unit multiplication. As you point out, there is a need to differentiate between types of scaling. What I'm currently leaning towards for .transform
is a private argument to provide the scaled based.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Now I see the scaled_base
argument. Agreed on the messiness of needing to track whether base was scaled. At least it's a private method and can change.
self._raise_if_has_differentials('division') | ||
return self._dimensional_representation(lon=self.lon, lat=self.lat, | ||
distance=1. / other) | ||
def _scale_operation(self, op, *args): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If the distance stays 1, I think it stay a UnitSpherical. Correct me if I'm wrong, but it doesn't appear to do that in SphericalRepresentation. I do see a #TODO
, is this what that refers to?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It only gets here for multiplication and division; I felt it was a bit too much to check whether other is identical to 1.
|
||
def __neg__(self): | ||
self._raise_if_has_differentials('negation') | ||
return self.__class__(self.lon + 180. * u.deg, -self.lat, copy=False) | ||
if any(differential.base_representation is not self.__class__ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When does this happen?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A UnitSphericalRepresentation
can carry a SphericalDifferential
or a RadialDifferential
. Quite messy...
But clearly not tested - now done.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think a short comment would be good.
for differential in self.differentials.values()): | ||
return super().__neg__() | ||
|
||
result = self.__class__(self.lon + 180. * u.deg, -self.lat, copy=False) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Probably for a followup, but do we want to try to stay on the same branch cut? One thing I've been toying around with, to little success so far, is trying to keep a memory of the phase wrap. Useful for dynamics and I'm sure other fields.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think the class will wrap it, but, yes, let's do it as follow-up.
return self.__class__(self.lon + 180. * u.deg, -self.lat, self.distance, | ||
copy=False) | ||
def _scale_operation(self, op, *args): | ||
# TODO: expand special-casing to UnitSpherical and RadialDifferential. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
e.g. reduce back to Radial or UnitSpherical, if appropriate ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This will reduce back to Radial
or UnitSpherical
, and, following your suggestion, no longer goes through Cartesian
, so there is not much of an effect any more.
Oops. Forgot to leave this comment when I pressed submit review... |
Since this hasn't made the feature freeze cutoff and is not a bug/docfix I'm re-milestoning to 5.0 |
50c1e27
to
786b7d9
Compare
except Exception: | ||
return NotImplemented | ||
|
||
if self.differentials: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As I noticed when working on .transform
stuff, isn't self.differentials
always a dictionary, even if it's empty?
In which case...
for key, differential in self.differentials.items():
will just skip if there are no differentials. The loop doesn't need to be protected.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good point! Fixed.
786b7d9
to
cd1bfdf
Compare
Let me know how I can help. |
@nstarman and @adrn - I think for this PR the main question is whether the addition of @nstarman - one thing that may be helpful for @adrn is perhaps resolve conversations where you feel we've reached a decision. |
I think it's fine, largely because it is a private method. Obviously the |
May make more sense to use the regular scale operation and then "sanitize" it, with any negative radius replaced by a relevant change in angles.
Trying to ensure that radii remain positive, and taking particular care with UnitSphericalRepresentation.
cd1bfdf
to
b629e29
Compare
|
||
def __neg__(self): | ||
self._raise_if_has_differentials('negation') | ||
return self.__class__(self.lon + 180. * u.deg, -self.lat, copy=False) | ||
if any(differential.base_representation is not self.__class__ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think a short comment would be good.
Whether the base was scaled the same way. This affects whether | ||
differential components should be scaled. For instance, a differential | ||
in longitude should not be scaled if its spherical base is scaled | ||
in radius. | ||
""" | ||
scaled_attrs = [op(getattr(self, c), *args) for c in self.components] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Elsewhere you use generator comprehension, not list comprehension. I think the newer pattern is better.
scaled_attrs = [op(getattr(self, c), *args) for c in self.components] | |
scaled_attrs = (op(getattr(self, c), *args) for c in self.components) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It looks good! but I would like @adrn to take a look as well. I'm still worried about the negative distances, but it doesn't seem to raise any errors, which seem comprehensive!
@pytest.mark.parametrize('op,args', [ | ||
(operator.neg, ()), | ||
(operator.mul, (10.,))]) | ||
def test_operation_unitspherical_with_rv_fails(op, args): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this the test for line 1667 in representation.py
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry for the long delay here! I think this looks mostly great, but I had one question about the difference between UnitSphericalRepresentation
and UnitSphericalDifferential
. I see the following using your branch:
>>> coord.UnitSphericalRepresentation(40*u.deg, -11*u.deg) * 15*u.pc
<SphericalRepresentation (lon, lat, distance) in (deg, deg, pc)
(40., -11., 15.)>
>>> coord.UnitSphericalDifferential(40*u.deg, -11*u.deg) * 15*u.km/u.s
<UnitSphericalDifferential (d_lon, d_lat) in deg km / s
(600., -165.)>
I believe the TODO on L2051 points this out? But is there any reason to leave the asymmetry in the behavior here, or is it worth implementing the special-casing now?
The asymmetry here is because |
@adrn - OK to merge this? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, that makes sense. Thanks @mhvk! Feel free to merge...
Note: built on top of #11469, so that should be dealt with first! (Hence making this a draft, even though it is ready otherwise.)(edit: #11469 is merged and this is rebased)With this PR, differentials are taken along for multiplication, division, and negation:
For all other representations, the operations will continue to be equivalent to transforming to cartesian, applying the operation, and transforming back.
fixes #10987
p.s. No problem if this does not get in 4.3.