Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SquaredL2Loss should inherit from WeightedSquaredL2Loss #94

Closed
tbalke opened this issue Nov 15, 2021 · 9 comments · Fixed by #112
Closed

SquaredL2Loss should inherit from WeightedSquaredL2Loss #94

tbalke opened this issue Nov 15, 2021 · 9 comments · Fixed by #112
Assignees

Comments

@tbalke
Copy link
Contributor

tbalke commented Nov 15, 2021

SquaredL2Loss = WeightedSquaredL2Loss with W=None, however, internally W will still become the and Identity operator.

Hence, should we have the following inheritance?

class SquaredL2Loss(WeightedSquaredL2Loss):

Pro's and con's:

  • (+) less code, more reused code since SquaredL2Loss trivially small
  • (-) create some performance overhead (?)

Someone with more understanding of the inner workings of scico (@lukepfister) can perhaps comment?

Otherwise the performance concerns can also easily be tested using

f_unw = SquaredL2Loss(y, A=A)
f_wei = WeightedSquaredL2Loss(y, A=A, W=None)

while benchmarking the performance of the eval and prox methods.

@lukepfister
Copy link
Contributor

I don't think it would break anything. If jitted there would be no performance change.

@bwohlberg
Copy link
Collaborator

I expect that the performance change would be small in any event, but are you really confident that the jit analysis would be able to identify a multiplication by a variable that happens to have been assigned as a unit matrix?

Independent of this, is it possible to re-arrange the representation a bit so that we can set W = 1.0 (i.e. a scalar) in the case of no weighting?

@lukepfister
Copy link
Contributor

iirc the default isn’t an IdentityMatrix, but rather the Identity LinOp— it’s eval is a no-op.

@lukepfister
Copy link
Contributor

cf

scico/scico/loss.py

Lines 214 to 215 in 4684909

if W is None:
self.W = linop.Identity(y.shape)

@tbalke
Copy link
Contributor Author

tbalke commented Nov 16, 2021

Only concern would be that the diagonal is explicitly set to snp.ones:

super().__init__(diagonal=snp.ones(input_shape, dtype=input_dtype), **kwargs)

But can this be scalar 1.0?

Then it could be that the shape of W are different than the shape of W.diagonal. Not sure whether that is a problem.

@lukepfister
Copy link
Contributor

The shape of W and W.diagonal should be the same; we can't stop a user from doing something like W.diagonal = x where x.shape != W.shape, but that's just shooting themselves in the foot. The Diagonal class really treats the .diagonal attribute as a private thing, but Python doesn't have private attributes, so there you go.

As for identity: yes, it is true that Identity(shape).diagonal is a matrix of ones, but look at the eval:

scico/scico/linop/_linop.py

Lines 213 to 217 in 4684909

def _eval(self, x: Union[JaxArray, BlockArray]) -> Union[JaxArray, BlockArray]:
return x
def __rmatmul__(self, x: Union[JaxArray, BlockArray]) -> Union[JaxArray, BlockArray]:
return x

the .diagonal never gets used.

In earlier times, Identity did not inherit from Diagonal and this wasn't a problem

@tbalke
Copy link
Contributor Author

tbalke commented Nov 16, 2021

In earlier times, Identity did not inherit from Diagonal and this wasn't a problem

What then is the reasoning to inherit from diagonal? Seems like in other contexts (other than the issue at hand) it may also be wasteful to store ones.

In other words, if SquaredL2Loss inherits from WeightedSquaredL2Loss and it is slightly inefficient, that is not exactly because of the inheritance but because the identity operator has some inefficiency?

@tbalke
Copy link
Contributor Author

tbalke commented Nov 16, 2021

Also, this means that I(x) is very different than I.diagonal * x, if I is the Identity operator.

@lukepfister
Copy link
Contributor

I suppose you could relax W.diagonal to be something that broadcasts up to input_shape.

@bwohlberg bwohlberg changed the title SquaredL2Loss inherit from WeightedSquaredL2Loss SquaredL2Loss should inherit from WeightedSquaredL2Loss Nov 23, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants