Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add device property to Kernels, add unit tests #2234

Merged
merged 3 commits into from
Jan 4, 2023
Merged

Conversation

Balandat
Copy link
Collaborator

This is kinda useful. At times. Maybe.

This is kinda useful at times.
@Balandat Balandat requested a review from dme65 December 23, 2022 23:02
Comment on lines 266 to 270
if self.has_lengthscale:
return self.lengthscale.device
else:
for param in self.parameters():
return param.device
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  • What if self.lengthscale and parameters are on different devices? Can they be? Can other tensor attributes of a Kernel be on different devices?
  • How about self.active_dims (when not None), since all Kernels should have that?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point - this is modeled after the code in the dtype property below. I guess the standing assumption has been here that all the parameters (of which lenghtscale is one) are on the same device / have the same dtype. I can simplify this and raise an error if this assumption is violated.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I retained the behavior if has_lengthscale = True for now to not cause any unforeseen BC issues. But I'm raising an error if parameters / devices are heterogeneous between the parameters.

@esantorella
Copy link
Collaborator

What's the use case for this? It seems potentially misleading without checks that all of the kernel's attribute tensors are on the same device. Maybe that's why there's no torch.nn.Module.device. Or if that isn't an issue, maybe this should be a method of the superclass (gpytorch Module)?

@Balandat
Copy link
Collaborator Author

Balandat commented Jan 4, 2023

The use case is basically if we need to conveniently get "the" device/dtype of a kernel object (e.g. for creating new tensors in computations in other places).

It seems potentially misleading without checks that all of the kernel's attribute tensors are on the same device

That is true, I can raise a clear error if this is called on something that has heterogeneous devices/dtypes

Maybe that's why there's no torch.nn.Module.device. Or if that isn't an issue, maybe this should be a method of the superclass (gpytorch Module)?

So since modules can nest other modules it's true that generally we can't be sure (neither should we) that everything lives on the same device or has the same dtype. I think so far this has largely be the implicit assumption though, so let me raise the errors as mentioned above. I think the discussion as to whether this should live on the general module level is one we can have separately from this specific PR.

@Balandat Balandat merged commit e80e5cd into master Jan 4, 2023
@Balandat Balandat deleted the kernel_device branch January 4, 2023 21:50
facebook-github-bot pushed a commit to pytorch/botorch that referenced this pull request Jan 5, 2023
Summary:
## Motivation

As of cornellius-gp/gpytorch#2234, the parent class of BoTorch kernels now has a property "device." This means that if a subclass tries to set `self.device`, it will error. This is why the BoTorch CI is currently breaking: https://github.com/pytorch/botorch/actions/runs/3841992968/jobs/6542850176

Pull Request resolved: #1611

Test Plan: Tests should pass

Reviewed By: saitcakmak, Balandat

Differential Revision: D42354199

Pulled By: esantorella

fbshipit-source-id: c53e5b508dd75f4116870cd30ab90d11cd3eb573
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants