Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix bug in setting training data for ExactGP if training inputs are None #565

Merged
merged 3 commits into from Mar 12, 2019
Merged

Fix bug in setting training data for ExactGP if training inputs are None #565

merged 3 commits into from Mar 12, 2019

Conversation

neighthan
Copy link
Contributor

It appears valid for a GP to have its training inputs/labels set to None. However, if you call set_train_data when the current training inputs are None, you'll get an error at for input, t_input in zip(inputs, self.train_inputs) (because None is not iterable and thus can't be zipped). This PR moves the check for if strict up so that, if strict == False, the entire checking section is skipped and it doesn't matter whether the training data was None or not.

When strict == True, the error messages have been improved a bit. Now they'll say both what the expected attribute (e.g. shape) was and what shape was found. To get a more informative error, I also ensured that if the training inputs/targets are None, the shape, etc. will be viewed as None so you'll still get the expected RuntimeError (instead of an error at zip that is harder to diagnose).

This passed flake8 fine, and I think the only errors I got in the tests were for CUDA out of memory.

Move the check about whether strict mode is enabled up so that if strict is False, set_train_data won't fail even if the current training data is None.
@neighthan
Copy link
Contributor Author

I also added a bit of documentation for the arguments of set_train_data (since I came to look at the source code to figure out what strict would do when I needed it). I didn't specify shapes for inputs and targets - should these be the same as in get_fantasy_model? E.g.

"""
- :attr:`inputs` (Tensor `m x d` or `b x m x d`)
- :attr:`targets` (Tensor `m` or `b x m`)
"""

@jacobrgardner
Copy link
Member

This looks great! Thanks @neighthan

@jacobrgardner jacobrgardner merged commit 7de4874 into cornellius-gp:master Mar 12, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants