-
Notifications
You must be signed in to change notification settings - Fork 381
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make get_infeasible_cost
return a cost value for each outcome.
#1191
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -320,7 +320,7 @@ def test_constrained_mc_objective(self): | |
obj=constrained_obj, | ||
constraints=[feasible_con, infeasible_con], | ||
samples=samples, | ||
infeasible_cost=0.0, | ||
infeasible_cost=torch.tensor([0.0], device=self.device, dtype=dtype), | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. you're accepting There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. There are something like 6 tests in there. I only changed two of them to use tensors, so both float and tensor are being tested. |
||
) | ||
self.assertTrue(torch.equal(obj(samples), constrained_obj)) | ||
# one feasible, one infeasible, infeasible_cost | ||
|
@@ -342,7 +342,7 @@ def test_constrained_mc_objective(self): | |
obj = ConstrainedMCObjective( | ||
objective=generic_obj, | ||
constraints=[feasible_con, infeasible_con], | ||
infeasible_cost=5.0, | ||
infeasible_cost=torch.tensor([5.0], device=self.device, dtype=dtype), | ||
) | ||
samples = torch.randn(4, 3, 2, device=self.device, dtype=dtype) | ||
constrained_obj = generic_obj(samples) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So If I read this correctly, this computes the minimum across all batch dimensions? Is that what we want here? The previous implementation also did this so it's probably fine to get this in as is, but we may want to think about this some more if the batch dims correspond to different data and we want to extract them individually (this may be relevant for fully bayesian models - @dme65).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, that is an interesting thought. I guess it would depend on how different we expect each model in the batch to be. Since they're all trained on the same set of inputs, I'd presume they'd be relatively similar, so one cost value should work for all. This would also affect the fantasized models, so maybe if we have two fantasy models, one on a large fantasy observation and the other on a small one, the cost value for one may be too conservative for the other? One could also argue that the 6 sigma we use here is too conservative to begin with. Anyways, I'm happy to revisit this later if we think it is worth exploring in more detail.