Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

Already on GitHub? Sign in to your account

[MRG+1] Implement greedy A-optimal acquisition function for pure exploration #432

Merged
merged 2 commits into from Jul 28, 2017
Jump to file or symbol
Failed to load files and symbols.
+15 −1
Split
View
@@ -105,10 +105,12 @@ def gaussian_lcb(X, model, kappa=1.96, return_grad=False):
It should have a ``return_std`` parameter that returns the standard
deviation.
- * `kappa`: [float, default 1.96]:
+ * `kappa`: [float, default 1.96 or 'inf']:
Controls how much of the variance in the predicted values should be
taken into account. If set to be very high, then we are favouring
exploration over exploitation and vice versa.
+ If set to 'inf', the acquisition function will only use the variance
+ which is useful in a pure exploration setting.
Useless if ``method`` is set to "LCB".
* `return_grad`: [boolean, optional]:
@@ -132,10 +134,14 @@ def gaussian_lcb(X, model, kappa=1.96, return_grad=False):
X, return_std=True, return_mean_grad=True,
return_std_grad=True)
+ if kappa == "inf":
+ return -std, -std_grad
return mu - kappa * std, mu_grad - kappa * std_grad
else:
mu, std = model.predict(X, return_std=True)
+ if kappa == "inf":
+ return -std
return mu - kappa * std
@@ -71,6 +71,14 @@ def test_acquisition_pi_correctness():
@pytest.mark.fast_test
+def test_acquisition_variance_correctness():
+ # check that it works with a vector as well
+ X = 10 * np.ones((4, 2))
+ var = gaussian_lcb(X, ConstSurrogate(), kappa='inf')
+ assert_array_almost_equal(var, [-1.0] * 4)
+
+
+@pytest.mark.fast_test
def test_acquisition_lcb_correctness():
# check that it works with a vector as well
X = 10 * np.ones((4, 2))