Skip to content

Commit

Permalink
UPDATE: docs for ts
Browse files Browse the repository at this point in the history
  • Loading branch information
jungtaekkim committed Apr 26, 2020
1 parent 2139667 commit c89febf
Show file tree
Hide file tree
Showing 6 changed files with 1,153 additions and 4 deletions.
Binary file added docs/_static/examples/ts_gp_prior.pdf
Binary file not shown.
1,014 changes: 1,014 additions & 0 deletions docs/_static/examples/ts_gp_prior.svg
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion docs/conf.py
Expand Up @@ -24,7 +24,7 @@
author = 'Jungtaek Kim and Seungjin Choi'

# The short X.Y version
version = '0.4.0'
version = '0.4.1'
# The full version, including alpha/beta/rc tags
release = '{} alpha'.format(version)

Expand Down
136 changes: 136 additions & 0 deletions docs/example/ts_gp_prior.rst
@@ -0,0 +1,136 @@
Optimizing a sampled function via Thompson sampling
===================================================

This example is to optimize a function sampled from a Gaussian process prior via Thompson sampling.
First of all, import the packages we need and **bayeso**.

.. code-block:: python
import numpy as np
from bayeso import gp
from bayeso import covariance
from bayeso.utils import utils_covariance
from bayeso.utils import utils_plotting
Declare some parameters to control this example, including zero-mean prior, and compute a covariance matrix.

.. code-block:: python
num_points = 1000
str_cov = 'se'
int_init = 1
int_iter = 50
int_ts = 100
list_Y_min = []
X = np.expand_dims(np.linspace(-5, 5, num_points), axis=1)
mu = np.zeros(num_points)
hyps = utils_covariance.get_hyps(str_cov, 1)
Sigma = covariance.cov_main(str_cov, X, X, hyps, True)
Optimize a function sampled from a Gaussian process prior.
At each iteration, we sample a query point that outputs the mininum value of the function sampled from a Gaussian process posterior.

.. code-block:: python
for ind_ts in range(0, int_ts):
print('TS:', ind_ts + 1, 'iteration')
Y = gp.sample_functions(mu, Sigma, num_samples=1)[0]
ind_init = np.argmin(Y)
bx_min = X[ind_init]
y_min = Y[ind_init]
ind_random = np.random.choice(num_points)
X_ = np.expand_dims(X[ind_random], axis=0)
Y_ = np.expand_dims(np.expand_dims(Y[ind_random], axis=0), axis=1)
for ind_iter in range(0, int_iter):
print(ind_iter + 1, 'iteration')
mu_, sigma_, Sigma_ = gp.predict_optimized(X_, Y_, X, str_cov=str_cov)
ind_ = np.argmin(gp.sample_functions(np.squeeze(mu_, axis=1), Sigma_, num_samples=1)[0])
X_ = np.concatenate([X_, [X[ind_]]], axis=0)
Y_ = np.concatenate([Y_, [[Y[ind_]]]], axis=0)
list_Y_min.append(Y_ - y_min)
Ys = np.array(list_Y_min)
Ys = np.squeeze(Ys, axis=2)
print(Ys.shape)
Plot the result obtained from the code block above.

.. code-block:: python
utils_plotting.plot_minimum(np.array([Ys]), ['TS'], 1, True,
is_tex=True, range_shade=1.0,
str_x_axis=r'\textrm{Iteration}',
str_y_axis=r'\textrm{Minimum regret}')
.. image:: ../_static/examples/ts_gp_prior.*
:width: 320
:align: center
:alt: ts_gp_prior

Full code:

.. code-block:: python
import numpy as np
from bayeso import gp
from bayeso import covariance
from bayeso.utils import utils_covariance
from bayeso.utils import utils_plotting
num_points = 1000
str_cov = 'se'
int_init = 1
int_iter = 50
int_ts = 100
list_Y_min = []
X = np.expand_dims(np.linspace(-5, 5, num_points), axis=1)
mu = np.zeros(num_points)
hyps = utils_covariance.get_hyps(str_cov, 1)
Sigma = covariance.cov_main(str_cov, X, X, hyps, True)
for ind_ts in range(0, int_ts):
print('TS:', ind_ts + 1, 'iteration')
Y = gp.sample_functions(mu, Sigma, num_samples=1)[0]
ind_init = np.argmin(Y)
bx_min = X[ind_init]
y_min = Y[ind_init]
ind_random = np.random.choice(num_points)
X_ = np.expand_dims(X[ind_random], axis=0)
Y_ = np.expand_dims(np.expand_dims(Y[ind_random], axis=0), axis=1)
for ind_iter in range(0, int_iter):
print(ind_iter + 1, 'iteration')
mu_, sigma_, Sigma_ = gp.predict_optimized(X_, Y_, X, str_cov=str_cov)
ind_ = np.argmin(gp.sample_functions(np.squeeze(mu_, axis=1), Sigma_, num_samples=1)[0])
X_ = np.concatenate([X_, [X[ind_]]], axis=0)
Y_ = np.concatenate([Y_, [[Y[ind_]]]], axis=0)
list_Y_min.append(Y_ - y_min)
Ys = np.array(list_Y_min)
Ys = np.squeeze(Ys, axis=2)
print(Ys.shape)
utils_plotting.plot_minimum(np.array([Ys]), ['TS'], 1, True,
is_tex=True, range_shade=1.0,
str_x_axis=r'\textrm{Iteration}',
str_y_axis=r'\textrm{Minimum regret}')
1 change: 1 addition & 0 deletions docs/index.rst
Expand Up @@ -33,6 +33,7 @@ The code can be found in `our GitHub repository <https://github.com/jungtaekkim/
:caption: Example:

example/gp
example/ts_gp_prior
example/branin
example/hpo

Expand Down
4 changes: 1 addition & 3 deletions examples/99_notebooks/example_ts_gp_prior.ipynb
Expand Up @@ -9,8 +9,6 @@
"%matplotlib inline\n",
"\n",
"import numpy as np\n",
"import os\n",
"import matplotlib.pyplot as plt\n",
"\n",
"from bayeso import gp\n",
"from bayeso import covariance\n",
Expand Down Expand Up @@ -80,7 +78,7 @@
"outputs": [],
"source": [
"utils_plotting.plot_minimum(np.array([Ys]), ['TS'], 1, True,\n",
" is_tex=True,\n",
" is_tex=True, range_shade=1.0,\n",
" str_x_axis=r'\\textrm{Iteration}',\n",
" str_y_axis=r'\\textrm{Minimum regret}')"
]
Expand Down

0 comments on commit c89febf

Please sign in to comment.