Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

code for bayesian_optimization.gif #18

Closed
junpenglao opened this issue Mar 22, 2016 · 12 comments
Closed

code for bayesian_optimization.gif #18

junpenglao opened this issue Mar 22, 2016 · 12 comments

Comments

@junpenglao
Copy link

Hi there,
Could you please provide the code and the data for generating bayesian_optimization.gif? It would be nice to have more information of this.

@fmfn
Copy link
Member

fmfn commented Mar 22, 2016

I didn't write the code thinking about sharing it, therefore it is messy and definitely not worth being a part of this repo.

I can tell that the idea is exactly the same as the one carried out in this example notebook, but for a two dimensional target.

If you really want it I can send you the code by email.

@junpenglao
Copy link
Author

I think this is actually a really nice demo - if you send me the code I can try to tidy it up and create a notebook.
My email is Junpeng.lao@unifr.ch

@SalemAmeen
Copy link

Could you please send it to me as well, s.a.ameen@edu.salford.ac.uk

@fmfn
Copy link
Member

fmfn commented Mar 23, 2016

No problem, I'll do it later today.

@davidenitti
Copy link

I'm also interested in the plotting. can you post the plotting code here? (if you don't want to put in the repo)

@fmfn
Copy link
Member

fmfn commented Mar 24, 2016

There you go. If you copy and paste this into jupyter it should work, triple line skips are cell breaks.

ps: I said it wasn't pretty.

from bayes_opt import BayesianOptimization
import numpy as np

import matplotlib.pyplot as plt
from matplotlib import cm
from matplotlib import mlab
from matplotlib import gridspec
%matplotlib inline



def unique_rows(a):
    """
    A functions to trim repeated rows that may appear when optimizing.
    This is necessary to avoid the sklearn GP object from breaking

    :param a: array to trim repeated rows from

    :return: mask of unique rows
    """

    # Sort array and kep track of where things should go back to
    order = np.lexsort(a.T)
    reorder = np.argsort(order)

    a = a[order]
    diff = np.diff(a, axis=0)
    ui = np.ones(len(a), 'bool')
    ui[1:] = (diff != 0).any(axis=1)

    return ui[reorder]



def target(x, y):
    a = np.exp(-( (x - 2)**2/0.7 + (y - 4)**2/1.2) + (x - 2)*(y - 4)/1.6 )
    b = np.exp(-( (x - 4)**2/3 + (y - 2)**2/2.) )
    c = np.exp(-( (x - 4)**2/0.5 + (y - 4)**2/0.5) + (x - 4)*(y - 4)/0.5 )
    d = np.sin(3.1415 * x)
    e = np.exp(-( (x - 5.5)**2/0.5 + (y - 5.5)**2/.5) )
    return 2*a + b - c + 0.17 * d + 2*e



n = 1e5
x = y = np.linspace(0, 6, 300)
X, Y = np.meshgrid(x, y)
x = X.ravel()
y = Y.ravel()
X = np.vstack([x, y]).T[:, [1, 0]]
z = target(x, y)



print(max(z))
print(min(z))



fig, axis = plt.subplots(1, 1, figsize=(14, 10))
gridsize=150

im = axis.hexbin(x, y, C=z, gridsize=gridsize, cmap=cm.jet, bins=None, vmin=-0.9, vmax=2.1)
axis.axis([x.min(), x.max(), y.min(), y.max()])

cb = fig.colorbar(im, )
cb.set_label('Value')



def posterior(bo, X):
    ur = unique_rows(bo.X)
    bo.gp.fit(bo.X[ur], bo.Y[ur])
    mu, sigma2 = bo.gp.predict(X, eval_MSE=True)
    return mu, np.sqrt(sigma2), bo.util.utility(X, bo.gp, bo.Y.max())

def plot_2d(name=None):

    mu, s, ut = posterior(bo, X)

    fig, ax = plt.subplots(2, 2, figsize=(14, 10))
    gridsize=150

    # fig.suptitle('Bayesian Optimization in Action', fontdict={'size':30})

    # GP regression output
    ax[0][0].set_title('Gausian Process Predicted Mean', fontdict={'size':15})
    im00 = ax[0][0].hexbin(x, y, C=mu, gridsize=gridsize, cmap=cm.jet, bins=None, vmin=-0.9, vmax=2.1)
    ax[0][0].axis([x.min(), x.max(), y.min(), y.max()])
    ax[0][0].plot(bo.X[:, 1], bo.X[:, 0], 'D', markersize=4, color='k', label='Observations')

    ax[0][1].set_title('Target Function', fontdict={'size':15})
    im10 = ax[0][1].hexbin(x, y, C=z, gridsize=gridsize, cmap=cm.jet, bins=None, vmin=-0.9, vmax=2.1)
    ax[0][1].axis([x.min(), x.max(), y.min(), y.max()])
    ax[0][1].plot(bo.X[:, 1], bo.X[:, 0], 'D', markersize=4, color='k')


    ax[1][0].set_title('Gausian Process Variance', fontdict={'size':15})
    im01 = ax[1][0].hexbin(x, y, C=s, gridsize=gridsize, cmap=cm.jet, bins=None, vmin=0, vmax=1)
    ax[1][0].axis([x.min(), x.max(), y.min(), y.max()])

    ax[1][1].set_title('Acquisition Function', fontdict={'size':15})
    im11 = ax[1][1].hexbin(x, y, C=ut, gridsize=gridsize, cmap=cm.jet, bins=None, vmin=0, vmax=8)

    np.where(ut.reshape((300, 300)) == ut.max())[0]
    np.where(ut.reshape((300, 300)) == ut.max())[1]

    ax[1][1].plot([np.where(ut.reshape((300, 300)) == ut.max())[1]/50., 
                   np.where(ut.reshape((300, 300)) == ut.max())[1]/50.], 
                  [0, 6], 
                  'k-', lw=2, color='k')

    ax[1][1].plot([0, 6],
                  [np.where(ut.reshape((300, 300)) == ut.max())[0]/50., 
                   np.where(ut.reshape((300, 300)) == ut.max())[0]/50.], 
                  'k-', lw=2, color='k')

    ax[1][1].axis([x.min(), x.max(), y.min(), y.max()])

    for im, axis in zip([im00, im10, im01, im11], ax.flatten()):
        cb = fig.colorbar(im, ax=axis)
        # cb.set_label('Value')

    if name is None:
        name = '_'

    plt.tight_layout()

    # Save or show figure?
    # fig.savefig('bo_eg_' + name + '.png')
    plt.show()
    plt.close(fig)



bo = BayesianOptimization(target, {'x': (0, 6), 'y': (0, 6)})



gp_params = {'corr': 'absolute_exponential', 'nugget': 1e-9}
bo.maximize(init_points=5, n_iter=0, acq='ucb', kappa=10, **gp_params)



plot_2d("{:03}".format(len(bo.X)))



# Turn interactive plotting off
plt.ioff()

for i in range(50):
    bo.maximize(init_points=0, n_iter=1, acq='ucb', kappa=10, **gp_params)
    plot_2d("{:03}".format(len(bo.X)))

@junpenglao
Copy link
Author

Thanks a lot ( I dont think it's messy at all ;-) )

@flavio-martinelli
Copy link

flavio-martinelli commented May 17, 2020

This example code does not work anymore with the new version, attributes such as bo.X and bo.Y do not exist anymore in the BayesianOptimization class.

Is there any quick change to get this code working again? Thank you, that'd be great!

@cornerfarmer
Copy link

This code works for me with the newest version:

from bayes_opt import BayesianOptimization, UtilityFunction
import numpy as np

import matplotlib.pyplot as plt
from matplotlib import cm
from matplotlib import mlab
from matplotlib import gridspec


def unique_rows(a):
    """
    A functions to trim repeated rows that may appear when optimizing.
    This is necessary to avoid the sklearn GP object from breaking

    :param a: array to trim repeated rows from

    :return: mask of unique rows
    """

    # Sort array and kep track of where things should go back to
    order = np.lexsort(a.T)
    reorder = np.argsort(order)

    a = a[order]
    diff = np.diff(a, axis=0)
    ui = np.ones(len(a), 'bool')
    ui[1:] = (diff != 0).any(axis=1)

    return ui[reorder]



def target(x, y):
    a = np.exp(-( (x - 2)**2/0.7 + (y - 4)**2/1.2) + (x - 2)*(y - 4)/1.6 )
    b = np.exp(-( (x - 4)**2/3 + (y - 2)**2/2.) )
    c = np.exp(-( (x - 4)**2/0.5 + (y - 4)**2/0.5) + (x - 4)*(y - 4)/0.5 )
    d = np.sin(3.1415 * x)
    e = np.exp(-( (x - 5.5)**2/0.5 + (y - 5.5)**2/.5) )
    return 2*a + b - c + 0.17 * d + 2*e



n = 1e5
x = y = np.linspace(0, 6, 300)
X, Y = np.meshgrid(x, y)
x = X.ravel()
y = Y.ravel()
X = np.vstack([x, y]).T[:, [1, 0]]
z = target(y, x)



print(max(z))
print(min(z))



fig, axis = plt.subplots(1, 1, figsize=(14, 10))
gridsize=150

im = axis.hexbin(x, y, C=z, gridsize=gridsize, cmap=cm.jet, bins=None, vmin=-0.9, vmax=2.1)
axis.axis([x.min(), x.max(), y.min(), y.max()])

cb = fig.colorbar(im, )
cb.set_label('Value')

util = UtilityFunction(kind='ucb',
                       kappa=10,
                       xi=0.0,
                       kappa_decay=1,
                       kappa_decay_delay=0)


def posterior(bo, X):
    ur = unique_rows(bo._space.params)
    bo._gp.fit(bo._space.params[ur], bo._space.target[ur])
    mu, sigma2 = bo._gp.predict(X, return_std=True)
    return mu, np.sqrt(sigma2), util.utility(X, bo._gp, bo._space.target.max())

def plot_2d(name=None):

    mu, s, ut = posterior(bo, X)
    #self._space.params, self._space.target
    fig, ax = plt.subplots(2, 2, figsize=(14, 10))
    gridsize=150

    # fig.suptitle('Bayesian Optimization in Action', fontdict={'size':30})

    # GP regression output
    ax[0][0].set_title('Gausian Process Predicted Mean', fontdict={'size':15})
    im00 = ax[0][0].hexbin(x, y, C=mu, gridsize=gridsize, cmap=cm.jet, bins=None, vmin=-0.9, vmax=2.1)
    ax[0][0].axis([x.min(), x.max(), y.min(), y.max()])
    ax[0][0].plot(bo._space.params[:, 1], bo._space.params[:, 0], 'D', markersize=4, color='k', label='Observations')

    ax[0][1].set_title('Target Function', fontdict={'size':15})
    im10 = ax[0][1].hexbin(x, y, C=z, gridsize=gridsize, cmap=cm.jet, bins=None, vmin=-0.9, vmax=2.1)
    ax[0][1].axis([x.min(), x.max(), y.min(), y.max()])
    ax[0][1].plot(bo._space.params[:, 1], bo._space.params[:, 0], 'D', markersize=4, color='k')


    ax[1][0].set_title('Gausian Process Variance', fontdict={'size':15})
    im01 = ax[1][0].hexbin(x, y, C=s, gridsize=gridsize, cmap=cm.jet, bins=None, vmin=0, vmax=1)
    ax[1][0].axis([x.min(), x.max(), y.min(), y.max()])

    ax[1][1].set_title('Acquisition Function', fontdict={'size':15})
    im11 = ax[1][1].hexbin(x, y, C=ut, gridsize=gridsize, cmap=cm.jet, bins=None, vmin=0, vmax=8)

    np.where(ut.reshape((300, 300)) == ut.max())[0]
    np.where(ut.reshape((300, 300)) == ut.max())[1]

    ax[1][1].plot([np.where(ut.reshape((300, 300)) == ut.max())[1]/50.,
                   np.where(ut.reshape((300, 300)) == ut.max())[1]/50.],
                  [0, 6],
                  'k-', lw=2, color='k')

    ax[1][1].plot([0, 6],
                  [np.where(ut.reshape((300, 300)) == ut.max())[0]/50.,
                   np.where(ut.reshape((300, 300)) == ut.max())[0]/50.],
                  'k-', lw=2, color='k')

    ax[1][1].axis([x.min(), x.max(), y.min(), y.max()])

    for im, axis in zip([im00, im10, im01, im11], ax.flatten()):
        cb = fig.colorbar(im, ax=axis)
        # cb.set_label('Value')

    if name is None:
        name = '_'

    plt.tight_layout()

    # Save or show figure?
    # fig.savefig('bo_eg_' + name + '.png')
    plt.show()
    plt.close(fig)



bo = BayesianOptimization(target, {'x': (0, 6), 'y': (0, 6)})


bo.maximize(init_points=5, n_iter=0, acq='ucb', kappa=10)



plot_2d("{:03}".format(len(bo._space.params)))



# Turn interactive plotting off
plt.ioff()

for i in range(50):
    bo.maximize(init_points=0, n_iter=1, acq='ucb', kappa=10)
    plot_2d("{:03}".format(len(bo._space.params)))

@Benjizhang
Copy link

Benjizhang commented Jun 13, 2022

Hi, it should be noted that there is a bug in line 58(if paste the codes into any IDE), where it is not correct to give z = target(y, x). It should be z = target(x,y). (the order of x and y when calling "target" function)

@GiacomoDG96
Copy link

Hi, the code does not work anymore and gives back the following error:

Exception:
Passing acquisition function parameters or gaussian process parameters to maximize
is no longer supported. Instead,please use the "set_gp_params" method to set
the gp params, and pass an instance of bayes_opt.util.UtilityFunction
using the acquisition_function argument

@till-m
Copy link
Member

till-m commented Apr 10, 2024

Hi @GiacomoDG96,

please note that we're not maintaining this code.
That being said, replacing

bo = BayesianOptimization(target, {'x': (0, 6), 'y': (0, 6)})


bo.maximize(init_points=5, n_iter=0, acq='ucb', kappa=10)


plot_2d("{:03}".format(len(bo._space.params)))



# Turn interactive plotting off
plt.ioff()

for i in range(50):
    bo.maximize(init_points=0, n_iter=1, acq='ucb', kappa=10)
    plot_2d("{:03}".format(len(bo._space.params)))

with

from bayes_opt import UtilityFunction

utility = UtilityFunction(kind="ucb", kappa=10, xi=0.0)
bo = BayesianOptimization(target, {'x': (0, 6), 'y': (0, 6)})

bo.maximize(init_points=5, n_iter=0, acquisition_function=utility)

plot_2d("{:03}".format(len(bo._space.params)))

# Turn interactive plotting off
plt.ioff()

for i in range(50):
    bo.maximize(init_points=0, n_iter=1, acquisition_function=utility)
    plot_2d("{:03}".format(len(bo._space.params)))

should work, or at least get you close to running code.

@till-m till-m mentioned this issue Apr 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants