Skip to content
This repository has been archived by the owner on Feb 28, 2024. It is now read-only.

ValueError: iterator is too large #773

Open
smdubarry opened this issue Jun 11, 2019 · 1 comment
Open

ValueError: iterator is too large #773

smdubarry opened this issue Jun 11, 2019 · 1 comment

Comments

@smdubarry
Copy link

smdubarry commented Jun 11, 2019

Optimizing in 10-dimensional space, after telling about 460 or so points, I am getting this error:

>>> res = opt.tell((948.15646, -617.33538, -597.3366, -234.28106, -311.52753, -61.03256, 56.78082, -27.60636, -304.24363, 833.75898), -7.97135426448522) Traceback (most recent call last): File "<pyshell#2>", line 1, in <module> res = opt.tell((948.15646, -617.33538, -597.3366, -234.28106, -311.52753, -61.03256, 56.78082, -27.60636, -304.24363, 833.75898), -7.97135426448522) File "C:\Users\smdubarry\AppData\Local\Programs\Python\Python37-32\lib\site-packages\skopt\optimizer\optimizer.py", line 443, in tell return self._tell(x, y, fit=fit) File "C:\Users\smdubarry\AppData\Local\Programs\Python\Python37-32\lib\site-packages\skopt\optimizer\optimizer.py", line 502, in _tell acq_func_kwargs=self.acq_func_kwargs) File "C:\Users\smdubarry\AppData\Local\Programs\Python\Python37-32\lib\site-packages\skopt\acquisition.py", line 50, in _gaussian_acquisition func_and_grad = gaussian_ei(X, model, y_opt, xi, return_grad) File "C:\Users\smdubarry\AppData\Local\Programs\Python\Python37-32\lib\site-packages\skopt\acquisition.py", line 276, in gaussian_ei mu, std = model.predict(X, return_std=True) File "C:\Users\smdubarry\AppData\Local\Programs\Python\Python37-32\lib\site-packages\skopt\learning\gaussian_process\gpr.py", line 324, in predict y_var -= np.einsum("ki,kj,ij->k", K_trans, K_trans, K_inv) File "C:\Users\smdubarry\AppData\Local\Programs\Python\Python37-32\lib\site-packages\numpy\core\einsumfunc.py", line 1346, in einsum return c_einsum(*operands, **kwargs) ValueError: iterator is too large

Initial optimizer is:

opt = Optimizer(dimensions = [(-1000.0, 1000.0), (-1000.0, 1000.0), (-1000.0, 1000.0), (-1000.0, 1000.0), (-1000.0, 1000.0), (-1000.0, 1000.0), (-1000.0, 1000.0), (-1000.0, 1000.0), (-1000.0, 1000.0), (-1000.0, 1000.0)], acq_func = "EI", n_random_starts = 9)

Am I just trying to do too much?

@chnzhangrui
Copy link

Hi,

Yes, it is definitely because it couldn't handle the Gaussian process. Computing Gaussian process gets more complicated when the known points increases - it updates the covariance matrix based on all past measurement, so will get slower and slower when more measurements are available.
Personally I would suggest you divide your ten-dimensional space to orthogonal subspaces. If you know some hyperparameters don't reply on others, you could optimise them separately. Then use fewer iterations.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants