-
Notifications
You must be signed in to change notification settings - Fork 208
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error from SMT --> ValueError: setting an array element with a sequence. #541
Comments
Hi. Thank you for reporting. I think we've got an issue here. How do you call In 2.4, we've changed the default internal optimizer from COBYLA to TNC. The latter uses gradients, I guess that if you switch back to COBYLA by using |
Hi, thanks for getting back, here’s how I’m calling it:
self.surrogate = KPLSK(print_global = False,
n_comp=num_params,
theta0=t0s,
print_prediction=False, corr='squar_exp')
Where I’m setting principal components to the number of input parameters, theta0 is of the same dimension as well.
So, I do see a bunch of cobyla failures along with this message:
fmin_cobyla failed but the best value is retained
Optimization failed. Try increasing the ``nugget``
Is what gets printed before the error. So it sounds like I should go back to TNC? Correct.
Val.
|
|
Answers below.
FYI, I will be out tomorrow and all of next week, so my responses will be delayed.
Val.
1. the point of using KPLS or KPLSK is to choose n_comp < num_params to get actual dimension reduction otherwise you'd better use KRG. What is the value of num_params? Whats is the shape of your training data (n_samples, n_dim) ? What is the shape/value of t0s?
ANS: I figured n_comp is number of principle components to retain, the default now was just to set it to num_params (I will change this), but you’re right we should change it. num_params is the size of the domain, this can vary from 2 to 30. Theta0 is the same as the number of params. We’re currently using discrete value indexes, so this defaults to 1.0.
1. what version of SMT worked for you before?
ANS: 2.0.1
1. did you try to increase the nugget like it was suggested (option nugget=1e-8) when you test with Cobyla?
ANS: I did try, but I still saw errors, I will re-run to verify
1. If you go back to TNC you get an error, correct? At the moment I can not reproduce the error you've got. So without an actual example to reproduce the error, it is difficult to help you more on this.
ANS: If I set hyper_opt to TNC, I still see :
Optimization failed. Try increasing the ``nugget``
fmin_cobyla failed but the best value is retained
But it doesn’t crash.
KPLSK(print_global = False,
hyper_opt='TNC',
n_comp=num_params,
theta0=t0s,
print_prediction=False, corr='squar_exp')
|
After upgrading to version 2.4.0, I'm seeing the following errors:
Is this a real issue or am I incorrectly using KPLS ?
The text was updated successfully, but these errors were encountered: