You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is very cumbersome interface, especially when you have 15 parameters, each with different ranges, and are of different type. It's very easy to make a mistake.
Further if the type of parameter is left unassigned, there is no error or warning. Example
self.integer = [0, 1, 2]
self.continuous = [4, 5]
Here the fourth parameter is left without a type (integer or continuous), but there is no error or warning produced, and the algorithm just considers it as continuous.
Also, it would be useful if we can assign a name to a parameter.
I suggest to adopt some JSON configuration file, similar to what other optimization algorithms use. Example:
There is a method for sanity checking your optomization problem setup. The method is named validate and is available in test_problems.py. I agree that this should be checked automatically by the strategy and I'm working on adding some input checking to the strategies.
Currently the parameter configuration is being set in this way:
This is very cumbersome interface, especially when you have 15 parameters, each with different ranges, and are of different type. It's very easy to make a mistake.
Further if the type of parameter is left unassigned, there is no error or warning. Example
Here the fourth parameter is left without a type (integer or continuous), but there is no error or warning produced, and the algorithm just considers it as continuous.
Also, it would be useful if we can assign a name to a parameter.
I suggest to adopt some JSON configuration file, similar to what other optimization algorithms use. Example:
Most importantly produce an error or at least a warning if not all of the configuration properties are set.
Thanks
The text was updated successfully, but these errors were encountered: