Clone this wiki locally
Changes from version 1.6.0beta to 1.7.0beta
A python tool was added that facilitates the creation of structured data. If you find yourself pondering over indices and ordering far too often, and writing code with magic numbers and interleaving slices, consider adopting the structure tool: tutorial, example.
It is required now that the MX inputs to MXFunctions be symbolic primitives. Mappings of symbolics (such as obtained with vertcat) are no longer allowed. The 'structure' tool from the above bullet makes it easy to do it correct.
The input/output signature of NLPSolver and derived classes has changed, see #566. You can use the conversion script located at
misc/update_nlp_solver_signature.shto update your existing code. To run it, issue
sh <path_to_casadi>/misc/update_nlp_solver_signature.shin the root of the directory that needs to be updated.
f = MXFunction(daeIn(x=x,p=p),daeOut(ode=f)) f.init() f.setInput(x_value,"x") # instead of "f.setInput(x_value,DAE_X)" f.setInput(p_value,"p") # instead of "f.setInput(p_value,DAE_P)"
Although the old syntax will continue to work, changing to the new syntax will hopefully make your code more readable. You can use the conversion script located at
misc/update_ioscheme_names.sh to update your existing code. To run it, issue
sh <path_to_casadi>/misc/update_ioscheme_names.sh in the root of the directory that needs to be updated. Note that the signature of the NLP solver will only be updated if the script
update_nlp_solver_signature.sh above already has been applied.
- There is now a new, preferred way of passing NLPs to the NLP solvers. You are now strongly advised to pass both the objective function and constraints together in one function:
my_x = ... # expressions for the variables my_p = ... # expressions for parameters (if any) my_f = ... # expression for the objective function my_g = ... # expression for the constraint function (if any) nlp = MXFunction(nlpIn(x=my_x,p=my_p),nlpOut(f=my_f,g=my_g)) solver = IpoptSolver(nlp)
The old syntax is still valid, but internally it will just reformulate to the above, which may be suboptimal.
User provided objective gradient functions, Lagrangian Hessians and constraint Jacobians are no longer provided via the constructor, but via options:
solver.setOption("hess_lag",h). But typically you want to rely on the automatic generation of Jacobians and Hessians.
The default options for the NLP solvers have changed. The default options are now the default options for the corresponding NLP solvers. In particular, the default option for "hessian_approximation" in IPOPT is "exact", therefore the IPOPT interface will now use exact Hessian if you don't explicitly set "hessian_approximation" to "limited-memory". The option "generate_hessian" is also gone and should throw a warning. The logic is now that if the NLP solver (e.g. IPOPT) requests an exact Hessian, one will be generated.
- Major update to the WORHP NLP solver interface