Skip to content

Commit

Permalink
Merge pull request #611 from vaerksted/master
Browse files Browse the repository at this point in the history
fix typos
  • Loading branch information
Vaibhavdixit02 committed Oct 13, 2023
2 parents f0c4446 + d0a18d7 commit 6d625af
Show file tree
Hide file tree
Showing 4 changed files with 7 additions and 7 deletions.
2 changes: 1 addition & 1 deletion docs/src/optimization_packages/metaheuristics.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ Each optimizer sets default settings based on the optimization problem, but spec

Additionally, `Metaheuristics` common settings which would be defined by [`Metaheuristics.Options`](https://jmejia8.github.io/Metaheuristics.jl/stable/api/#Metaheuristics.Options) can be simply passed as special keyword arguments to `solve` without the need to use the `Metaheuristics.Options` struct.

Lastly, information about the optimization problem such as the true optimum is set via [`Metaheuristics.Information`](https://jmejia8.github.io/Metaheuristics.jl/stable/api/#Metaheuristics.Information) and passed as part of the optimizer struct to `solve` e.g., `solve(prob, ECA(information=Metaheuristics.Inoformation(f_optimum = 0.0)))`
Lastly, information about the optimization problem such as the true optimum is set via [`Metaheuristics.Information`](https://jmejia8.github.io/Metaheuristics.jl/stable/api/#Metaheuristics.Information) and passed as part of the optimizer struct to `solve` e.g., `solve(prob, ECA(information=Metaheuristics.Information(f_optimum = 0.0)))`

The currently available algorithms and their parameters are listed [here](https://jmejia8.github.io/Metaheuristics.jl/stable/algorithms/).

Expand Down
2 changes: 1 addition & 1 deletion docs/src/optimization_packages/optim.md
Original file line number Diff line number Diff line change
Expand Up @@ -196,7 +196,7 @@ Gradient-based optimizers are optimizers which utilize the gradient information
* `precondprep = (P, x) -> nothing`
- [`Optim.BFGS()`](https://julianlsolvers.github.io/Optim.jl/stable/#algo/lbfgs/): **Broyden-Fletcher-Goldfarb-Shanno algorithm**

+ `solve(problem, BFGS(alpaguess, linesearch, initial_invH, initial_stepnorm, manifold))`
+ `solve(problem, BFGS(alphaguess, linesearch, initial_invH, initial_stepnorm, manifold))`

+ `alphaguess` computes the initial step length (for more information, consult [this source](https://github.com/JuliaNLSolvers/LineSearches.jl) and [this example](https://julianlsolvers.github.io/LineSearches.jl/latest/examples/generated/optim_initialstep.html))

Expand Down
8 changes: 4 additions & 4 deletions lib/OptimizationMOI/src/nlp.jl
Original file line number Diff line number Diff line change
Expand Up @@ -189,7 +189,7 @@ function MOI.eval_objective_gradient(evaluator::MOIOptimizationNLPEvaluator, G,
if evaluator.f.grad === nothing
error("Use OptimizationFunction to pass the objective gradient or " *
"automatically generate it with one of the autodiff backends." *
"If you are using the ModelingToolkit sybolic interface, pass the `grad` kwarg set to `true` in `OptimizationProblem`.")
"If you are using the ModelingToolkit symbolic interface, pass the `grad` kwarg set to `true` in `OptimizationProblem`.")
end
evaluator.f.grad(G, x)
return
Expand All @@ -213,7 +213,7 @@ function MOI.eval_constraint_jacobian(evaluator::MOIOptimizationNLPEvaluator, j,
elseif evaluator.f.cons_j === nothing
error("Use OptimizationFunction to pass the constraints' jacobian or " *
"automatically generate i with one of the autodiff backends." *
"If you are using the ModelingToolkit sybolic interface, pass the `cons_j` kwarg set to `true` in `OptimizationProblem`.")
"If you are using the ModelingToolkit symbolic interface, pass the `cons_j` kwarg set to `true` in `OptimizationProblem`.")
end
evaluator.f.cons_j(evaluator.J, x)
if evaluator.J isa SparseMatrixCSC
Expand Down Expand Up @@ -276,7 +276,7 @@ function MOI.eval_hessian_lagrangian(evaluator::MOIOptimizationNLPEvaluator{T},
if evaluator.f.hess === nothing
error("Use OptimizationFunction to pass the objective hessian or " *
"automatically generate it with one of the autodiff backends." *
"If you are using the ModelingToolkit sybolic interface, pass the `hess` kwarg set to `true` in `OptimizationProblem`.")
"If you are using the ModelingToolkit symbolic interface, pass the `hess` kwarg set to `true` in `OptimizationProblem`.")
end
fill!(h, zero(T))
k = 0
Expand All @@ -303,7 +303,7 @@ function MOI.eval_hessian_lagrangian(evaluator::MOIOptimizationNLPEvaluator{T},
if evaluator.f.cons_h === nothing
error("Use OptimizationFunction to pass the constraints' hessian or " *
"automatically generate it with one of the autodiff backends." *
"If you are using the ModelingToolkit sybolic interface, pass the `cons_h` kwarg set to `true` in `OptimizationProblem`.")
"If you are using the ModelingToolkit symbolic interface, pass the `cons_h` kwarg set to `true` in `OptimizationProblem`.")
end
evaluator.f.cons_h(evaluator.cons_H, x)
for (μi, Hi) in zip(μ, evaluator.cons_H)
Expand Down
2 changes: 1 addition & 1 deletion src/adtypes.jl
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,7 @@ OptimizationFunction(f, AutoModelingToolkit(); kwargs...)
This uses the [ModelingToolkit.jl](https://github.com/SciML/ModelingToolkit.jl)
package's `modelingtookitize` functionality to generate the derivatives and other fields of an `OptimizationFunction`.
This backend creates the symbolic expressions for the objective and its derivatives as well as
the constraints and their derivatives. Through `structural_simplify`, it enforces symplifications
the constraints and their derivatives. Through `structural_simplify`, it enforces simplifications
that can reduce the number of operations needed to compute the derivatives of the constraints. This automatically
generates the expression graphs that some solver interfaces through OptimizationMOI like
[AmplNLWriter.jl](https://github.com/jump-dev/AmplNLWriter.jl) require.
Expand Down

0 comments on commit 6d625af

Please sign in to comment.