Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 9 additions & 0 deletions docs/src/tutorials/intro.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,15 @@ The solution from the original solver can always be obtained via `original`:
sol.original
```

## Defining the objective function
Optimization.jl assumes that your objective function takes two arguments `objective(x, p)`
1. The optimization variables `x`.
2. Other parameters `p`, such as hyper parameters of the cost function.
If you have no “other parameters”, you can safely disregard this argument. If your objective function is defined by someone else, you can create an anonymous function that just discards the extra parameters like this
```julia
obj = (x, p) -> objective(x) # Pass this function into OptimizationFunction
```

## Controlling Gradient Calculations (Automatic Differentiation)

Notice that both of the above methods were derivative-free methods, and thus no
Expand Down