diff --git a/tutorials/introduction-to-nlpmodelsjump/index.jmd b/tutorials/introduction-to-nlpmodelsjump/index.jmd index 7053808..3939a0f 100644 --- a/tutorials/introduction-to-nlpmodelsjump/index.jmd +++ b/tutorials/introduction-to-nlpmodelsjump/index.jmd @@ -12,7 +12,7 @@ here only the documention specific to NLPModelsJuMP. `MathOptNLPModel` is a simple yet efficient model. It uses JuMP to define the problem, and can be accessed through the NLPModels API. -An advantage of `MathOptNLPModel` over simpler models such as [`ADNLPModels`](https://github.com/JuliaSmoothOptimizers/ADNLPModels.jl) is that +An advantage of `MathOptNLPModel` over models such as [`ADNLPModels`](https://github.com/JuliaSmoothOptimizers/ADNLPModels.jl) is that they provide sparse derivates. Let's define the famous Rosenbrock function