From 2cb53511e7f7ebae6f096aa0fcab145a2eb62978 Mon Sep 17 00:00:00 2001 From: tmigot Date: Fri, 12 Aug 2022 08:27:48 -0400 Subject: [PATCH] Try trigger deploy --- tutorials/introduction-to-nlpmodelsjump/index.jmd | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/tutorials/introduction-to-nlpmodelsjump/index.jmd b/tutorials/introduction-to-nlpmodelsjump/index.jmd index 7053808..3939a0f 100644 --- a/tutorials/introduction-to-nlpmodelsjump/index.jmd +++ b/tutorials/introduction-to-nlpmodelsjump/index.jmd @@ -12,7 +12,7 @@ here only the documention specific to NLPModelsJuMP. `MathOptNLPModel` is a simple yet efficient model. It uses JuMP to define the problem, and can be accessed through the NLPModels API. -An advantage of `MathOptNLPModel` over simpler models such as [`ADNLPModels`](https://github.com/JuliaSmoothOptimizers/ADNLPModels.jl) is that +An advantage of `MathOptNLPModel` over models such as [`ADNLPModels`](https://github.com/JuliaSmoothOptimizers/ADNLPModels.jl) is that they provide sparse derivates. Let's define the famous Rosenbrock function