Skip to content

Commit fff9002

Browse files
mcreelMichael Creel
andauthored
fix for ML example (#956)
* fix for ML example * ForwarDiff * fix ML example Co-authored-by: Michael Creel <michael.creel@uab.es>
1 parent 331a26f commit fff9002

File tree

2 files changed

+23
-25
lines changed

2 files changed

+23
-25
lines changed

Project.toml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,7 @@ version = "1.5.0"
55
[deps]
66
Compat = "34da2185-b29b-5c13-b0c7-acf172513d20"
77
FillArrays = "1a297f60-69ca-5386-bcde-b61e274b549b"
8+
ForwardDiff = "f6369f11-7733-5829-9624-2563aa707210"
89
LineSearches = "d3d80556-e9d4-5f37-9878-2ab0fcc64255"
910
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
1011
LoopVectorization = "bdcacae8-1622-11e9-2a5c-532679323890"

docs/src/examples/maxlikenlm.jl

Lines changed: 22 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -21,6 +21,7 @@
2121

2222
using Optim, NLSolversBase
2323
using LinearAlgebra: diag
24+
using ForwardDiff
2425

2526
#md # !!! tip
2627
#md # Add Optim with the following command at the Julia command prompt:
@@ -74,7 +75,7 @@ x = [ 1.0 0.156651 # X matrix of explanatory variables plus constant
7475
1.0 0.594971
7576
1.0 0.427909]
7677

77-
ε = [0.5539830489065279 # Error variance
78+
ε = [0.5539830489065279 # Errors
7879
-0.7981494315544392
7980
0.12994853889935182
8081
0.23315434715658184
@@ -186,11 +187,6 @@ parameters = Optim.minimizer(opt)
186187
# Fieldnames for all of the quantities can be obtained with the following command:
187188
# fieldnames(opt)
188189
#
189-
# Since we paramaterized our likelihood to use the exponentiated
190-
# value, we need to exponentiate it to get back to our original log
191-
# scale:
192-
193-
parameters[nvar+1] = exp(parameters[nvar+1])
194190

195191
# In order to obtain the correct Hessian matrix, we have to "push" the
196192
# actual parameter values that maximizes the likelihood function since
@@ -199,30 +195,31 @@ parameters[nvar+1] = exp(parameters[nvar+1])
199195

200196
numerical_hessian = hessian!(func,parameters)
201197

202-
# We can now invert our Hessian matrix to obtain the variance-covariance matrix:
203-
204-
var_cov_matrix = inv(numerical_hessian)
205-
206-
# In this example, we are only interested in the statistical
207-
# significance of the coefficient estimates so we obtain those with
208-
# the following command:
198+
# Let's find the estimated value of σ, rather than log σ, and it's standard error
199+
# To do this, we will use the Delta Method: https://en.wikipedia.org/wiki/Delta_method
209200

210-
β = parameters[1:nvar]
211-
@test β [2.83664, 3.05345] atol=1e-5 #src
201+
# this function exponetiates log σ
202+
function transform(parameters)
203+
parameters[end] = exp(parameters[end])
204+
parameters
205+
end
212206

213-
# We now need to obtain those elements of the variance-covariance
214-
# matrix needed to obtain our t-statistics, and we can do this with
215-
# the following commands:
207+
# get the Jacobian of the transformation
208+
J = ForwardDiff.jacobian(transform, parameters)'
209+
parameters = transform(parameters)
216210

217-
temp = diag(var_cov_matrix)
218-
temp1 = temp[1:nvar]
211+
# We can now invert our Hessian matrix and use the Delta Method,
212+
# to obtain the variance-covariance matrix:
213+
var_cov_matrix = J*inv(numerical_hessian)*J'
219214

220-
# The t-statistics are formed by dividing element-by-element the
221-
# coefficients by their standard errors, or the square root of the
222-
# diagonal elements of the variance-covariance matrix:
215+
# test the estimated parameters and t-stats for correctness
216+
@test parameters [2.83664, 3.05345, 0.37218] atol=1e-5 #src
217+
t_stats = parameters./sqrt.(diag(var_cov_matrix))
218+
@test t_stats [48.02655, 45.51568, 8.94427] atol=1e-4 #src
223219

224-
t_stats = β./sqrt.(temp1)
225-
@test t_stats [12.31968, 11.67560] atol=1e-4 #src
220+
# see the results
221+
println("parameter estimates:", parameters)
222+
println("t-statsitics: ", t_stats)
226223

227224
# From here, one may examine other statistics of interest using the
228225
# output from the optimization routine.

0 commit comments

Comments
 (0)