-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Estimates and Integration with lme4 package #13
Comments
Thanks for reporting this! Regarding issue 1 I can't reproduce this behavior. In the example below, the results are exactly the same. Generally, smaller differences can occur between library(mitml)
library(lme4)
library(nlme)
data(studentratings)
# impute
imp <- panImpute(formula = ReadDis + ReadAchiev ~ 1 + (1|ID), data = studentratings, seed = 1234)
implist <- mitmlComplete(imp)
# fit models
fit1 <- with(implist, lmer(ReadDis ~ ReadAchiev + (1|ID)))
fit2 <- with(implist, lme(fixed = ReadDis ~ ReadAchiev, random = ~ 1 | ID,
data = data.frame(ReadDis, ReadAchiev, ID)))
# results as in output
testEstimates(fit1)
# Estimate Std.Error t.value df P(>|t|) RIV FMI
# (Intercept) 3.549 0.144 24.563 626.039 0.000 0.136 0.123
# ReadAchiev -0.002 0.000 -7.105 998.386 0.000 0.105 0.097
testEstimates(fit2)
# Estimate Std.Error t.value df P(>|t|) RIV FMI
# (Intercept) 3.549 0.144 24.563 626.039 0.000 0.136 0.123
# ReadAchiev -0.002 0.000 -7.105 998.385 0.000 0.105 0.097
# results with higher precision
testEstimates(fit1)$estimates
# Estimate Std.Error t.value df P(>|t|) RIV FMI
# (Intercept) 3.548747357 0.1444768667 24.562738 626.0392 0.000000e+00 0.1362350 0.12269860
# ReadAchiev -0.001957248 0.0002754753 -7.104983 998.3855 2.286393e-12 0.1049052 0.09675261
testEstimates(fit2)$estimates
# Estimate Std.Error t.value df P(>|t|) RIV FMI
# (Intercept) 3.548747357 0.1444768668 24.562738 626.0392 0.000000e+00 0.1362350 0.12269860
# ReadAchiev -0.001957248 0.0002754753 -7.104983 998.3855 2.286393e-12 0.1049052 0.09675261
# R²
multilevelR2(fit1)
# RB1 RB2 SB MVP
# 0.06090104 0.19417320 0.08411631 0.06834806
multilevelR2(fit2)
# RB1 RB2 SB MVP
# 0.06090105 0.19417318 0.08411631 0.06834806 Regarding issue 2 The error message indicates that the class attribute of the fitted models is changed from At the present time, |
Hello,
I'm not sure if this will get you everything you need, but I am copying my code below to see if this helps. I have tried using both the lme4 and nlme packages. The fixed effects estimates I get using both packages are the same, but the pseudo R squared results differ based on which package is used to test my model.
H1 <- lmer(PAA_groupmc ~ Velocity.difference+(1|mydata$ID), data=mydata)
summary(H1)
library(sjPlot)
tab_model(H1, show.se=TRUE, show.ci=TRUE)
#Fixed effects:
# Estimate Std. Error t value
#(Intercept) 0.049907 0.034028 1.467
#Velocity.difference 0.015533 0.001554 9.997
#try using nlme to see if I get the same results with mitml
library(nlme)
h1.1 <- lme(PAA_groupmc ~ Velocity.difference, data=mydata, random = ~1|ID, method="REML")
summary(h1.1)
#Fixed effects: PAA_groupmc ~ Velocity.difference
# Value Std.Error DF t-value p-value
#(Intercept) 0.04990731 0.03402762 772 1.466670 0.1429
#Velocity.difference 0.01553256 0.00155377 772 9.996694 0.0000
#calculate R squared
library(mitml)
multilevelR2(H1)
# RB1 RB2 SB MVP
#0.1004472 0.1004472 0.1004472 0.1013596
mitml:::.getRsquared(h1.1, print=c("RB1", "RB2", "SB", "MVP"), method="nlme")
# RB1 RB2 SB MVP
# 0.10044723 -0.05885972 0.10044723 0.10135962
#all results match the other command, with exception of RB2.
…________________________________
From: Simon Grund <notifications@github.com>
Sent: Monday, September 21, 2020 6:54 AM
To: simongrund1/mitml <mitml@noreply.github.com>
Cc: Victoria Whitaker <victoria.whitaker@slu.edu>; Author <author@noreply.github.com>
Subject: [External] Re: [simongrund1/mitml] Estimates and Integration with lme4 package (#13)
Thanks for reporting this!
Regarding issue 1
I can't reproduce this behavior. In the example below, the results are exactly the same. Generally, smaller differences can occur between nlme and lme4, but these are usually very small. Can you send me a specific example that I can use to reproduce this (e.g., via email)?
library(mitml)
library(lme4)
library(nlme)
data(studentratings)
# impute
imp <- panImpute(formula = ReadDis + ReadAchiev ~ 1 + (1|ID), data = studentratings, seed = 1234)
implist <- mitmlComplete(imp)
# fit models
fit1 <- with(implist, lmer(ReadDis ~ ReadAchiev + (1|ID)))
fit2 <- with(implist, lme(fixed = ReadDis ~ ReadAchiev, random = ~ 1 | ID,
data = data.frame(ReadDis, ReadAchiev, ID)))
# results as in output
testEstimates(fit1)
# Estimate Std.Error t.value df P(>|t|) RIV FMI
# (Intercept) 3.549 0.144 24.563 626.039 0.000 0.136 0.123
# ReadAchiev -0.002 0.000 -7.105 998.386 0.000 0.105 0.097
testEstimates(fit2)
# Estimate Std.Error t.value df P(>|t|) RIV FMI
# (Intercept) 3.549 0.144 24.563 626.039 0.000 0.136 0.123
# ReadAchiev -0.002 0.000 -7.105 998.385 0.000 0.105 0.097
# results with higher precision
testEstimates(fit1)$estimates
# Estimate Std.Error t.value df P(>|t|) RIV FMI
# (Intercept) 3.548747357 0.1444768667 24.562738 626.0392 0.000000e+00 0.1362350 0.12269860
# ReadAchiev -0.001957248 0.0002754753 -7.104983 998.3855 2.286393e-12 0.1049052 0.09675261
testEstimates(fit2)$estimates
# Estimate Std.Error t.value df P(>|t|) RIV FMI
# (Intercept) 3.548747357 0.1444768668 24.562738 626.0392 0.000000e+00 0.1362350 0.12269860
# ReadAchiev -0.001957248 0.0002754753 -7.104983 998.3855 2.286393e-12 0.1049052 0.09675261
# R²
multilevelR2(fit1)
# RB1 RB2 SB MVP
# 0.06090104 0.19417320 0.08411631 0.06834806
multilevelR2(fit2)
# RB1 RB2 SB MVP
# 0.06090105 0.19417318 0.08411631 0.06834806
Regarding issue 2
The error message indicates that the class attribute of the fitted models is changed from nlme to something else. This can happen, for example, when using the lmerTest package, which overrides many functions in lme4. Do you use any additional packages that may cause this?
At the present time, lmerTest is not supported by mitml. Therefore, the only workaround is to fit the models without loading lmerTest.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub<https://urldefense.com/v3/__https://github.com/simongrund1/mitml/issues/13*issuecomment-696066122__;Iw!!K543PA!cDvYrHOjL7xm3hapdgvEvbfOqVWWTvH4Ayptu8b4-s--1bGIaxEjjZWezU81_TqlTEitzVg$>, or unsubscribe<https://urldefense.com/v3/__https://github.com/notifications/unsubscribe-auth/ARBVY5Y5KO6PQNAILBT6WS3SG45I3ANCNFSM4RSLIDVA__;!!K543PA!cDvYrHOjL7xm3hapdgvEvbfOqVWWTvH4Ayptu8b4-s--1bGIaxEjjZWezU81_TqlY51L3LA$>.
|
Thanks for the additional information. Unfortunately, I still can't reproduce this behavior. For example, with the I would like to investigate this further, but I need a reproducible example that shows this behavior. Could you provide me with (1) a reproducible example with both data and code, and (2) the output of your library(mitml)
library(lme4)
library(nlme)
data(studentratings)
studentratings <- na.omit(studentratings[,c("ID","ReadDis","ReadAchiev")])
# fit models
fit1 <- lmer(ReadDis ~ ReadAchiev + (1|ID), data = studentratings)
fit1.1 <- lmer(ReadDis ~ ReadAchiev + (1|studentratings$ID), data = studentratings)
fit2 <- lme(fixed = ReadDis ~ ReadAchiev, random = ~ 1|ID, data = studentratings)
# R²
multilevelR2(fit1)
# RB1 RB2 SB MVP
# 0.06224511 0.21019639 0.08648722 0.07231612
multilevelR2(fit1.1)
# RB1 RB2 SB MVP
# 0.06224511 0.21019639 0.08648722 0.07231612
mitml:::.getRsquared(fit2, print=c("RB1", "RB2", "SB", "MVP"), method="nlme")
# RB1 RB2 SB MVP
# 0.06224532 0.21019331 0.08648691 0.07231607 |
Closing this for now, because the problem is still not reproducible with no more response. |
I have been using the mitml package to calculate variance explained for a set of multilevel models, but noticed a few issues:
The text was updated successfully, but these errors were encountered: