Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Errors with powerSim #93

Closed
gfonzo opened this issue Jan 13, 2018 · 15 comments
Closed

Errors with powerSim #93

gfonzo opened this issue Jan 13, 2018 · 15 comments

Comments

@gfonzo
Copy link

gfonzo commented Jan 13, 2018

Hi,

I've been attempting to use this package to run a power analysis on a linear mixed model with several interactions and have been running into errors.

Here's the setup: I have about 36 individuals w/ PTSD in a longitudinal study (pre/post psychotherapy) whom have a brain connectivity measure and a verbal memory grouping (impaired or intact) at baseline and am trying to test the power of the combination of baseline verbal memory and brain connectivity to predict treatment outcome (measured on a continuous scale).

Here's the data frame w/ the relevant variables:

ID Group Time CAPS fMRIsignalc VM
1 B005 PE PreTx 0.70 -0.034933333
2 B009 PE PreTx 0.76 0.152566667
3 B011 PE PreTx 0.64 -0.083333333
4 B012 PE PreTx 0.58 0.065166667
5 B014 PE PreTx 0.92 -0.124133333
6 B023 PE PreTx 0.70 0.150566667
7 B025 PE PreTx 0.96 -0.044733333
8 B047 PE PreTx 0.55 0.065666667 Intact
9 B059 PE PreTx 1.01 0.125266667 Impaired
10 B061 PE PreTx 0.85 -0.174833333 Intact
11 B072 PE PreTx 0.90 -0.186933333 Intact
12 B075 PE PreTx 0.82 -0.055933333 Impaired
13 B076 PE PreTx 0.75 -0.068533333 Intact
14 B080 PE PreTx 0.56 0.094666667 Intact
15 B081 PE PreTx 0.69 0.058366667 Intact
16 B082 PE PreTx 0.59 -0.129833333 Intact
17 B084 PE PreTx 0.46 -0.080933333 Impaired
18 B085 PE PreTx 0.47 0.021866667 Impaired
19 B087 PE PreTx 0.67 -0.146333333 Intact
20 B088 PE PreTx 0.68 0.091366667 Intact
21 B090 PE PreTx 0.76 0.015866667 Intact
22 B091 PE PreTx 0.53 -0.041033333 Impaired
23 B094 PE PreTx 0.52 -0.070633333 Intact
24 B097 PE PreTx 0.61 -0.069733333 Intact
25 B100 PE PreTx 0.56 0.166766667 Intact
26 B102 PE PreTx 0.55 0.058466667 Impaired
27 B105 PE PreTx 0.54 0.137666667 Intact
28 B106 PE PreTx 0.43 -0.064333333 Impaired
29 B107 PE PreTx 0.71 -0.013933333 Intact
30 B108 PE PreTx 0.72 0.037266667 Intact
31 B110 PE PreTx 0.57 -0.015733333 Impaired
32 B114 PE PreTx 0.70 -0.118733333 Impaired
33 B117 PE PreTx 0.38 0.223366667 Intact
34 B120 PE PreTx 0.70 -0.146233333 Impaired
35 B121 PE PreTx 0.53 -0.005733333 Intact
36 B122 PE PreTx 0.81 0.211666667 Intact
37 B005 PE PostTx NA -0.034933333
38 B009 PE PostTx NA 0.152566667
39 B011 PE PostTx 0.37 -0.083333333
40 B012 PE PostTx 0.08 0.065166667
41 B014 PE PostTx 0.33 -0.124133333
42 B023 PE PostTx 0.35 0.150566667
43 B025 PE PostTx 0.51 -0.044733333
44 B047 PE PostTx 0.15 0.065666667 Intact
45 B059 PE PostTx 0.25 0.125266667 Impaired
46 B061 PE PostTx 0.37 -0.174833333 Intact
47 B072 PE PostTx NA -0.186933333 Intact
48 B075 PE PostTx 0.71 -0.055933333 Impaired
49 B076 PE PostTx 0.26 -0.068533333 Intact
50 B080 PE PostTx NA 0.094666667 Intact
51 B081 PE PostTx NA 0.058366667 Intact
52 B082 PE PostTx 0.19 -0.129833333 Intact
53 B084 PE PostTx 0.45 -0.080933333 Impaired
54 B085 PE PostTx 0.04 0.021866667 Impaired
55 B087 PE PostTx 0.02 -0.146333333 Intact
56 B088 PE PostTx 0.13 0.091366667 Intact
57 B090 PE PostTx 0.34 0.015866667 Intact
58 B091 PE PostTx 0.61 -0.041033333 Impaired
59 B094 PE PostTx 0.17 -0.070633333 Intact
60 B097 PE PostTx NA -0.069733333 Intact
61 B100 PE PostTx 0.18 0.166766667 Intact
62 B102 PE PostTx NA 0.058466667 Impaired
63 B105 PE PostTx 0.34 0.137666667 Intact
64 B106 PE PostTx NA -0.064333333 Impaired
65 B107 PE PostTx NA -0.013933333 Intact
66 B108 PE PostTx 0.07 0.037266667 Intact
67 B110 PE PostTx NA -0.015733333 Impaired
68 B114 PE PostTx 0.84 -0.118733333 Impaired
69 B117 PE PostTx 0.00 0.223366667 Intact
70 B120 PE PostTx NA -0.146233333 Impaired
71 B121 PE PostTx 0.25 -0.005733333 Intact
72 B122 PE PostTx 0.39 0.211666667 Intact

Some missing values for the memory grouping (not everyone had the memory measure) and for the outcome (CAPS) due to dropout.

Here's the model:

mylme <- lmer(CAPS ~ Time + fMRIsignalc + fMRIsignalc:Time + VM + VM:fMRIsignalc + VM:Time + VM:Time:fMRIsignalc + (1|ID), data = test, na.action = na.omit)

The latter 3-way interaction specifies the prediction of change over time in CAPS as a function of VM grouping and fMRI connectivity at baseline.

Here's the summary:

_Linear mixed model fit by REML ['lmerMod']
Formula: CAPS ~ Time + fMRIsignalc + fMRIsignalc:Time + VM + VM:fMRIsignalc +
VM:Time + VM:Time:fMRIsignalc + (1 | ID)
Data: test

REML criterion at convergence: -60

Scaled residuals:
Min 1Q Median 3Q Max
-1.3403 -0.5163 0.1083 0.4483 1.2355

Random effects:
Groups Name Variance Std.Dev.
ID (Intercept) 0.014627 0.12094
Residual 0.007428 0.08618
Number of obs: 61, groups: ID, 36

Fixed effects:
Estimate Std. Error t value
(Intercept) 0.32631 0.06235 5.234
TimePreTx 0.43082 0.05334 8.076
fMRIsignalc -0.29646 0.60979 -0.486
VMImpaired 0.05926 0.08532 0.695
VMIntact -0.11314 0.07294 -1.551
TimePreTx:fMRIsignalc -0.19507 0.52501 -0.372
fMRIsignalc:VMImpaired -2.65551 0.92599 -2.868
fMRIsignalc:VMIntact 0.18295 0.67886 0.269
TimePreTx:VMImpaired -0.17148 0.07347 -2.334
TimePreTx:VMIntact 0.01349 0.06242 0.216
TimePreTx:fMRIsignalc:VMImpaired 3.80581 0.80170 4.747
TimePreTx:fMRIsignalc:VMIntact -0.16539 0.58246 -0.284

Correlation of Fixed Effects:
(Intr) TmPrTx fMRIsg VMImpr VMIntc TmPT:MRI
TimePreTx -0.533
fMRIsignalc 0.007 -0.078
VMImpaired -0.731 0.389 -0.005
VMIntact -0.855 0.455 -0.006 0.625
TmPrTx:fMRI -0.077 0.049 -0.551 0.056 0.066
fMRIsgnlc:VMIm -0.005 0.051 -0.659 0.170 0.004 0.363
fMRIsgnlc:VMIn -0.006 0.070 -0.898 0.005 -0.031 0.495
TmPrTx:VMIm 0.387 -0.726 0.056 -0.552 -0.330 -0.036
TmPrTx:VMIn 0.455 -0.855 0.066 -0.333 -0.533 -0.042
TmPrTx:fMRIsgnlc:VMIm 0.051 -0.032 0.361 -0.118 -0.043 -0.655
TmPrTx:fMRIsgnlc:VMIn 0.070 -0.044 0.497 -0.051 -0.032 -0.901
fMRIsgnlc:VMIm fMRIsgnlc:VMIn TmPrTx:VMIm TmPrTx:VMIn
TimePreTx
fMRIsignalc
VMImpaired
VMIntact
TmPrTx:fMRI
fMRIsgnlc:VMIm
fMRIsgnlc:VMIn 0.592
TmPrTx:VMIm -0.119 -0.051
TmPrTx:VMIn -0.044 -0.032 0.620
TmPrTx:fMRIsgnlc:VMIm -0.567 -0.324 0.183 0.028
TmPrTx:fMRIsgnlc:VMIn -0.327 -0.541 0.032 -0.003
TmPrTx:fMRIsgnlc:VMIm
TimePreTx
fMRIsignalc
VMImpaired
VMIntact
TmPrTx:fMRI
fMRIsgnlc:VMIm
fMRIsgnlc:VMIn
TmPrTx:VMIm
TmPrTx:VMIn
TmPrTx:fMRIsgnlc:VMIm
TmPrTx:fMRIsgnlc:VMIn 0.590_

I set the following options:

simrOptions(carTestType="III")
simrOptions(lmerTestDdf="lme4")
simrOptions(lmerTestType=3)

And I'm interested in seeing the estimated power of the effect for TimePreTx:fMRIsignalc:VMImpaired. So I run:

powerSim(mylme, test = fixed("TimePreTx:fMRIsignalc:VMImpaired"))

And here's the outcome:

_Power for predictor 'TimePreTx:fMRIsignalc:VMImpaired', (95% confidence interval):
0.00% ( 0.00, 0.37)

Test: Kenward Roger (package pbkrtest)
Effect size for TimePreTx:fMRIsignalc:VMImpaired is 3.8

Based on 1000 simulations, (0 warnings, 1000 errors)
alpha = 0.05, nrow = 72

Time elapsed: 0 h 1 m 43 s_

As you can see, I get an error for every simulation.

I've also tried running this model centering the categorical values numerically (0 and 1 for pre and post tx; -0.5 and 0.5 for VM group) and I still get the same errors.

Any help you can provide would be much appreciated!

Thanks,
Greg

@pitakakariki
Copy link
Owner

I'd start with removing the NAs before fitting the model. Otherwise you can get problems in the simulation step. e.g.

test2 <- na.omit(test)

You might also need the simpler form test = fixed("Time:fMRIsignalc:VM)".

If there's any further errors you can use lastResult()$errors to get the error messages (which may or may not be helpful).

@gfonzo
Copy link
Author

gfonzo commented Jan 15, 2018

Thanks Peter. I tried the steps you recommend. Removed the cases w/ NA's and used the simpler form of the fixed effect. I still get the same error at every simulation.

Using the call for lastResult()$errors, here is what I receive:

1000 Testing 1000 scope is not a subset of term labels

The error is the same for all 1000 steps.

@pitakakariki
Copy link
Owner

That error has something to do with the fixed effect naming.

If the simpler form isn't working, try the earlier form again, but be careful to make it the same as in the new model (the NA removal changes which factor labels are the default).

@gfonzo
Copy link
Author

gfonzo commented Jan 23, 2018

Thanks! You're right...the NA removal did change the factor labels. I used the new labels this time and it was able to run without errors. Thanks again for your help!

@pitakakariki
Copy link
Owner

No problem. I'm going to leave the issue open until I've worked out better error messages for this scenario.

@Johnzav888
Copy link

Johnzav888 commented Mar 9, 2018

Hi pitakakariki,

I get exactly the same error. I do not have any NA's!

My model output is :

model1 <- lmer(log_volume ~ days*Treatment + (days|ID), mydata ) 
Linear mixed model fit by REML ['lmerMod']
Formula: log_volume ~ days * Treatment + (days | ID)
   Data: mydata

REML criterion at convergence: 239.2

Scaled residuals: 
    Min      1Q  Median      3Q     Max 
-3.5629 -0.6089  0.0891  0.5703  2.1860 

Random effects:
 Groups   Name        Variance  Std.Dev.  Corr 
 ID       (Intercept) 2.993e-01 5.471e-01      
          days        4.967e-09 7.048e-05 -1.00
 Residual             3.803e-01 6.167e-01      
Number of obs: 109, groups:  ID, 11

Fixed effects:
                                   Estimate Std. Error t value
(Intercept)                   2.84977    0.35433   8.043
days                            0.21317    0.01802  11.832
Treatmentanti-PD1      -0.11757    0.45427  -0.259
days:Treatmentanti-PD1 -0.07630    0.02023  -3.771

Correlation of Fixed Effects:
            (Intr) days   Tr-PD1
days        -0.670              
Trtmntn-PD1 -0.780  0.522       
dys:Trt-PD1  0.596 -0.890 -0.625

And i fit the Powersim

powerSim(fit=model1,test = fixed("days:Treatmentanti-PD1"), nsim=200)
Power for predictor 'days:Treatmentanti-PD1', (95% confidence interval):
       0.00% ( 0.00,  1.83)

Test: Kenward Roger (package pbkrtest)
      Effect size for days:Treatmentanti-PD1 is -0.076

Based on 200 simulations, (3 warnings, 200 errors)
alpha = 0.05, nrow = 109

Time elapsed: 0 h 0 m 14 s

nb: result might be an observed power calculation

And the error :

200 Testing   200 scope is not a subset of term labels

Any ideas ?

Thanks,
John
 

@pitakakariki
Copy link
Owner

It looks like the automated test selection is guessing the wrong kind of test. Try this instead:

fixed("days:Treatmentanti-PD1", "z")

@Camzcamz
Copy link

Camzcamz commented Mar 27, 2019

Hi,

I think I have a similar error. My model is:

model1 <- lmerTest::lmer(Brain_Stem_log ~ Group*Total_Brain_Vol_log+ (1|SITE_ID2), data = Abide_Clean_Seg)

Brain_Stem_log and Total_Brain_Vol_log are continuous.
Group (2 levels) and Site ID (17 levels) are categorical.

I am interested in looking at power to detect Group and Total_Brain_Vol_log effects as well as the Group*Total_Brain_Vol_log interaction. I would like to use the Satterthwaite degrees of freedom since I used them to estimate p values in my model summary. However, one of my power simulation doesn't work and I don't know which one to report with my goal in mind.

_ For Power for GroupASD:Total_Brain_Vol_log_
This doesn't work:

> simrOptions(lmerTestDdf="Satterthwaite")
> simrOptions(carTestType="III")
> powerSim(model1, test = fixed("GroupASD:Total_Brain_Vol_log", method = "anova"))
Power for predictor 'GroupASD:Total_Brain_Vol_log', (95% confidence interval):
       0.00% ( 0.00,  0.37)

Test: Type-3 F-test with Satterthwaite degrees of freedom (package lmerTest)
      Effect size for GroupASD:Total_Brain_Vol_log is 0.013

Based on 1000 simulations, (10 warnings, 1000 errors)
alpha = 0.05, nrow = 654

Time elapsed: 0 h 3 m 50 s

nb: result might be an observed power calculation
Warning message:
In observedPowerWarning(sim) :
  This appears to be an "observed power" calculation

But these do:

> powerSim(model1, test = fixed("GroupASD:Total_Brain_Vol_log", "z"))
Power for predictor 'GroupASD:Total_Brain_Vol_log', (95% confidence interval):
       6.10% ( 4.70,  7.77)

Test: z-test
      Effect size for GroupASD:Total_Brain_Vol_log is 0.013

Based on 1000 simulations, (21 warnings, 0 errors)
alpha = 0.05, nrow = 654

Time elapsed: 0 h 2 m 42 s

nb: result might be an observed power calculation
Warning message:
In observedPowerWarning(sim) :
  This appears to be an "observed power" calculation
> powerSim(model1, test = fixed("GroupASD:Total_Brain_Vol_log", "sa"))
Simulating: |==============                                                                       |
singular fit
Power for predictor 'GroupASD:Total_Brain_Vol_log', (95% confidence interval):
       6.20% ( 4.79,  7.88)

Test: Satterthwait (package lmerTest)
      Effect size for GroupASD:Total_Brain_Vol_log is 0.013

Based on 1000 simulations, (6 warnings, 0 errors)
alpha = 0.05, nrow = 654

Time elapsed: 0 h 4 m 32 s

nb: result might be an observed power calculation
Warning message:
In observedPowerWarning(sim) :
  This appears to be an "observed power" calculation

I know the last two give similar results, but
(i) I don't understand why the first one doesn't work
and (ii) which one I should be using.

For Power my Main Effects
These work for my main effects (e.g. Total_Brain_Vol_log), but which one do I use?

> simrOptions(lmerTestDdf="Satterthwaite")
> simrOptions(carTestType="III")
> powerSim(model1, test = fixed("Total_Brain_Vol_log", method = "anova"))
Power for predictor 'Total_Brain_Vol_log', (95% confidence interval):
      100.0% (99.63, 100.0)

Test: Type-3 F-test with Satterthwaite degrees of freedom (package lmerTest)
      Effect size for Total_Brain_Vol_log is 0.71

Based on 1000 simulations, (1000 warnings, 0 errors)
alpha = 0.05, nrow = 654

Time elapsed: 0 h 3 m 50 s

nb: result might be an observed power calculation
Warning message:
In observedPowerWarning(sim) :
  This appears to be an "observed power" calculation
> powerSim(model1, alpha = 0.045, test = fixed("Total_Brain_Vol_log", "z"))
Power for predictor 'Total_Brain_Vol_log', (95% confidence interval):
      100.0% (99.63, 100.0)

Test: z-test
      Effect size for Total_Brain_Vol_log is 0.71

Based on 1000 simulations, (19 warnings, 0 errors)
alpha = 0.045, nrow = 654

Time elapsed: 0 h 2 m 44 s

nb: result might be an observed power calculation
Warning message:
In observedPowerWarning(sim) :
  This appears to be an "observed power" calculation
> powerSim(model1, test = fixed("Total_Brain_Vol_log", "sa"))
Power for predictor 'Total_Brain_Vol_log', (95% confidence interval):
      100.0% (99.63, 100.0)

Test: Satterthwait (package lmerTest)
      Effect size for Total_Brain_Vol_log is 0.71

Based on 1000 simulations, (8 warnings, 0 errors)
alpha = 0.05, nrow = 654

Time elapsed: 0 h 4 m 9 s

nb: result might be an observed power calculation
Warning message:
In observedPowerWarning(sim) :
  This appears to be an "observed power" calculation

Thank you so much for your time and input.
Kind regards,
Camille

@pitakakariki
Copy link
Owner

pitakakariki commented Apr 5, 2019

I think anova is testing the whole categorical variable so you want "Group" not "GroupASD". With a z-test you're just testing one level so you need "GroupASD" to specify the level.

Which one you use depends on which test you intend to apply to your data after you run the study.

@Camzcamz
Copy link

Camzcamz commented Apr 5, 2019

Hi,
Thanks for your response. I get the same results when I use anova on Group:Total_Brain_Vol_log and when I use sa on GroupControl:Total_Brain_Vol_log BUT I can't fix an effect size for Group:Total_Brain_Vol_log but can do it for GroupControl:Total_Brain_Vol_log, why?

> model1 <- lmerTest::lmer(Brain_Stem_log ~ Group*Total_Brain_Vol_log+ (1|SITE_ID2), data = Abide_Clean_Seg)
> summary(model1)

Linear mixed model fit by REML. t-tests use Satterthwaite's method ['lmerModLmerTest']
Formula: Brain_Stem_log ~ Group * Total_Brain_Vol_log + (1 | SITE_ID2)
   Data: Abide_Clean_Seg

REML criterion at convergence: 1317.8

Scaled residuals: 
    Min      1Q  Median      3Q     Max 
-3.9639 -0.6379  0.0076  0.6756  3.2056 

Random effects:
 Groups   Name        Variance Std.Dev.
 SITE_ID2 (Intercept) 0.08534  0.2921  
 Residual             0.41063  0.6408  
Number of obs: 654, groups:  SITE_ID2, 15

Fixed effects:
                                  Estimate Std. Error        df t value Pr(>|t|)    
(Intercept)                       -0.03358    0.08590  16.25207  -0.391    0.701    
GroupControl                       0.04221    0.05095 639.24867   0.828    0.408    
Total_Brain_Vol_log                0.72630    0.04274 646.32402  16.993   <2e-16 ***
GroupControl:Total_Brain_Vol_log  -0.01349    0.05882 639.10586  -0.229    0.819    
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
> fixef(model1)["Group:Total_Brain_Vol_log"] 
<NA>
NA

> fixef(model1)["GroupControl:Total_Brain_Vol_log"] 
GroupControl:Total_Brain_Vol_log 
                     -0.01349125

> fixef(model1)["GroupControl:Total_Brain_Vol_log"] <- 0.2

The first two work and give the same response, but since I want to control the effect size of my fixed effect I should use the "sa" method?

> sim_sa <- powerSim(model1, test = fixed("GroupControl:Total_Brain_Vol_log", "sa"))
> sim_sa
Power for predictor 'GroupControl:Total_Brain_Vol_log', (95% confidence interval):
      92.20% (90.36, 93.79)

Test: Satterthwait (package lmerTest)
      Effect size for GroupControl:Total_Brain_Vol_log is 0.20

Based on 1000 simulations, (21 warnings, 0 errors)
alpha = 0.05, nrow = 654
> sim_anova_group <- powerSim(model1, test = fixed("Group:Total_Brain_Vol_log", "anova"))
> sim_anova_group
Power for predictor 'Group:Total_Brain_Vol_log', (95% confidence interval):
      92.20% (90.36, 93.79)

Test: Type-2 F-test with Satterthwaite degrees of freedom (package lmerTest)

Based on 1000 simulations, (15 warnings, 0 errors)
alpha = 0.05, nrow = 654

Time elapsed: 0 h 5 m 39 s

Thank you very much for your time,

Camille

@pitakakariki
Copy link
Owner

Fixing the effect size has to be done per level, no matter which test is eventually run.

I suspect you're getting the same results because you only have two levels? In that case the tests would be equivalent.

@Camzcamz
Copy link

Camzcamz commented Apr 9, 2019

Yes, I only have two levels!
Thank you so much for your help. I have one last question. How do you fix the level of the effect size per level? I can only seem to input the effect size difference between my 2 levels: ASD and Controls with this function :

> Trial <- lmer(TotalGrayVol ~ AGE_AT_SCAN*DX_GROUP + (1|SITE_ID2), Abide_LMEM_Rep_Clean_Scaled_data_long)

> summary(Trial)
Linear mixed model fit by REML. t-tests use Satterthwaite's method ['lmerModLmerTest']
Formula: TotalGrayVol ~ AGE_AT_SCAN * DX_GROUP + (1 | SITE_ID2)
   Data: Abide_LMEM_Rep_Clean_Scaled_data_long

REML criterion at convergence: 3325.6

Scaled residuals: 
    Min      1Q  Median      3Q     Max 
-3.0286 -0.7056  0.0155  0.6869  3.1784 

Random effects:
 Groups   Name        Variance Std.Dev.
 SITE_ID2 (Intercept) 0.5434   0.7372  
 Residual             0.7049   0.8396  
Number of obs: 1308, groups:  SITE_ID2, 15

Fixed effects:
                              Estimate Std. Error         df t value Pr(>|t|)    
(Intercept)                    0.05624    0.19418   14.20840   0.290   0.7763    
AGE_AT_SCAN                   -0.17405    0.03960 1302.20618  -4.395  1.2e-05 ***
DX_GROUPControl               -0.04216    0.04732 1291.17497  -0.891   0.3731    
AGE_AT_SCAN:DX_GROUPControl   -0.08474    0.04780 1291.97945  -1.773   0.0765 .  
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Correlation of Fixed Effects:
             (Intr) AGE_AT_SCAN DX_GRO
AGE_AT_SCAN  -0.005                   
DX_GROUPCnt  -0.133 -0.043            
AGE_AT_SCAN:  0.000 -0.676       0.013
> fixef(Trial)["DX_GROUPControl"]
DX_GROUPControl 
    -0.04215871 
> fixef(Trial)["DX_GROUPControl"] <- 0.2
> fixef(Trial)["DX_GROUPControl"]
DX_GROUPControl 
            0.2 

when I do, I get nothing...

> fixef(Trial)["DX_GROUPASD"]
<NA> 
  NA 

@pitakakariki
Copy link
Owner

If you have k levels then you have k-1 effect sizes, usually differences between the current level and the base level. If there's only two levels there's only one difference to specify.

@Camzcamz
Copy link

Hi,
I apologize for posting this here, but I cannot find any information on using simr to detect the power of observing a triple interaction between 2 continuous variables and one categorical variable (2 levels only) and one random intercept structure ( + (1|SITE_ID) with is a categorical variable with 15 levels. Any material would be helpful or redirection towards some discussion groups would be greatly appreciated.
Thank you very much

@pitakakariki
Copy link
Owner

Not aware of any discussions of this, but you can probably get away with following one of the two variable interaction examples.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants