Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

incorrect effect size for within-subject design #389

Closed
bennettpjb opened this issue Oct 17, 2021 · 2 comments
Closed

incorrect effect size for within-subject design #389

bennettpjb opened this issue Oct 17, 2021 · 2 comments
Labels
bug 🐜 Something isn't working

Comments

@bennettpjb
Copy link

bennettpjb commented Oct 17, 2021

I may have discovered a bug in effectsize (version 0.4.5). I was analyzing an experiment from my lab that used a balanced, 2x2x2 within-subjects factorial design and noticed that measures of association strength, particularly partial eta squared, returned by nice (in the afex package) were much larger than the ones returned by eta_squared (in the effectsize package). I replicate the problem here, using new data sets simulated with the Superpower package. Essentially, I found that the same values of partial eta squared are returned by nice and eta_squared when the correlation between within-subject variables is zero. But different values are returned when the correlation is greater than zero. Also, this difference occurs in simulated data from a 3-way within-subjects design, but not a 2-way within-subjects design. I am not sure what is going on, but I can't think of a legitimate statistical reason why the behaviour of the routines should diverge in a 3-way design but not a 2-way design.

Here is a chunk R code that reproduces the problem:

# ----------------------------
library(Superpower)
library(afex)
library(effectsize)
set.seed(2123)
# create data set:
aov.3way.design <- ANOVA_design(design = "2w*2w*2w",
                                n=20,
                                mu=0.025*c(-1,-1,-1,-1,1,1,1,1),
                                sd=0.15,
                                r=0.6,
                                plot=FALSE,
                                labelnames=c("A","a1","a2","B","b1","b2","C","c1","c2"))
aov.exact.01b <- ANOVA_exact(aov.3way.design,verbose=F)
dat1b <- aov.exact.01b$dataframe
dat1b$y <- dat1b$y+rnorm(n=dim(dat1b)[1],0,0.01)

# list partial eta_squared:
aov.01b <- aov_car(y~1+Error(subject/(A*B*C)),data=dat1b)
nice(aov.01b,es="pes") # pes for A is approx 0.37
eta_squared(aov.01b) # pes for A is approx 0.04
# ----------------------------

The value returned by nice correspond to the values calculated using the information in the ANOVA table; the value returned by eta_squared is too low. Note that the problem does not occur if the data are created with r=0. Also, the problem does not occur (with r=0 or r>0) if the data use a 2x2 design instead of a 2x2x2 design.

mattansb added a commit that referenced this issue Oct 18, 2021
@mattansb
Copy link
Member

@bennettpjb wow - what an incredible catch! 🎣
Thanks for brining this to our attention - it is now fixed:

library(Superpower)
library(afex)
library(effectsize)
set.seed(2123)
# create data set:
aov.3way.design <- ANOVA_design(design = "2w*2w*2w",
                                n=20,
                                mu=0.025*c(-1,-1,-1,-1,1,1,1,1),
                                sd=0.15,
                                r=0.6,
                                plot=FALSE,
                                labelnames=c("A","a1","a2","B","b1","b2","C","c1","c2"))
aov.exact.01b <- ANOVA_exact(aov.3way.design,verbose=F)
dat1b <- aov.exact.01b$dataframe
dat1b$y <- dat1b$y+rnorm(n=dim(dat1b)[1],0,0.01)

# list partial eta_squared:
aov.01b <- aov_car(y~1+Error(subject/(A*B*C)),data=dat1b)
nice(aov.01b,es="pes") # pes for A is approx 0.37
#> Anova Table (Type 3 tests)
#> 
#> Response: y
#>   Effect    df  MSE        F   pes p.value
#> 1      A 1, 19 0.01 11.23 **  .371    .003
#> 2      B 1, 19 0.01     0.00 <.001    .954
#> 3      C 1, 19 0.01     0.03  .002    .858
#> 4    A:B 1, 19 0.01     0.02  .001    .882
#> 5    A:C 1, 19 0.01     0.01 <.001    .908
#> 6    B:C 1, 19 0.01     0.00 <.001    .993
#> 7  A:B:C 1, 19 0.01     0.05  .003    .825
#> ---
#> Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '+' 0.1 ' ' 1
eta_squared(aov.01b) # pes for A is approx 0.04
#> # Effect Size for ANOVA (Type III)
#> 
#> Parameter | Eta2 (partial) |       95% CI
#> -----------------------------------------
#> A         |           0.37 | [0.10, 1.00]
#> A:C       |       7.20e-04 | [0.00, 1.00]
#> C         |       1.72e-03 | [0.00, 1.00]
#> B         |       1.79e-04 | [0.00, 1.00]
#> A:B       |       1.19e-03 | [0.00, 1.00]
#> B:C       |       4.16e-06 | [0.00, 1.00]
#> A:B:C     |       2.64e-03 | [0.00, 1.00]
#> 
#> - One-sided CIs: upper bound fixed at (1).

Created on 2021-10-18 by the reprex package (v2.0.1)

(Oddly enough, this was all due to me testing the package only with up to 2 within-subject factors. But it should now support any number of within-subject factors)

@mattansb mattansb added the bug 🐜 Something isn't working label Oct 18, 2021
@bennettpjb
Copy link
Author

Wow, that was fast.
Thanks for the quick fix.

mattansb added a commit that referenced this issue Oct 19, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug 🐜 Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants