New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
double-checking p-values from Dunn test #50
Comments
Is'nt this a duplicate of #20 ? |
Yikes, sorry about that! Should have commented there. But, I think the emphasis here is a bit different. I am no longer concerned about why the p-values are different if z-values are identical across softwares, but rather what should the default p-value output from At any rate, I will close the issue and will leave it to your best judgment! Thanks for considering. |
The default of the I think that we should keep performing two sided dunn test by default in rstatix, like SPSS and Graphpad. But, obiviously I should update the description section of rstatix::dunn_test() to mentionne this discrepancy with the dunn.test package. So, let's keep this issue open until the update of the doc |
doc updated now, thanks |
The p-values from
dunn.test
andrstatix
don't match up, and I am not sure why there is this discrepancy. I also checked the same with popular GUI softwares likejamvoi
and their p-values are the same as the ones outputed bydunn.test
. So I thought I would raise this issue.Created on 2020-05-28 by the reprex package (v0.3.0.9001)
Session info
The text was updated successfully, but these errors were encountered: