You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
cross-validation is useful for simulating replicability (see Koul, Becchio, & Cavallo, 2018)
Use-case
for testing robustness of model across your sample
Is your feature request related to a problem?
no
Is your feature request related to a JASP module?
Unrelated, ANOVA, Machine Learning, Mixed Models, Regression, SEM, T-Tests, Other
Describe the solution you would like
It would be nice if JASP could implement cross-validation approaches such as k-fold cross-validation, holdout cross-validation, leave-one-subject-out cross validation), etc.
Describe alternatives that you have considered
SPSS does not have these cross-validation procedures, and R does. Cross-validation procedures also available in SciKit, which requires python knowledge
Additional context
Koul, A., Becchio, C., & Cavallo, A. (2018). Cross-validation approaches for replicability in psychology. Frontiers in Psychology, 9, 1117.
"Recent years have seen a rising concern over the reproducibility of psychological science...Are there measures that one
could adopt in such cases to ensure a high level of reproducibility despite the impossibility of reproducing the original study? In this opinion article, we propose the incorporation of cross-validation techniques in single research studies as a strategy to address this issue. In section Simulating Replicability via Cross-Validation Techniques, we introduce the concept of cross-validation and how this technique can be utilized for establishing replicability."
Thank you
The text was updated successfully, but these errors were encountered:
This would indeed be interesting, and indeed it seems to be useful in contexts far beyond the "typical" machine learning one:
de Rooij, M., & Weeda, W. (2020). Cross-Validation: A Method Every Psychologist Should Know. Advances in Methods and Practices in Psychological Science, 3(2), 248–263. https://doi.org/10.1177/2515245919898466
de Rooij, M., Karch, J. D., Fokkema, M., Bakk, Z., Pratiwi, B. C., & Kelderman, H. (2023). SEM-based out-of-sample predictions. Structural Equation Modeling: A Multidisciplinary Journal, 30(1), 132–148. https://doi.org/10.1080/10705511.2022.2061494
Description
Cross-validation
Purpose
cross-validation is useful for simulating replicability (see Koul, Becchio, & Cavallo, 2018)
Use-case
for testing robustness of model across your sample
Is your feature request related to a problem?
no
Is your feature request related to a JASP module?
Unrelated, ANOVA, Machine Learning, Mixed Models, Regression, SEM, T-Tests, Other
Describe the solution you would like
It would be nice if JASP could implement cross-validation approaches such as k-fold cross-validation, holdout cross-validation, leave-one-subject-out cross validation), etc.
Describe alternatives that you have considered
SPSS does not have these cross-validation procedures, and R does. Cross-validation procedures also available in SciKit, which requires python knowledge
Additional context
Koul, A., Becchio, C., & Cavallo, A. (2018). Cross-validation approaches for replicability in psychology. Frontiers in Psychology, 9, 1117.
"Recent years have seen a rising concern over the reproducibility of psychological science...Are there measures that one
could adopt in such cases to ensure a high level of reproducibility despite the impossibility of reproducing the original study? In this opinion article, we propose the incorporation of cross-validation techniques in single research studies as a strategy to address this issue. In section Simulating Replicability via Cross-Validation Techniques, we introduce the concept of cross-validation and how this technique can be utilized for establishing replicability."
Thank you
The text was updated successfully, but these errors were encountered: