New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Very slow estimation of correlation matrix #679
Very slow estimation of correlation matrix #679
Comments
It is indeed quite slow. I did the computations for you on my own JASP development build: Let me know if you can open it. I also added the Bayesian equivalent to the file for your convenience. Those computations are done relatively quickly it seems. This is definitely not ideal, and it seems that some (permutation based?) computations are done. @Kucharssim is that correct? If so, can we not do that by default? |
Thank you. Actually, I don't need these computations on my own as I use R usually, but my students are given this dataset for their assignment to perform exploratory factor analysis (and I would like them to report correlation matrix). |
Yes, this is likely caused by calculating CIs for Kendall’s tau which is by mistake now computed even if it’s not requested. I’ll fix this. |
I have not-so-big dataset (N=1000). If I run a correlations over all the 28 items (vah.a1-vah.b28), the estimation is extremely slow and the analysis fails soon. It is slow even if I run it over just e.g. three or four variables. If the dataset is loaded, these variables are set as ordinal. I tried to set them as continous, but the estimation is slow too. On the other hand, running e.g. factor analysis (which needs also an estimation of covariance matrix) is quick, so the problem is only in the correlation matrix. I tried Pearson and Spearman correlations, both are similarly slow.
Steps to reproduce:
The text was updated successfully, but these errors were encountered: