You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Many thanks for providing a very useful tool! I have a quick question about your rwa package. Statistical software such as SPSS produces the VIF values to tell the multicollinearity of the data. Even when those values are acceptable (like VIF < 10), do you think rwa is superior is
to multiple regression analysis in detecting important predictor variables?
The text was updated successfully, but these errors were encountered:
Hi @mizumot, thanks for your question! I think this question is beyond my expertise but I will try my best to answer. In an ideal situation where predictors are not correlated, there should be no difference between interpreting the correlations of the predictors and the outcomes for how important they are. When there is multi-collinearity, methods such as Shapley or Relative Weights are more appropriate. I do not know if there is a point where it becomes 'correct' to use a method rather than another; in practical terms, I would always opt to apply multiple methods (e.g. methods from random forest, correlations, dominance analysis, or from the package vip) to explore the implications.
Many thanks for providing a very useful tool! I have a quick question about your rwa package. Statistical software such as SPSS produces the VIF values to tell the multicollinearity of the data. Even when those values are acceptable (like VIF < 10), do you think rwa is superior is
to multiple regression analysis in detecting important predictor variables?
The text was updated successfully, but these errors were encountered: