Skip to content
ran88dom99 edited this page May 26, 2024 · 31 revisions

Second set. Basically everything from "7 to 10" folder. I was messing with the scoring mechanisms so results may be faulty. Only one complete cycle for MLR(3cv32hp) while 3 (6hp5cv,32hp7x5cv,3cv32hp) for caret. Caret's highest performance: MLR only surpasses Caret on two generators and is surpassed on two as well. Completely solves "ifs C2+C3 & C4+C5 on C1" and representations of missing data as outlier. a * b * c simply or as 2 layer ifs statement still unsolved. Same for needles in haystack. See 7 through 10 compare. "Random" of 12.5% is because of a fault that also inflates the other scores. When running MLR on simply random data with 25 test data points it generates 4x more scores over 4% than Caret. MLR's best learners are cubist 8, 3 forms of SVM 6, Kriging (km) 5, model based recursive partitioning tree with fitted models at each leaf(mob)3, brnn, gamboost, glmnet 2. Blue dot at highest possible means algorithm was able to solve at least one problem as well as or better than any other algorithm. Caret's again to compare:

Best cross validation and CV for learners

Some learners need different amounts hyper-parameters and cross validation. Partly through 7 through 10 compare and partly through I guess that the following learner hpcv combinations work best (and if - precedes worst).

ll 6hp5cv: earth, BstLm, qrnn, -pcaNNet

lhw 3cv32hp: Rborist, pcaNNet, ?SBC -fail: glm.nb, gamboost, ctree2,glmboost, leapSeq,ctree,svmLinear2,

hh 32hp7x5cv: gbm, krlsPoly, kknn, xgbLinear, RRF, cubist, rlm, -SBC, -fail: bagearthgcv,gcvearth,lmStepAIC,glmStepAIC,bridge,lm,glm,bayesglm,blassoAveraged,treebag,rpart1SE,

Some learners benefit from specific pre-processing.

cns:msaenet, gam, bam, svmLinear2, BstLm, gbm

asis:"svmLinear3", "relaxo", "superpc", "xgbTree",Rborist,

range:"avNNet", "nnet", "pcaNNet", "glm.nb" ,ppr,

Correlation between Caret's variable importance.

Diagonal is correlation between different pre-processing. Its often lower than neighbors.

Clone this wiki locally