From 219c3787083477b07dab60911776d7c788095bd1 Mon Sep 17 00:00:00 2001 From: Kelly Sovacool Date: Thu, 17 Nov 2022 12:17:27 -0500 Subject: [PATCH 1/3] Fix feature importance plot Reversed order of features. Also improved explanation and added blurb about Snakemake to the top --- docs/dev/articles/parallel.html | 32 ++++++++++++++---- .../figure-html/customize_perf_plot-1.png | Bin 26617 -> 26275 bytes .../figure-html/feat_imp_plot-1.png | Bin 43880 -> 40629 bytes .../figure-html/plot_perf-1.png | Bin 29892 -> 29523 bytes vignettes/parallel.Rmd | 31 +++++++++++++---- 5 files changed, 50 insertions(+), 13 deletions(-) diff --git a/docs/dev/articles/parallel.html b/docs/dev/articles/parallel.html index 52cbddab..a880d6bc 100644 --- a/docs/dev/articles/parallel.html +++ b/docs/dev/articles/parallel.html @@ -97,6 +97,13 @@

Kelly L. +

In this tutorial, we show how you can speed up pre-processing, model +training, and feature importance steps for individual runs, as well as +how to train multiple models in parallel within R. However, we highly +recommend using a workflow manager such as Snakemake rather than +parallelizing within a single R session. Jump to the section Parallelizing with Snakemake +below if you’re interested in skipping right to our best +recommendation.

 library(mikropml)
 library(dplyr)
@@ -183,7 +190,7 @@ 

Call run_ml() future.apply package to run_ml() in parallel, but you can accomplish the same thing with parallel versions of the purrr::map() functions using the furrr package -(e.g. furrr::future_map_dfr()).

+(e.g. furrr::future_map_dfr()).

Extract the performance results and combine into one dataframe for all seeds:

@@ -287,13 +294,25 @@ 

Performance

-

feature importance +

Feature importance

+

The perf_metric_diff from the feature importance data +frame contains the differences between the performance on the actual +test data and the performance on the permuted test data +(i.e. test minus permuted). If a +feature is important for model performance, we expect +perf_metric_diff to be positive. In other words, the +features that resulted in the largest decrease in +performance when permuted are the most important features.

+

You can select the top n most important features for your models and +plot them like so:

-top_feats <- feat_df %>%
+top_n <- 5
+top_feats <- feat_df %>%
   group_by(method, names) %>%
   summarize(median_diff = median(perf_metric_diff)) %>%
-  slice_min(order_by = median_diff, n = 5)
+  filter(median_diff > 0) %>% 
+  slice_max(order_by = median_diff, n = top_n)
 #> `summarise()` has grouped output by 'method'. You can override using the
 #> `.groups` argument.
 
@@ -302,11 +321,10 @@ 

feature importancemutate(features = factor(names, levels = rev(unique(top_feats$names)))) %>% ggplot(aes(x = perf_metric_diff, y = features, color = method)) + geom_boxplot() + - facet_wrap(~method) + theme_bw()

-

The features that resulted in the largest decrease -in performance when permuted are the most importance features.

+

See the docs for get_feature_importance() for more +details on how these values are computed.

diff --git a/docs/dev/articles/parallel_files/figure-html/customize_perf_plot-1.png b/docs/dev/articles/parallel_files/figure-html/customize_perf_plot-1.png index ba39d8c1d03900686b6ae97b6e9ee1fc14a74464..a6dface729fffc70d488fb5b3dadca108012b4bc 100644 GIT binary patch literal 26275 zcmch=2UyhSwmm$?s7Wjwk5Qut#5jr#P>|jP$|+o zNLNviE_LXjlmP}9VCZG$+k23l+?)S-?z!jYeDfp^2H{uUviI6+t^MZi85M=~-)#Se zKp?Cqo%-ntYTJptK8gop7E7Qz_szpf~rsxS4&xU1^z)%@&Eh-u_^s4ItI10w5-{S)BL29_rH61oJ%&n>n`U;sef}^ zD`wUh?>g>BIZY06n`-_-2>&9v#=Hi<63(1k`o>XG^RpF%@Y@v>d3pVXPh0)+3JTdy zv`~#v81kbI<6jC9j;`0tG}p9z`SEt!^`iBGQg)^_-Ef8W)BoppSRRF#;%94Ml;}sq z#c8q(Csp#*p7#$|slu?@io*zo-yV9|?c+32qu|90-)s8uPw((`9_OWY?;iaE|M2M1 ztF`#`A20bY{ebAzk+$q+x9Rq@`H@^2AL}FEJzjXOek1P_Tre0^%?wkuFqe^RVrgRT zh}O^{_+Azh1_(Y!N&nIv;3xn2Gm37(<)a&I)mb3PUSRqWi|%jrW)v*x0OI_uZL4u7N-iW0l+M zwvl%pzkT)1uV0_=#n}=0>(}?^XQ>K5_KP(me|isT^@bf6M~~oQ+ytA)^i*}2tcd+k z)3XwIUe^RF2fG!Bw7YV~j>X$#;ao;pssxs7Edv{klPOK)!Kp{5x9DGqUy=`2|k zNBG}}UCY~gVKT|1^2Ao%N5p2EYFV49X5+0g8)2wJAu4)qPrX`7)`3+=A|Ey={t6cQ zneM-Ee}5iI6I0W&7iXW7|Fow@yIuz#HaLmHNYMUG0CpRE{(|tl9NI7gKy(cj7kaMX zTh(9loH?NCk<)(z2Gp|g;so5xHZ(Icvm!)Ngd}{u>($TEkRWn``Tx>90WnO%k|DJ{^HOURK?)B}7YYfWRk!B%A2x0DVq0IcXO$(@+W0zbmHmcr^z!dZ3Ge4=d5)T{?97e#}(s^Wvr?%j# zdyRoP`L*$agm>!(LFaV@&mS>$!slN|@7uk2Va5|hV1^fljOCTcLl9X0nBgV{8xCyR zb9ygamvCuq$_WTXe{ELD$Pg`D=oRuKu3WRB;`JInouQ+czYvg^zK=^P2zu@-Cq)uA zs(5lbSS$fygy+cBzb?fr10RGfFuYRI!|CVyDgyn8CYF|he@Jo>PoF}DS-0;$H4etF ze_k~^l~ou@+$QJhC^vRj_1izjBFC{$I?e-;|Jhh%ZeB6^vB-~zWAM5T=DA&&vl-yM zOVg_5wEszdFWa&{*yQ7zTiH`BW+Jei4{B!~JRfy>pED5#G~Kjy{Gj^~KkSyrQ@1?! zIe$s;ObB|F8(1(~Y1KCnO~DmQXgcV>SxIC%$af2@bV4X`)H15!#`GLLsruE;m1{95 z5D1P3Hah0M_gt~^9xu$4R06w(7MYWpf!kzJ%niR8e*Fu^YMHUOn`Vo}3m&c&$~&~{ zX)@UfWaZmqyid>8(`~ssu!EPD3LAQiynW zvA2?&o7)J$!e z*y^aX&VLZF79w~Gr!Gekr=KCiHxXsIgRRT-7^leg)fL z#s0_q3BU9&gxLRAiS=TqP{GWV!NEaXIE9O?g#yky;9|l5cd-87uds|B=cXAHNU&n0 z)H)UE%`)o&=CpHQPHnU*#dV>ZPv}1u+1}mT?6fde;zw+?%(a+mwJQ9XAGTYLAJjr; zj_bM4_U29JGnC2qw~6eji&FlKYX9{V_Wv4Rn3+G>-8^83VW79B8SPkZ299* z)Ym_fooagCe;AUCz}qb_Nbva`i2n|Qmn+oev2NhtrB{t6eFub$T9neLQXT3homWIo zyWFtXGto^gr7@&0ohmM{xVly4OG2_8;sZ}RXKR*H;1TBQAB`PnBISEO2C&jgX_S}q zE8y~H>i!EpAT$mnhfY;YcY2I#mB_REtJ9vjHEap|$e5NnbZ0o7wue$^Lt*JktS7u` zRN$;)uPp*)Y8tx!#AZ9Hu1zh8f3wU+!%VZno6Q-WuQGf*#>BG>-)(8R?NE60xXn=O zbf>WRO2Xcq98h>r=jEZTI@gc=p_Xv@Kh>{)9K_}II7p|bb^A!~A;H4b|3ew|{=?Jb zB32!Fe>v*=efB;hZ$S1M)-5K{yGbX#xBDvzW=AYx3nIK~{e#}}09wKPh!QzoCp)zZ zLZ|15(O(BTmg`zToi=a2<|mRkyJ87pb9Tk*_1hn5TjfJ?``@j^f5J07x@cx*78D#@ z*VYyib3S^D>;~F1_V^)BRz6)VlStP zg}n+0WMa5g+--iUGNpu9dqRRhFh3f3j-;)vU9#zA@H6A;uo(3?_-D3jjeOIsy0fb@ zuMr@oHkgVkJw8YppVKpktYPurxQY6Z#>Pg-G|}1FlD@K27h@YtNAjmTc6cm|HQTpZ zh+O*+9%A2O;xv>}B2{H#Vp19afN?07Q83V)W(-9eWq~Qrw$XDL&RE}(l%Fp<-Dab@ z=k${Y!&#kji*u9Zhf8+&1zYCqD-RITH(S&w823j$=jXM8r^r|V4nLV5E8%0j>>D%? zF{(I4mYJ+SFFF2xS6Zc5YmchOT%A@Sy*}nVRkHKVt+h7Ih5^003}+4J>5AGk)p}Gig}CuKmaP%HTb6i|55V91Se<#)=nu_~l8yyt2aX3p1h}6j>Mh zf-!Ha9+%E}J=RQ#yixy$r}Vt)wrm@lw;L6%P7O7i?I@h?aIpctp)WqIs!YgZwmh$! zhtcl{@9CD{=0MHhVhkF<@J_co==;dc_4`bBd}q}8a>OH-&6>AqyHS#GW6`kuac`rx zRb@g}N6<QBWWUi^e4}XE4@B6w~}BRhfs2 z)1C6`*RMAL?ECZ%ZxF+kCQZ`giljTgnR;|yrS#beFJasM51chI`f6oeQ_~ZGMMIgb zmUw6D`m+kVNyXmW0g92S#j}%>Nwn;ooSaG42KIz1`;q(NTyKX9qcYX7{9Me1RG7)6 z>5$C_)O&XE&7LCn;AkK?eR}*(RAQoFt9hET52T+qS7s%NEX~Lh@nA72R2SJym8FCu zYSH%cMl+P7jZ$(~-u~u8rBOWGohSNJ8iN+6T9djb0j}m1y8uiaX<|+($j_aHMVj9T zk2T4&GdnVJx2oK1&z{8%{))RrjH@b|QVie9x9UnG`(x1^hc>g8bQ5k~M;;aV#fiLO3oClPq#wVk3K@sbD#zpc`Sa&hBtHI1EG^=a z&m*OsJ*b}wNZ*`HeJoT-I(aUaoC|7AFzdl1wKxvEllsRGaBn%K! z%C~UrZlj#rob6l=t{3uE;GFe$eVIg;yTfCDF-j#g2M6kM2E`z8u~3i))8I>Yv-?eJ!f9=cn!Pt zysB)&8142*iZi<-+lGc0Zp?OjI~UIq&L(b?wC>r{?q~fsPUQ-LD}dy;XSWlRUk%Ej z?GStzj`@i~m$U;KTt3-niC5LdX(S0)tKQqmNjR{N`fMokI>3^3yGyd#E@XYkpR==e zSwmajLP*6aA;UibphhJ^|=aB1$bR#^M zTR5LfOk%`I8&`#h>t>lY#ID^TZ-so<0 zwE7T!j6`0wc8mG=*5v%av?{3srS$nR7Nn7jv>;g5GNg0*`V=nLu9dpZ9+~StU-uZS zps3SPHk3h>!?IMH;jzg|Tm-`&LDS9Tz&qBc^=kC>2p>NkSS02Lb+3ZdDF8sAcHA|%{Qt5M5W{Nr6PgO3^rb?23mx9k})II>F?QrUOnB5m-2Dzl= zA!Zzi8HgeQ(TF?x3b_p^;eXvI;h_%93g%DL`nY7Ir{@ji_#e=)kJo;A{iAb|B&;2P zXT}t_DW^Ko;PGtp;ib`ku@fLugO2P|m0z@dETkWXm;(}xB(p+9pmA7p%q-Bq1;CUb{6#P`TkTQM zF&Rg-IF+G=F-$!{E2C;!eA1@5WodTL?9$TK7J+`-L1xTbJ zR?TVb-BxBpk~>1D+4l-^n9rLFtaG)NV8tZxp5K z_z^LV*X24ekmrKI82InE4Tq`YWa>5$9S0iXsKJS{vps$|^7yzA4jB@1FxGhtH--`! zQLQE|8;6=TNB1Kdhuk$|@g#krB)NGAjm?TpJsw+#nsBQXFaH_Wl(@278G z_5v{YN|n?24h*rF0YU?rX6zA%Lg3x;f_9;L`Odv>H>p-2R_9ynSD0wVO|B-@YtRq z_rU!X$*v#Q>R;SdYk9vx^$Ut^Ey-+0Cm>#7$I&j-stSQL9%oG#7Ix=Xb?hQ3+}BGu z0NdpuXS=-8E~Y7Xm(o~$zr7iepGrKyoSzkFw;uk5OHKkF+?WNK)wmF%LA;qLEXwng zxeJK}bAvcu_yCU&YLcNokS?XGw%ZqbEK)ssp|(^e4*}weNz!M=y>4k~Arm{zOG6zxvd_}1<=G2ZapuqywQ2K| znYa98B30kR4K6qSY9$xZ$xWy=D}UG`G-&uXw`IR$dq+nYtZ-3x^?3+NVNkx=E!aTJ zSk1)?8yjegMi*aLazyl2q`-|?I+#z=@xC2GdCFLXFgNf2{=QL zG!C4*@Y*d0`z73giZIG*w=d6gVvZ*|uDdLm*VHnmkr~uPfVUa0;X90_wp=?^1 znQksV-(4Y1RL?xOABND0C`w_+jh{f*LVt{HC>@Pwrt28k5M(jvE)`ro|P$6)SJbO{`$hAQ-YHYCK4csJ=HwPsN%zZlOGw zFbHRIe#B<0!bR(kx3+Ep7421HFBg;6(`X#M)t+6Lcu~D6KSmjL%+d8G78V>@gy=M! zL4`Q!$?h$XXYZU|oaKVZty|2?p~LFaH?F+>t?U~u`(g1zvfQ1ng%_JHqHc>U!*ILD zbyj!3Ds8NAVS11LbGwEzow1%G68W-QX1T(&Gi{H4T=vt$3WXL|2kC=iv)hw{IP3Ii zr3=l+s#@B?b$$yAUiwPelT3E-uz{SPCLA<9-Dn|#%JBrmKs zXgg6flb1yW?V3Q~J`}FNkeTM%x>a^{{JUlTx))??&+WCK^F#T>cZz&}yJ+kZrHL81 zp2L#?we0b~^%^WmrXUNOQ41!LoCaz^I&96exUkw|@b-_in+=~)^8-Z}^}LCBBFU}V zeHGVAcG2>4{SKCw_U)N-Q!u12c7pUS2?`IH2;jnnnMPTRM`AW}LtJ0sgGTn8N3g4V zq}`1Y`qXY_+RRw1*sPYFcWcCSfx$Uyo7wNEIRa1_xTZ*CfQMUwKkyi& zNzqqJ{AixxL>t|0Ae-j$O)mS5LZb5gv%+ufJxqN+=b@7fH@4VHpLs>dfOGYWUkK@f zMv4=3#=3K^F{`$%tu2Ov#KQ^S{9^%n&L$;!@YWEnp}g|RSi4nFhtUEQMFdbV^HA>v zCqhl@1SuEbHVrLkNPObKsJg266egZdnC zca=1wZNTQh66|%Tn8(oi3oTwR7x-Dbe9CL`ZY5!0*_uHdfR;MZ-3N$#R&DlUP z3#}@Moa??j^gR^Z0~3sf&$GKXn^`R{55MnC5x-sK#OzC$WO<8sygXslTK`1mAMo}R zS;xNF5MD!9wnOWGjiX<*8f)<_)uyIrbai!0=nz=G;BP*HFHi323${{WbmZ67){>3G zoT6G8(l0IMD664jW`fU+I8KrJ3hACPmH4%_@}I#pG862Cmwmp8zxhA7`V$OOhednAXBJS8R)<4`+wa!*vSWy}3e+ahpE3OkMW z5K94PA^gi9Xo1Q~5;OJBtJVsGo^Bax=JxsFyqrIivOSnsafSdL7(xXs3B`d3ycgsH zGw(^I0KrOl??55@*l%b2QfHR;f^6}_sTD;-c1Kx|% zz{hMe4tW?A6%_)(pG3wj2;#TyXmC<1B+AFcZSwA}P+R46WZ2o+p+O1A`VEpj4=jUC zj${1Xo(f66?~|8M6;is$KcFG+m3lmI$_UrD9K85|k2QKnGyU3GSf%)%9_f`C}RTJ~rk;c> zP6{Zzj_vy2I{c35uXwvS_@D6FvfD}4I^fol_pD0*X+9@-Wik=;A7vXGrj|Tojt#cv z5DmMV@+|7QmFw<*_0-*Z%Ep+b!|)YsDap@R*P)}EW82(cI1PFhNlfeU6>ZuEPF~IAUQ}HOnBbZA zzZc4mwBifO5c0QL-(0Dtuiv>asV~27ILYE=oG3@K%L8sOd-Tm(5sQ{|9axdb_xa^% zzG6W|TzUu83o%-dxRqXCRcJ7Ao0$1UuSgy-nTTPe=DnDC8S!qp3|c~~3urMlbcP3e zUIYSW-pnU2ppvf!7u@mb`}|0Jx;as?+P?P|Ne!82OB(7I3g_j5gM!ZO_ZOR`LG+S$-A0Qr{Q{{on-9rMHnxg51WQHxhQ2+>l&iKXh z%dIos9AC54X^5;jcJ#-@*-o}aYtUP`9{QCz95G5$ti?{YEJ^FzH^FgGqHooAy+Ou# zN_boe-mM;I*RIj*IsSdkwoMutG1(o8c_-XJ}cypB&UFQIcv{XF2a4?ERm z-CfjW#nNgIf0IqU=G!+aJ3%xX%NK1DGb45VK_86N8}Yl#j2CA>>vKHpHPiq}-Pc zYut|rz}>C24LKT`+;ldI5mtF)Cn53FK z7LNqpMjXc1hy~3Mf9C+HWS;H9S5GlRU!v%TD-Wi4Vx(Of4OQr=abe_VIqkax-_`M0 zoFtJ^#|0uw1HJ)I6#$@0j-wDvXIC47W@II&kY(AP3Ns!JK0iopk{}akRifsA)-e+K zMt5<(#mKQ=R*NQux&T>XnVt|8wTDZ7;ktdN+w=l>XUpC{+y!!kAKU>wlpNI&s##k9 zbKj~44gL1P8B_+s+~)?cB*dlx=1neFr2bB+si~Qonx>+k3{XLt-24#2iK`Ho-J|*J zG_!x%O7PquKANTx6=g+d4<(9dPqmt8j>!u!M!%ckDm7Bj`k!lnPwL`j#xveTNhe3Vp{CrJheghQy4I##$N4b`?;Q%q4tKG%WGok`7>75CF zZ+>7sTrIPEALIC=R+?%%P`hrMFybv&tXBK(#8{M}*#ktTHfZcX0v53ve^SAmDblx~ z7ahbK`haAk9$?jf?Hj-sDpvu=A=T>`T7c=2Azn(|xNPw=rCNn3Uadl?&V{pC|d8m-h zavF?@?y#o-O>d>%xL%GKSbaq3>zW`z-5l&I0CW8tBi~tSY+RfTRR5V5qaC!LL%q`f z(-uLUCqQUbezFC-UV@zFcZ5qWnIj&QgnpY9L4yHmLv6My@aXW#|WchIf}mPV4AT5K*=O=?woIA;VS* zAYugD9|D!|UVeDDqx|ULUurZ0Z`nzFyU>u`OHTTKyzIYx%YXjVqW&9T)aS7+A}A>6 z^UHEpYXrudjW8Y?uHNO0>A&!5|NK77j~W|e3uDkyjRhO%dBjqe4rUnkQf&r%8R3zi zG)$<}L$x?{L;-bQw&Gry}0e*3)h+ zE#h)bTPnqPIF>zaR~O{3IbTaDb61g4?@m2oqLXhVn$NHSQ?B#6Fgp9he^L30}`Un6B znHzC7Hk@h5$gJJCKPg8ka+B=_UDI<72?~+tq4~`FMIw$U4k~Avz(Ae(_Xi61(TBU} zq60{%0gx~$1fssb5A1eKqOWg0-HK=p7S1lHhQ@ud+30oRq`(iw;26aLG^&m9AQ+GzjA!pfhn3o zuMBKWSyF9+b{5#ivfcoV<7sWZ2Dc^v%%rPbuWlmhQ~_|OSOROS?DTR3`j=nye}25r zn?;q)T4W*xNvZ|n@*DKpl__&33Y8=%DgU~GxS+gSW$wdqu3oIsY4YsZ9c~jf{=Y$p zSmocCq#ScuInuy3FKRNOV8KV;x7g0`n=c7RYt$mT1pF1jhjcpTLhRdtR%NJG&>esV zE7-=kd>#ORF^1*@5_!eS)z|?Rcl}x8VsG5My3`x7ULkhpdqoA9%<%o^M(t6uf{Vq& z^m-CGrYk7#_(|Z0@&lBqCILa|3@0r~{X`9QP@TT)aem;tTN-sPmm+t;j#zJ!_49PK z2g?Q(5*gM^H+JKo(aFF^au3?s5<%xhJriod_MS07>{fy~{m(*xp_1(KTqz8NOdEhX z&0Kp6ADQv@sIg$aMnSwmO{4eb8s1aXL|q%GdySzej!8(cINS)OGzI)}(r8XXZ51j| z#ENAk{?JD&H%Npx>_AHez(rw~siA{f>3fTAuP=wv_Ax#aT>V-H1DP4XCjLat99t9g z076jYN1K%N;!hA016D+{k})2K*O>(TJPq1j{kaADj!?_KprK)Pq!HCa%HohQf=eb# zRc7ob^kkr74D@Tq-Xg%}WYod2Ws_-UL#wI;QOpLoj6U!qzE8efM9#}TavX3Z(;X&| zyO&d5?afLMbleJ53EE~rx{?Iz)3~??Q#>X%whVrsS!v}i0{42m#e3S1vOt7wJwIMY zVI5hydL0fq6L}OgoEMDV66ygy$U}}jTWuU2YF~yEf}|o8W9auA@0c!dc%C9yF!O00 zz$E#(8ef#&0FU>9H1!^EY$%#>P}+vfMkdl56U+o1s(~83&nqhh**+#IsT#G*xWt_O zpyAuIz?vzdn`>XW74GdiVnZn}ff@jjIyU`eO(}ehQ8)QSW=_{4hzt*jl}ZQ9lMxAn zf@WDI(S2WFdy;}LFNd~B)XnAig(+We+JpWRvS z#`^jm7}9HcT2BF}W5H7?-WdWpF%mpFvMy7n$zYD;!D(O(P+0p6N_;GMpM!rj@QBGd zR7YWx^g_{aeuGqhIMGD#j(D5of_;Z$x3`W%yPbd@1e!q0LeF@Pim^kKm_O+wBojBKRbYQWj5Aw`<%n^nt65Z zk!KKi1Tu%=F%!ZJidHs@#XoI@jun^%0g^CKseOk#O^XVERCFt0a1dPhE_1clx!@rs zmIA4qSLz0)zdHikZH~{j000v3YK4feuFs}_;n1FASE*k(`Pw9*c~sUa240mW-8rk` zMWw*D0Iy{y3Bp`(F$dIT>ca?HzD%1ME-OV$=T&SzJkp%m(w5SZGgJf+u0^zHHP&XJ zK56Iasi_${aWGZ5a>wkzB`X6cM^2Mb-%GF*@>c{aib7`Jw>-~Wrbj9r))bBvb8l9G z+zYGC35I33=D7lTcwt5K_6F!%rYu5%v;S>S1{7o4r0vbgJy498*iq0pjNVufk((aY z0F-zS%3CV15ZL7e<@*cU9S}|`GEMK^hDpab2R#hsRA3`r0d!Ts4|D9P6j$L+)>KyO z0pn)n-VB!HvKIh|KIGvzRXI|XNLxk0NNeUh+l>GNp>yNiW+b7(qu$_JaXWGc7Wf%v z)F^l>|9f172gI1fL`#;2y7|1jG`rb#a715Ss#rRw>leOClFN=W1JFpDekJak4x%Ki z8s;7Og`H@zW`XJ54X%lSfgv7)u zaZv3SK(w!hDHL`XXxWV$)bE6iHe++-^&7*1TsRF<|K4wGwl$sA}BZ)`B`%@GyQ<7=`h zslVW0()*i3rfYl9L4A$RD1#M*ehD=&5664(u({ZyT$(ShU$!im?Yp31)j%!`t;OuY z={U58aM(vy5{$Y21N1VLc@)a z)F=#sy3-gqFwB@_^s!*PM(LMWiV-tGmP30GJPhNu7DIlK4;BKa#Oy~dk0&@%0-1b( zF=k(_kxR}%XhR@$C7Ij3&P5M4aDMNB;SGezK7@|q^V-_MKtKy&Ar9ewIuh2cY$_+w zjBe4^9o<9fEN~Mssf_^RK#c5d&X&0(;yAtyFMIS@MmZ1EV&>eB1$Be0T)VkRWbhXq zK+liiO1QHjR%rb{sVC|*FLbp+=k#7EM#BBle&Sr`=%0(t|2@=exF7{jG8}D2O>743 zn{ysP(siio!rQP+41EthgG?|gt4Tq4_;BnX>{|HzUlIMEE@Arvc%|9FXP=)mtbcJ9 zZJYW7&=7GnaQXRwNkXeVIRXw&;Dnw>!C0BW=dY+fm&&ng@{mN1W1b|I0&A-K5s1*E z*;vs_Jw!_}IOP-ma|j;FqxCS_Ky%f!cfK51k+hrnLG+P}fAbtj-TZd^l`)+Sh) zR-|MtBm>J+amyp;z*g2kVgj%6r?BVNHBb4oj>A3`OI}`Q_!NZ9Gj?if#f$=mx^LRG z$#I#N{X*J&FDj5zGM|)SFOG-6&)-Yn zx?se|HJX*G&6NnIMM_pMTD|{Li)4pGbwzr66is zXdL8}f`6>NfD_yeR?5Wx*b+s(V9DK$58$>w1nKZ~adTA$2t{b>=I>w8jB_OAAl>RqXggA&BunZ1;g-v*<_&U zT7%ZA{+~_)LWK*~82@y2Ljm2ZFO9)xMi3Nk?nQn`LQb4;1kvNKR}mDGe^=VC=nv&NohX^Iyk6Bf?-;LrL-^$!*vSH|e)c;Ia%Z3=_8so8Ce=n)@A; zFeiO^;kAz*_%?5IBWyzSu|jc3LPfI=%tr>1yDU-pgel8HIVhk2P=<5WZ41PBb6NVGmPR^;R`GYqTcK3&q91rqLw=X{lTaQ3@SC=^3ip~v& zQ?(0_|JQCwJ7#UfUaiPo>CK?K2d(WuqZW2g;se|Oeo1s|9+cu0Z|`n`bac%Nm@fEE z?og^%_PSN}lMNc;Tl4yazomkil&Ukzd*kg!l%L^8`^{qy0J#bu&&;#`Oe0ZKgSg|r z;aEx~oSg=_2Q18BC`Hd0-aZ9#E*`2@8q8(1{m`YOobOq-=b#WwB12;ViqAMP&LID% z!Cw%qV{8LS7d4C}tyWbe3;cO_#Z9sZ8 z?k)4jbYW&}8%V%j ziwuA|U;>tJ>T$$k5z4}XcaoP}>cVWY6PUGj+geH*t0JLk6B|R)svS^IhShwRF8rt* znJO$egUk|1V37_k9u3^KHy{=^LagmqQlLOH$E(>3)h5Fptc|-)_8@a%YARO9wkmfV z4qcI{Ff8jE2Jiu;mjs^x6v->0|4H0=7_IbZZC3V?gtnLk*l`_0>yl+n8QE=*od$B& z4L_IdI|r8TJ<37pnl$i4l!9rlNoO=i@c6tLw4#jd%F6%u_uQ1@eW@{|H#fuO#kknmyw)NR8XjjL}vk# zzYy>lhLCVg+}mM%6#&MAU4RK~*}LIb1$H2`VC*NgTo|3JSobJ+em3hk(!;PRAL2tw zE>6_p?JuaQMIzrYM=cD43Y5bZ1L@IRY{uY>loMD%xD0_mk{o>T9NgK_5!hWG8>@)D zM&OHb1y2@`cyPP{H_tQ}rLO{aJH7MoEiZOZ-wK2VB$j`WlQI>Qh0|bgrHOz;!@I2g zEj(^H^30`v{4|Km7J?3~WyVHWIJQ28R?$5qpxm3TXv1PaR~xYuEOxq?V=dAv&3#*3 zcmYEikjUWSOKcck>c#^jow<7glS>zm%`(>j*rdA7Q@r)oCZm{bOdB2AJ8 zLQkfIFJ0I?*wtt?Xb-ue`+%i_&`2gin~Yi9y!p$uUC?nL@Og3dK2Z?L8-)|yUU{yt z%GC#dLQtVzhG|2q5urm*vTyuxl@itP8Xu2n0LNVe-K15p6Iws5P=a@NM_vxQ9zlWE zMmAo|w_8Ny!^ZgH{7_9e6^0xEfgxB&{94hftI)z?xGgDPx$mxk-s7c)P)VY z%dtn=5J)qD%v{0aPUe+lehAk*$KdtR?Ls!6@Qf&I1g8nH6f#)3u%3&t=kQsDs^^e- zuwM(i{%Zm0AkeIA#f(C<>QJ6&G`6pTAXYfnkc8b1y{(-rd&n2rYTLro_Lt~%)GqC^ zHXCp+_JSQzg(C!;+woBio2F(e*4ci!ieYb5#GMZbFt58nRFttmuX(&Q(Nd^NkBT{MO@y`J0&79)XEuLRl`W$~mW0Hj@<`gpeIbxm z4=gW)Fa_M#IkWkE3P!N+65u%~ z_UpOD;twu`5`QyZf5k{_PoI6uFUJYU*rH?u`HBWTOXg=sEPbF?M~sA~Gpn(bWZ4&Q zMaPn8k#ym7U+bz!%phRNhEixoT5jFs@JqqtCCtEW{a=04F?<@-AgmIJ!Fu->f$t;X zfk@AmZ)|`!*0C<#jbb+TNnjwtMw+0`OPTL(i$_S0yfg?d@vMsfJN0d07-#dBNd_Q+ zO0@Er9_hQTd)tSG>mTH)qI9*#>F(^S^76OK~8ti6zf7t3kG6#y9sZVu!uu&&f zg1pmd&qh%l+#q<|0r0x@SHK->S|tX3ag}9+*5^|p)xcwF{2h(&pt%l$zP|xxPY=rs^MCKrUu#C_4E64{CvnOhm z&;*yTJt?7tz-^Y4-3GW3)EJ@@Eqc-ie+RwIS0TM)J0>)H-BX{2PHN5}i-43Yo5feR z24?DP5*MYkN(t!(JatQ-o*zQv5xU^9U&gghgyPg{1F>QjkDTE2GDSkW(mQ*2PDKuS zFu)TgK$~Qu6ypRkODfz4&jZ2kNOWtf3;i9$GZMLxr4kN6qW8N?2E(EOopZl-tZxhLP73(hCTc-8Ow*Pe%?sA3m$0FI&f`cl+f;r5?o?vv!gKk|8md#+W z!z-4kkl~0C?%DBx!Vq-4;+ls3t^GWp1@-rEn3%2vkY~i=l6Ar2Zd4T#2pP`^LUT5p z6@ux>{^*-VQD`d#n`Z^MhdUOuz_Ryl9MER#8srRG6TBzIE(XG^=UPj>VI(VK^h6YTc z6LhfKyxXi{dnw1eGFUummJZA@nrE({+hZ7wM|;yT0HUXmd?CF{x6s{%>Muyfp^%|pKXRjP={Fy5J5J09nOP!xNirZVbGb^iOo*uVjjWZ_eJ=rC}QUV z3($pW$a^Xqc;!9rt;4r@^CNo2?{f?(I9wT^T`ortE* zBS(7BX$*Y~?N=yheT4&M;QUYfrtJae%#|bpYbQ3KiyaMz;g2;nCA>Ud8no*3LQ-!a_>^mmE#j)M>I4^ zm)+ZZ08iA&lA3~Zlsa@q+`G9u--F`}f% zIUcPF7!I)r#{!KtCuRq)ZWhQUoU=?|JQ3KV`2f3tp>+iSYgPp4Iw*Vxv7dD%w9uzP zG0EJvo8W1I_5)qTbBgDjVcBW5N5{m(q{E>#wa{tOmd5}xKnYy!0#mZ!0#qa^;$fW$ zz%jg=f+4Jhz1*IM4rFHLB%Lbms}F|tL9nr#fSo21Y&8KG7`J1Ku35))QdWX)Ze~4{ zL+8k~APcu#nZL~#0ElyN_r6ImGNnVH6R1i7{kv~bY?7Hf*{B_yHBF}`U4Z?4g@P8_ z%6X`^65maM${av`gs#{vtzxc1kT>8O${`XUzIud?K|SOaXMl%WMG_-n3walmKz4qF zF$wMrC!=UBYO+8aH05RE(N8B^Gy6EhaRRMknVN9A+W>2+{qED6gf>dwh3QzbF*GR*2R413_Beyt=JDAKA4EY z63OtHpP%>0F$#(jEqWOq{)(QqzUYjRQ54bS(xq2s-o)OXp1q=?4fG6<)s@I6PoBgB z6GE-PXRSxAuXkKrkd&0PmI@_D+~MNWq7CnWI7jd-{2m6D;BGc(%ntL|v}qHW7+n{W zo_?gEv2hUc(mgM)V0$*etN8AT30q1v>C~y@pQ`{XI3?^mc(5L}46EM;VBslprbTNM z_{`_xy}Z3M62(PDD`R}&_nc)J+xjhA(o4;}jnBwQNzHAr?fsQNSj~t3kAOPSqC;1M zqE6R^WU43g>j%d>jc2fui0|IMy;u?!oB%0Lm;5asAH&A2tE=l$Z(pCHbrMthUDX)! zK{H++?z$lIj5Qws|<@WxYvU=6d+az&X8 zLK#e+&^DuuG^HE|$m9&Bnvl8CEsjcaCdFosjx$#I)YGTy@c7cX`T0r6GI)G~U_d!- z-BndM$%5%rX=5N(ZLD0$TlN0^ld*g{oeq66>^e#564ZOUguVjh&a-oT%XnpR{lVIi z5lcG%e8Fjp`bG!~lQ~(D>)cC5N0ayP^Y6|!4rP@=31~6UtwGVKjei_ClLWqhIOIqV zP7!HzpzV{8Xrj-u8!L+Ijc$-}4{Ej_XqM*1xA!=PotInUA3oejpV7G7{cZK6-teK# zrxXjf=D4$2y5zh>F1nDSl9JllvzPPX?55~*=gvI}35k!0Anx9?CpIGDCvmHekTEwS z?n5ptuA(<@jHu}g1|XXd2==>hO{{H%^xgo>MST3hY@Niy+Fxk*TYdTDS*a}OFeOJo zRH@chwW;aou)LM4;IpF!TJARg{qMpFRY#OtDZ{h68#ZjHnsfK~jjWm=r*<~xWdGqGoq(-g0g8RO{acODiYb%017nK10csImS$+vP^wMi7$I)=&v{Wpkvv68d zTuv_W#f!a?p(@y%)1h$e*iH0~s6b4Vqr(Ah3Ee|O=B@L|MUc6O*~Z_kS+&Y3A@IqQ znFQrV*4uiA0qg7|Qc{B%eGunNuU-29@BSz-&?Iy^fV2LZ?s|gqbnxP~ZQJho_&g?t z$~e^;w*#_1A|dgjw6xUeYjuD!`$^lmx$B@UBERwU>C^qdf082vv@`cgN;Xf)4Lx}B zWN&HyqhFJfbMo?Zii(Pu;|pNSB3UOr1$CL85tW*Hj_l%B0Gz%iZP9RPc7EW@j{5cM z$#B5oZW8!Q&R)L!6l%=x>y8)(MFH`5K}RQkeB7E;3vhAs#*LR<_-@>|(I`?@R(3&4 z%QRu9w4`JV9Fk-P#M0o<(4(S{qjb&ZcQ){ldpeR2!vB3@<;s=hwvTqOb5c`NZKsAb z1vFECB2P>WYim=xy}i8ZfEgx|cJJPuW8HfXME5kCBh!{c)i1{F zaEbp5P??j%BO@YSBqS(bzI-{yemLmLl`B2hCE%ZLxw+#82Tdh+Y-oRZ+`uYp7en{49 z!fR@3V88F&xwGZvjpxu_YRhK1nQcnz0M=;722nAwmc}B$o)^xY`=OL){PvwY<%ibY zx^-&+XhCwsr%zYniv0?gmXY9XNj~iz@Z>XrLiz>$jS1Q?@n;ki?)-TCI2>@Y8p4R` z?Ck8l`}Y_0=NxCpBNO%VW7lx;zR1ebfK{Zfr4?jtW8)tj90vsVqL5pK8dy&RbzX*- z`0~L`BO@dC@$)MRVc3!M3-X zNW3;Z(!PKH{@PfzIHSDJ!+J;qmpkY4xA4zKX85l!zFh&bDd+R$UL1JN=c_Y+_+gAr zoX_`1H!iX1P%e0KRuq2QlW=0GH68xE;vTMV&gcL43$wp)`cTT!xy}tI$q@*o6DmI? J{&?x<{|`7lHwFLz literal 26617 zcmch=2V9eRwmu%mQD;*l_p&Yy#;lYaTL%XARP-J zy@Pbi2+|d#mso&+0YVQDAir~h?w#Ge_kZ{9y8o>6K_TRQ-)}kRInVQ)?_NKluEMi& z(@Fw?z(YFvizb1v%#A=;diI;;@ExPC8ja!0@^h*xzra^`%^WJCz?W|?9MvNe2m*fi z-;%)IQF8*}2LkDrpS0W}2fI9rwVk6lqpX6Jem6R`{<44V=;i){##za`9$c0AF_>Sf zVn8n3BCP45KW{{0PWYah*OAXut@W+m_0q5YQk_yx)A%-L|E7Hj69nVE!z&I%mQX0u zBgCa&t*ASjnO9H$a8b~b8?;PcFD>rQ&CBbH_8cZrZwe~PIk;CFtc?B!FR*jf zrTjA*3ZuU$&AsClCe_I?_FUq_AGDg~oyV_)ox2vl5&O{gHQec;4Pu6OH7K)1yS_|vsZ2n5B-wQ%QBW-oqUUoh2fLmtX)v6!70I2^c?K$zJ3`4hgzX(ISizjuRQ zArOuod%Fz3{?{e`*EejsAqXEZ+}PO}nO`oM~1j3o!nTO%}Uw&NQ+?pAuTLvmly%h8uyiJ+*qwMt9LLkKYNo`Ps%XL}ShD*7B zOiZ8>i-z(HGM~fu6zgz+;PnNjw(0Z)e2`O&&-`p}@pzpo_2w4YthZJ$Fwr|;-MqQA zWa;vr`!~cyzW5+~dcK4Q54MRHkNix1SWqB)WjT+C+mG<{!tR&(2(i~{B4os$U*oCH zIQv|}eXQqx1|HDrBJk~F7Z7pD=6;X*-R`80sZGsy~z+EKu(3RO8w)gDM zuT5dF46cj9PvG@0thv8mlu4QW@JRhiyLD1tM!9x}5!)_S2$1{l`E5}64i;k2;zGRe z?%DBZ1zD1W&zu{sf9uFEcL$&MO=9`|74W?5)j~=Ns-a?P)cyPS!)`Fo=qaD{K z`GLoKg=G&#!DM11!)?k&#q7me7N6pB!V~H0&=l^D{}egtTx-sEOI%k3nH@#&WnaA$|Z&OUKy z`I;?vtHp{^`uD+{r2er1A~L+bV5~b9M_s>Lhe+DAX;VXyEAz52pW_b1xjJ6T^mAfmfc)_35xBR58f&=&p{lC7LnGR7 zVk3d@MCaU>-v9;%20jbpF~X$fyn>Nec$FTvUV`ZsM6mMa7N3dNJ{19HEvnhv^k~>V z(<7gN!IxVLpFt5q8VI!BF5T{Q%r>Nk&Q8NWkHeuuhpzE#*7;A3L;d~3H^y?)y+uh9 zjy*5E`VR?xN`Tz3YBtaiS1|qlMiB8sMIaw8nDVy!+~^nk@biB9_W@9A>)qSiRCbVX z4F0?k;kiQ!$n*^mb_p55t$%;vti#VY=oh?}2~y@pal!hEYU<@GjHi21aG}3C{1tP9$M#`k za;+4qDBh<$ zKcGP#dAfqmS2lf^UPubs5to)$CkP4u&an?uha^W7>Fe;>BS>$%ezVRE4+sdz0qpzx z?zETV^~y2CL@g2c6U8u|=i-4gHh*!GdHp8wRCbH8XYczP;&r#S9@5m+4JFqr%B&Su zKkn`AZC^6kd|%~j3<1WLUw5m$dG+d+xXxxl8TrqoF~7xl!jC(H7yyqmQ|W}4;-40<62nfGEMI+HzNZ(Fs? zvjiX!V@BDsZxzMZvewg;6R~=rD_yl^zU!I0`}wCrMdn8czWTx=_J91eLf}q^L)p9( zjrPhDZnJt;hE>P6Mrj90!hFh`!BZ^M#S&?28dO!S}g!-N&Nn6My%*`yW3fANI%IY=R1J#i@7y)W!dQxmLFF$z1MVlZ(47C$dW zYj}MDEcy>faLa+cx<=00R5N;%>))VZ2U_XO+ZD>E|6@ThNjrILI-$<6prdcMCQ%UPZBL z@@dXvauv!mZv(TEOlx(*-Z5DUd;Nxs=@v0%sICUQ#E8yWpAiSG_lPt zzss?KB)neXJM%mP&eOJ3&RgdI%B*g_S(8rtHP^DI&&h+ESu{DBrG%YZxl^*oxcC4) zhT_A`*NB4N9PorK1h0PqrT;x|@i|3*WH7R!B;QT~fcR&q@^a+e^XxNEFJnd_JWwp%4^Cn0pdAy)Y4&t*gKZr#4&BBnYu#kiz!5j9(UdDz74 zE+IerR+vbdf^haX$$B+EHNv5^;0pzzJEsAlg-1jnJ2KFiz^AyzK#vQRV2lmop+F#M zrs}MU{x!9Bk&8HSE8S-@*~;F&PF;RTYhovN-XUgrPqj@jTEypPC#y3n1i3NG_Ln$K zlBcI9eeL6j2T)ii>LkM-_j=FnM*qf>%aJR3w6)1vsyakeb!^Po7KYw#{Oi9VwcbSL z<>l>{k%2TM%d0qXBB{xGxR8;y1#m^14|_<`?B%UUmv$?6D5T};t?cY-=}N4E!6Ihq zKv$9LRGVeYTy@!;s22?v{XdBL!Sqx6ZR!^{-Wgq8RCAO1pK_3@zvB z%A6Uv7=3QMUV}OZC}JRz_0CUdxVUe3g+*<6FwuQ5v%>L@?^MiyqNx^I81ZxVX@NmY zcP)K!gO5&C6{|F$%{EUAGzue`(Nh^D={@UYQlsOz$cx49D9K76y1ZvV zZhbXj+0COx!-czDyPS3F6vk`j=IF}x5#1BvF*R`e;Y8Vihd~ln4G(e1-6vkFWWzA!pX>Xs+>)l|YAMZ;qp08}&3-~aNIhGOu8KqWa~BWjnd*m5gO z($XcG6`bz19TtblY(prOxODmI{<-@(SJud#k79RHIPvG0viJ zSne?ArxLZY9DA!eVSQLxTUTuhRG2z$Xr6A9s5A(<>|Jw^aly7ud`gKuY_irTau3+*L~#Cewx8W$&ahzwJ1sF1sDW zCzVpAbe2J8RZ;rD74JX&d_5_ z7$vZ(oQB=ma0zkm*$F-R9DvEXTb=Fg;VMGjPT4+wP%yV;TUtK*^;_6KtRq)0P&*x~ zMfI}GNYox>&bV@mXWc_m5L8Z)#2L7QLcQ#hi`1`gxC?BJG}d(Tm0?&zkGrM({VDS+ zleLp|QYX8y@T_a`6ZGh}>8A9&ydA{~2MV@)g>hh@&%^`w=bO7<#gXIHqr!=8-mESY zGUsi4eEcG|jhP>`jF8&=Vr;t&pi&EWtKPB5{qwz)i*w@*la!)qI)~9bVQ0*;dcX3x1Mjhv~Wj{NGD63zlUh{$G^arC#kd9!3EULAscIZ+d>SX9j- zQJ=4oPf=wgH(3jMjXWh*PLF<&c|66YaePWS^Gt_k!*v>qNcUdaR#JoUz{@M+efAfI zQut!J?t4l#$=JT0A>4XMUUiwL?T#q#LVksyXUcfJf7gfLc~5$M46T<<)MHM4w|T58 zv~qjp-F>FgjOaU87V+fjjkk84iYK}c32UEGEuo}e86To8{Me-BIiYmOVAgW(Hy(=n zYQj3J-m2iJ^cL1KTgG@*^kc=LrtzGd9NS<|&DGqjjDa_1`VZmX1+Q#!UAVN?arUmO zg?UA|U`c&I2|F&cyX6`oW2ha{y@X|5e@9lr}&il+O#cYXKl>N548xom*1 z*ycAjT@K%_@NJ1Wv__VQoAo)bvf;!|Mk_2k3Ad5Adki1%4OAF=xx6LXsev>z8c2zR zTrEnQA9l@e_|=JzBMYQ9s1(m>^If#CEVsYO-cI zcu&0Jr5qqRcG%^cHzjE^jIyB)nghgn6w*fbcr?q2y?R0c>j?Zrn3?Pk0rERY5aa9N zBOx*GT*W7L95&D&*VndqIKbzVsE2QCu$$t`C<&3QSCFi%W{>*NRRNP#bn@?3hk=>j9l8bbAeIMV0L{SEWnW7>QG@)dwaX%KwPLx z-4v8$+kiM73SnnCFK*h5u~40wl6+!NiOI?EX{t&yl^QTm z3!{OqY4^N7CMjkjfA1{w^Z9G`j0>vKy*(47du zqpCI2lsq?DL0>8~eGmecNxv7U7pKKaCK`^%C@GH9>+4_M-X?kW*{_io%lX?NgsqD5 z7u`0p;W+LWn6S{lXQ@6@&j6g{P(PwRgd1)%4OHX`OZ_AOlm~g-hYRQu_MMNpJ5kcK z{N^Xx?(aYBuM*S0%#VD&ok7S@VX|Zk>O+$s>vs#J2Gr8WvWU291F6{<;q!-qy`e)|*=-8Y0e2zVKp)MVh3po>U zksOnJ``1!aTQW?CBb?MfYPP7I8uc(=M7h@O55$!WCca@zGY{8W9`^KnZF*vjRZ$&)3P)BX9Cc?tuCWX%$3UJypBYYpB~9NBWwZn?Hd;Sbem9MJ0AyeLO(J z3ISKcPyP zvv$Yvm$1?$&8zMxEKGmUD{wSNxsqSfqUWY$-OJdYu3&J;t=p{R<#xn>Q}EQ()Lf6S za@$))09uYSy`jjWTSi_y655vx1V(#HX$ZUpL6C~_?Ca$%Wm)#75J>=XAUK*VEh~tr z&Qo7p37cLwH-}vOP;L&gC%>H2d1Q^S#0oyP9pFhNM6e)nuXn2uB(b|$r3id%qA0LI zc96shdZ;|zmVgBHGS3~xDA9V>&)KS6FWbWW_5H&GP070S61Z+?+G}lt*#g_@#clHF zJIfevm%h5F2)Jx-VnTEkgwDn?3I_>?d-t|(J96{oRX*dpo9!QOB{{ax9e>P{3TrWKD>5i236-%|3PeVCgorJ*!q_V<@vlA4+t zCg*HT?E&P85Yr}2ai%9o(q*u9kABWhNz1xH1*P+^VvEP4n8&G|PK{~|#e5jp7DKWc zwP&JHgV}HsutZgpu1OSt1I%Hv<28~<2ZcW7GU`mWLcdd4OBosOzhvxbKxekHHz z+c$?9@7 z5eYt~t~@`nUrMU?%_$8vsu#n+M^xiwiZbs`r`7UM(QE9|A9FY;5UDTrLZfCLF0#3F@`$bqzBQ zXQyj?erO=?!lx>>uaY+rw=oGeA4*;s*_STDXj?d=x<@aINXnWAC_I|0G?f|H+ZW+b zhqMD`Z5|#T5_N^3+YclO00me)7!~$u_>2V+D=n4Ak^FKNREFgtzmy606b=(E8qH}c z94EJJ_zZYuri40Y3KL7Ws!j4-t)IND@MWtu^yrq&YIjej zxjL7&G9Ct=j9IvYPk$q=>CKxvfnGh^3f}*|Tf+I{n=MYSe$vac)f;~NA%G~Z?A);)0EUpH z>u@KrQmkkYAWCl8rWCBa$A%mzo~cjRA5V9@etnzAy>}kus#NDM+v0kVs6ewE z$;&Sh%!4@iFl=Kx6$`JX-2u$+VRK3sBo#==)2yznWhyMPkfhHIXKiKV{kl7VtZw^ zXTQr~*f zf^Vaq*(o%w$aJsQXF9VWfh^{;*drLI23pchhQU&S^NK_S7N10M2mg_tAdS9X*Q&pi zDVwQ9P5`1(_40Jrs$B_05-KCa$_UzA-(be`HAdkBAT}J$8WpAlLSaswWTVC7L{b&AzC zLQ+Cng5(Gly~BkeG-Uu2t}yz?+5*7ej$l&RLE5x=vm}%U#RF;%RfBEWCgV36FvNAb zeeh>^rx(Ery^g{s0eIY-8~6Q)l>|ISRArE`W&f*tfO6}RKmclEzJS&3iB+`uAHRJU z;Z%Y?p{R7><9A=h=^4+Q%X<(YCjwLrt$$WG&uSWr{M-QEjeCEg#D_?RkyPg6y zblnsA6mxcDF6V5Q)F#{S{z)qJ$&uBqY1{F}h_YuYWWY5tr~Qd5e3F+|nkMGDg=`_{ znJMM4Svsr#A{PpuIOH+;G1#??-9@58NZ0_rtAlp@kBwB<2q-1>?>i&P1^sPj{RABA z6-Bq(ms9FP7Iw6n&MjkVIpJ0Vmy+L*RPf2E8DF>vCvv@WVPIgUM}~k z&iYQE+N7$gfD=sAlV$|`7u-GrxNNfQ+U3Q!{Wr#gPjoQyil{3Qxncy;eyCDp*qzQE zskdT(;B*X*@3vEgV7kZPZX7QytF0x;rc=y&N+71NFXZziJlYDN4g!e=6)NZqq>b}< zhQt-35;*SZ4;x;KwiqZ<)cZ^pTn(K3jf(_s8Hvx8k&KM;YyMz-s-qrR*=Z~kcD%QD z)tgb4Jf)Xg+`sS9l0}y8t@8txG4nW6k!IXF&!>DrTP{RFw{<>I5jlDSA$_y=)O^Qk zx#97vU6dJ9);GmYyBFvDV5RqDvWXg*Ob+!?ntsZ=L2>Gck=N*psBxhBfaU=GP`*XX zxY+HA4H(pd`}+EzZcs`uhzDX3$kSRt>p{Tfpl^y_!t7|kf%m@|y7vYHWmIV~uik}T z7f4c6K*o>=m<{#eQZ^o5X+R6b0cd?%7(Na4!E4?KJ)E+n7u77t!E?lH-hfwIN79|0IC+&PRJm0 z>5sByWdr5l`?UD@CXqyiRJ0A?Y^#9Ojgo>L)!#5;j7A6wLJ7^2iz_{07MFISm8Svk= zKzX(osq`=~`wE;2X<@EbFGEIl^m_&NVD`+QrE&zY+JHUG+Oo_G zK#gbsYzFkJOIpdtq~t7M)VUs;yWk@*Sr`kA*x&I#BYPKD@2?q;A#YQO!qIN$0=pg* zeRfMyI14k7lff1Tfd7tlZ@R{9u|nqWd%UMB0PcGHRL|U6lvoPYk{MKcSXlwq<8>AU z8Fh#ZsiW18A)oz{r`+}t!nw-X|u38VZZk<@NK6+Q>13hwFDLu^1Qz<6Yk2M337fYW$09zVVnc)z{p`6CCbk1!0 z%uGEcT2-vTY@D*h&JA!Dm55NUTEa(mCl_e+H*DgJ9j|8Dgf47{|KcuNbwAW9jax2V z4(xYKkAJRN#~6Hu5tw)?8giM1)w~CNcqdn_I#Ox0E`;lKj#XvkJr@fOc$Jq_rAgf=^ zQt}0zvJ<6JW->ZpYN}>Y*n4xza);BJiGaGj$H1`C2XM6-3@l`b0rZc_>FQ%gmaw5> z*)KHxWJP5*DD-j_9%>Ju&Fzyy83BfSjd3WT=f zpqN<#d>fkOmy3|+&PJ=p8z)t5R`hyyQjbxQgN^~ag{>435phZ+SbEy|gTG9g>)@+< zhsK`^E$l@@QD*@emm(KLgNTD{hWXTB+fbb|(3H7eG$^9V%gZNAn+y8FZS)1-=BYb7 zJEMb;2TY&zwY9Nunac)#?V<=WdvcdT=aBTXP&+B^G|+8Y!aU zq0WIHaFxCxgP_NSg>smXuq^Bab4?{?Bche|Hx4NfH?bHXQGv?N)svQyk+B4xmM^e7S)Ojc>E*>8m-ZxJy$L71KmF-6i7O-TTpWf*C_wdU8s zaU^%Rk#QqyqAt(MSg7BRkymy}*gozI}Q{5*!jJ*Q8F-aMSN@#O1cfq*MWlA%|H*xB6OQq6r1m^|5MV7^QMe*T6bL0qFULXI zlfH|pEQt!$XRjzZaI=h<$@DWm%o4>=|DqX7KhJyQick$Gd02tt@|ywSH_Ww!VU(Dd zSP2y|zB>~50TdH42n<5(NG&hJU=QUsJ;cDi5>-nGO+-i#8ko{yv4WqjuST+-o6jYZ7K1@dGL= z5Go2_1`R?SA0;JK<+d0{5V=XHyTmggDM>_1N~+B!y#$3FUK+6?^vnk#&Rswx5A*V* z;XZJn9Xl^pAgd@l_Hr8;wp*U;0#F+fewdNDm|)rgURzcxqEAPnVwq{qiZR4YB$LO>Oudf*6a zwHGy5h$)Q;8Zleo!MLdoHi!=W@xm-)9BDTwAbUJ`k9lST%z3cAKCi=`omWsc5hvbD zj$s=jeDMNa#>Alm=$udpoYFE~PN&jJrrKdJ1W*%&Jw*laLRyI3TA?o(pU3rdLoG}O zz(H?N6|aSIG#E5xS@e2u_1i)XQkUE1C*X2AqFTmDdKrivd|56-?Pzl%QDN6vjJ)Lu zy%zk2+)}2I?^>MQuurb*cNP6~?a(<+S>Tc)`;o`-EqtN7zbCx93ZxS3ypkaw{OlKe zCY@PgLmkL=9$`ihal=Og-^5g!ltbPGl%v&bi&Ev6$)>m z_vBQ}ESRWa&TtU=-~Y2?^50zYf4u9YR42G6xjM+&U=}i5eAtsP)=pa?OauqO51oF* z@CL8HAN&8ybv{p@mR%e$oUbl!5I2rQm7IihY%gdsk1qU3AjC#?!mz%&nsPex$HsWI zYDn*6K&QnwT!e)waq7Qc0{_fyd1 zLb!ECK9Crv*;b!zKwVMg6EfvK@a#kO`+e2RY}1AUqo_VY3r8@kW?VgivvE1e-Z?nn zgnN@zu=*>hH*eyTJnN4ofUR)VNuvSo>=1Hr-NNhh*7#?}<-glqv8zxjccFQ+Fy5mX zNJL--+;?{SG58YuA;qXjx?YKGOErTkGvxx9ePcUu+TiuC?1q0>n%ucMSb<=>bhasd z2dR6e&Q%Azj)W8TaKFA{T%#7E!DHwdl)JtiZUJ8Z&X)Xq6G?dufFt`4KxOI!|u-WJJPsJaExL|a6Ww&PH9Z+=c9E&>^S^6C8RTk(sg1QC_W;z4lbw&vmQZ?9q z_ntYp25Rt?!qy!zWqGb~+iXuR%yEur0c0X9)4qKZ@#W6)xS%b!TvAgX@{5fscKG4#88px-HfzK-dU-4rZ$0V?*^P ziWC>R7?iI4@K31IttGte>h9$z?%EgR_2!Fz&FZ_AiSBa zL%F&*H*047Au5P=`4!6_Qs+41P#p|y*t(1<;M#c`iq(wX7!v?j$D`z=&>0AJ$^?J? zXJ@(!i_=SItx?heESqOrG+MrS9ApGK`VvtZh0X0!Vuve0tg!#SmacBPOmnq(S!4`n zj*iu1^#a;OpxHGOwI47~>x(~E6)(91O?lO*Sy$VjC=e-2Mkb2hNv^MjLV#MFx&IKb zk<~u1uNb|^=z)-0E9M4Psu;N0y}7x$vH-KkT`4TrL;%F_`(vpN-n-#itvmc#Sj2)% z<2jZcGrtF1gDbPj@%x(a(ieGsKn}!@~CFevMbs<3@Uo~$p=T!uMff|*Gu_g zy)M^1K$&l$#2Mt<%OZ<`g_qELzY1c~xv|i)&Q6o<_kbQhD#gz+JKYen25!o$bF zIH!bF16UOd?1ZFkPrCqI<|>JbeIW7aS9L8`?MSWRKhOzDL@h2k6FsSu&9!RXJIz_p zDyT5>WU2f7WQ)PdZfxQqIRfXC7slv+5Lkz|2dY-iy5fj)&^d)(#P+_tpRO!72ggT* z!VbKl4CjuN%dWuDk*I*fov*lN25j7SE@)+UluLVJlMUlom-As}L~7BEJS~yj_B;@_ z^4tTQdCd&+uh~P{WDJ|1So!O>YxT>$vYpvtdRaYo6a=^g-?|LFWf$+Z@BHiqU{(bK z=PM6F-f|XJOVU;~&b>AU);2Pt_N?Kq4zGEKRh{5}%kkp$tw_i<@2W3O#K&g(u!>TR zDWK7hbuoQpzy>h1iJO*NJ?wOz77f?6=ZgrNcuzK|ZOTq9vh1)it?(Gn;)UhIe6^YZ z%{la!+X8JTJpg?5@^kOZ#hsmSrQ|B;qtYa#nR5$Dpgh0lC)gf&%ISAOMUR|LfjJmj zZA_7{@XFg#F6Ud;M*~x$p_6j@HnihqHcbH)dH*jq%{toJ=Ag=wsKA~l*LvGOes>MS z3X4XeSVCFWN<4IJbtO>K+29=mLs-co^6E(0z9}gw30YZ;D75Kv8SegsHo<^4b#{oui>QhdCn5~-mGov$I{#LVjPwb4w}P= zMi9~F>I42cuD@4#;e&Eo{t$Y1F*ib$U#qCrHVIxz6V7bQI0hBYlmn;P+7rZK^n#|d zV)LQ2E&d?DeEIC^DzxXD0qF%HmMkYJ4H;>QvWjy@-q_Z&AY1fB$*Sn;sPX8!i}Hbw zoYoNt?HO&$9DuXpTDtK`xgOLoN#Ug{*3^8bvgP*ZhYC84Gf$%(I5}hylQcL*qi~R! zMRo>n+7i|>_QIj}5YrJDLu{RE^em%-QS}!uoj%u`W@K>b4P;sE=j~-mDQ*$=idja1 z#zULvxl%mt-nB_O>AZuvcGk%@?FZfui5g^!R5$M@l60igtXiOSxnrUYZEnHiUtSQ7gid1#;uOMhzv!YK?yg7eAu|Po)5~0Ki^^yE zIP-GKl=l>a5+9~Y8CAg3ftI#U+{a0$l=aquN~8;U{k&sPddv=LhS=|`+8@L2BBH=8 zPn%Ki1WJMRauUP>*c=?m71d8Q&?)z@ZU{$mDv~0JO+kjlOVH{E1wf zJ2=JTvMxc;$ilM5X={4ff=Bfg2bDm<)VI*>D^~aYV%PIOmVTdc=09IOY8xQDcZvjb zVb1kbfGkq{43tJ+N34gOw&F?yJN)N5^%p3wu*m$sU+Ev})IWd3tE<>LQHkDAw5tL# zOj(4z*1HAZUIH4S4HdlLJ4p@TS6zwy;8nGYT=o}1#GgmOF0XAzLZss{su@6oM}J1b z*pUAEFrT&@g!5OVFIg&osz=b|416e+2z{Qgs*?X)uVM=aPVy)c_6r=db+lwUr+n=-{*!TZGWY$CcAhzJseAEWiOML|}r#Xm9|m38(M> z15^TnXb7jiim zT=6vlM30qtIzknRORAtR7WJ_I5To>Io8UtN@6IK)lK!+oU&51*Lp|`nZcr&$HD^|h z{P8|?>_!_am(iNALHzvqJpc)edsq5_4T{O7cQ8xowE z#-HfR|5#c;!&G8g8VYyiH~CL3j=`s$00xm%RvCnXmR?MwkSG2p8u$N_BtB7=>JahA zu^=R22g_p0QR6Oj+w59tR8tJT{Ih@F;(%M+!LUIkLd#JlHnseDqZQAMRm)Dly%K~5 zODtMrIkRTvL{qY|FCV1cuSB~R$ zTmUa}6^{YyeF!P3|7TE1!%?e6^5#79-%c05K-@z#FxP|-505&)%(AW(G)Dc>7$HCXHJ+SfM=5@vEn^O()(UPeL2VQ`Fs@wz3Ylq^t*=0dpqNT09odjPs zMS?{lzd0WWQlFM?*(jg49VTqMUqku_SOZqRb#85T;?YnJMB3MGj2idEBXaj=*R!BgwDL|`RaKO@>M&#^) z#Ud(6F6}+3E%K2f_wgLJ*}$ZY$oI`w})d zTwR-64>voO=30{6u+LZe@fFuq7oizeOYY~QTIk1%dyCfGY_Mo@ZvcJ#s!!$5{qfuJ zyL<0(FAm(7>Qi=rYw>f#YhM)sBV(}An9Tn>Bg`_|YL^=?RN-hRhrI42)r3qwybiJ# z4I=*y?VAKP|F~m*v8{cz5uwo13D4B!+yZ^L$=r*9$cI1Z@n0dyvvF@M_j}nEYSBKk z1L%{%u3TfBbjA6JMzEHtQDp>uSFUTEn;Az>G!e7gDH9VDc#AH|Yw+5+^Pk_^7(s%w zfz30d%ujM%va^hIOXvly1|kK;_J@TYjYXWPn=P;T&)B?qJ$@dz5Y**|%Qy?zrw9FJ z*#8-d9VXBlmpWAjw%##tTwJ*FfoIeHouP(~l{@IZscI^9y;F6nE$00jBjiH?B;grs z{z^2ITA`LFrd6L1lc9N$u>&|ctxyC4_|T^3mKL#wNzew7*s!(V3%zaFA)1iNVFQa; z@5L&jso4#cvp7Ti=?^5IG(UPke+?$M2+6tw4{ArLW>t4~@k?31@klqi_+tG`Q@T&? z(|s#=Hr?S&2Xb2bS7%6l>@Vnyb{{BQE%0DlpC%YtpzL~llAfHJS_^%8G|iDhu|pL$ zpvj{Mj5Y^s6)9S2920iQQFFxpD)=nMD>Zy1&?!AyAru2f!iT`n-QW|in7(31$}U_W4nSmPH*N!V0qGcO(WcX$Y* ze}1=zjZT`OUf!sj+)`VAIS)ZaAj@zCq5%_IJ+S`=rVe8~Sl)NxJp>V89x^zI3S*knt^+6EaN8=zz$F-BR~%4BFnACUtpb(zQEJa!(*P?n8~F{QJG2uS z7~4Q`1f*;hEJb^-3uhDs6hk{C#f?JmnL3pl;_xG9JqW!KsdXO;R~ziW`8+jZ4-rKZ z4@Tjd>stGi7{gz=%?>0~OBkzI`lg|Zh@=Tosk+&t%1`4 zqKSZ1+uj!b#1Q)d$k9F3ZURwD@{aW0Re_Gr1q1ssV8(-p(9UwE+6I=FIu)ro?0~nZ ziwpzHkU84SfWV1?eHaIxu+7dst($^P#T-7Z6@KynM6XJim!NF{2Yv*m%qq9bnZL@E z9HKB9#^1ksWp`=~){|x>D~R){T>2KQu{t52W}hr51&xZREnu&EsW%&8rd`_gK}~N`?1*b zkP6u5k~`h4g*BPifeZ;- zcQ1?Oa##FP?97|zr2=4^;lZ7Da3AR|Je#Fc_AHJ|u#F&O79`MSpjj{sn$MMmW?rf| zf_bLzkkIUjWJg!iE#4EaNmOjG3JScM`qTEb>lAtJMz}*3s|A_>&JE(J7*sQ`qy1@F;xrn&oyBksi%JrYHNfVUyyB;}fzA#` zA%k?d1THbs`s>GtbdDYT4dSaMSe)IVH?IL5LT+y_4bAPk z$6);Vo~)oV9=gc1vE}=zU-I^}>WuB<5dZR%8VZw@41&>m2Ip>Bkvjk$p&K1Sb5d_E z$eY;iMWSM-8*Jvbl+kw2)fuG$;nkpNf%V3VnorSZ;_pBJ6DEc1(WAl<7*dd^9LO{{ z?jaJ?L;y?u@$+Xgg01SME*=^$C|QhODrNPh;J`!rs`rY}+!=N7Fy$-cGbHZRgR`7U z#2g5yRNsyj#{cLcXu`1)aBf7PAsm2$zDY+=>ai6Lo3&949t5CZXG{c0OC{RP%Ul8| z5pC)_>HNmfXMC+kiyiG2I6VP%V$XL?xYTZF(*#+StZ8Hv1#MH7aNvaO``@-Tz!{RY z^ZV{q+c2dvQ)fO;>I0gkGWf+vRPd4&4;4U!&w{NZ^5QesXU_BG3cWv<+kmpT`uLL# zz1YMDF5HDY6Flq%``LlxNyyC1Y{(2EqMTJa7MyMd`$08OP!W6cj;TQ1UHF(Qrn_YW z+w3o5%Ll45bOW0~#={neVK~JE(^J%~3_Fl)@DQaju+ZieH@|tKm8k%#O7l~A#3vhU zK$hJVd@QvTh8J*ZdJh~g<4!e$)%i5MnYZ&RVcUVF7n`b;s+)?5WMX`OOL=)PcT*i0Bw)jhNpJwPeDLo zL%AgZtq*wU4+>Q{80e(KF5T)@XsM}(>=}j*{sB0=gliJQGZ@~#xIz1<8%@N+oVW&Z z?9#%MUZ5Moj5E*T{uT&AIkl$Ukr(SCWnr!`#Ne3;mcYv)LkmZl+=UYWmBCHwRXr|_ zoLQ8s@fx}QSV-}f*n0wJ&tT?&S}R{6Pmv-B{sC1jZI2>X;4-56lakY_K;sW0VqFJ= zm!IER215%oir#LpM20}qVs?`fY9DaWP&C|szU&b=`U%x2JPwUf(hXix5S@sX(5i6v z>MFiI*v@us3n0SuBE$fBzM=0ipOMpR4_XZ5*b@%zbnTtsBB3Z^LuN;3UL7`j?oNAr zt|a#kEpil&(Y6B&Ez|?%SF7=9)XE`ULq~zPz8xFeY!T_eHmfxSwUC1?i#DhBFhhgt%GW#1Y|tmeKC+PV$rU$58KI;WfQ@WxS#uXFc`6yEwJdC#-M zT0fs8{n)D5@M!hV_g{U(cT!BGQlGzpN2KU4hxtyC`{u>3$=;Mb72a6JKjU-oKs3v> z&b@xrNU^vSb6yRUF-5sWhGIcsVOA}V4aJI5v$GpZ%bvg6d-3AMKFGE4xw%b`yN1PB z1x^E0BR1K<)3fOMjT@5G8Ys()ay@CgLxrgSJ-ULe<;fHnm(98O3N>0vx&k`MG z{gX>e`wm*969_Au@&6;B_rihw=l=M{t|L#4dgs=y*x6YRPyVIp@$oyFnwyVN>!KBt z0lB$)UVHv7!}4aS$K)|yA*G29zWk2V^713*=H^xqRkHRZS1?Zkkpw%6L=D&?`<|s= zou#Ftvx77{J^G-FsXV0O;*v+}72R?C?inGz4I4y6M3M|@;cTYC53m}Ytt87^C2zK* z8qis7A?8-#+@?NcF=_17+Ycinx5OnRuAU9R}09y92Yk(KH=!- zn9qQ}oENGGLHjKkj?J@K5hdf$JzYMra)Wx)U%ftLU3mpeAs*^KR@5qDtnJ=`9^-z!qoMHyI<|FDg7#yvzv7N`t_+y zriG2LI`me=J$(4Jq0oUJfBdizW%KB-bg4wOBnH~~0t!HZbV2Vjghg8%`tJD4EmtSdc9@oS**X#qN za1f=_(}FL{L;4~3p7;q*pqWht`#yWx_3VRCrbqhIw>-qS=&Nbq9ttIvF`i%h{n@jh z*Yojplv!F>Si|1fr0i3ydFqsuedmGo>(=!{Z%|pWhQm?|TG?4R@Qo>#`f4NhAu_%L zbm8k`16dw?M-31|wTz7BHD=$+f55}G%2FvKZ!Zz8fZd^_YIvVCF>&xK+AL%L23lg) z@$$hPvp*lO!&VZYiEd@amBs-rFRx-^o6YF*hPJlE zYuA>O%DTTEG?63s-6YnpZf|JNEPI!Rr24M@`hc4^b7bxnLC>7Hjc%&`c37R@h3%PK6?hXd?_ z=bXRot)x{BX^!xh<6nd36id_3ICJUy@4qLK z&Q4DKpm_P%7>b4FIncyN;j-j-@OTML_}w)vE&85L=rK7BrYE5oH|5{M@_{OuPowK*QxvQ^Q>34kP3 zVRJ|?oz4J2O;vd34672ThLfg^g`m`|Gi@8_U(rO@&4}buYKsy6-`ae z1%2gN5k*C#2x&WN&$kEHe*e9hUdWHzwn^Ew?*}TQAG}LbJ1n3ug{wg2wo+dR5c`A6 zm%k!K-;&v}<4B14E;%{{Oy0M#ZzdAVCz*#r;}M^zcfwEJ0E65y6pgKKS)b9y8pA#o=nA{k8dz{2)Z z22@PIC@=t-1Pl8;)tZp7Afo12H;M4@OZ-y4aj3h*z>ilb1lFpvb1r}+fj3GD9ur5m zZ{MDjnyL-c2>~}bC1npZ(2=aHtjda|7uVn`U0j3Q*I(jKFP`fS|9Z9#!#Vf$>#nc3 zRT%u^n^S*!^EZpV3y|=RI|9hYB<8dAwA-k--*8Be2+E1>lE3@r7yo-s6 zi4A@Ak|q<=c6TPGZMSzY!@n4BZ8L#y%y(3kFTr>CT2mtU!MB}`SM{Bln7D(G|F(pV z&YLqa9brOWx}fbIH$C9xKHwTJN?lQDMX8a0zhCnE^$Yd~!Ux{xIQzfu;ISi;1l1G5GB z8`CuR7E$E;`3pPUH~(dqO7(8!AFf~DirgQQ|HI$rwjzJ|J=~uI`JR~gOF8nLX*=Jy z?a23&C;!7kWM*Nu%z3iQFI_h;CV>c*LlV~7g{lvDdV70Y;WtoLwziFb>^)AQ-ZZ84b}h7PX|QQq%>{njyE{M= zZfAfBg`d_+FT7d44YI z0EJS-GvzE|k4?G)52J2#E|bRRgx0CM2PY$Mza+wgV*BDozXLP)rpCT@q7j|nV79#fGnTJcv-jV z$=;ItX0rrxrQF&`SWmvWYKT0gyPQEM6AxL&y^M*eF(q{B4y*)$-)A^0jV*JOiJR83s2erkiRPL5*&ds;SaBGd0ni<+< z#lohQat%?U#1D^mJJ+yDCFAk<)6OQ|^PjwDopDoVEN6NPJ-$6)S*YH|K2k4~+c=Si ztK=9GK*r^k25SaSdGP?o5V;m5c6o7UQgc&#hL)H`<11LA@aGwiyi6FsMssl$eTK{F z%h8^44`=(zwK2MZmY0_ozLGkrEf^8!HCtK=BYM_vW4+&l6fZrI9JT=EiWDL{d$KVO zYE)x_yljLDA1}2fXaOpp;v$JiT&ZHvjU3-^M`L3hDy99_hj~|O-1Z^Y)&2FwPWP@H z!zg7Q4;0#Iz_rV~J^cnfgd0i}zJk(=suLs1g>_`<@%QljN{o{htD_Wchim?#qi5V9 zPktT5Q~&Ic*uqGd2F0VlL9|K^es&ce|4LCUkJqWDrY7l`cWq8zFLRi$&V5HppQ5if zzH}@G?pJQEdYhG%mB7b1KGn$NoE%}Y*AQ)?*pF6hWo@mJp{=5&r6p4G?cw?-)Ah^P z5mUypsoT6F(r}rkvySeY!#pOW;e0*!NRPZQ4pD?hFJ{%6jH(+BRZO_+WfKUkVfM_Z zlQcIs_kgLMU_^MsC{NB#OuL><(Bs`4C(ss6F&#GEW@cuCw851VXkYr2DaW1WeDek@ z%5PcEufs+iTTb>Nt76v|qN=8y6_N9vb_~2$O}O{<4XRGxGQP0Sx$Lt7ZMp!B6~dPs zgMPsthBEv7tnT6V-3u#zj1@0qtrX3=-J<0$P|^!uj?=iDe>~xbQ!U00u~Mbc3kwaR z{Bnm^Q+H{p5|*)KuHP zq8!>3%n}~cnp^p|YXYsKm?~T<+LEDxOm*cd-Xd{t1Q4_eCRbRdy{8SVm_!fnW-*wO zV34;clJXezmnd!q%`9)6%k6rR@6x^wkpRIdDrc3M;rWBMMfL+RaO8@FA?Rzv10B_P zwkQ5>wY=+a-hqU!mX;TH8T9+p)W?#T>2BXzmkt74<^@GdOX&KGp^Vi~{UTfa&(Ard zV8#^;y@fe!SZqHq>S3zL!hC#1jhV2`lh@8cBcp;|9nXHSbzNa1qkZuEeU^cKj)T*b zQH9A3HZS|Dzm&`3i|mF1SRO*5P{DF}&%NJfqP37_s!~{3NZ>Nm(~}w}Eshcu9)c+hHm_7YZ6ftWLWH|^a!-E8^jZef( zW9I3I&siI)`JBOfnKhlAo!xBm-kR0UI=!xWAX(nK{kkhMqH1btSDr6C6bw*BRyo0s zlao{I&c{D06m-qZk~rSoVrPrSy=qgJ6-FaCW`+{S!s~q+y^yd{ZiyqBW@vjZOHW%% z)|-Y#a2pdir)Fu=DjzGd zJXl&@podRrB*dZC;MdwK7F;Zm@|$f?C`voF7t zJ9zc=Iepm44=8j*$aGBwchoj2GBUE?efT*V8!TFR?)mfQ1lJI}=sAl`p|#EZTcg3s>KoqFUfO)j3NS78ceocfYT&F&|3m&@*jHMu(mC zp)T>%j!&kP!cv;uFfGO*Rs-od5A@ZA_}FXP^tCtd)8Vfy+ET zOJ5sfr9@|DW&ly@7g(q%FxH6tXpzzpEDwh)XQOl3hi^omS?As}$d#RsV8drIPba}_ zuxO5xE}bo#TKKSAbR@EM*tW_xGNhW35}!PQQSO;V)MDAVQ|cEvb2iTUzmM@DWg z@qR6Llw0LueDqm!mCqsEedIG1#DsW5hYAS*GuM=rl~*X58R4Qn6RP@ow=VENohVu4Gdm8`YqbW2zn?S@owWpMwWTYAUW-d@8fm4a()>7Yr&P7rNOahVu;>uDGSmNDmsnJd!dg!|mxqh%?qj38iZ4R^d=k4TR3ZGeiMgFR z>GvG1Xp_$~%JWa4TODik!nh3{bMJtOH(=Kbw;fwncC?r$@L}IOD=mboShuGGwUU2V zSZEGh1daV(6J+`EkG)<#*@pI|e4&aP?qf}{QrAb{zesx`xcTh=kFvr48;~}=%o^Bg z^_G^eFJ0d05q8Nm_&A8t7-+1vUwZ$yS7|Ww8^UFU4 zw5xuX36uY8>*v>81Q-YQR?>& zk!`oqHdTW;W=XfEvPLggALQqiUKiv2ISwwpySF(^Cr=6?8iYTpj^VcSh|9pVO3#@= z=Fm${(wulZ^Js64^QZ4YY?rK98b2&^BIiMlTfC`Tz-NYd7E&wh(Q`Nxs$H(X+eIZM zh5OyO&0nWd_G}Toe$qgE^ECfELdJi7U^;XN$k}cYg$EW4cy3nR$AvClZ0OvQZgJ0G zZZa_2)(&QkmmTGXa87Nno>1m?)jpvf_X?n%*uE3j(lJjqp9k&km+j?%YfC3mWAlNVna~n| z@5rt6J0rzc?Ea6BMy?$?Ko2I*q*heO7g#jam)svd>o(iJCCdhwU@5thhHw#!hDc#) zO1Ie}pgTp&xcgVEt*ItgeG0|pGs^k$4XdY!8DeXLV7mBBR2#8kjhb0@o zUr2XE47YX-O8H9BXhW3c+QO)QiQ{eilJ9}-9UVI6%^~%pPL;rE$qPl~DUYA$K9d}2 zFo@~1)35M!-6Ox$0&G}HD^0x#x*Nz@@aHUC!L7fluvGbRsL3g2cm#&S)uGpwS z3ooTkW@%+;JI$7o#$ddV_U}xJcNDTI2hjYe6*&#Y{0rUv05gr-2lOpC`D zh?I@LVdYhjOVJbwhbQy&~2WH`MKLXH(#akU~%vt;4;8&b$&V4=9p?6Kz za0_!Ra8MD0cPG(Ko%)slbPJkggj6F>)fEsGVqw@7^^ERcR+xXHF;c$R+<*X;E@QVD zEj8dZqUCcJu_G|l)6R`>0k8NI+>ceQH94FG0=rEzt%0Lz^%Ku>f^)O<&*^zGMB2~Y z{CsHfov9)t20>?==QSqJ!@wo}S4455<0FhJsH#0Hv-=H;Y|&UmF@5{?E%A);g^Aum z8}{+IN|yo325Y;AiO+IJzZbcJKI=zU84n5BBg}iyrA&9lVW6zAKH|KCFrCZSLK>isg}aX3?~3*mYq)gqJ-)^Oq@z6fp5^o`fZVZ>4 zi9Tcf(VxdT-QChIr&i1Bo++-TxVU(?`)&433aZ!H*VpIfH3*0eivm6}=ie2+ZuMBj z&TdAewzH#yyG7b-_(xHzq_P~y09NWZz)zH==_wmn!-yWvbPk1OjE;@)TE2Hgp(K?K(HV2>j25#9M;J{7`h9=;Yn#hnAW*aLvdS3Cx2dLJ zhj4U(h&LCQmMB*k@n1nA;GWyVDcMM9g}LUbRQ1(Q_mDS;)Na*;T;g|_7V@e6rP^ba);hZpHNXs3ZK-VyBT)mN3iJN zOfkc)qRI8s)BV!J&%`{Z4d30nfpbO(0@U8)imUJ07kY@kMYWY33UTC^skqZNESL(N<&*lxk< zs=&{BFMS2obBu_?MVWYl6vqeR7(%p>$*eH+@U(e-IJBtIugbhlz{AUOnM*+;^WIp* zX<&_<>iFb8jBk_pdAUf-AV}WTh1t~o)(q%eoRQmrtH<}p98OIVT}!eQq~bm1FzL?9 zRRy-Wvb_&zV>BWbQ>St@uw0n#+tQ=qTKTp;`6gL<@bPGX1*4$iPQI_oo(h%=97!7q&N<)$E0=lvDLLI+l?>nn#aVt$+Qw@82T*dvw=FLVmZIsdK zD&T$)m8yYD5YS2;0it{jD=sOCSy2pp{{~h$)Z{v-Z58S76ANIafvDTw-aZ17OeRJ; zK+Dju$e3P=LIbzUsQOVa$RY07OBzf7Ii!sAIVMKDf3SjJ?LG}J#o{~v@k0@v$v>7JiGyoZ4L}{X}`(drt0G}so;ny%wl4~9Dz%q>gX4r z80Gghm^tge?REcJAj2bz%1Okxvdzp+#UF>%u!`!tTcjZmhojByMZzZKu%~Z3`BBK5r#u-(BE5w=~l~L za5(9L&kx16zIxjsV)m`DPzSkUYimvC{+%88H<>cAv2Abd46(`L7+dQ}IpeKN%!+@( z@PC2b@*f{!GSe6o86CU+7a_Dty%|B~YoieoKDODc8GT=>Gi~iMFgrmY-v$tWE>PFX zLwFP?uQPL&lB;4qK6HhN$vvR;&8EAXn3xR%Jm@|-d3Jgq4v*RIa5X+Y-pbw{QJ(_Y zICW>7x^HA?&w%IMy&DYRZ_m?GPe?d;7#3!^n@vt%^QZ~D+n(MoF0M@go_z9eF7hAW z!GGN}{&(ltf8KZg7aRhyu9>QT)y}YGbPuVd%pgLK1LBR}s&^} z#625*71=>_xb-)Q(ub8bt(S7fEG#fEKIHbE!N9+=02%L!`8cI039?RiTgBgyx}S4m zDU>ZadzqLL(_yHpeRWp4#I&4$e2fp7Sf6RIs-jm>dQH5>XqsN1M|k>rN#4ifg-p0w z4#Q)8suH|uko^DG-Tj}Y4>Ibs2Jk}`$CE-WCR$TA+rPE7Q~M2N^B*6dNJA$S0}0tW zY6nsQ@-zaCs{5eU#oCt%034qvB!gep;7pW{LJIgke)UGMt|Fj?@|gAZ^@)Mj_l#3Y z1o{kjUYVqQ|0~)oK~zIw?Jk1B%tGZQAv{2H{(Q(XIW?91xIMt5d5R9YU?V87@gSUz zKHM$RUEzQPLy%KzZ=PDikJ^TCIY@7cjKZ))!YF{~e5@b4(R^XaMgm>3S7qB${GE$+qH5XIrOtP44&p^5)04(@}Tt!_N zKdEnM1A`c6wu9x;QbXR0zBw7dr}a#IT!FbZ%dhnJ!+=FA`uWa6E01$&MSEiP0V5~Q6iurqeM!TEfdg)htW8U#Ny94>OCjvqI@SOQ?$1Nj-y@maf2dCEGa&rJuu|gu#GVi zSzW82KYuP?sUWLi*Qs4qA|K^?m zBO)*nD4}l{NX%@L(z1Xyv=Bo!&T}$j?Hh~YEAWt{2&?w%Xppic9R|zW&d3IiN2pwO zFlCO%)9zVXTEg>=$a=8B)=nnY+vM^^3@GZR%v8Ud6_AojtHM=ls@W}+1!rjb7_Xoq zC-MyA&Dsl&l|p?YhzaW(cdsA*LYkusXKtfPp|%bJ9f=UqJAp>@32^fyC~j3WUkzLzx)CEMl=@8jM?da)=2CIwp%iAmcYfbIt)GFqYo7*zhmvfh@96sEz zTI&nmb4z0S3BLB&>6`lRsb)L*fqYxP2xch`M+Q6$D$lSLl8O&(#pIJF2vw@t6 z8vE@Aw<~n$JME3r`1c8}N}A}K$c@$7@E9q!FQsm&i+RVf8yi>Fdv5z)VaWhC4GezI z7unQXM`qmBLdwcy>|Pyo`M0bSQ1pSW)4g4!x!IvS65PF^V8jAAKGN_Enn2XSOdYfN zX*Qz=K|o@^%AH$PZP2yrJN`(iC`d2pqGG;-)6U2$1OI;%Y`dnVi8fk%u@EvHE0=Q| zmVbJS2`{?3x-7w4^zu>Y<~1-hEC9a+Vhg=tip=rLQ(bABX)W`{S7)|4w}cv%E7!;# z5V^XRrpWWknS5!Ou{RXzYQWI8tTj`<1%r-MJc~FGutkeAkZB7 zf4cebd>N-!ky)LD;(5ma3$Fw)tGG}%Sqew%TaoSY`a(36+xn$nv*(WghK z^rhw9u5s$V9mF@`^EzLID7SbGJF+?)$_r}(5I+(|;FUEQ&9~XK{__0TqWxFw9J~zG z&wYOcKldILGVQUd?rv?xaypcsK|9TV;viEVe<TGc?Shx%QN~+N^8@YI;pO%t^*vlvCusb>3Ax)N&$_GFfd8= z&DFR`vCBc7M4pKUtV2&*hyN#o@k8Tg(Pj?)|BRK`Q-A9;@#jSaX`0t&nY}-98Z6#x z(P#OPF~BB+j7MOcVLkAduOG@we*Ej-YY&8V7H*g8I@aSiOOA}EW|O_V7ROASg#Zg>OG24=vkw>~O{exDjpAwV@wG-k)YF|Q3!dpg zgMWHk{#*2@1xi|rAzBoM$7JBl3gm>DSS>cmL`MRZlp()N%rXXSRsACEKkLIFYZvOH zmZ>RDnb+rC@92^ZB+>?4@%ja+y`$R3n->qKVGe?2X&sek_*9eh;1&>O_$uGBP9I|U zz{eM%bCg1i(*Gw-n`3T)rc9w=vP^>x{yXVj+C50JfJ&(0~aJ7(X2FrO%XVNpvprE-tI`nL+pq~;G7swJSg~#dBqK7(4HcsDkjtp z$$PpWLK*-iA(-_QpH0C5Li~NCbW5Z2?+<^THShzKYQ-x7y6mZ&c}GY%Ki@qv zq|XBEv_LwTIUo($ocrJlmC`AFi^@ggu@NfYZdq4|v9(k#J0*!?7pvNEQk9X083_~P@vO&Nm0eEi&(2pdV zgGI{HyY6@8n&Oaf4N;NdJxJ{=9wR9-Jb-h7RkjGxH>7(Zgs6(<#t;F=L$r)E0Erkt zGf40v4s{5mtj)}o$_8}da$=t&K;$ug#Wl1&VW!BozvK**Dvz1ME+^n6s2-jx$jqo1 z69!8LVwCD2Br7bvW*UIw`+2bn<(5AVpM`C z3e&8Vu|Y#F^EQZ%iiieE!Y4^bdRs@4~tamz}Akp^)r$n2;(^_WNT zq20Rs5R^lTb5_TW9OqrTI$KGbz4q$#Q2^~UGz1?a{L+kiYI$-#%U#rfo{uQGO6%fR(SjxaW>(msr@U+%0n;o+TdWHoE1(*cDJp_kK$BnT8>AG?E+Rp! z!HS18M=(w)1o8`|Jrf({DJdzmMKE?z=#2$Y##t~b$!;6BgSa~@7_bCyuTYS@9lFSm z0oww4$m^lrBgYBp8_!d$w~%deDDR;=S^T{LegB>) zur1G_oj^cd+3|oTqpZiY4pVaWt2D+11dU}vZdQj9BvT@0{c!F-o}lUgiMx5M183() z`%(&Ud_BHLOYoKNWq0R=jpV?!h=ek(v@$AgGa(?$ltJw(xb^-aVxCS8RyuzF5C_Cl zp6ZVX_@y&NeVV4GRf?)V@vODmLe9WM03ivPhEp4EdbQ-q0Rq%rd70V}kS-|qyj4|I z-8ze9w@l;cJvhEKt>;Lth_(m9w(-paZI0gQeGXg5NG=@MZdMlsC;+mkD~cE^jEx~= zjgw%x*6|FzM=;H;lQv)xZLBYIm`)Kx==WEOol5)ZYqP)fL(WqTJ8SYmws}ZLvb)gw z)eJ>77}|lPi|Jc37|2ahmGjSMeT?ZBA;PpUSHq5&WkdyRg3G}NgnN1UkdvIh z%?-s9jYZ<%P)P{OZy`poO?N1S{m!zODk~LW9XMiLR({NA087vHB|zc;DQj4$qdYwm>qw_EZQ$0V zs8-8Dpc_>O$f-4HVga#n%7Is+Jxyn3pNTbJ#ma$^PD?@|6Y69(;x5-w5(;GvzxP5G z3F)uEY)V17qHz^e8E_#_8;yk^y)3wQ^bjwYYy?F`Si5D4RdCcWTjBvj0|RQjB4|%O z!md|)M7KC}?WcyE`|*cDF@Im~YH*S6bk{_+!>cH+y+ zxq*DC(|w_QzaG1dzIe;xtaGJba`bQ5^IN5(rNc^yl;fG!XC=LkX@k3@OM@jwDEr}c~iP6Le1aMZ+D ztb;BAGqaV(_K@i3afX2w!WA&SBQb#+Kae2u5%247~+`eo1-L>s$kOwWx6K`Odl(7Q}g>~_Uj(x z=jXTi@MtGiC1BDL=OcN#DmnR*sqb2fjN5EB;GRx{rtj6^{wn|ud;T^aRyP!tq=6 zuEpH)1r<6>Rc&bRd~}ER=5{gSR#;e3{Af?`I6Gi3MvtweG}oR&+a68iihlg~@y>og z2L3ny!#aSZ3&0znp{zjtda$ORrd2%TOMu4Y;FGTfFfPpER> z?QuwH?m^8$uzCMjIrzBPWY8*`o?$n~nu)1R{9enEyckNPp?!wSHMsa~X7vZICd_>1 zbYOj!af)f?Yf_Mtv-5JT;>HcoEDPe))9#p=brjm@Rm^|pl)+5DLR<^;c(_Ack;4$r zxmt*p-b^B15VXs=E;&2h@lG;UO|)VOi<#{Q$-q+uh|J3Ydp*dwv{(80RY1$HZ)(y> zQ;!#vFKV)A1H-ZKb`A3Y+=R;2tJk%)!#!7KbKC|!lpzPL>Ez@TLCOua4cuuQp@CD# zfoC-duAb95F8K4{bA6YQ^Yi@t{HZ?U`o4l-a71r=xEpoJYXAQI0%y+Lkc<`X9%ULGA(0wRv-ZqjjA32B7hySB zOJ128|C;pl`E#ASckkvkY^fJGbt=4XCKAPmLY+Kz?AVw6TSnv0-M@$-RD9s;I-dv2 zWB>2JUlnBA(J))!aulvL6&zQq?;jo)>hSzZx#scdxBdH5K^`=)>Zk4!t%$^l($-Jv2aElq{#S?Ejml(1^Ow2hhd z`o)VEOYiRqQUdP!6>`$)5Uzip?F1jHk1nyqg+zq^LU`s`dz!U$z+O6(Yj^)Ej?R}{ zhcbo=@9|6{P1BKCD_8pIvuD~vl|CKC4nu;qzjJUnd%(uEbZ0Vmb6aKG?5(IRH|v+* zvTbJmESy33_~?Viz(Bo z%flxOz=Sy&CmE|iNcFkiO!wHhQE~s<^~W<|xmUlc-7lBCc<B?@N8XCd{<0-|-EjnxQ=Mg3p4VJbBqQ)dABj2SN`+o5O|N z%#mp*Rb>PAz)sY#?OA%F;9Vk!Z{E9t-W&ub zre87FcDj?AV$&Wz+&TbxXoDi#Xz0(!2^9-k((&pQum(M$6T#-tWkAUTf3L*NQ#L2a z8)BD`GJ0q_!48ZTY%#@IdAS@~)7m=o&}J8h z-#OoXhVPpXX!|I105e-A+jo@P0ju$>Qm?o4LS5FyfT5hlQ)F zqVg0e-tPlgvY?nal^R(@{yBzg^A`U)uv%UA>#U{B|oYQB+$-+A^JN-weoaXyB)ABk%jwdv^b&wz#+w`mr zhG2KE>*PC&(Xs6WRwj-kLsO#+(Tlo_&r!e5ZQWt@`MZ(FV4MZjV_MHth}7~u702?r zCcHQqdhy}aRY0?w7rFT;Cgrts=hR~*)xfriM+v~7!Tz||Lur8H4!GDP{gP9LH_K~i zk&i8!%&UdlVNxv>EPK$}_ih*FP^y54NMx5GwY&wiDrmPy^d80BtRJY zIprTpgE2f1HsT~$HwvTjG`aZ1bu%{ZJ@iuwBHiY%Qd{i9HVf^?PlCtZsL!I=n9_r$ z_1w5$uS+zb-Py$RM6OG3zHlo91m1u>&u~kdqQ}nS+?sm>p#|5 zP2rQx!32DL1DRwZB9g;-fBuWV7?chtvA<^0Y2H7B3P9!FAiG~Tzh2EkU@@#=v{jZ| zzHeId!&8ARxecS(CJRF1=AJkBr4OOE4=#9CEYRrhMMmF1(JnO-;p69LA^h&_IT!el zPFt~oNKJ3v?SMU8=Z+8t@ZsUVyH0oC7$CT@KVjM24*}&s!lS|@A{z}u<1(&qczzt5Rvilqi^o0?ocRiR!xGqb zS)fNAma>m~la$o(eBQI-@HxlC!xGlXOG`_^__RjYs3PvS;hm#jaOLvl%hr%)1Tvds zKTuX@pX+_p^(1kgMn|ee<@$o^OizLGjT<+thbk+;>{K^~NF)d4$e)9gzE{gYTyFJS zz+u=*l;Q``8T^f;E3l4DS~*O;UxUst2~u?nr1X|3eGUz$pY2x2tsVXvrlFA0*m&*o zl`E6*blZRhB!XaL1%eF~(93JYx`V=`iNO>BrhMcrUI}G>^SuDKC=mDr{1I6a0~{ux&jeK0XKb)fgiiYE#gDa3`J>p45r7Wk5-Y3g<)^kh%Zk zQ77IwYlyM20-EEE89B$FPkM)yzP>(nRMan{q~x6Ic*_MuHq+76d;^B7zDSfR)EF7^ z8Q4K|9G)-8C|B)^4Iw;r1B#PK#gZl+;*C1DZ@)$f@bfFz1hIGVMH#&Tx;O#ba&%z? zQ)&$gts}}!ayLLZnt%|R?zL;rAh`S*uy=mF%&y(Lm*F3oKLdN8gj&lUSJenxcCuf+ zI*8V^x5pcaM8PlLXbZx3{z5-t^YZ1(m~;2vn7Du6j@&+wk&=jT z6($aaih5$d$bS=F{J2l!Ds~Vm$nF?iR20XKAbyR6?C$Pv!CH{9kV*Ukm1A0E0;7vn z+4f`0Hs-5>c7`t^BMjM}E)*vcWKwaDWhAMLgWejqTgFBluXbxKB7kyj!H9 z0gJ`nJs7B)W8iv}-7IHEbZj;N@T;xcbH544b^$Q~0vqq<4dPZrOiU&egdBrH6+kms zB1y=NLlH;R!(V;?rFG3)r|4yRc3}-b91%3svRjYmVL?r{rUb)o@f=9)34ww)LJFq` z9TLBH{qe>ptoYq8FV272&E&t|owbVDVemk>Gg$^fn||GH=XtrBvxcqx>m!UV?UrC- zn#{t6GLABu3rx&9@5r_|&pY&;A`0yjv~^QhFR^`2#6|0LK|qoZ4Hw!b>N_7?O#x6L z1pOG$QSSU1PD$HzSi_pIB(ce`$J+=r$af&-TfxoV8!Y#5l8MJAkBr<^i#vA%sS98^ zw}Z**R0}^afRyyDz3I?ifE-EqP+gFbg>ifu8XBeTSxB89u3c`HCEtVH;SbC{*483i zxVOME6zWP!ZY5O(`e9gdwNF-NnV*jFu4w;a+YTFZ`TJSN;p(u~?c_Jgp*;1CjqR{b zC4fH4Luvwnv@Ya|6(|iRK-=GM@6h~wn$>Qo(iq$R=5nw~?K_7dh@Iwk!s8E*i8C!I4wNjsr zr+p&V(dZ`!PTfR09^|GhiLVa@BesBI~_6cfQ1M3><=R6dRA+xnE}^(dz1v z+Mec&9z>PcrMKEQgQ2gJ6uaQT<3Va<**b_6cdW~h&x(#g7pUo=#GmqkxX=TCsqjaz zJv1Rf1pVO1+Y_ucEr472(1_|mDbIiFwKB_QThL>RwdVH`wON>UJoo$l{W=q`@*S4} z{vK6S#unKL`K(O9w$KiExDwDE?Yi?Wz#}b#dbHeNx}hl`AP~Wm+XxtxSbA-Uv=}jE z_+!~~Vj}p3R*?Z?+G6(n$&-ZEkEdA^6p7y(voOE_iKN|AqzEz7fCC&HEWT1q7cckd zqwF`(Km_Ca<&QsJ7~vQCSd}t^6>>0M6Kh1>V8t%q^?7AqxNzaZ7vlVyw~WU^@-&73>p2B7$j46?m6Y6saMW`M7;_>r zz+pLe$B0^CLBUo?O6iPleEj%vSI*gHd!H?jp5|}JoVm)Q`y6)P%-=V=b0@RkjWRi~Xw{NeTM%$-ZmN%3osGwM7v*4;`yAyi}I-8^}6cWIU<|L`?)@|8ymarwRY z;Z1>k2$#Qp{R*uL7HSSemhtS0>({0F9RmQ4`3#zE+hP3j(9jpm`pvusrep20(d=6_gw>mSbSXFxum)1e z0V?5JER$qxC1c@D8)uP5>EYqgQC1g&LWmUBr9(%rV>K>9TK2mr%}6~iiKNoQz4F_> zeM-K~(L>)LAA*Ph+ZdQqqshSv!X6&Ebciq5fOMzBTnnepDv4>{|=1{eC&tc;?>CQ&aAGQpy&~&a}PaGe=SK3@@ zyLU%E=v>wH1JD|+g#(!;HE-N_X)D<$tyi+;{kLV$Z7bTKOJqBZ-gUinv11=3 zzb*IK{-~VD+v7Ly&o#x%>cMUR zyH+>evROt(DP#xmk~&IJM=9pgq3%Rf$v zh~V=ECR{QhSSjN^m$-*lfoq2f@f|2j=#&|Oi#{Pu7FuB87DBZ~hZH+p{j2)KSjfQU;~`R=%gY`?932p0~TZ8GAMlz|A#xqQEO`yf%p&piDqw zjNY1viZgLITrP}PqKS{aN;Lr_!$X0uf#wgkOIwzKY~2AegI!PlW$?W^kQe2YCuc%k z0buz?QPW2^poB-@U6Xljbl~86)kS3U4UL0ylbu9<}c9`D;{CAB$N}C>0Z+u z=@M=GTqI-oPV=cFa(bJa6?D&7x1aO@cL34v;4%qLG}ziN@(BPTI1tJ&GP9z=DxCs; zs0C|J;#Rf)E*jLNB|v?Vgsp?4P7_as@GS}QUcc|#mxPoOKdCr7!^n@f+c=ZLP7^RzL49q{H2P- z#KaHW5BL!8D{?}^knzjBon~&=j}ke)YjG2eJl+EE3$IlVP*^f8Rqs&kpajo z!E2bb4V1y-T87Osdg;f*Nz;QBhKNb_n8`?RCIWe*LU7rA<0@Y|A|kRuN7bBJwqmw! z13a#yt(^?bEN|xh)LNU(Stk{;=V9qJ3EjC)ExuqR$VKdW86nn)D5*bJL~?iTlfJ;FJy>F)$Me%G#D zC^Xw)@sjkS=b-Vo!HXc?flt{hD0=tc&whx;4wrH6KjS_wUF$UzE5I0oEsRqW_w$WC z^*d@6yssv7spjeyoBHI$DFtI057Y>x=8iMm?{0bJ{qv6h}H$-x8D>9``D-(w4ILdSNN&p;&G;YCb>10=_D;gON4 z@7}%JMf)7eYdU&n-@bjzzoLO zChgA^=@S}d@S;2k?^Q=OUX!ZUZ7a5wsN;GZc_>K)#iddtk2w z@^$2q5y&s_Jifx4xC()&)1j+Bee&c6PQgbUmYOShP~MiTzt8$wsZF&5yhq7N$n($h zK8k|Eeh=6~%<0<~^HejsK>7xbXZ`v1pTEFl32J7=bx3tg!$g{d`7Z)Ue@`8XZgO-A%!D= zhOMYU;Xo#0n@*r7VTQxj_BU1QqoPlr2>3|7hUy5%?1C+qEZnk~Q>PD{f3fPlP{~lw z%i7@~S!K$mqN4r7X*=Qfwu5UVhum$y#NN<8NzSffCa|Kh6&*TGE;D;KjU1+SF#zVK ze)K|=6+D~Kt$mQ=QblkNkPuf`hLHz;-TN9sqbFr$4S^ZA!kgJTK+zSe-+EnbBPaHu z@k72#R~J|aI}$4wJ(b3!8xGW@Y|2@a2?Ngp?EfNlPJJK_>f8Grifu}T-Hm9ITs?S? zj^sa;)a-A|?s8ZZPz`$;7?C1qL?(GG4bUbEC2Ul39omn0(kZ5n+rim>1MVx?90m`lM+zkOi^AXU^DeGr9a~ z;rP~cV+;#XADLd05+FSy@14OWgB}&$$HTz^Jn1z=w(4gvQGHQ3oG2J8HUGV713kkC zEAG4Q&IW&h5Z6oXUYiworCuaT9^R@0D;kTexie;;uVTwEGuD4OMdw@G*_49hP1+b- zq5?Z9X-{P=>X(ZFReQqfUjo%mP+s16Vq)S!L@5Z5aR?@X{EaH~s{}|-$v}Hl2RSKR z`|$doTJ!h2W?#{+CQ;^8kJ(D`8;i2Lc-`P*sQuSHHo-e@ohN23VQEeI9}A zU}5p+$~@mYoM4Kktrp zbu27p#DD;Nr4xc|Wr3)H&(cfS(9(G7FPrDXQ!C-a;3Y$MN;_XN(=X-r#j=ju3IQEififuLIvE}*YATY}rf+X+Vv=y;+N&dIFbjQC zphf+|O8~!4)?rW%kxvnP0~n}1RgJS}CIVMK;W7y?k!wF5Dt8RDHh7&N`Md;?>hYhLGdY_*H0+=|6lI+tlf3=`P2tO*eXa476yRr?>#|0K?e*;#Tvu=o;OM_!+Ka4`GNzV+e7rkm4S zWEYFTZfM`kMKo`PwdR|rT_{yI2-ox}zwO=o8XC+=Nl9Hmrdr@nk2saO3_q8upITPI zB8UJ?taPikRN%dlD62XwAE?v+=35ooLOj%Zu)LHqJ0Jwp1uS^|wlob0??j>CqFcA0 z^PLjC=h|)I8?4(0dXQaI{E>9_kmM8yyITIN{798m*A!){+i#3 z>T++*iqES59^R+BK2+UiJH#dXT0NRm$QC#BItOq2tIiZD_>VtY5Q*mX<*X))FjS9K zmW52)Rugm?q@R|-d$0lI2_kw3EDAhKIVh+?e0-OU%Ur&Gdmn)4_+WsD{k2XfKPI%< zo9~ZjtuNIG4?mfep8Nq*jNY;co7S{*Xb$h?Qkl!mWn9J+5g@-XQs;VK2AOH_<@yb> zjck~m%Y&DPhZSP&v8C6?%jfCWLR8W5DK6zMR=L=%l-Mgb8JEGRNCAiW8wQA7kR zz|av9DFdSPHgKOk>NzEBIx8-Qt6_ zWy_}blqgTz~K7l-XFU)J&NsCopyBF_U$pT;7fe`Qa-X%-F-GXZYRb|%OiGYx^)-;4P~}~ zp?QHRY>*v2A9m$a=Z~@f^qK6!Fkrx#V2mK4MHCw2W%0R0V8ESIpAS0y`fK8~Pe}IR z$aUb>qObZ$^BmvRk`192MX9pM4?u;2?xFIZTAa`3eLA=q%TIrlrN+=rbKMgE1wB}ZNC#0FMy)-)JJW;+g zYwz4YHB|S09Gzz{F2(FQvBby8)%a!c%aX>`!Z{NZKK}xC{;bWLnuE?xIbkYsfAXJx z_}{v*xz`Ro$45FMb4%#k?v{DN^!sGy_iJ5$y$}SwafX)9J+2>~dDml&eKAFXgdg$O zS(U3d7pJ7HFF#1v61DKtlmWg!^8<1FwWQXZacLPmDYeXQbj|3YpY3%_6eAxQ%rUO& zO|Kce+I*v>iB`pAPtj+viLl;~rE$n@PrCg{=FpSUZyt@TQ8WlQsVVEejF$i5^0K+; z=;QTT%~$^|`t6LjMRm0F_Vo<8Kkyyh zn~T@}bg3jvrghitp#DXPMan#<(dV46eSEsz=Nh=w^A695O?m#@)w4+3v`9^zmOp!R zK5^=MYGaX9lxcq*T|L@ye?h_6tUfu@XlrAWk9;Ra$yc8g$?y}KKeDaqhyS(TZXTP- zNNlUiF(>2$_>0S{Trsf9=qN`!olsTIg$pM>>Nw-wO)kbya#xq}he*&K@LLSZQx6tQ zBOpuGZ}<-Ac2TI~5CSu*XCS3SNR38ke4CzLT(YR037)wv+N?l3cOe6zGZkgJ&M~jX zNw+t86a`bg(cSG*2$B&2V17IBhV+Fuok(vADJfRtn>R}k)sdK&=zLpq13)rs7y8eu zm6UqWY+r^`2Viyl2#`DZMT>4$qiY_H#(8Wh;6aD8GnKwSd0=b`7-00maKDjsGmpk^ zHgDRr*QzKOuYCJ7csJMo)6v0h{eml#1?l1)y1M5stJp{5qt&2SN?j@h5TK4GCZPmk zqO3P$u~_m;mOKO7+sZ}b?OAA@lsi!~;{)7!eWYfrf91K>z-JZ!Q5%KrKpUhEM zdfUa$ezd})Re!FgXy|>z=JKRNOD4I2-xGb~#*MCXz{t`8cLabzMI1vkE>L!WI&z%2 zxoC(U_nUokKD9o>?gh)+w!epLmo`V@%Q<@wcq4xy78KN`ZPAK^)U4D1^7#@sZGRd5 zi|ct@WtE2hFI@MHp2+`(wq;cD{Y9?|A3r|N>_Jbu(wVphXtLT@r|qOta*E#it00GO z0%RNax}l*5A;&~t=j4GyNNG4XP^0Us-2}(W=q|3#zUn4m%JWxt(+o$WMjonEfr<3S zakwv)xEZQ>p)-?)guWhkYiEUtuQaX*QN@D?J@auXV^70C;<3*^Hu9oK7^$bFlilo|I#6u#(^nU-#&flfDTYrT}RWv^r)-x6Wpc#RvT~GB%8M0 zGaY6-_4Ogiy>D+I>-H@8H;apzB?}b(k zeG7k@lT#r7)%HkbOt9B06IheN!K2_Y`ixeoMLwmHTFnze*oGVk;BwK}BVr4NaxSUZl}DfpGp={~^yvc3#8HEtC^=B1j7FB( zcy4Ezc49;d!lE=lcyFUw%B}+0VXGQJBSCI)DCm&>A#X29uNc2q4KsJGG&w z#Kn&qq^a)Ov}x0;llH`*22Z>P^E28Uy2%JN2EeH?5qRrK;cOiIwEbzjRLP*D0z9QT z*!s#&@Q6T)DroPhOy)w`+@Rp&cr#9LwfpkGUHbdop&wyO#TL~)F)_;#V9}+d1{l$q zh>K2}+;-ut3}}ukw7{a!#lkrrJ9+h~!OvvatM{yCocyY70b2>ZAU!Z>Up%|IQI18M zWF${3YOT*;5>C8}moDvH?D$iEVp+jZeVrWZoLSDRMYd6UYAZVObB>z+-2VFtok?Lp z5PTQ2oX*KC^qP3LKKALq^&(=E?tU|QI{<;>D)wJ|@x|$1CS>-kj8DU%NHw{Nib{DT zqtX2%EuJ%L&7p`(mo81xC`&s0GRdY?9(Z7U$`kSXW_x=?6OnY94gU0>s4o6tB_b+6 z4m?=-rtXeH`JM=#`+BEPH95Z#e?a;-!;99sJnMF57`&>>F?6CLKsDt?fu;7Lsh>dA z-D9(~>^}^=@#!nR#~b7N$D*Z1yIAp)#9C+8amICy%lsvA)36hJ&bMv=M`#7MF8xz_ zdFRLJQ3}U0=V%J1waM?;f?Nn~8Qg*$i<^|r4KEo@QE+uqoOiJzu7zoAhrqEUuk6OY zcy0PUJN&V4lOH-5ZW{GEVLGmLYktd1r|I9^KJdd2VcoBKtYaaC%vANQ-Pn6^*^e~6 z%wYT^`KP!3Kb6`3XMQhb!3?d!8JaSIelMj(2h`f5Sufg06h?oabIZlm!ahZhd|I^W58}DQW%tgXp(1=KqX^v^=*gDSg3Z73pI$<0t6(W=lQ*1>Nw0 zw&E9`nhR5&f0XIM5B%d|_8%^{|4&}LHUhF8bf>CrhqRA#+fV0$h?hL?P!`p;WBQBfD4w8`Qs_Is=j>;zE;sp%e3$=`G7fz?aBYYVW0 zAD=m@?^|ztCTi=$uiqPeKh-B3W#o+4-+YP?|AU3||1HJ)$^ro4L~$iz+W0T$(1rl@ zm?}*?Pf$ix%jG_`7jid$k;&%&=eglVTmibe#Op1K*?0ZgwFSuzHAWax+6f|!WvXHJ zc{TkeOjI?Gj&zJh51XJP91Xr{ajCn$XeppP{q4V@JjHD&9_TN5Yw;TC032ITQGFY`=sysji64LM+BFvttP$<; z$NmL6tkf1Z`r#atDDX(}0J)+vRBFHTk?71g;&8zELNaiBm>_N7VEp9G#QJx^n7QkL zrsY-+Mx5i2XR;#2KCs2anE@#ZTLEli+;L17So-KduL1x5=8J}NGU`UCQqi3}ZeQm+ z6pPfL6ck%^utA9Glgr|{BVPy9U8HSaO1J%zs( znsO{8BOLuoquh?Z@sCi`(2T)|{fv{q0@9C3SV#X&UM=%5D$Kx3Ab$mby#wAx!u`+r ziR&`EPlHXsz0sJkYXiAdj-|K$!%twtEM@@|;JvZfSC(Pg zlueWneoE27@lP6iinFIH7+qu%sTk5#Ue&jhmW06vFURo$6XaVPj%8w@N&zu~F|d%6 z4{@G^QUl{gn9V$ed_i_)<0?*L(~!3L;I6p16~-wsRja2%PO8Rg*G9i*s#SjIXDIW= z8k7#`T!2IbdFLtftQy=c4?AlNRZmJ*yuRb|`l$gr)%TQSb!)b1pNs9+xx1^SV|FI9 z?1k&Pj!Utv5`Tmz2x7WsUk`cG;da{M;fF*>iDRbO=N%Vw`ceczIjGuokErAFHlZ^P{hFr=r!7uTxTeL5`>?p?@-b-yN>|z*wQANW;|1lIg z+sN8|Jlj(t>Py4>adH0OUmDjzO)XbDr?K)zPL)3I`OTD+^=X!eV+$A)-xOL8c{SjJb&?_kz-Zf^Jom@2YK6hwhA398&<>ASdbyduK zxY6A6hPr~u!MfO#N_lcGA^8syC#JW8F2qZV9QdDU^DpPCt71w58Zvb~qhReMR#=7f zWN4ObK%a@shcKX?V0(Ca^9qs}5#_7UgQo$7(*Ao(I8Ix&0&|IDYBOm2S&HODKl)OBbn@do;|hOEa)1sX%503ReD5 zKU9rrt~u2{Oxy_K1^8W=KL>`X1+9lp)Z9UzkmL`f{Nh%9bPH+KoAp80ggduQJMxA6 z@nFW;k(*tY+gdz-9y53uC6mvoY=;Sx89AOm3z?IIyS4@ zShOl1)V<$fEdF-IRT{%*b)GRFzGnNjrIkGFFyy3*kqD7oxWFYQz`Au&fpsP?%LlBD z3h4b`aPiwwahkM|nH7+lq|f&}{mKW_)BFt47Jl}v_@YZkT3 zh#HWbAgEH%SAnR2t?O?^ara0RnwwQt#SurHvz>#LbTEJqH4i8Bh;LzQIJ!TWjW@xQ zX!>R0xyP6YdP5GxN%#_f&Car^DWm|(lMPbvF z1q}F;zn?=lgi2YL(L;Kwmk>(OPJaB-xg5xlLLD%UQR+Fq@&2K>4>_1?uqDnGggvp z4L@NVeB1O$>5^Gs9pwwTwS4sRlD36l4nz%?oR6wJ1g)y8RhLu96-Rendy27+cdXg(cJ6YQp(}3Izr|CzLwaACeVs)eZIKU3u)^B(w-< zpCmoVmo%;ycNt7QVWrN=oVI*BC(Gqkf0je!sThogxaxGy0AoH(1=^kGw@jXN-Aw#Ae8k;rTQsud>E;Rk~u4h z5|u#g4=blQ1a@eoA_E}4E!m$eJ1`7$l%mRV<9YxByOx>@TMd;kPEV>pfRleX200pP z{4A=|{ZqPMi#6CfRa4xCyU0vTKSB}4W*0I;NoJs=C_(%u%zpoc(rf<(%i(|F zr_eWzMU#9S$1%T_yd7NWBbCB{0vD3`z*EPbJ<55t33vCoXw(KXO`sp zK2RwRukBostSHXde?4)N=;mpC7mR*!F1!2qbK|#CFa=8y-Z{MUomu>m03ZJf8<$5V ze8-pF;Q^{Yy=q|Alw>?>^_K45R7sS4CbxIAlq;ncP~DIFNk{DC$FR zAU9&+*p70J13NmUxi zDoKnjrTm98JmTa$plnA;214BvqK=kRD=o_!2s-_Qy&^u~TszgUoM3rz<43-jG8NX;G0Y4vUUN zGQ%PHV;PS(zk9!)68;z}yES$x6;?Ha7&v~8jitS3R0^vIjRx5RP?D#C8LzxY4)`VRbw%^)L;&$ z5mH$BAnkrKA40-LHktz%VhyMC5dq~X21e7qvxN&sK@g%a0{k3&VloCKwX-*13>Jaq z9EJ&Fldr10Ci>e9ES9iE1t6sM(}S5s4u?I5u|cL-x@ zjU+J4c|&?_P)6~raYMspppu^*$P9)8DBVG3f%t0**ZG|XCy{m$sR*{&^l{CF@>}K{ z-2|G_6`3I+hLOV-o?E*Bs&1xLBxk)wS@|HI&p>&eM&SI7C*dznmba>cm_%G}C*}%P zgP*!|QwW7ZmJ={{tM2CKNEA)&bukcvph#|mx-d_gHFB0dF+*URJ2oq<{@r)u=U&A9 zu{k^B2q6PK+Td=X{w|Sk=G6$IViD5G zD`@NY2voRU;ZW5QnLoj?g`TduL~htLs1$;(mdNL?Lx*r&BRmvP+PAp+Mg3eP?Pp|U zgj;B1BGnzU>-G%I7B@A`CFUPCcDo+s-{0(UOqt(fK271>?wqN0%i!^4Q#46Xx&<yWW~1!jj?{g{ zd2^h%YjpnZ-<{@n3%SfH>o8Y?k40bAic^pShCO0zie6Me(P&_VCd9e1pQ2r7J|mW1 z@9mn@rmXWJ)6#hRN)rneRO@f&lx0Yky^O5#&o@|jil6bDZ*kq0ZfmT!nm}HgwwKt7 zm=LiKzjvP&Yd}8D&oJHipi}Q8-}=?Qup-w(V7pcbG{Slm(Q>FuZOK2~q``8?`ta!K z<)5Us0Ctj*V{Q6QUzy7nigriq%*jgbhN1SEis6=u&>LZqHCQz4jJlTn!Zqea?1tY; zj$>j1vL>vX$m`b^QXx;~#!a_0g@?1j`h1I_oyRIwb}ub5K@(^qf9$xt%=(kFGv+2j zvJgsQwlPywXsRd@g4crjpPq%gXv|5(HbLs8%#+CA!ZI|&jX)`(6_@N-kr^fKPMqluTFs2g*7INJ)pEu);b^R)fj47%tFbO8eMXyQ@l*g zujQCTzIw~>LT|FG&em~+#3PB@ql{9aE2g55W^(gRrOM?)Q7RZKJDWd({Zy&o&f{{8 zZv|JxuUZ2obR5a(+qrgqmqX^PRMKvfYgjDQgT`|X^G?%(dT znAPfBrs%jgP6MS?#+cI(f9P%5IHCMJJRx@}1_^y{)gdgj&(a((Et6%&kCHCmUnBL4 z+$@eIP8}8pg{xa^6=lk-gwc_qxV{g_zQ9chaHvunpF~bZLC4*3-GF!6;y2=1s*q^+ z;sI-SQFUgw79}BJXBU_v%}lE+ydua=;7ojLiBrM@_(qm)o{`5*hXn5w!ILU{h)}3g zs<1kwpzMw%f+CVS+3O&c@pcFHxPgoAU-wJfA`yRO_MGSq*Ri z08T+jZ_pp(qsi?2LBNSaj^~R-gB4{!)GND}?X_<3QH+ zP=E?feHSo>E|66Prf@f=F8*Ev20bKo?itenkDw=*-Wb@^+8k$3A4AWlb@yGx6RS`Z zk>+0oKCh>@k3DY6Qd}n$_(9CWc6M!-)un`+3Bdu{2bPuxgCQ&;wPaz{?~DUqkNt8Sf>k=N90OI+4yRL`90QZUb>&mXLrfPMbvNZZzpQdhVvmns!}f z9{S$FWW-EzC;*+ylJDxvHzEPB0Y)B(;P8gr&oOVz0XJ8al#%dhnC=(9l#wPPBRwSIFn5{mh0Sss=ATDPt|CLchfo)qgmD`hVO{loG~`J zg|8y}pT3HmfPXZVP6UlSl{Aaamo!sRwtD}<9NB)dazxq3ZgTm>Pkhgbe{nnc@5rS8 z1*Wf*jMG7Jzr-(#lQ(TedrCWR+$68+zFS>pMRzA%R{4IP{!Z=4>t!I_mxQ9CWQ(#XW|CF%gRiX2FTVA-#MAU1i>c3>sp>Hc@hE zer6OjhPxNX9$tac9HhKm0;h-&Y*^$VH-Y7mD4MJ4Gacklq`^$ve0#c`hNKX%r#4=e z%!-hM#tdTMGkzsac!1izr zbynvI+2MJP061+;zS3GX{LU?>fIQfk1OB7^{schc{-`M2?D@20u%z(yPbkVq>q7=M zS#gX(3i3i2iTLSoU~MXovCSbPD}ABXw~q1wGH|uTbQR9N z?O&qirBDE5ePA$qb2;-O7E@%|=9Mee0Td5ZH)#Z2QM`kicpOwD4n$F=A#TH zR2iKdGDGit?(gdo>n|NYe~v#pO>S2ap3=gQzr9&E*hT$yP%I0S0!)omg)x zNobb>KQ6-2Hg^fp-O!(`%w9*K&PAEY{Uu7F&X z)ui5WyR2$#)ak$R?jB$@frX?w(%+U+SNQDldMZ)A zx7vA3w~pTo&GG0SFz+ggV;uK)QAfn6lo7I8Y{eeK+xnf5bYilOAvDSKRbaVtm;kE# zB|03@cM}!L03udEU9tai9V7R9{lEP7nVNMd#^qROvYuX#o(};G3Eu1P!j$R-P?Jxp z!nPlH`(M62yi~si=rw2g=o%h8DDr&N#tSjHWB+VKjjshy*T*xhLx10OJL#C?nEeZH zzr9oulpU>ZC2qI;ovPU2&79xDtLzeWNQb`>L)681@ttiU`OomT*@gaU59J914dD)KK6=VkvuYK~BWh-m)y)J-7-_o*H+$bd*yO3keOYC+4*Ll*2~R*Vbbi5Fe2^@OEsWVlQGes(lAagIHCh35pyVq7 zB;s2*r^)mL$VDi&lqz-6$ePr3+#g!S!;z2FP!|n6JEf8Iq<4z$+W8>&ZG#yLr%~{> z`({^uFl1(p8a@oZFh^2Mjfpm>XP|BJVwS(yj%B5Dl!jgyCg<6)&puRNtX0>2>O&%m zTwo{LInS>ezxa+bJ60XW)SO0NA<^Hf*P4@Fl)z@ zzwu$Z&|}s{0!e-Zb-rugoZr}qp)(DX5|`N1%W`LB0X(^V5fx`_r(P3#o%{P&cF^c& zd77X1K|0Ig5b$<3&p3PpClI1irtsy=O^aJ9CH}^h-}>O}sutwkymX~R#+3I7-)#?F zyNb0M=A?VfYyoRQhbIg*nS=R*C@Vke;rP$*25XNDau&1BLA;Ja8D)wD5V*KPkC@DF_2q-txjE=+3cH#4 zfK1X1&cqvL=0Li<-A~LW_OgY^e>jSb{KJv(628EBlQTM$qXdO~o}=1wxamOZY_~5I z_io*}vugaDBU;II8j@=xpPa6n`=h+)@P|CIIP2-JBov7{oHbUfV*1~wP^Ez*wZ^lD zA41~t;X~C;PES|my}2$cY_80n%a)3!aPX+W=IkBD><}+qcJ}R_{)*b#E2I)m`sSW? za58X4PHjaU?^}K2LKaBxvp7+K3wK7kFLI{9_d(F;Z$q0VcGy9Zew zGxk6InD{PPGHHXTPMZ36Vc!2@a=Ih}P?o}I)97$`8(iqfXcuUmnSe9V8l9?79wh(= zb_$%zD#+%xH5^Zu-<#}V&du~SL0{?NE@WpIWcn5lUxfLBhU;c;>*yGFVDrP$va&!X zmV`3zhNGIG!`@4;!`7AWZ##(*;ZE1M~yK$wa znmh#t`}L|5(-{%_Dcbco>szF{jg)dC^!6KYI!S9mOqW5wTg0- z1n>6egM)*BFUj%=*Om@GhBH(EtT0tT*#h~ntVjA_+r~wKj-@*qf}*L(HO|YF9AnA7 z7p8wK90GCnYW|uyKbZN@Zw&22N1bjnL5JFk=}e@u1E4Kbmm*NCB3XJ!B5|D;aA2E} z#0^9Vw*rU)q1BWTU`tv8T@_)m_^UW>Ay#d;QqkIl1d|}^3)b1ue@d#>I3SHNZ!M&V(km_?M5ZLhJ*Pq*@}0 zf=&G+3G>(W*vXG!SP~j`ta53Heab!z6y=0M+nc(~arT~Gv>MK-wMLzFvKM3Q@S(zO zE5QAXGH24uA!`^nWB(-FNtznqWwb_z!rVp`PA^_2;b85js{)~+o#9Y*3qHV*SZRWc zd=Z^1h{}!uDPehWh!)V|42JTuu%=aD;&yB^@=()J{oPGR&AdN6q|6PssiSw0?8b(h zw`e}0q#AY`Z_&JO?j0mIV_@8Gjq3&tWW3t$*a39r&bhv#Df;eRp8)4)FV(VeL)6>( z7m#f$BN1>Jo}TTa$vZt!3naJZ`#4eC4a; zTYbqH0&5F1><#@OD%F;$zRwl!LdVz=iYElkV7+$YKdGIujpGfhv6y~zTc)9zha5iG z7V@m79Aevw$n(OLcAp+{ zNIWo;pWU*fX{wIZSm;T`dZ%xZW}p|wSgmHq9Gej#86tig_< zWepD0+Z%`=Fv18{u{T0h6-qrD*@4Gf21`a;$&=Ew{v8C4LNE`eNborN{&bv^Lk=C; zYCY+=n@D443vmKOs8L6H8NG;*ElP>NB%n+cJCYJJZ`?9Hj(7be^T$LEa%k5WeC}_B zQiQM`vcj+i`vM_LJKdxR>Iek-Qul_%@qQ&PyA@fre;y*&OC|Jh=#U14Mt+fq#{N8h zL8D*o{hl>v(k=tJT#XG)6rt@%WfA46gYP|5Rrjva8UQN^6Nms3AdD0rp{W-RkDn#^ zoe?gL6;}mbJhkF`mup8a7aep&6QJ{|w9gXOYIwqcBBGg(k{}4#0irfJf77{hfk|Zm znPMvUrl(*QkV6?0OXQ3R!w@~{q1{5LMxBc^>Iv?DNz;g-o6znlPFi*=gr*rWZ$Mrw z0=QV!U_08SIr_4v3k^DcPZzdDX4yI+SXGx$quB+dG&-d(o8bfhQZK0%TB&4=gmqB} z{TrKENj|twkI|*cc+!jM2(F-s;72?IE?Np2LoUK$6g93wlS1d@MLz05gvn8mV-BD| zt+L3Z#uCmr7Ka!RKGAWD5q<1G;=s;0efrXh34Ee`yJsB;#Sh7^crUAIdCe7gzAkX| zs;c5J$?lxr`^|$|x9qO4yD74u`xwp`45=HwndhVN;a9ykC|ZSTrCC?$?x{PNuX;9@ zFsh-FQeoFoX_RTEnuc8_u`KiAMmmd``B(?l4TJYstC5CL+lR#=Un=@^h9*ZhgR@x5%^~I{~3=VUagQqBHDmnrtVY+SA6X~e}ou*lFA(uydc?L0u!6Ka9p2o`fvC*>d zo=4=aif4`BP%x;OzBm80n5EtbPRzxuUNkp(_8*~Ud`6s2`?!8G?yrtZ^~ww;(yt2b zA!jhVq=rQ;y}9XR-j_Ute$D=kLE;Q8F$%~AomgSK-X2;xtQS4;$;<)1pODY2;WtKF z3xy`iuwcU0V%O(RqYdry7N1Y(cL6(L4HGCiYPWClcl@vYz3f`ykV| z51P?*?2?sKhgaQD^s%<0_>A@y@=V_0Wrox6B%V81-6z3C;0E^h@7OMZyM;{wNuH;a}t-Pbz6ekq?Qf?PI7BeH%XT z)Y_?RpS7IGtIHCV-|T^~9_iAxhRaS02rmMaPrp=7mM8{vAaCV(<1Q6K3;Z$|b&FZl zEg~*}X%G2yRG_)Q^=w<=H&DO|JXMX<-IH&W3;C~t*wos~kiE4=3+PXWLO$)?r;sc~ zIN48O1t%tUp9-BgoW|p|D7QxTRL(_jn0%a810kfTISLavt??%X1W*&4+GtdZ8uf^g zj%}%>aWNg~vv8vLB92%i|JUK5r>cMV5e~O?;leiS3(KWFNOKa12vyyqJ?5hUmIHQG zMLy9jwlLwMehM?6{Fi9$fa-3%a+Ra!%Md51_`ZdV zggcVY?5}Efu9{kJq7L7@0Wgt+aoLtgvS#AK$dA~&b$9p^N`b!f7t-lYWH$8FUjk78 zFwicF+sF+FWu4t!j^%O!%Vi!{A`6L;D^NgeZl%v)^hbtIgxa>q7oN%HfGY`ljkHlw zFM=XQ@;sd1Gv>u)?J*Yp=l)+s;6MAv&hsL@3igkyk(>W>z~sLYv>w&`{y>nO?llD+ PdUMuh-A%j=`+xjjN&8V< literal 43880 zcmdSB2UJu0x-S}aDa!_~WdWq=k|H2Q5Rk5-VCVvZbdlaWp@X`>ifCveUFlVN2>}!Z z=}ISofYNK|0YZ7-ME4&1jdRbt_nmvrx#Ke23v`EM&VTv7U;ED|>Z%Gy51l@QLZObL zZ(i3xq4s;CQ2R`NKLGz?xVOUyz8$!yq;MU+!`HeTE&#qAbh@eQibAnGLjJcWbacTS zg*t;mU%#s589TG(6V75DC$L?OAy=iio}+)br~IJHtzmni#P?s%8dSMlsXuZ|>iGpr zh7-3G>epG_8Ai^$9dgnYKKJnPCg+r=oAu&xlfRwRf8#u~*J3ZjAHRL%d-9Uucl3C7 z)Bu&L&~N|7O%BuEFgI0cWNUskb1Gva*&8#PiIHB>%FxTv-iKV~mG752amWuSR7q$c zJ@P#<@wZCkJ8D12(0=6m`SW`OkndNn9`xLU{BTII<}mVIU41Wd*Qme;f6nhkemD{S zKYa_4ZXC)pgUPY-VvMMLCWaAJqw4nYAeY(VNQ0b1srSlMzX#D?GU+eW-DE3U+a{ku zD#<%)Pte&R^H1etZ)^+T74Ub|%ymcKVWv{vA;T(PuiUa9Pvg}8KnZHhl#j>nHDcJi zN8nsLZg6X^a%YR2hkS^j&+<0~*9MNLnR##ivY`jeoZ9P3N6J4?)%e&-c_8kGFsb>8%-EBRVkU0OU5%gf81 zxNw&4uC9pVYEl=&xYSxJy_Tj7X;JfsWOmlu#&xPUZo@4F{&Az`aE(KbMnh3X{?{k~$OTz9`of10xQ0E?=Jq3=o+LXsad;I30qQLz|D_T?f6>`<4gD0l(m zttp*C=ixf`hN-SB*FN*8QE9T5nyUC{xPfbnNMDh4N8q%6ff<@7LeYQzA=^lHVQb?P zuK3s0R9tnFA7w!7E+3PaLo=qDIzSq26CaOlPLQoG4SvI$S1H65?|Bb90@v`xxt=8o z`3CBuMbT;fau@U5s*S15)za$OILgpN zy0H&5Y|Z=W4PNRNTC8N4etvp#{fmMOc4fMUqKYikrHOZ;WBP*|+S{{(+k8}VbpY1> zk%gevK2)bM+>m9w7pIPqm$`PXLG(!I_yLis!aYNw%R~EnL->rVq@s-dy-qVREvfA+ ztCi1|5M7A&rK4|ba~m|(q$8&Ds{L3aA%uRI3Dja7Ht{dkQcBu?cYNg6$lWIi;G)_TZ&8AUQSzY zQ1i!DZ%NT^*{oR{PyLRK%yV3kytYgMDXA z*_3p2bg(1B3G2$+6!?#pc*!(wUNA1DS3WZ59A)0gZs*fd)@7=M_* z>`dvPzYjLqfPWl$Ev&j*<2-xrnjM?pJ5!g7o2y`NxIu3iUP~t{HR>V`e;%=VTWzlx z@3?=NW)ikg=C9sdR8EkQPSN1b^X|%W*_ce@Lf3^bMG5$RKO8OLZljo@k*X3CNeAl? z-JVuY?rt1)*`13#uNK?(uU}u*)z$e9?PL4Wn{Qg^zqKJ#%a7cdZneLk%+{pZcygGk zqQ1U9yiMH;mKfnSDPd%RiaPYazVYnTN{O{q@1n>Zo`~R0$Neae_~77R1Lw~t`(V-d zRHykF@4K=Cn+}(un1T&DOrM&RbXI}c`IG1p1!jZUsb-Iy-Z-)^Qd3i9I#vm`dF&0@ zx`i7@YG1M=cY|fO?arRW_ysJ`2fQjNDUs<#(#tVD7J zX(^tbdMU;GE;d}v zTPn`;n-%S_naLv<}{G&BxHt5O2C1=hlwiD#@@Pjxb`N zL3Iw^r(wNA{u7lbxO;De`2$GNr7^JRFLRdDv*h!fRWlv_TJ55zr?(Mj#H5#$`tIF3 z%8v153NMqW9fJ!CswQ`(o$QUedv~u<;DFZbN9$>)L8tUT&=Vy>26P{JKYG3 z(Ex)Vl&~4m{!{yVRa9axUqml{m7`mctk%mJtD?>kBRK3!ySuwdtRMGJ!Mfz(8vhiS&-dU!176ZQkU>#KLIEV26GkDxrVw( zv7FJ`y`l`QMjF|*59|$=GE31?OU=R=+F4LDbSpibTP4Yr5;~VJ!Ope7j!b3cwSEjb z>{I_^;Jt}lZuRztfzNDlw4lkg7ngrLCW#g&F?Z)0#m_7L`xDeO9ouP31y6<{86>oPVlOte4Y``@NU2+n%>(9orZ!V{ef) z?v_&S)A^p^)z1E6Upf%!Qf@$Y3A{D*N`bY3zG^S{X+Y&)-@RLUZy2CsDgh@759W`@82J4^tXurz=RsjB%_ImxYd7=r-B@ zoKcuFV38{T+gE5guHxve2?YV)^c(!rfiqC8TU4OrB1@tbR*QEO0-=t)y7>PaV|Cpj z?@g3d16b$3In36BH1*dx2IVeyx?KnTD5DKr3Cxl1dr;@X7%<; z-qN)3P6Y|>;aq2n*q{My&cHacHfc&W=Aiody+-n}P+`IqQ-0zt6%`fYK&#hY)N_ZH zmX;O%J%?`8G(FbL(H3?cYu@Zj*y>9py1Bcj6c!3ML<&5Z#ao#@3t)$M3+$JWUk8%o}*4L*v-!fu1AC848fH!v_T zIWcilUq7bC`S(aZ{YbJO*@uX4gHeS3(t3IL`3#;ydgxGU-H{3ai2X@IAMMoSWCN7$ zb^_te?ixDOa?xw8GnWSHus&Ji5C9*urLh(?wysXTXP=}M{0zI45qbh%i8~zWWV*#R z0F(l+IY%@iwb!O+pJtkB4u(4jfq=!fJ)CGgeSJf^nMC*L?i9Toiyxn!tnjC(!zgkN z1)X)V_zwg4f5ED-UjIWJ;NN`79|l@cX5R8_OVyD#3ZvIX zXhASyX+O!BIDii;*^8Pv0$jxUG*gGm!pW7~w}L#B9if3iss}* zHa|D(-1PxhN%H|zP3tI!Ei$nGHT3J>9s2)u2KS%v7C-_6XvjzB) z)>|NeJsJhE=gUz(|S0cc7 zgiAL1B(HXIN(u+t#@9zKpZ-uV`}p@kG$_IV)>_;Us zE4pn!f1;>!H~~i=BGCo7)*&y1iQlDXxNSObDg!ppT@iFAGKRr3F$rAzl zM#$vj-_37Ccy|C_$3vYoAEk%3YYW4!-)_)zpic=H#lrBh_x*V+!GmCp zE;*tv=(9Q#8?e2OpGDf3_UOCtnDX6*wV z%>6~~2*qe2b45qB9({*$4&aCvuE!AGpETb5LGKC`9BDyXJfmHC{_d@^JDX6l`9T`M zP-k|D{M~T|kv)!l{n8r~?}Y6MMLGulWIUy{Ia)Z39qHp?c8qV5($dl}(Xc9j-;t_y zb{V%!Eu&pdbrQ>|3wK_!!w|HJHkFo?jD0y5s*K$n6RT!iu!oMY2uv`gyj&dage+)6TnG zHwOeH41j$4px1PFPLnjHU-&LDK0Y3F12;Ff${Ep;68B?+0|U~@CGPpFpGV3EyKql$ zrTKhbbQ7p3C7CbtKwqDvs}?iltKZ%#5;+4>ho7a;#QRd$p6!RnTgRPgSh@ND{Ztt zBP`fGu8J6}rlS)TGhO>>+{H>lQgVN#;PrR+9tY7VH5WA(9F^Xk7+aVD0=}HX`tcS0}eQCCZ3^v%{+??$!>BAdmOF!vN9a zJcgkMS}r?r`8o@Kv83v#%lgSEdpkSk1$!*t@Cv9W)y>19vhm2K_lrdM^H_6qnAg%6 z-tMc6*W!rpgdE#xr|qL+{Rk~S8?*wr4S&^7Q&aQJ8+niFhIZRkw0pvqrrxR+UWxH` zCbe3$Yl}7=!-y|o?eR$t;3N_V@@{k_Ku9Vvq5u}n)$92VXiq!C{0PDRMxbD^g3hBw zzt3!r?;Xj<4iPZLnax+DFS!0SU%ZabqTQHa@R~%hAoff zqah%A6rxW!0J)F?JjW>JVDLg>Dt!|~C3BESws{+x=prt{5=1nkeXTp+LjA)5OP)0@ z>e@>h8ykDYrZBK!Znw_1U+^j_p^rFLl3zJMtOXC6+ zRngV8@c0M|=x3=Q28Jtg(0`Cj7iE9>-pR`5Z1?Jchs7RpM?Fhf1w5cuK4@T3&(QLA ztO!|VdVOx7TWp|YTS}~ZJ@&ifDv7sV_4FFL`c?YR4G8da<)G(m8%xK)K%7mnjV$dg zQ`2{@ybNz|{V~tk@cj{6JlocU!un?c|5svF3n33}T&HtNInktr7nqs()vPX?c&O(x z9X2!zu!Y<_JXU=uu3ee8(O6)KmY+&9fLtPg*L8ED-mznBn@T2^9^Ea8oAN3>K>&9- z&<+)osaNsy-$AixS{!ZC8!&4cphT%U+SoK2`OHR)gbMKZWPE-` z&xNLyUjF*W;8+BFCeoUqLHtyzd-uz$)c8PM9eZC(s|57JJj$@!@ppYhmNf95OhID{ zTP3pEJaq&tL6$&c5!EKf3u^OBX$zVwE}T7DHeg$H381@ck0Az@DhERQ&JtqXyLF?b zeLO_%df%(W>_2>p9ZiB3<8y)qxgC%Dw~oVjd5yHBaOUy zGc47m@Hzx`^q0GKfUG6ldf8M^7U9VtI)uEr=ow}P)#VF-6zpkhNJS2Wv4_Hoj4m{N!bkTfU?;rwVPzWAu&RGrYT_X3 z#*Y}LspQSE(v+h5WbdepHsPFV=-TI{h(Qv_;_+<{b)*;V(`im56HM=sWK4VwOOB%K z*aij?RTVo{m6t39lq)Kx3>!s_d>1}n5)m<<>CFe~I{zjntXJ*1MgIQ3P=d^Gt!hV)qLq~JS-tYb<>FnbsekA)p=#!?^&UbU3Es71{`Wk$ z|F+o)Ru~<}q>lH{adu77=4BP*Btc;}99nJT&Fcse^Bg=@p{8)>t;zQkC z``y#zM0&2**#X%K|K4t$XKmiW_93`54XoaEqpe!D5@UN~+VSy%@eHnZrwJwvf-Tv^ zfERTn!7*>suWB1N?$NgzPo8QOAg!hvmPn)S)-%4{BmeN|^t9og6WZ=u72Z)`?CN#( z6)TmNy3jk)p?aOY=RZ0m`u5{AWfnm)*eCzIkxy1}G^!d<3`&r_hs8yG+jnZ@Y^mgr zYPROE7|9J~$z*$F|J80re8AxJoN=lTL-594l!x>lftdq}EiK$3EIMSNyCYT(nV9`3 zL9^pbOMv3i5U2%phLpJt*ttb26u|EEdGmNN~9+Bovk)t4v zH*Zk}6Goe&(!lG0GH#48dJKNr5F`MA>X=z!tIVxZY^oZE(v-bVb z(tItQC8^RY`l6{k!tRQSi&u10p{s!^*^rD56N0{Cl4aZSMr2o(#*wQer+f0+5Z#y= z{68;nf1`tgw5{bjwl1Gc0o+E^*}AoeVjl%)O-6)>hxe zu!Lr$*+3N1SVTSt z0zHSpBY0C7gaQOissga_j|9Di8ML7FAsX4&2+7mpflkR?4!S=GATCXOrDnr*AwgDv z<0$XLB3`imN zj4UA}Xy%_L$D*+NwwHR0RM+< zuLrsvbcdm5W?tsSOSy_ELN>D0T65DBu4>Yhby<|nU>P?W@mupJ1C03xE4|V>MPGu< zTJFqD$u97wH@7%(CyY-Y$RUxAF0;wK5-%g**)` z1sNPPM>KY_BaN=Y(7q1_wgRI!%kF+zb>pXMkkVR1wK#!epO2ZV!VCY+48SMsZ5|$D=`?Hvi;beXIw`k_`|jyhRwb0O zni9|os7GYKtS*%^zj^V3c8|c3Z|M^i?^pI;)7K|9UFg$MYKmmlmz-9zU1FOnAp@=2 z$FG@H>XCF2OYD>aRFC){)%J)!*OjG1SEjnV8`hpbWv$ewl*~>|op!_R+6!PdO3a9G z1o)nKFZEb~$<~zcVgwdjSi&Z|j-mT)*y9Om0UI4&wM-;L7J!PwDc0{K=TP>2*Z6M= z*sKpY!cYRGK!yvf;y@ylW~v5%l+V7k54{@~Ti|!DZsFt~w*P%_(yK#XlTdUqrQTnk z?2`C0lZN^A)Y?H%TB7lJ^EAxqr1gJ(|8ruB892QXQ?8-_AQcDZ5JwGlT zRiUT>9f03`E5Wn5ZyLW&V`uG^EFXet-vP@ zu_MonH}BD29ttFxbE(D00vY86Grs|e+>EBlZdoIa{W6XO3m9=fPdnkb5`10W#KZ{u zZ?5f$^2z`WiXy|=+umM4tsJr6TpJPCk?0z2OV+G8Oic0WK4^V;?Y(zYev{PB<`O<> zWTYW%+E6vg-OVi-Bvnu_{y0T@b@ssgu4VwGT~0#h4$i~g@iinDE`Jz7JSapK1M22@ zS8az@WpRLeADgYMS``o$R_1JmDx6WOhinK8V_h!r1~PF?P5phtJgj!5)oF0#$qam zStKjBpxTWm1XQ&Z5%w!9DAWhmUg->$=mm@*?;8V+NO2TC@BtRW)z>(i)274H3TH@;?Vbtt*qQRsJ_v!X;-@aKI+2479gt(B;05cUbS91U(kRa9; z?L+`j=2TEi;S(&84NJP(odA-JFKF*7Sj6J@JMafvhCSp;enw}&pcfbctA6rmyO>?? zOSbL#N9%~F1;v!vYvk3#jp$wAM0C`uG|K@l{{N{cENO(p-Y3zQ#DOt@z&c2432>m1 zpiAX?i(~6UI+@qiJJf1&$6Qf|Dk2J%GVUwO$>EDK^0r3OH;=P$OOyeOF@!VPMpBSWOp1V?^;i5Z!Va7j15jtyB)AtzPa)_ANj}j*Y5+kd zpvf_V1OQ?J35y6VJ}^Fk)0fGWOD>}kI!II5O@0~iL@chi1wh^|VJBMO5ibH&za_{ekH}M9Ua?ZKu?H;#7wlft3|aVi8TW;3lKnyxG=i! z?_Onv#A3F7DL;}xa)H4f1MZ(E_BY~6)$j{V!Nn^b^8x8%H1Q!VA$JwQo~|ohIuyw~ zaEZZ80eNawbP2SCwxeaOP<2MYnP>sYTXI4|-X;cw5M&8IV)5urEZv=~Bh`Q;cXOXL zbJzdldC&hN+~R*8Tn$;_NA<2FhOBf`d=~XC{p@?+ zp9K#OjXijrGdbylbn$FgM@Jaq_rqp@j7Duyw+13lY#N{#K5Uz9mTk_l1iVR;l1lF_*Yg411{- zMcgaP$?g4bp0g!rJUgt0@(?(Fx+(hGM4cq+tL#7|QV{>QqQUjb-wbObTCcn6bqiS>vMtD+zY9NmIx#R)K+2f>CqwdLMKMHV;GY>jFAoJhu-%v+ zOzmt3uza9%g>5H(iXh^b8o{Ze+CkHrsd+=v*u5jwZEGht7vu!^P=_luhxhlqr`Te< zF9OQByA?7_>GOku=x~<6-VmlH6AjTogz~3oSf|ZOBA-4q8uBJ8J4dgW2NK{4*pVO41W26y z1BCxlmml#TAfV9?)V{CG*%Y}rMEtMWL-RWZnTy0$uz2(jV-PDrIYxB$WYi#dD@aI< zy#dks+4X}@O-xQs8m58{gTvt{WWa3^iYaYQ2lS^Mtsd=~H?TSiTNXhU5Qpv?#tt0W z@z6JLY`NNGcYJB3=4l-RX>PO~9r=V@+R(TYR;DAwKpRiGR>Z&0kTK-CEvF z4=R9pAL~7p5q;VDUKKW{`#x zAX*4QX08wlf0SV=XhR2y#tq#`RCVGi@S&~;&)E#ZCD3pqMb2_Oqru4rxfl`2+ZhI` zbw&8a2gIaRN&GQ)~xjLvGaG+9(>QZm# zr))tz+T?x`4~TrLjOD|hz| zT=$bdBmBj|HXZ0jSw$Bnck0hIn7BXU1A~HB9{XeHX%b^DDE1c}C z?$02mv*3E2A)mi%xr^9#cjErO29QdvU`xO}rx+NEF(~u1HVVeTq$P7TGA?t{ z9j7|4LHr>ZV#Xx%=Z9pX53FEjz`WqAc-Ud=vmY~XWW;&d!&Ye72M>3ic)(W zot{Kujv-DIEd=(jNgnt5u<8EMt5T5zy}g%z5ZNdF&JF-2kD*lWZ2iiiBLQd2?alS_ zm2Ul|@-{#I!Hw?H14lxEge{en;_Yi@r6Fnz;i;RNnjO5|+8|1G5Z|R4@H-mQ%}5)^ z#VcOixD%-@6Vt9*aq)wotpnPe5Ra=LCiZ^G#|MvR5|i+wAJ0WVm54Y4ERY_%Ag!%H|dk&q5BhrNc3ge&A?C^!;wDg-32 z9-96+h`GkXWPvv3Tw`uz1KKD))BzGXAhJQGhBz&-k%wWQs+@YN6(y@l7Gn52J|nNO zZf9KwR6y+U2o`lF)kN6vl(o;RUZ6Z84l4xo-D1u4;TQvg(4ma~Z6GJ;G1JRVlx5b= z830K*c$v}YCqee#^b~vMaV15g0eXp$$IKlQ!q1Qma#v5~+ zo$)zxE(VRQXZ5lBy6Xl8&I;S?CH;$ues}ucnCMgFlrxZ*>bLNQw8rjH0zYG|lHz~y z&@BXQOcvnQ4-6pZ+)%wTknED4ui59?Yw`%hY9tNj(Hin!ZP-J@Eck;6w)t6bJ6n-d6q$(+v{Z7%_K4cH zS$*#9I#fEW|0s%~qzm5E7*-vpym-ogR{epk^4Y<>=DgAUYCp zLsA(q>=4f1{jC;}o@Gh?5oF^5^XvzgxcqxCQ#+{PAUJ@!Aeyi>V;RN{KY1kNi!0c~ zr}vysDwo0d!92q zLl+v4JpDLS6{yH~OZ`6ggUbace|y#PN`w3N6Q?I4IVltc6+7!ghAwDLQr`HaTp1_TyW?WwfcId*m2MZ zs&W{+h;{pQ76~_XdHG;gVC_O%2acXj-l9^X^!HsB64K~MQ|kfpn+hJdH5e}Dt(aND z?H(f^`>)H8aEYC(-to`TEtChn>RZOg+FB964L6U2y9WqSsMZG(Q6E2l`eX$)kx*zE z3Zy#e`ST;@=H?9!rCC5(rh4<=c~jSN-4p2BooG!b8{E`{4v#qGA zx>~;EVLyAEo<0p;Vhc{h#^zF6u@A*TH%FU;lQR=!@Trcp(7*ori>K0Y7tWzjYie#l5Eb}_+h{xAEDJx_ zSl19X8XD)6zBU}XSG`5|hueXQ-2g7i=|ws|<5-}ga}Ju$?cnXGC@!6ir&n}-j-z5j zTbquu@^f};Q1zz1y;Hb*II{*;N>{e7FuEPIu5>88G-YpV?manh=67BFVp_L@>Fk!z zK3xIJ7rw5fh{vglKOBn7_@FV~S98SZ%Wt{x@jO-g-3{SvDAY}Cyx*#>%4@-^&d$yfp7VDTo9-vTiL?)2?!78r4@x~sZftlZ za!IS(dqo*nS6<_EIr-zqD?tC@E0Dcm)-J`8$0552rKmbYojv)PMv0>49}_ z#t;lrRp$MDG(4P~^01{(i`u;g7i)YrEFvNW!b{iS443Bh>wgnU?6V+&J3aC3T^c^J zX|x^k&)G08)0IoDnr?1I8x(S0k8yzE{{8z$38744svbB)z{&|6XmN3Il3w-tSj?fk zVGmqHWMt}Dx#0Ax4-aJO}Qe!OlZ4^$j>q;5ojVn6Uc%`STChK2UHGaDLAFF<|rm)1`!=pq) zL!$@|(Flzj`^<{qp*wklHGGnlH5c}t&^TtMHwhANlK{DJaHvaC1j04_BDBnx6OHiR z*7Yw~%wGr?R%n5h)deUa4cN~lupdgiYq`N__keVVL?yuUFvx?af$tn84&XRsAPNV%bigA< zLsPRW*HF^j%q+dE?2c-hCRC)`fT~BGE?!_G@^6z`dVZw#YiEWg2Rc7LU+9Qju*n3Zy_VO-`=R!4086{% zvuZawGV*2CV;U@gH?%BtV2Ea{T0kbR zKe7PHKtSQDG7XHgx~7i4Dl90lIS~x;#$u$Cc_D`m!`S1byfk1mNWF5#xTG?sH@2<#XNbuayXHS2l(jt54STo82j@eXJ86W2 zy+E4w1)Lng+xrOWkG5wnoaiMHy}Z0gL;?IVmOVoul=W$7Xx{q(I1^PlPS0;JA-xSd zP>H8}2NKviCOgQvV$1aH{(CkM$*kEE!Xl{+<%-wU7fzc$eepsYU`rS5GY)oktC26S zczAe(3YmY_#u_)sb1ZpPE$FR^Mgq@p%WAynjWvWXFy9^hgqtPJm6z~2}jsC8Z#=Uz#80%%Dv^K5M zlb~K?ShmHFzF>pB2OvlQvl*dd^etZ6XBmz&XxXqF`*|seq6NfROFPZ3Om_!$|+V_F6qe|sNtS_Hdn8$$c2)+ZNASK6$8ms4e~u=xQp zbSnH|iBg*`MS;rs2b5S#iSjO+F3@ze)YTJ#6!Y@)s{*ao^7gJ^{u4W}B*7y)7 z-Qrhk-z{&HSwLkY5hD(kEiEl^G~gngCbumQRZABJ-})FHNP$R5M8{+&L=)e_s+)kU zEhV1XWW4z9D&Tkds7&5H%%xAZTjcndM za@k|%wFjBRm$7;83(%f0c35_V=h&aa7?|h|W`z>A3OWr+=SR$dgU+>U57@nWe&pPQ zc3QsTQCnMkl8K2Ityk_cYE2~rUQ8YB=1%>sGJb!oK-tjVg!&Zh{v@TP6Rvw~OLuC@-D^|V8F;v`V$MI0ViMHCVPYA8y;^Go*O~h6{ z6HM^_wKy;LtjrDQL)t%_9g*`+miPcybszciNBO{<(Dw{>@)074eqC1kN}d9sm&a;p zX(faFwqZByQoalko10h(%v?4=d+#=cXpx8j&X@WEAP4LX8Q`tkE^&7iYkby6qm-}! zh4JZ?O2G31`58rcCgPk&>*J zV*^(Q5z89@-0>SzS*N8uoyxcQ^h-M7&~1ixjvjhrg=J;?l65;fc%Z7-uuEayoUmK>Oz^A|7X8|;96t^tzHYxYTS!j@L)jvl=Y>)Y+ae)*`{k_DxCE7JjL$A>Yl4#Z5uTV{RBqjCX#Li%6 zxvOPRaBwC(Okj#ooV5)Ht9ZD$6d>)Z7UWLLBB3RTnOim*hCZI%K7J>;?O^6Ie(pOu73^@3-+@7zD5*V-cj1B=?9%R8;%> z_wNYrG4dKcX-_Osfs&dvzcK#ynDBw4baXv%jQU+xUKP?9UNwzk*^xWvAtwh-sRrl%Wn|17h3-(9=<;0GOqQ0DyZw1`bIU`lRL(Gq3Jn3mKkKPsHSk8eMzV_>2!{&O%<>f^6Z%`0l$ zx$_!0ku=lJ~s;mN;Kig3gy*^ZAK$d;`8OjLhkQ#<&9rt zj!pde^U5=uI|Ieeos)xr+`H9TqK-)n{E{Yi86s^wA|hSQE^zhUx0L!S+zz`jifB}h z$qH?kmi8}vJ&U<{c1-GaIh|QU@x5|x(_OO%6?-&we)u61^NF3s8Vp^JRthkuQ5S&io-f^tllbNp9G;e zU9eZW3m60k8i6$b>~UA(OciTl;^Xca!YE?iK? znlV(BC`;f0rUgX(94HjZ9m~DPF_cxB9j#wtm*JKG{SA%9hK05Kqt>9Nm^Js3cBkW# z*_W4!ZC?Oe6_nEboH*1_v+_YtDsTiSF4nNgY=|W}@FPc5(`X?=QQ)`$YfH4S0&Jzz zGQMU1a^1)8H71@t3(YD8;a+~!RRm2y39Ep!m!obH09I*eJAr*52-~rmLn-pSgv39) zA($8!%@5?U1EIdhm$a+`jA|u#3QO{<#z{P-^%umj;=H{w!+3=1s@p;#!A-l zI840E&ZaA}sUf290ufsPLuA1O^XV0-NAVlFfvTA4I9LVH#Uk?8K*CIB?8#FAVsZ&i zrK>~>joKN|{u+kXgD_V(3^~$-0hAMO0BR429ysa_t@!|8ROqHiEn*e5n!<+D(% zAp)BLiihUW_362n|JwN%u1r*Q$=OuS70_uo`Df<)1r}xad|dxH14VT6%hK*e8Dx z5yqdS8fdla8*bZqhrWUvaN}Pgg|Px5Dm08m59E#WfX47jNo^MSX=-TvvyS~-5Wp|Q z<64F4k!FKm`|-?j#+D)YUuG9p!>OOQm%j3JFhHs9+c|%xwyoOuH%I5UFa@FxJku#r z>uF~?dEl8ei+0 z+|a^@|4@kpAGUBRJzT3Xl+836$_**3v#CpjT5;-GiXPj~_eD*`gCcSV{xPp|Q4vbE zCM@<}ODXYEt>B28#Ha^upP^z1aGXZa>3|yD=l>kbaT>uJumzT4O&ECi`L%)j!NI3Q zFk_X0lY$K4JzLzfsVxWp4vQrPAfh#pRHT1@dm9Aw2O0UEr;@)e#;gtsCRm zZ!krw{w5Ix}|lf!D8JPYvGsHD)|~LrRza4hph{2P$~OBSR2H zB!X8*+b8?&+rQw{nVep=KmOy#k8Xg~rXjA~0~E5@bKxF>I5!|Aoe0_ysQ$O#$%nK9 z{b~vU)W;Kh-+%>BYnLHrAf>425{r!Q9Z+Hj74BJZfX)j+kuqW11Va6HCxi(hDht5F zR4C9VPn@tqdOz%EH`w+zZSm6nxLs!$p3L9_oOB{J#z6xF9zwBgS^ylYhH>tOgMwzI zME9YZ`@%04U=(3$ZQ+m{&xH$}ka+KUd*d)z?p_UaU@9@0`-0R zZ2TQ%QR-e}zZ@9GzzKqApK-1AW8_ET*yk=$~_(E&?a&Zv$$&^^9I1viAE| z=-KYz(`G=$1YIvNR@E}4&KWlV%a%|bP>rB+uhahqVh|q@*uXOc@93NIeMBs!>sQ?XQvi`z6fAO&ci3eIevI{^EOMU$Kciju8K7&#Nj~pogodHfoghxfupYVhJauyAb3VIa& z^5q#2EoPS>C<(g0kLBK9fThFWvrvK1zSSgC@l-z>K(d`cyI>8jfN7Twdk^;N#>RBs z3OwCw3IPL)uK`3R-n)0N2tzh1A8;_Yu*d|(2H6lmH4~v6s8T89DB^540L4yNu(Q4S za$r1gi*>&@sRg|;4^V^=)FR-tBq=ADW1%CpL{X65;w0SL;YkdaFmnc@V`GgVfC0M! zYd#IpwGg-qzX4>%Ud;&JNsci%2^M*Q1m>Ob>w*K0?w;3f87pyW1#|U-P>}^b8v$aJf#C@ASSjZ* z`P*Pc-a)qm_HrL^@9_%7Om(86H&%e!lqqEaY;GCYIBaTV>@J^y)!y|kR3@{x_YNdb zzs86II9G;p*a>^P*rfIV$YeRqcM!@5agxfgE3+Orus8Cc=1;;9HNqo*=FY8vezQAZ zkb1o2yXHh@xOnm68$pu?;8@bY!|;qDB^nK~Eqg@EhrN9BCpSM9CPj1I%+3U56`k4c zO&?zZ5#30wMV=3E*&GqRd*Bq~GCYtQw7ntwne z0v!}G5r8PTMMQ3k#>^HW&-a3_sSCA!W1^Ns?&&Yx*PWg7;aK)Q;6H`|TPvfjrS?Qb zPs&;>t9?XA3C6JkQ4D8CYUL>B2~oItjRr}o+6p}1FWl!*HOMXS*vN|g3p&59= zxq~D*zixLpWUI*ICFW~y)_8=3EQgP}xhb$Hx1g#aZ3Q#?4Ae<{V~_9=x#(z+ED>+F zA|Y)Sn(KtLMgAT3buK^3g+)71Hd$Hc3PZRIq@h3MkX?B|odl>}-kd!SqPo^v%TPX| zkwY5zWIRyvs>P)xAl|P!UTD682RcUTYQHUdTsqd|J(Ok>qCd?LeBpj>60Nh}O3Ksf zRs5!+d(igcS#+zh3_G`D=RTn+1?r04_)~R(~IJ-og?rwT#Gelo z9UOAmZ~MsUt5w5U9G%IF1{;`;Smq^CrYHY`FW)U!{-|f;X0qS?mlGa5*h|)bbxT-^ zKPxpcF;ThoSW&iU!zgyfLs!6>?Y+?RTG^)FYBj8xwP)2}-PZazI0r};Iw)+@dvzuQ zkmzme{?*y0L7lT_ITlj5Ciq&yhSwX<@*a9U-Ic{P*)zy{`0(LkN8<)REqI1rLUdx9 zy}9V~!@mJIhmwEW*5o@7SF2Bt|F}J;v%w@QBvjD8C6)JkxRG#Ag4Fa^?{50l&zgDT z^dp4+`Y6o=H+nw3WayDouwe7vhia8mD}@C9)GE(w;&_(TOA(JH9x1($Tezh3w+VxY zw{JpI+2H0gYKLqYTUMPhMMouGY5mPQ4dq>cbHK*WZ+{(|Y95}88v6F z=DHhau0;jv_!}tJEHv2-!Eqg}UQT}g+{R(~8#iuy=4LkCm%n~}P0HUUFeV{^jGwh# z?`pT1KHOh??e6s8U(5QG`=spR zg;UQcCu3dSPBlD!k+ED>{0446p{Gqet*_QhOE6|&F#GITA)61Ur9afodL(R9nIbec zg31Z+wY3D7HfO&3XP<7PIH^=eXp#wTb`a}r!QGK5vC)x)zydmh#qEUWv2hIL4hxkpwepPcHuz(Y zjakH9K2d3$ZpT~RGBPnqVy1lrFMO~VRdeOZ3x3_VsTBb;-tkHhA<9~2kn!Iv!~Au{ z&hA2hT~A=ssD`TQi?{FI_1!5A23&9*M2*asU+QZdd zZSElFLyy8eInKi+B^8^{akVDwWRBsqPljBrAX3QVTd=~}(ts+%bCJdw9T?-eS)$iQ7^BwzJ{EbBE zXV-ehWFo%nrW20G|7_8T4o^UF2b2M@=c4^!6~2ER#@F2P|0(Y~prT6GY|GqjqHQAv zFq8=u6a_>C3~eYuKqN?35ow7^&S29@w*j>j5s4}wK|r!(%#u+;kui{?5+&B#UupN9 znKgIKn|Wcq&};5m_x5PjsdLW%|KGRw{`NMU78Dl|e{5qa1U$@~v;!64(9^5{!;Jzf zu;85#b!9;F2*YW0K7_UG&1?m%ZhK*giC2;_N4T>-OApu^?FC-ZjmDci0jFBLhK#c; zm^GI*-(00>$geu#%+dA{Cy;wohrnl`B|f9C-T9M`|F>&%OMQ7V_E%IL>|pZksfyM2 znE6}@+kNKOk27empAh}n2xpZDi!iW+vmf?!OJ!MDTJ}TC*n%0Pn|NfUF?zD*f<6Z+s_@1`A_KUSu6C)X!2-h_(oP2X#R>HVu^ju*k7|sPmH~;3dc&yqUEaADkZ3 zkm?&y`3i;lCR<@?=~}};U{G@GI#=$PMKj(EjLlFsG7;V@fSj6$dYYal-?hk;IFf7mj^*SKPEsN5QH% zXgB(FR9vWn-(vUy1=BiWY;4kMt6!TXw_^~9UD65^`4+ZX{;-~cAMGOB-amDpKlP&O z7L*8Rii}W8r$YUb2-ZPV>O*5=V=J(la&%6SI2$)eN@}cMzuw^0=iqK)l@hl~Lu9#! zMeG;JB=|3`@XiYXhOCLfM_K3S8I@_a=AhqxB`F1*YSAYeEV@X{DZ6{y+A?oX2 zRxX1#GZiJhDmV-j>L|Tmr}^y0qM)wYwC2eX6c4FAxf>!sdwqM*yEeyGnrLP^rPSFs zFM~dgg*ON4(s$@e67f`t^5?0mk_>fo*?I@!&xJ zYXt+GgKXP!^X128KHqG#;~-p=RkXD`#0D;UuC9V5xy$%`MURTMdYpIU%$C#}@?hqa zhg54+pPHrGTzM)A-GfP-+Fp~mCfR_)mthcC!>Afpu-4< z7=~~uuP$|mn~9M6JhqpcWhbt2`OiO3+)P}2@}?N~%c%7A>(_s3QT|P{^>F!U?26$O z&%@;rG*C2*gS8*OZA$8`#p{<2{-L71QBHeOW#FBqk9TOdXR3Vh=w`9^m_5-*svp>H z%rPsA@B;S}_3`7$aRG6SKV+v$qLC`PRW-Lp?Y3d3=iNW>Zi5X;Ff0-v&=Ke+T0k~f zAXKGgvHAyyQURGg-;>^G6wPv|vbhdzwP$`Nk?NUa}Kup(9@_=2{& zj5O#O_I~{axh&7w{^5FWzxpKGu~iGt=F^p;n6PhysWu9mv0<*o^}++Rf0w1 zAtFUmK~2x+=f_A}NN07jF}i&Ap*2UIIl263_Zo$16BqY|m_QK!V_SB9`^cZRnRu_w z>+i4H<^3fReO`4^Z~JxX>&x!#okpZLb_o$%7nEPA;OE914u5U(b{kZIACzEje#>7P zulXw^>V>qfWwPVLs`TfH=&lfeCb57Tc?E#o`2nswqsmgjuefQ`-mde1M7 z#-J^v24s!=9lWQ7XA+G@-UFdHXv`_U-`f7|*D;hX`-r+H@4%s9WE6p?4H4bjU4LW5 zi2-LO#GsYLZr2YOq6MXijQ$9=7nU+ke2u^8)?NN5HM1SDA+*psvdF?@cJ}W^-@d_Z z`N?>%R2}2G->+%Dxpm+*+@IQK+2ULy{ z%W$;)3(Nw7J%^^l_@ERn1{tZ_?GOb!~^06;&tNjGQN2g~FqgIz=AFSp8*xZSz$fP2pF5sC#)5N zLdRp!>G|(a1tvte{=bGxwNzA?tT_K3q37^DU(Q0)>zE?rt!?oqvlU&b=wBbs2(!c| z|nI5t?**{641NU^~=kJxc?6cU}xRl z{dWQ2cJIUW$$ETURgp6M)eT<_*v$8r*H4IH9uAbDA<$HKKISfkKY3n1hErK=Y*^Tn zta}<*tatb|JiPxe?&G&-p6NST?|$@_)3~s2LiE%`jX!+c>ml7Bp-yg8^xnlis*ME=3>zepT`J#q73tv+3VbG`AG_z^4Gf zlgw%(pvT#R9$pQ5EreUqKn=hlyr$BwL>^_f*2TjVAA*RDCBmqQA+0Haf$<~Ydv=q^W*Yx2Uo>mX>|th6-e9%e znFF5XVcaFtT+8=?r@3C7tvLUl_!fg%b%2?>ry2qAEhl9aU?Z4jtlPFNLB1M&36Iwf z2AHS?!N%Y3K>SD7DHqNQ-}4ikK7FFjMER2d1z^%HBq5;=0bK^x;L6Cvk4q)DZBv0G z^DUS0AwwdFpi$521JrQm$aezZZ27jh>cp=Srju;O_IdENr#Nd&NRug?jfNMyhZegJ z#jBrq%_dV+8U*;F%5?rq>Nz=MbL2k%`|VRYhL-?}wOln>8h?#27Gqmm~S|~s6|x%$Y?ho?gH1We)&m_dcH0k)a*jCf60<8Mvpd1G?m@m|jD z>Z;Gm(ww{dh-7H~`wUpaQq@P>4|dQ?SFY^T55!?F$G$j||KxtwuG+bCR*VP=csn-R zvI7H|U{_87VOg-CiO2cw*{dO9Hn&Ce+ojzElkjuP&u!aqpHL%(g8~5F=}pueUYan7 z+u|y7V;=bU`$rL!dso&qlO)W5xUem^G3B1w$F25j9z1v1(&E&Qv%I2o-!J$LCgJ!e zkwa2SfT^1_;(me8m$}%vkH2J~swJs1;Zf2UwXaJ;c@O^M(Np7}aRsQU;LoiLRHTgX zOK5Khv~n401ft9ihMLL3$aP%NUH&7n^57N9PQ7Ar&P}KYSS- zuBB-ULvd$beU!_`h3LjB7|s(!1*9VUfz(RS`H{S?t``mj?XOtSEAtMWL}T^8sl&pE zlh2DYe|zSp!wrkC=rbm3-BeT)UXVoswg#jShN}3kq-7|yWDtm4x?B+RN&B>B_6F8} zhHk9f_>F;QJdOG4ona9-{c`Ea+TpDUGEx@u(;fHj+0zHdAFpcnypyp9+4!dQoR=}Q zKM{RFN_HUIhMz3H-HTn>PsY*~rs|zqaChs?uQ&{a7IuB?h&Ylx0+9s$1 zn5EV$7F_nbc<*eaYSz^sYg~?w%($GeN<@)u#kwWA!o*|tGI{=mUg{6rY#|lIV^}O{ z5o}v!-TLV-`QmoYu}$omtxk?%GVR5}$E`MgD4fd#@=rtO%S)rInnmtpCH4sblrRSvU)LVVde*&+ig_22M_+CEv%mNUM&(k zu@}!4mj1ZcQCtvYnb6S+C+w z(ufkvz5boaW6TfxBJ>8Z0m{@1Xm_V~$3m$`2OK6TV_HA|^{LbE#1+({`$=CC_1E9b zKXE8~1OjZ|EgQf6DL2jKI8H8}@JlN2DSR&bH(lsvSv$utcHF!usmB@XpwJ9}ew{}pT8|%(D z6FbQ8-%5EWlT)a77@9d)cp7epfzk@^HAnY>nvsP=kA(|3qDV?EZvo$-M1WN?n43cf~UkgLd6x zUz#f--bn=PLm`VL!(UsDEVEuPVvz12?yLOq(>#OtAp_LT@AdGVg{)T}0#&ye-wo&d zFx2?GHo?cLepRS0_SNi9<2sJp2D5i(dD=TW`N^2M{;BtRGQ%&R;N*I1DYpwj<@z{d zo0+yv+`_8FM;bncotpT@c8g^$g1wQS#s%jx&N}QZlv4_OdB5;&zt_B{Q;owGZMB== z*ZXK;@(r=@ro44e;s^5lW8d=4$3y%2+P_(%)OZ@7mz0a;p7iKT(XOw?6~~__+6yQ1 zhfH7;9zXkhvQG7KwJj@FoSZ14;Ni9~Te8UE1e7O(;X}KYWUU|xc4BTp^UIN$cCMV| zY7g#i+o8|(@zKEV{ApM0ieGHEZ@vBNx0m7ntLcRQ%%Fx4vv|s#)myh7h5Gj$wqing zu)!z(`gRWrA$8Q(7)OZT_08HG_fkx`d14*0$C%>-d+eM-u?_yDX{WB$2w)r-g@`U|XYVoVpChe)KPSn`d^%MrkD*q1)C>wF(b#89oDmX!5?m+&Ynm zy5CMph8wpme4g3AV0SH>XyV1+X*GiT16LvnIboy}>-!9Hmw97LyUFz|h!IUctoKBo#hB=yOx zci8B1FQkRw>-A9b1T)V7AtS^=<>#OH7H>PKoD(-CJUkpe2!6@+a2WBz#1CALhL_Uu z0F+Yo$dMa^6_7DQc@LreT68J$_<92Au3h86b`DKhAIQyhA|fK|hW!d59>e2bi6^0g zo`)4O{QVh(W$u{!+2L6&kfI3e%7!NOz(QBAMUj&^+9StWoyYg2*35hfdU+rx-P~Ev zuM?~2xBE=yZJT4b(abZS-#=}v^6>D8#GnH^81L7#eSw$eK^6c+482)O(UcKrw9d3z&t_J4bZdP1oIR-QV|!Tj!o$GlAHv@0Kk4g7L!%pI{9vQU^QQqb zTs~T)81?z3?;QMn&G@a(G48c}+)%&+SRC}e$rFVkyxXzLlaeW{Ov@XTPu@P?JvCD>bPpf zil^OTtgP_~C*vy%E5w=1;PSyQji8>&0DaTiEip6szPhuyZ7*i(0+2%zgC4ZurKB=V z%(=_#tc*Xvwn9rh65@GO&#O28Cz;x6hP1huC#D1Xotm@`3BY6??(ihH~XXowz$E8ckzx?t`$jJS4Kfk3Ul|{RGUO40zj=*X`mxijotC#AX zpU3D2D7<)lBf|FXLb&Ee@LCIJMNxQ1bkpJB#qeQ&atdY`&h& zk-CLK7;sXAbm5!dCtp1%-#4E1Y4~uJ!@DfG#V;@X-tt*YT{ZmL%a#67h6jr3C!IcA zC#k;OU!lAH53Ga4=I-;$p<&tZd~>?&h>|3>k5Rv=5n+W+udRSyvinHpl3EmJkjG>p z$0GVmKSZd2adVgUe&=#Te?r;HW`f6mR6Z!Fs-||EAl4TzUL5jhQ)VqUPt@En%E*m9 zmOr$;IuHoE+m9RBk8bP9<|_{`_8NMh4E49wDs~z)->fMpF64&X*5V#+>2zUP~1s)(hGwd?HE^nKIGS2LWb%UX-1 z`#=-4gJ~%P%;V+f_XwI}QGL7e;=FMFT316zxsL=dI1_|S9AQV<0yX)%U5(j368RtdbOAkeaE7H8kMaf9$U0mqIlBq)Lid4pvp z+EyBTau2=s8iXj51=;);dx3Aa$8D%hC3}NcNB&OzS2Jk|;xW-=Z(Lj)gg0dnij%+z z(1)6diOB_h*fgktAW?>7@6zSV8Lu}jxQyJPM>r7hDCH|FD`^@Ehi@4UJ79ce&2~8M z%aI$|j~jdj$Ptu6G`>}qc1Fd!S2GQ@cs(gnXa6?Y;Fr=%;he)5F? z9RC>gE6U54!tK*JARZ%EI9ZL1j8ZJ2M~FF_|MjcgRRL*D9QM&x%Mw$HX!;oQ?T_Ad ze*Z9>Y5%3c+H7v}VD}N5TT?PWb75`YrQlH`|2u2q55J$C^V=oK4}Z9iGIM86earas z{gr9+pHA8C(C4-L7rwYBLX*aq?t3bCOqcJc-E;kqFPg@$^!t{d_uPtnzx>A)H!i~= z-pV$0aAcZ!GAmVzWou(-d!fiM>QUlR)uN+C(Jm1-Wwsob>Hn*uL+Qc)S0Y*eL*pM6 zox~*!CDTUP7=`8seBDmc8`M>Tz(#n*Ke5+|DRE8CTs=e2?o3*vR-V6?+0P866BBjc z&yUg1F@G1^AW>#Em9guk&OqPR3id7;B$?CzGH>?xwx%C|yH&PjHxpO~Hj5#LektK}(m=(9x7uWbDm6z|16l4TR zxE|FDysR5cx3lGnEGqKKb zYiP8%`J$8qhsw-;V5C3eJ=fvS?^FAI#;ZflhL5q&ulD`3k>R zjKDT~CC*sCSLC}7g{K*dexCW3-q`;&uk3%n(bWGymhnGbI1!OHlcwfT>JE7~%H7_W zL(vsSkKUwIL?7gfdP)yeK=vk6X10=ULAF9==fpZ>LugxvI!2wbNkG<{NWh^KgsY2GJ~`U%bhfA6S={Rj`g#Yy^aC2j9ZM1n^sw@1^nN>}8#X{^OiiQmr}88RHWg-bKS%3k?10y(2%AX`Ci^{?RyN0>RLrLU9VUe^ z#hrY4i77Je1q90qJw1=7;GoQfb9Jds7p2dlm!}|(0nKjAkoiI{3H!Y$$0|rjZo)T` zVrJoQwErTXU;s=OG0N5ml@oqeE=AeGux8J#$||yMfk(LOo9eoOB$;p13sYa?(rhlI zkrOoiv;khm)~++KCTA%hjqj}-gfgGa zOiWBXV`9=_?Gt_08bXu3neyJz=WLh2wliGOIl|GmGYoV4%`^1A?g{LbFs-7@NW2v` z9_=-wh3!x_hh}7C=$W3B!;1#@o^DRPTYxxsN;W=wI3wn>x=xl~#p4t1Te57GIY}G6 z^9H{7m-bhd)YND<-kz6-34@!bviVMT3MGRsaOwXLbuDM6Hs>7T-a<|WnsSS_7B#h& zCv}P(o5Gd;rCLnok%?*E7rvbddF6v%9QoY(nM=y`9XPR14XX<)M>!E=8EYOqj4l$W zj&((R)6VQ=!)KCzu#>3&;(Pb5%@b|tq1Yx{PMfVpVZDV#nMX3-tw!dY39jqBTX`Yc zl0OQ1>vvyh(k-PNuc3xsJ9H%F6bsUxmxsS&$+mPoy+bh<7*YiTq|Ly}rgNO8` z1{66iHJt0})g`LY85z#m)phlcM1_VbZOnU9cc#BxXtyunUi?;WrVYpWZDLa5rkJW% zU!ITaP`Y!H;e`s%u}?eZ4Vxz~x__cK&2+@_ehTOSW)4PtH`IkT#4KEn=Txne6di5$ z#+yayNKF{R$)o)PvC`&g22`oJJ&o6@d8!2)DA>gq;9@+yD9p^+J#CrF z4I7iOO-DJ)p6TqiFvkyMr#o@lltl1A{II`mJ0M_xN@3@h7czOKMHCVU@yehi7q8`6 zBwcTX#D7_IJgy`y14;$KV*i(;Wn{wy^8}aLIf{zFvAV~TTtNHJ=O1qLiaM0=C`rEbd zfh8O1I`O)}zXXdA*wAVG39XtR9&cSBuw|v`-oEB1^>${~(YoU%iN0G-b(R!g=iYH> ziPqB7-+Qs&BEK_Lprz>C`2dJ>N~Qah_U?6WKkuIat0G)QSCNd3Jd&1*_WU&T@h>gF znU;sOE0K6Z-p^}ghgdF1nrR01R)0?vq~9TSdh22Hs@8st<-j0shL(K$@MaTJ)5F-< zWA42GQgxFFWT)u5NN+vTc*r*x)RF}TSj=NhIGgz?GLu;+3_Yzvz{J*f8d(%0(pi!KVNs?wZbE^B3) zeBLbnya1oLo*s7E8Ly{@HEbCYJN$fua?$Zc-IrgQE>=65drNx1N*ce`(wqEG3Or=3 zq#yUD$~c<59zUl$Wh}YcuTlV(g0R{L6jW<#VU~wb`x}%Gk(4BEZufl2aAI`bD|gW4 zty`S2jLBG!9l$5l-UEu2$(k+MaD9bds9cJ6mq1`*dwtWV7XCCCETA^ITD#^wa3Ko{ zh{Byl+_zrE>1-Hgvndyd)r08bvmUO6S!gquEDSG(3-opYiPg(v6JalZ4cG}@=UnW}36EGa7>*Lw@ zgC}MKG_)!*KXWFB!Y?RXjeN%_f>1y|$L0bX-Q8j1_S=d*oep6YIMRAOWJm9fD3p^I zCZ$WJzD}&Ifo4h>L2*=O$c}%xR#I0NfPEPQNlm}a5I)GJXtt`4YnSXzk=ILg`MI&# zlkURf_6~7<1X3X{!2(%PJZpOjc!TuvNs$4zn$mP6?Y5uuJ_CE%!AaGpivs-d6Dep( zkxk(PF^;wEaj<3~cf*SNHE7@BorZcw!Ex7}Bqy=x^qyRD+awoe9Oh{+N8ygR2B(G7 ziCnq>#@^Rv^FOZB`?zCK34Th^pPLasb2 z#;z8Y^t^m%K;^3G<4?uW?*vvDD$^&$*pWKIO=eqZX#=+-xhSX<^BE$150voaP%6*P z-NA6H4g9|)OoKO44sR=c0@BE4J~6A=KgCBxNgo&=c)GK? zO06@rulnnW=N8>Jj&3#h`q>ECLArCCqCmE9elgnd(tA38*P5K(&Q2fh=w<51(@#Dj zL$c$4RH_KuX32z|v77y?$$_qB1N?9PHAYuUh9qKrUl*Q$;`^QF-G=# z-upW9gy7xB>V@@&?}DN~C3pzt6+{MhZ*d;~;BorvY|n>|pDZJLpB@MdvnpY@b^br> zP>eOOXd~Q~F?gHGY=NXrS0h^Dx+q&yA(yJM#6@&g8cU-TZ3;>9`bopo-0a_o=v1% zGCaZsM=D-ZZv*#hvTe*IUshB%t64bV;_beC)P}YmlLQ)+9asZd8UL1(FBbH}OOZTN zPj=GSTn{A=G+LxMRspDvLwC4Q`^jb~4feIQwM{?T1pmi3C{}VdQGlipgiVfSSz`lq zG{7fUf9m59x-V|y6;ZF=XW%Z<#mL{;yaG~_f-kQH1HtPdxi+2&q1lW6h~eWQh`zNBKGvC1?+2hcBz=RC6CHRh?EN`!06d6vJ2+92ctwn za$dnoHjrS9oQqWL%K} zJTZL8k(ov3`6+a*EZy0EV7}kKq4C5F)W=?zg;T@QYL?m7+`y^P2#B;Zu?`261Ukn< zsK^Er@Eg{`qTNTv*@1HGX9~*fME?U4GLOQN(I|prk>0dv(}i{PAP{QRXlm}LMH!0# z0dPY?!o$~7E)Zobu4RqGV!>I;^;>e0VW;>6bM+^Vlp+krkD9I?h>>$K^vh;uAv)`3 z`p#{85q~l$3SZMo(2axKJP40=>_}R4_qb)otj@>07$VTV4jbz$D(;|x-offYp}FvW zK8%)cGS~W@-E$4463l1Oil)_%MRSJeZtZIPpV>k(d6v`6>RE45j3!I+X2mP)(_Zc^ z2fh%1B7m6#S5ov&oH&YE98ADH0}xg@)Zqv;n#D$&1*`mlRC)C$sYYcHEA-Z|qTwN@ z>)qJ{kYYQ_0>NJWDfw6>7Jv!{B(s1?VA0i@*39v(m#jr8`_`i&<7aE+OpTcDQuO6c z3_|ZW@^uf_dd%V^9u%6&7z^V|QN~)g^9)A5Dzpth(g1nyrKP+P*tM`?~%h zCMMQ-Hs#{Qi_FTC`89_ylcUc*1-{x@xTLHO9m?z>(du^yQHMBbL?hMe3~esx*OQ72 zIFB?+oF7`5LucSkctUb)7W$HeeUD#d_lYg|VDhV9F1#yHF?KHTd zNwp5VipxJ9cLM8((nq0%@Pa8zck$kGc9!Q_ipEe4UZ&-mR%z{VR{eU7@9o>(S!Dt+ zzr8LJpGfI3R8(7lfqEDky2dcT{xs#oo7vi3rZ3VPYf~TbDH zD#|QbvLvlua1ynusi~=0%IQ^y*Cznl=@Rh=nxt{#o}O+v-dftT@X>n9JA?5wD3u)l z3qaE&8_VfYfU&UL`NfQG1&$0pM`y~O>9qY(v<0F@4rCS1$9Qr5*bG6}c>fYSLEuVP z7Hv71CNZ%?p$)F{@y~@u_)EoJ$l>(4)*HP~Br0cNpmX&7_G&Mg$DFU}hECm_In~N~ zM{mA;Fw7k)wU#ap{4`M-k2pL{ubjh46R2R}+wo{?1D!kWfZ_}+DOOA?;IBj_-Aj>9 zSuk(0MBHAtk`GMPRWt-4o7L{KzDB3CR+vdtfla{ynNlIW%}f4sRgOINEGUi!*6ca@efrPwlYHJzh&xR1wapwtp z9O829{6JY*8UaCQ7hxcO^`#EW2CZT98y*>%2R{ohTC?QFo}yg{J55ViYcW|AjbQqv zr1PG3=!Ti&4QpjvA6}&^Q!+>^Sd`8Kz^L9x0ioCnjItSsX_sZ?QAKGnas+p{(vF^& zGnSnM^GHJ~o1m7z$;6;@KE*mx&QIXi7i*K&Nx(aYJm=}t#yyiG#^1x~iZ!)Jqa9`1|^!8Z^K9I0uJJO1=H$_K!~tW7xr z1jb{+u!wS~jh!LUI!pf41p24ImXBhLS9xwW9N-Sib~>%Qjf&#%V7uw8^RcUX zzwLU-)l<@Q@^*Jn5EZ)Dv%4Dk5Fw^*TWg(B7#S5+0EAi;c$YvgMIZHPG{d;u0#4Zt z=w2&7*vw26jd3W8q$^wY-L77H>JfBb=ddH0_DLM}MWFC`Fry4;W5F)*>Sk%n|-3 zHvb5GhDM5o!M?U3N)4m4Rw|u>Gf|bYx3B4b6s>Q83{)ed8j-lIgbg!M>VO=0MD%+mPi!5jn14lH3>4}s!9l?Nt zS;|6Q+9n;r0M3}gF1E^494E~?)Ri8s#P%c_?G&2~|l$lMOcPPBZ`o;N)X%t!j+l)0V zUX&-oyUe|`Cg;d8^dH_Q0zLa@6=QFT_44|3Xn!@0^<51m#T#B?pqde>^~Sc^}c$u=l9$y`7xCT1;$=D|0`q zpN`O>j|;OZ44o+~2Zv{wOS1dP@Kdr&f7S7-TevRMXh81u{)<14Ox(LWv1q>E+Q(v? zBa9f65LmiI+pJp71Bjh))r#giJ=gcn&VgOwjnvd-T!oX%`<*d9S%j&i8bg2j^jQ)$ zkDr?c*$GwhP195l8Juz{rfTe6?2UWRGP9MGI$c_Jo{viLoVY7H?|LyhS{y}V$YU*B z|Im+eWP=gLhsmpxHOh0^t%C> z6uNB#%p0U&YMat?F_Ql^>YT^@yV zEAFI4#o5l~7gL#I&jj2Z7tyLxEg`T26y zR_F5{BW>tP!~V^cPEMaW0a>*$Q?dJ&mSgS1L5kA%q>DB!_;rfsJ$8)z53VY8k#k@K zrA@Mf!6ASZ%@8iJj(d#OaCc-kdTx0(D(aQ8h!50omo+2PPEn%a$wq7cAZxBcnrX;w6jQ zv;ZWq8|3tc-z`wQqJuKh3}&~ylfEM^kK$lDlg@wq7cbwxdvjBNunSL(zUhYcwo_P3 zJ$X!6s(6?p3YnCYNcg}N1gfG1qCwEDpev}@2pq~c>>)*f?^@}3NB^T=@N%uJd$hK2 zGyI*pwv53-H6JJrdV-5giZ&U3^q_rx6c%Z9*YH%$dbIyt=`e5YitV7=g_$i8Z1iu4 zNJgK~h@{*io9f-MDsh6d6QU1w&zM)Q)Z*=VwPSi(kM`5d>|LsVzmhEP5;(FOZ$b+8+7O%)$R=N(u*Qn0Ii@uN(+b9f#p z(wMt&8tdUz>OgkYc>N54kLCdGI$b39N~FI)WC zb$LH#JXgXms1Q%f0)w;7c)L5sHGW0v3?1jAS^lD1D+u(K#pY(_S)78iK`s0J;`F(nBFv<~mjP=8o0XqE25wKehifbG=uG z$5^Mkl$K|$y4XmF{;IuyI-kE6RMkBF+t%f#F8)`&%l|E}BPi(mP|=is8dUdR(+};M zKuH6<YkE+R$bBW2xYgvFCytG6&>vVP-JD7JHYmFVKpU7Az zVEcI@V_rTE>Y8QQStLLVKhu6e=NXJ3J81D3N}EZ9 zFxU=893ktfv0`O!0YxCbR;K%S?HqCl={NKj2eG{9F=+3q#y^Iei{WzZ47i9 z5fwjQ`t`4izM5?u3>v@5i-% z*MUqMpr(?9bcAF`o#ka{I#CF@<^ikh!Ft;PwNfq`s@~de7YS6T9r5ygkw-@nlR|K6pwJq6i@|^%!Y;P-Jf7?X&1r%`A1sYSk9%SRKL7=tY?X02T4! z*2(=iP7d4pS6AQG$|}PC*kAby>g@)MjpqQS_MfwA|0(=J0{&qpwXJ3EP6{fLgokTf zz01#0Y>W<1&u9IzTV=!mRtywPXNlB@Q_$Kt-)?QE7lTsf(t>gGmF!=EKs$)mJnglm z^T$8eilP_Z^lFtUY6>6`TQR$@4U5YZ45~UV3$((!#e>)3=3=(R85vvj!%>!|Q#3Js zk>jmSO;@z=BwC^{X03!6Bu86uBZH&+cfv4767Wboz~fC#uFX>QIkuU-{-gEW z5!Vu+$Y@$kw{QuIhhJ3y0@GXzR$bf4hT^&Z*y~+Ms*sRfGcB&(o{oEyv5DhLk}f)T zla3uglwwnXuQ%rF4WRBGD&me70k-JZbTK5gTnZZPMU#iuu>37jqRmwu}KNJB3AOmf9N1TZwpq z=-~-{E_B2Qx59X@EO)Ey@Y`$jiV@yG>&%ypT~hpSZlMMb-1vJ;4OK)jL>-G>V#mW$2DFj|@(++b2itXC(O4a~IC zLKe4dK480>O!#E%jcFu$3TF#52kg|Apn?7-AZof*)zu_Vg{wj+$AEAtKM+*&wbtU8 zp@XmD5Jx991E{x^=Z>~8W)2)ss-&HJ9}mK!qBDLJb9!mdi|Oph5VnNjNV8as{o$;t zoyQ14AK2mCyUP~OJ45)W`XJ}EG~&>0vZ5Q3sshfN<n?Le)ANxJd%Jn zk+=yx=G4;6Iwz={6nlD->VbU@Yj+9a-QJL-z+haGg8}nnFZbCye)a9;4w!SrSv~Ne z4Rx%7wUG~#9(&Rc;>l`4Ua5Nv0INOb7}Oy*aUe3o{R~VhTdjwiY!l2j{N8bxcbiUu z#A$@cQkuoanliDhB#s2jcL;i*%|QdEQ6M}W^FNqgsL#p!uodu;^Zr?k&(ps4$prP!RzY<(*q+(7wCi2{@nw)I+(hepd_#3=Nias(#b;h$TMLrbM_{I?Spu!%`7W%I?c4P1Q5vJ zEqTm!wwe}Zee#rI2alXEX7$K&o4N9FjZf11%)-ng(IZ@0A|cF5TC$NOo#8PtMGyk2 z-msPs7pE9TRucDp5_-XemF<%mK^0V5X$>JU_g!;36iYVKj2{L#7!CFfrZ-F1_9k05 zHv&`=#Vw}OncmV1n%Wz^YSi4tm z8mxx1uq{MRDklC6#yoG{U8RV3&_iYV4%kRh`gk`q#%f%X>Uz>;kFnD?)W|R?GPlgI zmIx+YtS;=IO|&;%(9)ghRiUQC_l&|*MBh()Y%0g8PXT+Q2a69!9jbN@vmqmF=pOIt z7E0Z2d+`@yCaBA2cOVa_r=q; zZ=EfimU$Q#wipoc=2YX*K8ZJ%{&0*F2pH_$$(j98#15%vMXfL}OfJK&(mU4eDof}R z8~0x={VM6RWpIG5v6pP9#l-@+gIwYkcPxu*%l7V0>eLKn3qyBI0s|IX1nNG@LdHeW znMWrg4piGN%(7mJS%@dF-fNoidr^kl#36i7AS&XWtR8Fv2H1s$dc3pzP;&6(130uu zDcP(qIC6~Qr%lVtP@9sh3|Ur37l`Qkp@ih&{{+|YIlZ4m?W3^WftO0ZoAzTbXhheq zjFRpaUh~sL^?dk!E)RhRXvAM?E&KJoJ82K7crt+kn2K885(TME#CD(;kPT~}$_QFS z>!p?dbJsWE4-%jO{`)*gFDR%jq88Ae&;u2rPL&}LQba=TiL!5u9h<-in@Vff?AaTw zGJ}yx45e9rPrLAG@I>(I)=^7#0PK#B#C zt{@<6nic6vm5x$29i&P>zk55GcfRX=zw@2%eD9n;&NW<<0dcoyJ!{?fudU_fvI%Sl@rA;c~OD$^Ax+WBB}sS_R38ojYGWNj)FB{&J{kME#y^s?i~x zn__ot)@~9Vd63zYZWyo639^o<27 zP7Ly&)@uLjipE9{GeQ5}_gNRG-QDa(7wzN9N7}~fC2B3AkGB8ao+@|xfx&P))VhSe zZ{2!+4SkP_T1?l$@cH>Gmj(307l;3k9}pMYyy!ts7Kf91=FM$4{{^GU(Q4cH_>#R< z9&kAF)fQ=Gs}}5^|L{`jQhc=q9@e|>%hy6uQc}^etb;#~U26$hQc+zUI61H|3w%_WgaN_P(W z7Jau&;QFr9FYn$u^};$kqkwmh6a3WD^b%{_f79#g=_xW(ny!14mS*zwm(Sw`Tq3=u zx^JF)xlXsaLDFrs>%`e7OUy0w()}mKA8Y;?tFha8pwUoZ`GYNbdV0HEhFY10KQCEc z(cEmvz08Q3-x^%JPR`*B`|h1Pn@70i$s^zI_pYk14?VwVS&jROld@A){dcP^q;Mx+ zjdoYs4`g!hB_<|nb7M8O%MNCbs3mIIkG?s?y|}^VRbobS=!dofvzTzFqC5VIp~Vt; z9*F|W=1+#TXMPmM3Zw&VxpLPf1377-wnZd*Lxb< zP44(g7X|0dbX-3iJ@bA|-m9m-F0U;*M=!+4c@wj6u->l0IzyQC(ocX>k{%Qk)Q8by zu@4El8Ygc2^Zj}@S#l%XZORMq>4m1>7HCax#83W8b1rP-_{4O*0Y&+N$ER^U5&Op9 zo~}FXuTazVSjN5e{IcQp09A44hL;;`Z|02GTc&v_Ff5fbPYg^xj*X4g=1*G8aEp2P z@La6_tn7-a3zuF#n~GOm(k!z*@NkZtP2r;DTH|YCgcnQ%Y8&ddwKZE6UEdb6^VCzj zjDEaVo%j4~%X@9cas5S0SG@9Bu&762DP#0Vhn;JzME1f2(ZIve&6^pbt;G_y6>r{L zW3H37di%x1x;AD=zNHpYF>qf5YDDj(ns<~CXf9yW>QF1sIyL60c6 z4?HZ~-{0>T@fl;v-!3y-qjikC^Ef}|$C%&(odrksWEZkczLt}hr>fF1AZg!O!YmBZ zx5<3@X;O~{fQZ)jjAd^~lAf8Wr4w()4!n}1yA=h?;Qx^7+C!rF4|=Hrrd zd^OsaE3Dgja({ZAX;_-w--qKfJS=~~vb(Xd+nk$S&1^C|f;6@VYQ$HU+}O3*XsRNZ z%@m1ndB2-qqqqO>^XSJot@vs;{vSmz33*{kU@0l^3%lI0NDIB=C%2tt_u<7Fv**8k zsn8(k-uLjP=XhPQ!eH+7Kv-&-oM^O~d5Cq6f{mujtM$hv#Kgqb?Q=QHHh%K+62XCm zjNap!C(2&mQVw7W2nd8Id3&M$P2H)fsadgc?}O&kjN&&Mg9~~R~mpq=$TcJ%2fUKYo0n1byHC-RIf#ov}pt-w*irIQ)M*A2dI$uqSc4 z*)Hd1xC?u3-MY2O*VmWDPD)C0aCWYB{g%1S zC>^WC^sW;$as4(*$->$?apJT5FLW;a%dphH5Jdd<0lt5K$p0?r{4W^E&>n^SdTICZ zzD9>imCT7|cS-NLnXBuiwl5QKIR5hLx`2xpGY0A~&t-vJv?twUT0HErjLOSnj)jLV zm&iW*>$L5|7ppTzD~@PS&hYBLXa!(XrK(c6WZB-$!1LTRw7tFjN;79|IQ{9LAfsEa zk)5`7?REoOSFYQ;U)N~4&BR4)pr^Mcj)vsx$7S%u`uL89hAuy~%}-uDbv8C8qDZ>IYBSm|zNBi+77kixGzBr&$vgG!ZUFL4 z!7z%j1YO&&85-R|l`xvFuT81e(ifYML*;SpPpeI7a%vcx`ou=!<5U+n0}2!fdGBNK z9_Ab=?A)M;H_XW`bIzM?+HvCXPA-6~ozUBKpsc_$-nBfC9=j~%!Yz-CWSU!99X)cS z*lt4iD+427enW${4+TQy3KNY1i}?5L+lXfz_qe=TfKeo&4A>Y*HO&EY}XO3=@~%bH3kkp zvfDql`5V3Sk~AwjIy@DgOEY7PK7&l-S>-K9Z~T$$H5Omz$_>|c@#73+i2|eO45Z^4 zBrk2Wdu371|3F$wMkXdRGZVOGe;!cgqpCHA3u*vk)ds)x`KD`?x1`Qj>Mb7w`@iWf zHM%&5InnWAjZw*_OB+)v~pm% z!+V|Z9IF=feZw2z_}jQK3l=S9vO3GcIZr)Ti=Vi`tUGhZ7uVEQ;IGQw=sEgEv}y3S z&3>=GJhbs)NiU7&G1-pmk1{gMda7dSMHN3>R^2hNahZ_E*@OOqIgQrOF7XfQ`Yk+Z zusyg~A<|M_Ufz=jlV84k8DX*xz#bTuM|pMQ_-c$iza0|S^}3asDtQI1Wgugvgs$SL zVzG=9iS~I|a<0RLBB}l94W^YUG6r{K`4vC59u1e1Qr-9A`4(cbpy0`q65- z=FgYjj}|`f{Neozg(GbYMw+&Gu=_$=ZS7#COXst60tZ8v_S{iT_`^UvcqBa|N38I7 zq1)s6i};y^Ukv&gTx+n+Qz+DL$>D6L7ROw-+xXS`NFM%+!YDBbK?X2ai5xp&ij}2L zRApb`?@h3l!BD7|Si4#Xm8-PZ&T}e7& z_P(0TUR)^juS#pyn<8Rr&H`x;%dXpQY-*;*i-RX-jQW{Ky1e1CQAuAgF0W11Ru zyuTt*cIuKqP{1*{Zn0z~6wsS)J+bDNmihzZAS^Ql-ZO^|`fbdN%exy50B}DTzg!`> zZ*O65%fr|I1tETRA^3dO8Efl)UH*+ad83yvC5Szo9d9t3`S<~h?r+41h=m6mI2GQy zBp9lI>Ja;w^S!pVHrv=Wd>X$y%v<=j&C*47B$5{udxWSuu6#Csgmf`TtjFUPC>HSo`m>dz_E0UoCc+ z?ZDr%vwF)ssEbbR4BrX{qbqQMx_7|vol0m*O73b|(hawvNTKirouNVdf1r)szvsq} z2j+)P7QatA8d|G0xa*6-ioB9CO9KNIdqle7aarZpExCyohW!?ljtMS`DqltF+7|=A z9u}Ve$kpyqI2ieU_Jn=#)y5O8OLUg)_*9SnGtn~PgE}`AX(82No|ioF^{=%%-L=XK zRQwg#7nSmK3e0~`cCRe0l=`U{z?Jd!Rq(a_^XiHBOo}9W6b?jI%y8U$6#n*3tbR7f zQh3NlO`I#RNWbRk_YbBFdrn;Eq7Er6-Jka@-T%=|Ld|D&=G-qiL#dgHO;(ursm!x8 z6R-O^`7?~&sAcq%4R_DwU4X_gtKryL+&$HlgG*y<-3cYpXD%^K%6D-7WKK*}WRKVA zm-wB|WYNXom{eEPMjCbtxroU#82an(^+l@UIK)$^VY`85lYT4u4j~GA~P5k<~fu=Km&UMYfYi&Y`W@e*6NV zMI7hkiM9BQ`P&Kc`xtiDdlfj?K0e33>pry9F=8nr%KIyqQwqn|%k7A^PLH2hJE~qC zl6-fHy9Tp^u|&g_yRuN;`98hIY-s-S-t!ef`Y|)H=Xk@@J0PE;LyED_-*|jt>W`Ru zdAqSthW3kZC;qowR zHCjw5c_l@(n6OHgI|)MR*vuGRPB(l16Kg}8qBSyg5I??d=Ckh{Uu-xi!Z>j1@~7VN zKi9GM8S~XFsy-BB_GWS(bg(OVwOHK5;_N54RFxu`>wdH;;?MBV50Al2Iq4}f^wcdX z%jMkeyhLD;-t+Sr4-{A_O+KA+s#q)gztp+SL;U|)Juh0H{XlW;-mTHTGO-h1VAkC$`r0}ouWdwNH;%Khro=}RgyMOqzpSU5VHA!XLjdU6(O=k0xNx4A<~gvT127Z;JQ+lD|A(FT=6eC)>O04>$ks zl+tv8CO2dDCrJLM>TZ$yy>gmf=`((%n=Lb)G9l8|dWK$;9VOxNGSDcE*zE;MbLSMN z-uRmiwLVD`TvB5c9v+)3m)fjOWu`nCiy3o4qI%DeBEXUC%+RxunUV73vytUq*@P}y`$MiLoMae$I$yL*J%xYiTClTJ|2+o-sUS~3ITcY z)geVTD_~1B%!Mb>C&alebNwv@xI2MLi#4?k0xOTnC$2vI#4TFR_)NY3%=bA3TeH8@ zp5MtUK4^p$k>ZOr*A!crXj5zC?a6MRZkeAu+%~Xbwm5I*2-_IwJ#Fphcnl+| zJeJiOF6U^$HttQ#Fzh%j-QqFAEPP~MX|d}odcgij^#I}ZOINJ@0}~jEZMCA)EfJ$d zUpHUYw+cEJW{dIfipia7yb`j{(urz1Lq`Myr}Xg&e)no3(qn>47+Sh23kShe*gxJm*ik4FS+Y{T?+^LGY)JzHgW{%% zj$orQ8a^7jBltSD_uVggSE>DprlIci4b=)Uj(;tsd;1p2(v9a1IlVn~Y2We}Hg`K|*KQk%KakH}4sNyByKBMJ4q5CS{K!Jgk==cfU{|o3;C==fCY6t5TP3 z^;yKz$|h$Ay<1XeJHu=TE^{_mKlb+$`+N7S(kfjv2(jT0(d6viZ7+TGO&iwa4*lBb)|;q{5|miLn7W_j zJn#pa*^!PWPPE>zlqkJ|cPw53+Qm8+Wz<{r6;~cLQPebVT-UdQDbco=(!3WP)9$`B;I(9s&JXRcH z7NL9PlgvMLH*|6HHuZb$;OqwLH*By6(Knr&naq55aYInu;Jq-LqQ*>nvbDCb#IuK< z?!y}U=b4)`Gh7^yEqE48$%2VZlt%9Q%pVrF`qJGhN?b7Wt=@kk0`_ z5MiMtJS-|8je(T{pQHLGCOzLrF=os83EV~(c6(oH`LQ%TgXq&qSv_&g_>2TzXWs9v zg|&{CyoaTo)tgj&?T}z*g}f_#C2>w)%1yhXZ7eoqpq$m4dEIbSRwZq3_k&Z@D4fcC%eVk+S;}6VZrd*)BcHP=TplS<>%)H=MCfk zes6Au?s)Uy`6Z>9cO^GvCAAG4HA(g`N$1}ny~8ZR)ieUGi^oV<%TZ-zl~0aQm%4@1 zvca!N?#1*W#}JKg$@104V9@5~<}9qjL`-XC1e-^=suI&)!UG7wZ<5w;X z_4RN+BP`OUf~IZYBK4m9mY(4AK3=yqT*lUr$ZSWbWm4hOUvFW6jI? z^iW3C)}eIB3EXh!GAs_GE0w&#|6~1>#&FT{ZEAF#Qc;>)qtRa38$U%5}S$tlfYT z|Ma_oy!km68};&^eiw1}>E|qVZxSbIbquj7#$Tnr$e!-cFj@%#f^K{!>?@0NyrS&m zl^nZT1fy#PCzjieTU|cgJdt}6(9gezjT}SF=Z| znpykiQg!?M4LDDvwFOn~dqLw)|H zW4n6nzE-^K@T<*|9v_EFLZ#xHd9^QpzPsme_Y1Dkyi2~1hK#qqC-;0_oNjCXz(5&! z)M2){vdOtqJ^BOlPnSzD2^U#P@p{qdF*CEGdCvpvyICDIu~(-#a{P;vF)b7Z9=F*w zIo2kQpF>g9T#-2C5xpRoydV#-;&;TS9f=`h%?IHn8c%Fsrg#qM% zuE4Zbnk-ZuKF{wF^^XN8^?7!!p}RG^-9B>qL{j!}5r&;v`0kQW1ln}RGNq5->FMhS zFt4nUD1(3#X6w>| z9`ZLj5St01_0tU4zh24Z(<2hqpIjF3c z1l_ip-8^@&c`V79q-7Fl?O%Vltb@QJgybOE&gfvn!}j*}z^js{UtZ0~eyKFqrgSee zQx^WhM~m`_v1IQYuAW5BXvG&3`j5{1Sng%dEi>qr74jH*f`)2#Nx(_xonnrwQH?WZ zT4C?}HLt14$j+QW&q1~1LEt*s>AuteCOSz7K;-HjI)5-_yYB|rA*WGe!Rv}$Bc0IO zl~>9#p-@|jXLa2^8mXuN0g^^~H^jD^9>axoXP@znpvM`_8I_Pv|4OC(34@e zR903N-Lbp)1(t|m;v@o2Bc--plE7y8A8Fp*b4SMAMIoBw&sd3u1LMuP61|+zCC%+) zX)c2(k|dgrP08dm5wcV{&?Gz1~S3$L~$4m`b3? z*>U|GDPI~-;9n86MupnAG5gG#QBU1S5FPO!w+9BfS|bRtPRYxIaz-cYOQqI=CK~PI zM|rJU$5W=+opRF+r=G;ocAc|up`-=Sx6GLNsz07eof7EO0Gp*^<}e?8`s+W zl@tR+Ft6Gya+-OH$E>#Fc!Gn{h{kMEOPy}2{rqrn-cj~O`;uLj4iWsGUCJyr!cP*D zEyxN{89ZWbG`ulU!sFd??g0{J~F3t~~0y3w9;D z3W&UAPW8l(_}Y5hG1G^Nkc6ATEQEWNr^gMKved$y_M>_0TFYq;Y#)<1RaKfDox%OL zOnUtO_Y=6KmV2g$6A~v(JT(LIr^5$5L`FJGkAM;9e(<$5hF-4QAId{%30cB2`YQ$G z9DK{iewFap1hb+FfeiwFL=-s0p zsn;`x5@Iq^DpL;A^QQYirijo?>oWKB!~rSx=<_b&n$GDI#W+?gVXOcqikZ9RIX7N! z47@1d%xYU8l+-Y`CTH{~wLAG7^#2Zm+()ffQE^0~^c(FH*T`{0 zI$yqY3I}wFXtRu=1;g2^^i`594diWejFy-vKrxB4dd-fP(OjOy~ zm_MR9FNpy$bvtjJwVd|UZK+}o5#rzl_3r*1q&BzTIE^+>?-_#N+2M))zP^$JYYYH_ zQmO<}Z65*9QlEy`ghuFY_vXMFVOLMSrs_9vZSDK6NaSM4W|jC43=CWqRFGp)y^r?u zo}X@sWM)neaJ1dR$t7c@;Ahn5dM~3I8T1*NOzd{a$jPIT1`RxJRwF{=GIERGb1PnX z5~wccsW4w$wFLn3O$P@DV-(Oja1j>EvX&cLYrkQo!Wo zqxbc#}&WaA$V-Lnl!+X*&>{=7dv z6QbH@Qvb+Y)Z=*lCB@NOCtz$AH>Iby1dp*$3Gv2}4^J1=ysxq~2QA6$z86Mxg2aGz zQWhF3<+`G-qAP~Pj3ky+#i|Eq3e5PXzy0Y-M6zO)1+Q%D7;HY|S3=2F!oAc@NplI2 zG&99b!&8qEAB8_Cfyx!{kV`?qm`E=-cKaX7D|HifgZAC%-z+rw!^IWrhJ&WPKU^^? zX_;+m=^#r~wN-_E>eTlrEOZf$FwvX9* z>%Iv+l#r=1H>$%n<}4`9w5^hc3*j^N-T!USF(GAqe0+yt_G2!W&pR~ep8l3CtWZUV zGwb<3p3ON}e0$JSv^v=E2ewA;Xop5|sRG3`)R1V9TE9dyK;FfBJT+3-b+*v4Sbya9 zYMAh4klr8=lEaT!hZV({Eo%}yrW;zyD4IOk5zHn07CoXcQX+}?MMm>onVAWLnc-KdORK01y8#JG)I_@cxVL;NCxDC{y!&O|CaIJ3=J- zj5L2d@ZF{hi&PdL*tOkn*;VDL*1zmL_{Q${^}^{QTYg-na?H|Fuiu63P_ZTbv`D($ z)&850&VMib(&e|ki|@P{SR2!|=D;n@uSNdI^to-}?S1yX&(|r-E%bLh9_+2J6?V^= z)-5;O9^aGEGMno@7_P3Wkk&t0D&4SyD2Y9!oKB=pi-F2QBNW_|9yUX6V-pGMJDc&9 z0$tea&*SE@j7`luIonD-GM0zwez(j{L?`_B9iPeDwZCE(n?CZ=09t8bvN)FWHMO;7 zd9&k3*wDI`TP9SaG|&V_2B4)04t{F> zg64|T*&3x`obdypp=$2Wq>6j2Wp^I&mv`--j*VrxS*MKWbMTKd&=e)w zyJI5D{@A3v{amPYvCO*rc%Y}JC+Bp~r6lNpoEPa!5cEQ0;6l^1ozPKHQMq?a%-i<{F=~_kV>g&c^$bEx@V_pVt{SO0 zIaQly=p;n)ogo4!DK7`0r^!xr-6>A;B({*2mX`34vcquw{Zn2Z1LvR|R-NZf1e_pS zl@_m`#@}>2Sthi@(h50$5*dS<3Lc+pt5-au6CmL9_Eu9}XFSbYvp2U7nD`2y3@Jh% z2$C8H3Md?_H8OG%qE1CEKINq%G6SUH6s>Pa)X$*EmdV69(%;7CtNH8En#f2JVkBy_QdeJ}Q#34vJ9PqCpW{yb zkaHjO8L&_mn@McON7{S#WbVfo68PNESx3+Z0;R@@MjiBBuUYdJajZ@-${iekFw|3l zQoZr<@eKxLTuS(S{LogMG&|uvKVf2f@!b6E1Q*3^t7RWJSA=BwzSKTNo9quSvnM;l zOnjSwo~)4>U%3&s}l8SHo{FG`vvI%&P?u2lCc&kX0m zW^4J?!EGd8;cdKwrPA+AG!YI%3EXCBb!dQNoBiP+h0ju^_o4)*yjZP&U22@)`=c@e z-;|f*U2itiGcnGA*@K4XUVLL>dqx(krVotH#23j7=Z2=OE{-$=rr^IOzWsRLd-t}i zF?2aq=h=t{Oo9QNsCjFhjv@W{bWKrG)3xaV3N$TRxi>1|A;~|~hf}6?KIBe+S6%Hr z``#qOrfIihPZd4gq&Bzp=jBDsU6Y`WD6la~fl^M$5|x<8gljno31kiq0PM=5h2XZo znH~TLk3`RWnD7vkS`7_H+o>P0F{7{JBk?EX)v^;F3giqnBOd1p&!=d;JhU-DqpXC7 zpoWuA_`Qx}XayWKFkvzO0(UnnAAY3F}5vq+3=W#IW4nEyjP;nR0?WQw4MV_ zM{sQ$#|cAA#GQgvSS=WJqL8!G6Gq*Gx1Yu(9g9_Uoo{+%`D^KGwMy$Ii|@>~Yvy~; zalM0Rp2cCqhwwH5@0kzLkawbKQ%nV9DNY4+GVN)d4cJz4tQ+fa$87ohq>{EgX$;#1 z(DI`RqM3|&V-$BPr z0L90BS%^NV)Hxo3d>M6N<#x_E85S$6lK>z%<9Mt2n7%BE99>Ep!%DzG6Jjr+(3-$G7{kX!+n7yvs9~Mt)I@V>2<(B=4Yr7}DJo zOaG0sDavWf_||JU^$&V*>gCb83SAldN2O8v7KilS`HJBsy-j1UtFBFLbhOf#4tzr+{9VGOY-XTbx8DLa;#w zyj#jsrdUQ|caTY~0pu54M}J0(p;Q0#U|ybbnh{kt%lX0i**rXJtVaLSvlKBSf(_gO zc?IHfX3RKtf~O1o#5e<>vdyDBx}sCpEo&`}S7-?9`Yj zXPkTyrzWNO$#Q|L8qyEz(vrdDQhFiwI(4}LY2E9qkHp-P*Kh}^5a&3_<{jBF0VC3> zOHVa)Pi$X3Syh)TB=jEL@2e*fs?KqOE~Yo=aC8Nrz=^+4{48V_`(BSd22m(Vq800W ze;lqB)&&M3xi3q;|B?Eq(FDE2eMT!WbZQbl66Db3)`&%5QtO2}DW~BM=x9nVAvV=ndqhPkH+!VFJY4^AZ^_iHZzWw|qRa=@v6gkgW~4ew zmWpx%dOIQ={5Hz&CgPNe(ijsLw;d(lhIrWJRU*9j9Bpn%S6v~`;i>#SQgwf2zZ_`d zi0F<>-LKZ=?b}H$mDWFr0hv>B4~-dFh9kNpCvCD+^S#D5*zja z_|sp`1*3wKvRAa;o$7JU^@CtLb&dg~B@+nIt9c6v9V-CR#C%R4{}NvJv-F0>-1}Nv zYihJx-Kn1ttV!*_tGs7S+1`XyJ6J$LysFlBydQA>UZ3wOsFf2bmJ_|TnA2I_^K&M( z&FJJsNszpnZuNeMmhjvbIay+cqIWGKz@+%p_35<6aHo1q3kAp-uhRsv;YPo|zP>)a z!)&#L;Z5Anggs7Jyj{op(UpQ`ze&axy6dymZ7MQNkCbbtU{HJ28G+YM_ZKG_Y!Aew zb!t3UplLxlME6{56mYsf-qZ7li=HhuZFxP%ePZBs+FVUWZgu+NicqLwcl+v5K?Sk| zkrtKUj3aGad8@vO<2?A<)7jDJ#atBd876Ne!~oKuKGkbJna1Pe6+Z}M)?mNa{kNVI zFbR_I&C+OEug)lK_HBV;&AaJ=W%Ee};jn`f`|3SWTX22Y;Z&D~+1&Dbld~%3a7u7t z-A`A<s>3^f%|YqmbaEq}JDCJEpbKw5^Rz+0eUZ@){g$R_mLOI0o2| z5?tTYS2h+E&p4V%UO-*^}G@9!UlC9aE+wjJ+>6-f! z*k^dLL;iwT9DQa=VTh7AyEPftd1f@4oJy~^h+kPf2M1XVZLejfw8f!I4|i39^d4xo z7M9kZE$zJ%S5bd7#Na|l*!4u;ix)4h%p-xAN8Z}kSif*#Y4nV(x6;eZkrKZ}pBp>g$JzWi;-#&7CxD#_S=E8h=$c_s4u}@v5El zC{~&^Jhv#-KZ~~<73HF{vUVt>WZRZLJrw(`{$F!pAcw$>LAYD(@^HUiZm*fUj!Q7Y0%9Oq>*l2&0Ja4DGFQC}gmwk95uw6d>(;BzaV74|1H0W?Az%anBXG*!^w8q^(-_4k>nIaG8mkIumVzoKk-@-SDLgCyE|S3 zXliO&mm}z%&KFDA5}Vt0vWDlT2f47O4f_{%jE#+D4i|}1>fCYQ*5-sK%N@S49P)C& z5BbY;0bix9183>=C0jX?%Rnb>>3> z?f2;e9a;wyXB-;=oi0>El4a_e-#dpsWQRv~z&I--vun6mA|@+KPWj@BLI}LOAud$U zMW;)DVoa`aadBz}2G=B#p1H-KBnPxdxDT*SpFTY`)*B4z;a-zULDyPIm-9(KEgd{m zu)>g`_-cjWNzaq`)3X)yflY_tqNLKcKk>YoVKS*ppx}j*Fo`Vh$)4U`rO`C_d6N#`{L~7$RK^MOAqP#${xEL%9n$MDb~er%4<5u5g+(0 zDE5c^hQzSTPv{|xK&__gPnpLDPTLi$oeJ5)nb4~2VPJQXf_hU<*cFhf z4_;QF4NoIHc|rY+&&0z5!6O(U)3TeH6(i`mCGp9*p(K3Lte`o8DJFJd8Gn+oJnua> zT>pJZHqCKt_*G4t%>-H}BY}cGFzk7Z4x<&u1g<-T;QAj#IO@>9MXPyM?1o!vY;X7J zoS9Q|Kc^)dePX({_|9 zq;eC~eJ^#Em0xdiKAaI3cdX}kU(Ev-o43sVnFWy6-GJ_GS{@OOB-*~bsp{lFLsxqltju(|GBi zH*)DrRuT+p4eexE<$;<+eG=TQz&uG~f=Z}!Z6t`*WwD(l*yT?E2R>q7=7UVt`okUNAuGKILFP(cMZ(uT_^ zPldu`I?$MDITj(Oe_&xV6?fKfXIO`Xp#lgKvk;8V=yDLf_sI(+Q9zYmsSqTdI8*-| z?%Oe71dEBTsh#L#lfmAT!RcNFz8f-L*6tAEYyq}cPJS{oIbx5nl?l)d&&mK|C5|0} zG#$|--mVrfBzw?j$B&+l=@>U`wY6fm-@YP#IJV@)?PN;TkPC14=jU*QD~G-5iK9BZ zx;y|`wPd~U|6a_TMC%(fNOG7!RE$kdR%M?c8foU+dmFR@P@(+!6WfE-atv;M zDtb>gjsPGtF93tS5t%htWQIhWD=%T)x)*JWxto+EPSV3MwEk=?)p&sh%w+J9J@Zh5 zZ!#x)wDa`-{?YEFydRF0T5@9Ax$o^m7Mt{LWKM|vksAJcTK@p`qmb8Kdf` z`U@&h5ulR+;&;@#!QF_#X3@fwb|sUH7tMG(U9H3AQR45>P_Q~0ji22c+uUo8*2oH?j{J5 z_6-jye}t1QNONkiue_&Hj6hpB#S&>J(1_Ap!Tb;GM~@!ejgF4CK9Skmwxtlv?+*|m zx`f!o#L{gsH`$oR-i7{Nqjx;W%yM)*(3#jt*oaMxyAWw&B?l|quLx{t`F-@wmw}mI z&6h)3x(UTzgmkBIKt)ZziT(x|s>eJlpJqyS#UozWs>^CnTA~!0=-6VxD|(`N7p*b@ zg6f|(QA#BQtOe7O#AaSh09&iVXg?P1+ikEUOUW3(ezH?0FT{1EGde9zio7U_($M5} ztoqUVG|>SCG^L~LQP2(e#(4-h75DU`4Xd(|ury+^X;XW5ZL$%)I&|@@RPiU#MJC)Sh-Tv;q-N-(2*+yx{LROEH=N0V;8GfzIZ<6n`pbjKEU`LqPG+fcmGy!<;2105uvHZ0k zNhmBf2|eG$8hzfo=fVUXh(QY8_z3PA^hXt}5TkeERMQ-8QDm|(CC#|O{^dF|CT~|_ zI+yfHwy^xj)!!2W*Y;yiFp>joY zvS+`gIdH#r?r>nB`1mlF{8&euztEe>6(3V@H@+01kkvQCUmg)4dqlrx--D;bnP5AL zu~J|HJ(7RsG&z+C{qT(YKwNxM5eS4m*gyy9co`sAFqRv+5Eqv6^;~Zzm4#V|w9li8 z5!4z_>gY(Ah*Ga77bVayxpX)gY(#W6(!yE>=qztl8Gugq}>w*9^4PGCJl2 z5MD!DI|eg~1uZ!toFtE&(;hMpI@b_7@$-d)?&gnf=(0LAF1?sUHvk# z$m&q$vx|Jiy~%kIkYT=sK2Zu?>>6dS3R4*4m=Z?9F8#m7s>gI9r+X}Ahx^RQp=^)N z0o=!oB3G>ymP&oI2`(N1D$=^$lhlFe`~|`^pkdaD&Cd>5hX^lD9Q6PyV_!4=Xv`%qws z3Q!;q_P)1nZ!Xo>lBPJ`z*6T1 zMnFB7#jd_BN4LheK&9G=`bbr=cs=Ib*Fy;3%$+7aSk}V}cYOQ|-Y_*2)-dpCqNXn{ z!U}3X&z0l>qp9TiP-gt%rybZDUrLpW=MQpD*~6}KO!&!PU!s`iBqe4lNPDX}V^pW5#lf6F| z@bn$0S3HQhJsCm=N!<;*m(#A=?WZZOIztyj(g%g;(T5C42*V9*&dMt?I z8Q#+w-bdITFq;gz-(utf%7NV`(GSBgap_o)_=L)^9r~f;$O@L=02Mo&GeYpnayq?M zR};RpqthFJ9%Pwf7DmfhU=x7lOgkg$D^P&Sc590A9x z>EOvWIO+0OU5-6No01{qg-3GRr8(KxK@QU~FSBgl>on7M7OO6-N8&?9*nc_hdpN97&Cw zwF=~`(b@wXZ3Q)o#fGM(1mVJ3J{{18Oh}sZ)O(2(kOGJ6OTjj2;;OO$`g9bG$AEog zFB#%;D~zq<|C2-PYzme_*CJ5)~rSaK^{+>Ii=Nr1b`<2e2 zq2^KV=h#~#M)4_D1&JfYO0_#`6$P4H5f~mxOZF6lg7gD46R@wD?b``*q0tksK3_Hl_&~b_U_XUG^W27{tSj9)+;lFme90h!70?eqD4QMOU%zK7 z2GYn5LSDx&tB%oGMxfN;7%n+7LhMM+L7Lk-Y(Z1t_LT3H(biKjt(e8D4<*IL_4KJ* zT}Qmv;Tzc`2uNG$&~{;LScX1NI6QDKj~pkwIUhE9I5IRe$>y|hWOpjdO-q{@dU~_^ zH7OTp$9444=LiH5bi<#CN~#+f&EHB^c>mjGIZim=MJbXDumQA7uBLTK&9VbV7i3tg zvAtj7%IajtQdu!E*&ZC3#x`&te&~hk68m?zeb#adCG3)Jf&fTxzRJqK~BZOCq%VC)Yog82Ootu6*ZtULE~fs`c9^?f&#!9zHRZmV9WOVmPpZw+b>-Ir|5^|711tmW+A}x z3yQgYBHIkBvzTuGdO zZa%K9ol(Sm_U*?9EBLfx;$Wl5ak_Atz%@_-8CqzFaHLc`{8aV17h=M(1)f?HVI+iE z2Ks`eKSLiNwx^x3!0H#Yf(GelC$K99pFQjGX2x5Ws+t;{n4!pGAGosG1Q8U2#xEc2 zvN&_*2N0pGEWRj#Zzgq4y6%@pDm%sKQ!E+gkijR$$B^EOSr6qof*D^@Ay$AOs60o2 zhS-Ns4Sbpk7;6|ZT+nPN@dH*0qElM8c{>e!_>SQbpYG31m-m-V%3sZbhE|S$I{2a{ z;oAj+l>LEAq)6Ex3K&w=hNWum+(&Z)8^pu_w^ZQ3(^d)1f4pwS?_c-Vh@o;Grzd5@ zdLrVcj-r6gGn+9W0Qxwr10Y#w1i(2hXMj{U)F9C}Jjo}2DNlRnD%Z;FR*|@ zC<~P?kO$4N;b{{kBQVBg_kX;zy_&Zt>C@AK8=#KwhLuPgtTORFG)FoqH$dr63?reh zNlZvyQW&5q(256;AZ}?QTVl|(xPu@x@{$<-lbbb+j3S|k39~41eDB^pNNNmeLFmcd zl`*PpW6~0oyuH>qFSfe6e*OB2JBR#}Y!;n4^y2S37Nb(IZs}`D^|l3+Kc(z3Gjggb zh8{|)HANc;+JHfo?d5$EhR}mD%OT1U&}n!GR3tk2^W$^^!qrPxZhR~rvk65d%TEd? zfA!+{0@6>{)EI$mIKworgQJ4s0lcGa1|Tx-DBNr&_BmJ(>&A|qGff=c`$Uz>Ey<>! z7aiHcq@hYdT_*VrOV}(0>x~Xz;o-%O7$AZyXWmxM{b7z@xY%NKGAWr5iGbNd>lzz2 zj{KlAM9L|p$fO{1O@{{W%02Ss5}$D3VNswA{o9#3_apgEOsq2M6*%PL#g#S5j(2bs zymIN2befMeRz5==Z))yH=x<>Wl&$t7B*Viby>{DqFRpkpd|`9?x7G0y>YAEq6&0Vv zO$WbYpS#q$u=zjtp(4f5k!Neucbt^@=XnV}yhaigmF0NwTCkDVq;hSYHH8IEu*u<( z{8OxTJeVBF;3?W zF&=!l9~XBRDVDD$PDS1hN#oOWbfBTPDXcI>zMPG3=apqLc_vl zMBEiuI9%??<7e~=EwDY0m?iD*#nYt!_6cFCunEKCWl@nBR{H<|8%#@a#>;d#lxFZ{ z2#62XD|+$9ck9*#5;QXZZ>~|el{Xpx{StIG-@nHq>d(3V-#8ZcZ48?Bo4Essk_x|@ zk{z^DggP5eFZyBS-c}zCZRZ{35$Mx&))7e2w=kidh_F(}0_`V8!AYhwgDCS{T{8o} zHpntLCtcZ-CO+el*@Z7$=+rIfmLhOc>4;BvabmiZxNB!05ocxCbmVl<4pcbk`mr@o zOv1riB_Mhl(Z+6IhMxj!bAUq8V0EpV_~m(7)7M{aIGRtAFM=z3A*Yd9O8bjpTbR`( z=w`w#qn+zq5v561F0|OBhc(1aKWWq*h(R8MR|6;H8hY2!8#_16e-N6#Mtd^A{q?8D z`#BYsG1vtKJVM+m7HURS18+gTJzLXC0ifMvwSh|!TdPFR($Lg20Yxj$fLw~7J1CS$ z$Fl15LiIWwv_qG-Nt(7pxQ+LP;Pt>6B}Kv=)W#uLFblyq@`^0!94t~Fo*K~LMZK54 z;WI(GCw4m+giePJAO)oVJZ2#U(O&A6?;APFZif+NIq?zwp3Ow?I8xg!7jh99P|#_# zz1rxN7?$Q2K662?$B=iP6qJspfg2z(vaDf55&s^VWTK1AU-Co1LBG44mrLXY9vO0# zOCK&Bgv=PgglGi`-`2Tg`1KgYWKQQ0uTu60$7y1*DIpcjH7gAbWD*%oM%a)-$N&VY z^K-MQQ0Z|j>@J+qRd}h!3l3(z>0i8YD8vPFq>ws$U8mI;l>#ZIx_ZNN?ghs2&qSM@ zIgY(SaKy#?mcG@Q6kyaCZTLQ)gS~=GU6W_2-;7}lS+p}KO;HuShPN{cyKPuxb|SQr zJaWU+Z$FAyYY7{O;Lr+7AXaO}vRNCp2+9N);3NY1sr1RG7x886~0PGqJH2t`Q} z3Pt<@<=@9w4XHG(sn%~=?^HB4Hdc!<;varDWBg#>(L{8Swzp}sr_LZDlllI4(P4l$ z6I>Yb#z(Y1-`k8$Va!69(IYVH6^Sz9?hlhbLFXiae&&uP+5#BZOupX91DR19aghxO zSTG9#x3uI<-#x{tD*$#0%4syZNb-o|w5;#ZvPLKfLMJkCj8yg_dr4;vCl7mIxOs;` z!C@gR2pUa6S;RO9Ut)FHatfesq#cv|V?m7;SVqHFdlB<3VhbPnJNm{sWn#q-0DIDH zcP{j}4%Flkz+a&n$zKxF>*Eqau?R^a^p$D8!K8!d?6E=O zQMjcwDMzL7tm#;%YQH)hr$-0LrMV|$W=<+ATlpN;{h*u}6to(x(&HDc6&euQn=YLf z-_-c)Snbj;htSYXDes4I3jspqH2q})LV-G)90kh+ZJ21ZZ9ia(N)h4)_O!mCsrEK< zhlQTIO$_9erSay!^bN7UAfge@gkw0fy@1Sp689ilmXr1WQcqQ3OoN;+oI}N>t!TRM z2C&VAw)?bSS3d@n-MhU2>qp;~jzy@^;l*`Ns$hN6hJJ2P2$bF)5*lkvJlXt=wwhT3N+9DECOd(Q~tr$=rIx3@P z*no$gW;8X;UBAd&;DSSU?UDb_aJNEkm3O|hre^i{3A`cf=P)K>29~rJ9TM25eIrsA zpWq<9SE#xT<6IJ<@S;w*`r3hyke116qN9^9HP_k^bUuvwLaLW>oy_Y0#KeNEo#b%L zB7pwewZum;oCZxxU;FX>#Mj(GQ6V~UkanF5Li7aDlpu_a;Lv}suK&H&_3st6 z@PDQdWH~x63hxJ)E~mCj|8N1xP;`Pk*L-0uG74T>?`cEf|>|? zN~*yX`GwL4(d)Z$_yu&%N2f1@9IO8yC`;pfOY`i@8qL~M4DVy^ob-nGzpOMHzx3<< z0IR5jB3oXbT5*vbPR)4+o_sZ)hcOEnP z%d4M{oc-JC{A&kxFIcmsxba}EbyL&Wl5P{n#D=J*oS0b;!!)k*fK2(+Z8Oh-%)9{F zuj&c}ZU-htr}9rC{TqQymE}=0UsE2x@>#V`X2~FO2NBq{!2Nl-_^aWLV*GOmA-A{E zzhpv3gdF&>opy?>M6S^^^`InA&B?!fHSFgJcuFkx4uh=G$7wj!#AEc1kP4f!Wv8Vr z_we?bF>bzkyw$}rO^*(AVip4GDD=r9Goi}1YDl*}YWnAg2>8$$Fa2KrFa?RI<~XL%jliIxOVN@GY*SSE^d3@AFP)c#1t753SSlYm910F z8B-qU+`F#rwz-w9ttz_@K=ZjhM(PhHHifPa(7h9QsxHN3^0|oOuD*JJD*(ctjEW*Q`6HMZhm&-xB7+#TvJWg+Weg_4?K~7 zJoW_fY&+Pypj+LK?VbAT1DO!l%WjmcPh3HcCysEsWUyiF+T%kD`Ofvltct6S^EX?t z|H4KEw<^cAXMTIJ_o|n8Ld$VYXQ*->NuBjKzkD=loCiW0&|B%}gTK0Zg3a zVC$#fkmv0!ZC!sII~ggP^grC;Z7!4-f;+Ge%%r=&zcfF8zeSp!(?DYgaYI03xDO&@ z|5tNo8r0MkhT*GJoGM5uPK8vZI*~<-fFdd}>bN#2f*Tf6wt^snMo^2OK{Ra{sYw+O zssd6F6xjr2v5~DaE)hWxOhIjkV-Q>lRZJ*?vF~@G{n`HOlwTQ=++Dwg$Jrfr?O&iG8%JUVCrC6hRkfvHSyu5qcf!w zY8A5*rlW={m5o}v6gX1@<~B)Ou+=3UE16v!7$KaGof3Ozzv{7z5o7eIkiA)P|9GCFVaqjO%%X zyg?J+UKtUXoy{=W;V9V*==*EVkTe@_NM#0$lSfYTD8~HbGH*M$q+qL}=Jl%5Lx&C_ zN;;8vCnhF_gROu#Jy?#`L;ly&ZO$w;Lx1%chtK04H#;(!wniy(!uL>Es-HmnjfFIUYslYDOy;;^+oh4mdGRXJ=s;O6Z>R|zMKcjC7c+2IKguDZI= zz`&sWV!(1%M~cjao4dO^0!nW(Z8i73H-8bK`-L9+lb2~!4hjyK2_W= z3%i69nqGR{-yZ_-NZSYW_4;d_PC3^Ic9XHm5TI}cKtPjl)&d{7Zqva%j0ZROzZ7+(wkc4Xy4Jk7m3u%s#e& zkh0cg;yQKpRjTHA7J8qc5}LM5rBV%nQ>njoV%BcJo==d1S*|4bn8hVRLl^E;75T^` zk!cKprbJr?b}9I|5}JqYo8cGk#JxddOTiac=3rT@4SY}`x}*E;L0`V#+nI#skok&6 zTWA3tuJ@8+@4^WRL*{XVTB{}7L!?MAh5b_W<|Q(N6RymjIyjGUaHOZBrgsU%f@=gZ zDADA^kY!T}IHW*WF7sPFpU_jbdAQFkP14lE)!QUCieF(}Z_Bhw8DrWCu zcvR*hSxF=dh*vtm#*AMu$qj_}Ydoh22UBdxuQiQ$r{^4qb&mx^UqSwG0i%iAl{Cn_lH?vKc z4wQ4zPBF@%tb%xyk)D$n)UMbIiBV2fRTX-DoK*Krzz83e?CHK zALTL6hBT)0`YQt^IX(W(`wQxLUvDqFf3a`X%pZ7UbWyTIjWJgbD@B;3 zQDwY|Wzj)#R{YOKdif~-tT!F*tqfy?L9j_<_^uMu3-gz0fZMdm!7^Qj*qYbdwq%oK zO!u_?W3ghX6$Y93##!}~O~2q38i=vnH~(HKVn&O_Q@1dYsJLU=5M+Jl>R=znWDa5c zb1T9F124SB&(F{Iz{_iol>|}TeX&J2aV7DYT=Gu*mfuCzm9V6`1jen!f-0S^0=gzN z9@+RLqQlsTZkvU2iK&S@X2l6{P zuRm2+I2m0(6`HVgeG^X5boNEvc#qM9{bSdxBLp=OaPAVs8d%^V_rV0p+3KM!*5S91 zz;B?wO;GIj!rt!go3JNv)&z*)l|_sa85STqvZk4c+mSATm`X3*7&+E{>HA3J1^xQd zUJ6Mxb|HmEof(X-k7u#3PffnIZNuf0cCf+Z^Z8c*5J!TSh4OwLb$x)Gsic!UIWJ5Z zQH(y~Mj?w!^6jTG@(6Ux8yW4?g#&j}tCQ1tQMnZr^X=?Za$G^&k!9le4A=Hv)nG z2L4_bHuTAcK=_V8`suiycicpe&%Jn$fVH)@Xz}pVZ=Q#K_1)W3U*3FUyYctD1HXND z^Zvu#2i_*u=o~m(tDjo1pPqGpz0QL#QiF-XR-(KiVh_FxJooM4yXFQPj^~{b&X22$ z>l3_p;QKAnBrPpQbk#tyt9iJa3v=%qqECZzB26z%PjnsL+5Jmn4$Alk0%2v})O!5+ z&E1ME`19=9^^*AW*zvDQ@F!uT(0}_4t!KyA83iaSE02zirO)lI_)bqWLdTG`)b@p+ zudhy;o`ceM!d3y7Ww&QKX^&GYb z7#<$pDyX7FYNFF4=jZ2nwZ9--RkyabJ{h_X&XQM{Ji(S zz_}p=|MImr=Y`E0{TuOx{rj)%!k_p5cfTR^x(UzOGtJ3oY9kCSqQ`T}RqE^O15^Tg zOMHq(lVe?(r7YWb4^Qlob2i(i@bUiXxTA+XJPx`HtSE2eJiVkwghrCi`1k1#Xf(yd z)!CHh&d!(vy2-QhQz|oc(((hhr1KsY+AS}9D(e4bOVit2e?cbma>gSw3A@T)Hm)dq zYjx&`oSfal)O!-8prGJsVPTv14}{%HynKApu7hugBz1N5Z^cdY27V8&OifA>QoFG& z3)aErt*Kq4gAp2?ot;u-dS;v-qsZRLX|m0yX2`Wcky!%EWN*OiyQ25k`CP}d3T>wc zt2-`N@z1ndhYd~FN?MYuIFmVB6}-8`a%IeL=67xXA4cNbv~25RTA4H50jwVQzIqG4 z$p})ou+7J}4-Q0VbXK%qP9%zT zn6hV)6O3brAG0)^7^brH(M109aH(>&sL>mgS>3&1aQ7uJ z*Vp{mOMcz4{Ff@$b@+~F89s9w2$9n=HSID@5VUe4U%dFSlw6pRQESLtXT|S3;PsV+ zaUL-~K3*o_%iXsOTBU+Jr)e{1BDrpD-12Rvd*Z8`yF^H`ofjX;s4(h1`vO=KWo*X5 zZUc!ZL!ZE42YY8{#)^`mL{xKcuf@Rc1goiwj*e!+Q%i2UQ?ErvMsl7(h`0B`sMKb- z2m}=vZ@1w(62-A-D3a{|=UjeO;?mUKey^}}(uriSdg$pU0q^ffQi010GAf^^vHXkP zokwTis)ZAEQ?$&TP03_3qi7^y>mDg$mt%hYzsxqitv`4})7!VVVNpsKcDriA0u_zW zux(6;mkj)LlSB#EQm`-Dls?BgiUSzDx?dmuI_pMinV+1*HqzsS$sRcEj%_ z2)Wy2WMyp!-`pWlLPJBvOrEB4f~BUW=Jjp-CgX>LcV0}LBT(r*J@JEc?d+3XF69Bj z+6i&6nFnEzHQ^aOJ8Yu%`hI-S0V~{q*PiY7dAuySKPq}};`N$ZPyMjXf{@A&e;VK5 zHT8U{miK=w6a4pc$v?jVSJ?kVR{q~Tk`b`MlIUu=zL8?aU#}Gfo1|uBXa)oXP#!}H zgHi)R4}mL8C++U_>(_}SZEfvh8jY;9{m^dMnMY(~T79=#ZRCX_C;;jo>eO(fvdXS? z$0irtb`!k!T`4>U-~O$HnGtNL#1dJGU^(xkBy3TEcy3?C#}(*F*QOZmW6u`wm@07G0TETB=;d@73*K$sCS#(Mr=BTUo4&mUer8ZCAGY zns`C`4^bArqYjCJczaehTzdTh!_juGulLwJ)=8T!n}4$}Z19e{K($k0PvjPfXWwyi zyR2w37z~T7@vM(#k|k5mMZ+w6+&19HOCr|o@UHzg6`%ZkyLEGScU+O%$gpYOk^$6n z5w()#N{Q_55d)uu2WmHlMN}87MOj5lV;OGEXXJ(sS}xW|c`=2n1=kj;TH4wqz3ezJ zrlM2Uir!z!X3(&tEuHT?7%qIK=IsMDUD1fVcPIIVG%c5Op!N-%&rh`Z^x+lRJzibF zhbY(Mt{bYzcy{{Uq4PzP+Nvw=1s%40mpND=wHWT#g*I70tbS4@KLpRs z1p}V;wSD~f@r_~mQmZ4i)!<->7K9!sK&g&}J)SFN2ownwW4!;kQmB(Q_`68LDGKLp z-FBlr1y~D|=gh=2{Y7r>l5CJ4iOYY!senqQX1Q`c7fiK3mRU)IXhS4Hq>G7-HGJ0C z^l=Gl!P#+MEIzS(+3VeZWe*e=n_1C{Ue1)qP;RfS^sU9iYlE+LS=J@TI>!XecZl2& z4D^<5x4NotF(OyKSg)~fRyfv?^7{^vd2(Q|%%|5kG-PFE>6gAH?DiiUANNyrzIgEs zj2rnV_aURKk0<4jl3McVXiAR%(7_I$k$6MjP#UB7plv}42WRp>BOiWBuON-w4 zVS?3Jhl>}DhPUJp#x;t}vgf|6(AfH0`5v*5G?(R?`I^}gNwuJb7lP9tZb}Z{R@-Jg z#Q#jPbXJEl7`$QIn<~}SPmU#@UPJxwGuLV;?J?GJI#ES=B*D))yKHGp|Mf+jnBdmK z->qlf{$7nj-14oM(WnYcgBHc{>ARBwDr;aO3Z;)mb%xxU=4F2-8I(bUzF z7n+=$jFxw^P1a6K2n(37KB!bxxk)hY(?C_urLK?f9{EK*A{a#h9P(3b^Yrvw8dD}@ zs?E*KmDWl@QSR>PN%#K2>QJ5@UuHSiPyOb_%Rk!L*+FVCSjc*z8KvFsS}V?WXTY92 zpshToSJ30x(?DcFvFTXe#-Z#B8GhAPrmXZ)dr_*;jyEr>Hl&`VN?IzlvWUw_& zuFH5|F##CUF!laLc*!dN7FZsi3lhL$xAw|Qk>C;DSJoao^YrP{+3Zc*GA+4Ys?j4m zVaTO@rUwDi39)*-EKN*J^J^t&gKyP%>IDErp6)hJhw_G$_po7Q>IIJo$+OpIq_ubz zRv3P)8@5vPVe6BKF!Ik6-N2*tRzwY--70I(S+A*W?u`D84zKu22xk@H zPxMWU{%dn>!NylP@y#lC`~R;gkoZ zD6(>O@VnVqKL_O>-+TBkq~7&vw-0H|Cs|vsI$WGFx%cG~BdH-y!oe)2Zj;_|1B1f_R5a+wHl7!}%0YOcmxx17@Or6uGCA;d7gEIIZ-M#Vd5Y@{TPINCXrqx(CD@!TqaGjY|#rt+7& zIpxW8>qDpXzm-Ux`x0gc3)!vC{F9?T9}B4oz>_Y$QloQj=RaN&)3;@S1*SH=j(t2j zs7uBtZEV(ltTfaK<3RZ49$d8IzQN*5#^`cq6buA`Fnbf%*=~Q0B)#jvi+z)Ot0oso z4#GFL6=5ghU%2~EizJ;m_X^d8fJGE ztS!PuA^?y>`^ERjsc60q$(h@IjjlAAd_+a|AgSr{oA;x=jyG{s4lR6Ha#mAWIW1_i z6=+v__%{ajq$1Y9lDuS!4zt-LXwLE3WzmojoFM|?;|8HG9Du;cB}-_pWaPB?Qr202 zq=u7~e|(^S{Yc8B^qQsUzLRv{Z{WRFGGCRP-RI$Exx>pi=dOZc)6_)s0p`yusssaV z*8Gm9{P(Y68@}gj@#cjENb0|4lD6EJw>9a`Bb`oL8t=9jRtC{C8lpkIpCnb&T0D0X zHYY8I@>9Gf&cebYaLHWRMc|fyhKQ=_W5TVl+HXU^=p57`l5giZi|hduYd2Y z(obhi)#U0d-|~5v=T={rN#G(F{O1t`mX8sSo~ztAzM9EmP2USnSmgRWH{GhqH{HG@(WaX3RTZQB=yyTL!s{x&S02mS!^u5ZCMv@iEErd|LV%FV5;-@@b^B|p$9hvY5*|B0pH4@{V2;H)m3Y5s4l+;>7=CdQN>b5 zzI24)NQ>`guC$tmCxEtqFaDU4&eaRJ;3#5)grL>M2*%8)c8~(e^#!-wAQW(!!$)&_ z$9$%rxCe|MgylkySu@T()aU;`$t0}uI~nq)4}adEp7@l?^3;nCk2Z6^px&02yNEcv zx-VBrv8hj=URYgTlpjldT&NwD;06iK0QuEj~F34-XG}2L}KIj2^FsIC=TCh)`Lv}qOX(S~^~08HbCxq>89BpwwGy@Z zhUO8?T8deCofM_Os=HN8AdWUWwYRFCf~*=&s92&03ex8G6LLFY^`VpHO<;#TNhp9DRbHlHqfl$S~7uysE7e^8s zp*W5LOf*r@<$NHjmPC0JrlL^3GU+n2Jk!V$kc0WTwfnuPFRo;Qfx14U{|x1@_3OdW zl*ER0oA%x&j*gC2y9BYjdMD-j{M}r-dq4@2f~QBs+{(KNKW0^tR>NPS>6BM3jDYjUZ#^bR!rNW?+$EZm&jv#O zT823{YdLqbb+f+d=U5y(mhRgh05pm#;27Jc$D~>(_VS2oB7Ccilao`cz=qxTi~F9a zK`Iv?@?u14RJUcqR|i)nIcOdMUCjgrdXX^zYTyjOf450$8!Ct>#YJ{80rl4WvT=(m@D0Kw31D-S zKE2!l%LhjRhESuwba}cqyhPwq7;q@Swr08aZQplt5ay||5N3>64baRWJy}Sn7!@6L#T5fOMuZCppse?n|(sc1f7sBW9GivFfjL^F`P*R*>;iM6a+O39ekE3@c#iMGACh}@sYYwCkNm{Bav;VR^kLV3N zesnUVCM;-qz#nLbhOENg7@R~DT;}_N)+S&?_b9kuM5c|+ELX)P2aA_NOiA!{@a*wu zgRk0M(ecM$UI2U&&aM}(hFu<$xVAhZKYY)+30NzX5P=(7&|s;uC-P@p*5+OQwp_pF z)(n`sDp(gM@8;U7Z)lSkFmLS4PCfU?6&ShU4!gv9xw20oEv@<yjM2rX*T{ zT|-S2mJ^r)?8)R>ijZa#ya;5LgC(crwyMCT`UoiW_b+bb)M|3(Vyn?OaS9~fWFQD5 zG={`kJ)o4)D*-LRuUCfD3fiUj&;38$w5*e9GYbcyFK7O8#tsBo1VXA#zmtngBPhFZ zBZ)y}yeXV5_nrRc64ixD9^V0mh{~tm_X5oA`1z)|j-jDp$e+7ML;tIiHp`#@%Aj8# z+d@luuEoKOb2nCRGit=CnDr=wkHfXnUypJwbSh#!^SAB$C`7gEThspb+A5owNCZVV z>u;|C2X~KH#3&w5zwC=QNzh5Ep!Ef^hx)JbG#p5YqX2c7!yCo6e@A5(8vW zWOjzlR5GH>L>4f1U+B>vfKasuiUpWO17B1$P5-{aCky*jhFptSBS3Oh-#wsfFy0PD z*1x_(>>cM4$L92hw)OhV4C@a(#TTX`6%-VjdV0*q4=*`FabuuHRoe6lse3Kb@1+2u4~x(%Q+ZyHwBEmkp7O zgEHlA`l8J4Gk9l@vakDaL%iuYCj|7bi5XLtVgv8$*r_)a?C((z%%tx)P7&qmg$s3q z2wb!{9E&NiY-knu+#e~A^Yimf_Tl&+TXXj)74Qmg5jVU2LHT*em2`(IEFfDNp&Do# z8PHOkd@6dH6EP1>L7H2A&lB`9gf>tW)`)Bt)*nzQsKZFkdxMx+ z(bOEiM6-R&xHhsB6v|lh%QUfXsI@N{o?cDNc<_A>vr}4GaiV3-t-?}o$PxxCCy0C@ zAYer*CyZZFmNfnGQJ92R-LGHmpb$a8zFD#5>@mNsar+3yaFS^6GUX10It zqpNzgEWM`O_jQ@-L! zDh@y4?=1BLPm`{AgA8zZ$cWOWZ&<)nCt~o4UjYEmsAdrr_d(h#TNrC?2qa^dS&)Gi9t&$w%s84M8hy zK|(&t%bduyVhCUWD7=wd(hYB5N8Cm51g44xtSEr;6Wam_k8uwf-!o^ihP8#HVHDQ~+pI>Zq=dNZLon4O&!n_0p{)(y9@SZAja&Vtx0zq&Bs)@JTCv`Ka0xD3Y)6ud4AQWMj+ zjV2YaH0xNlI1HW;(>$xnU*J0~R8xd}p!k{|k_?guPZ7O(j=f}kc99u*sVuWGo!2`GgyPy#*~=eZ zo=#AbM>#FKXgCH8R+r6haVB~L9qP{nNcy69Pt&u_+9Ue}7f+qMwfhcaR;b!;!S=LV zYKoeEA-H;oU^!wP{M0-aVN8><1FUm#uuVj!A3*h^{!9 z(hyhiNyh^M0NzcMZl8e|WcmIHOhUMbwid-7B(cfnTL7VprfUv1LuG^^H6_kfgEFg< zn88bb&CXNF@a;DT1_GThD5g*YF&_~EykIE{lR}lI-`s*V62XqtNK#tpzY3+s;&}P$ zxJ^O)<=1;ilr54am#%4wbc4eR)(4_?kjh$9_5=ZjO-e}-!BP+}JP@V|XsL%8=v8E6 z(H2RaE6dE`P51cFk#EOAdb6X=YsH4bKe9n^;TUDGF=*9ol@+um2R1DuTXxgP#emhBL~S*-zUdEzWpi1dw*e8|X41a1 zAM0Y}Br)=UF|1G~Cg22}&dw`Kha{mI6o^3&%u@yfvv~2`B?U!a)rqpTwo&*kvp3mS z0MG@o+H$4UGV7fW)~?o}^1@&^s&6&4jL!*L887XE6z>WZp%6*y==)z);=Mq^-~b=k zKz1qdo*L6j{P`(~fxfFlma9UBW|vP$l+HH&{XSM;-NwUB;9|h^hN8i2(J<# zvgkW9+f7!r=9A=1L-)$NhBVGCHU_ESO$bGqQBa`euq>#-k@7yh=CcFsccNL|1$Gay z8Uk9pPFf3l2?nDLxGWGTA3u3=IblzW+4=G=KKcT$on+ZQ!= z4k9w#M=TMeoOwtyXAs8x)QFj}xte1<3s&_fM}0?MJO!hwxP7O<@G?t#XiynR6{HKvX_@v&Cs-z zYGjO*jmEc<7TxR_fEF0*NECpD9uaK|&55gbiM7h#I6IX~j>ls1L?aqj0e>#kidfhW zhe?!!IVDmkb`F){M*sNmrjA5ZRFU<32x635yfUH?4j@NrOB8>6NX*H3(6tN9HKkP` zhTVkST)XlSK=s0>fp8*7&PKxvEXa0{<(XxE5p9T?!-=T#N#*2BvTOnXr}=f-rwEZ$ z=F+se+o_!5)hb{bRB)9k)A6;mgwei^%mIgC-4lcuqX z6h${Ts6}oob8U84xna4u;(#$o^2M}qZm-Fu)-c7xM1L#Yq# z3dHJ6<9wh*>$1+sc0AUWHixtg0&7EGQpS3YbO+IdIevrR(+CA#T;gawodMSFfm{;dknS z^$32jEMWRH`WZaygGvw+WZ^C9!0V7&15>96z+@^U1da3d3E-X(PZ>$lgiW;p26brY}5AzpYQ5x zZzh;h5!AIFoimM*bFpv+;Ti)z`ng|+@I_tSrm%m9h?W^+)IS5+m?7s)L65*;@{pCm z(;CCUb4w1OtNH*%HGR`^`3Fc&4d6wQwrhyP+zFqpdrS(?1I(6GRu)KY@S+>r4MCpO zreN6;5MwiXwL@k00guhg-?qCHt>LOzp6pw7LbZ6FtCxtN0%r{hBh~7bG~Jn6%T+0q zcVQt-Ki~=2pSaYL;wzY3AVp>e7=w7M|IB6{`9syeB6qtFG0bunyhAuPRN%67mv;)!F z1`hVhiiMHIGfzAxYbGy8nfkO$78M+SaA8>O!}oPfp2HoMfBY!AS>Ra1hON@$tW7*S z-Vn&aLziBRdzgJ_h3VGZ;YdyG%W*3g0F1-xIlQna*~(ma&#jzp0F*IfPEeF_;= zv87K#2TB)DQXVgwMkVW+ws~fjm8pQWCBBt+qyC*eC5REi#X5?zOJyWQAAsX7%FLAn z02wo4oO=@~EhncDN(t9jI}Hb6XT1rL>%E8W5u4UnB_CG@2L~yz)8_p$s=>*^6~(R7 zgNaQ`g`{}T)h0itduLI{A$?-MIjlL#ptT^|NT{Z89Crn|rZry$8{=}uiuQ{C)x!d$s0;=npX1#P{R-Hv;^KWe% z)XJ6GY5WgQ@R^e$>n^V3E)UCAn)ujFfF6JJ)xI0LX!8X-|E(am4C$P%yMp*{RbGYF` zMOz?JtHH|~iC_UX7E5%zK32{pf~sDfT z3-1R#)-p0O(MQsjWqvM0P)JogPvI2XK;@dy+hKVPBckaD0HuK62RO$XUK&OhF#Lj+;ZJ#Jq7LJE}z8w zp7&Y>{2oPXtL$3dt|ZMUGw=7;Y*?4UoRKrzH@1m;_o43qgYKuzBN14aVBX8a4?*_Q z^{sn!=ny)p(0o))?b_^e*sz?HrKW(K~~+P=;_!&m4wuq=Ot7M>ktcz z2$=%Q^1#LCyUO4`_Nj-~!JHb>+wlT>;8u;TM*xUqWEZDTsAwsSKHf>s%*oDn_L5J; z`3ImWL7P`yX;zIW-SZl7+-;Er?}wRBaAXkc<*}b3Zv#UXA>-Mo%&G%Fx<0)fHtvYJ z>e9K601p2K>9&Sf(Z1>N@`Ro0LE3y<**vx2%x!~|eXO1{ay?I-sy?`4##-#^TQ#7! zhxU4*^$M!bq|7J$kPYDyh`MSR0|2CJzBP3`M{h?Q;{yyYjANE*AqT&A1n5nd`2i6g zS^8iIyk+??%rFa}FFJWW>W{!7xE;8%1Oo!s1r-qqZWw&HZPWuK$*2UxHU==_`@3rS_85;_?BarDN$P?K#MhwUx*%O^Gv`9F0KnAJd?Rv*%AZS$ySQ2BE zrsk$+=#Clp_QiUN=kGCbD9$t&H;U5OIR!OC+7Zu*b{$X`4v5s(1WSrfuG#sE;TqCyAQ03cj4=gH<{kQfip7%)kuo;|?GC=A|KtI1v|Y7EF$ zqmqHr$X>x~!CM4Zr*3RB9|vpTlZj5$g*rUspao-_c^H7D`(jEPPt zl5ILX5}_`KPpGbpYTLAx0exGO-9;t0!6(3tmo(7{9Y&L9Cs$Y4QRZyLzD2DL4-xHz z?g!)*DEcDJ+4Eg4%(IgxZhjkK&aRVg6r(XfTr%X`WhYE*ZFT^Nh*9P&uyu~Vo!brD zw2M45#nkj!U?uS3`PB0*@T5=_)@FCnyqNH$Fb8>LIayiHR8MAaDctkoP!!xVYz^jH zYT)vxDtHM>)hP2qAa>N+tfBs~XV9Kw|HrMjf0Hk`Q&LbR&TuH9L z+~+wN7C8Hef9Z`zV}f#R_DZ#OU><~s#h`mcR3}0~Fv@%F3;~HhJpeLkWa;G(U?Du` zIoVt42kVIiiF%?FrSuMJMn;CByK>pWE6HJ?Bqhm?>b_c$VmHn)s$5~VNAk#C<8)ji zu}%f{8RpRTRj^$5{Mwd$UV&;OPgB@QBhQI_lXT zePsa$J^Pfws|r7$MX=mtUfoG`>0OxTLK(Y~Nj;bk%x;u9%`nFTz9oP?4)awH13YdvOstkjc8HWVJ&^g8C8k3pEoCVm*%?&m1{&#C<^qZd3mKuOI<{qgE0W zw!-K4(C-V_RGXs~0zF_cuu9>m?cf!ULtiW|f7Pb~=r-T;-3C1kpxa?%+__NAT5#QP zQNj6dNWs#A@MXpIESOuT5GGhpo=&cr7+?frf(H|CD(>gobnQKCgQ0AZ-Jwmgb> zz=>uaqiVRSFgTQ+le6n_^Lf~-G~w?jJ5#f=>LX2^qk$kf=u*DoSv z8&6A5Z(tB2b??Y{-vIIu5qvcz#9bv*#sq19vrCm5Ck6Px2hyOrwyY;U6PBwdN_9Hr z0b4#9%19CgaEvN2Wekm7`*Q$>AZUfOnB&32f1caoe!kIF)(dNH3xdiU-ja%?3f>aI zZRDvuVbMG;=1xPiJC#>AiIAW~xD5j5!>?0}DZu>%E37i`~_g3e&XR}fL0L_0XG zuoYZMKd)mVhSc`|u)`^+7viuqWZ~J#E;(dEnzdEqQTFB#h_?L|SA53OQu3fG(@Ph^ z`*;4b%5FE(zYtD+_ekAtxGowcp6LwBmGquVawLe2 zZg0N7F$fx0-TNDY2?)@TVHGk#t;TBKobAkq2Iun|o1y*YxsC0{7cuE?w)Ufs~@3cAW_n{WqzOMp3KZNCKm>BEH{phScC%=?eP(06+3s& zu8cWQlXA2A&~*}Fy-VDrMb^_}&w#GBas)c4K-NCU4?cxysC@_d`W8;2u;D&V7e|l2j6k*H;%dBK!awj?gv#LUPV~O zdcS{uEiEI%bX34R1Kle@?7|@OxEHjVX9UzYJ;^|Ogbz0u4Py6&DM?-B;;uob2(Sa+ zqihUHjx@Ar1G8gUtqz^_*u&4iT1=<@@kI%Tb;Pa=t+cvV4|AYzhluUe`oBlw%CNfX z0qSIuo~Ss>NcLl#tW)RaDu|==JV}Kz+~JAP5&;$&X^zeToOj+t=4B^vCfQ8&SK_e< zOY@VlP{)cuT-E?J6!KJ@T(9@OzQqQwrluwe3Q#Bjz49}OfIGn)G&VNo3Npzc3bbtUNSxlLh>M*0krl0#NHo!{pa!Yq85KoPWH&<|fXR`Ds}XrAY(FQtKL1MOt_=B^4nz=VFsdQ@45EIo zFBwlF82oKt=o{n;mZ8|G!MYP~67}pHi&9t6J~@W1A2pBQm{r`k*bpbb_%1B)E`UOD zFx(;I?1nZXczrU|1wa~}D4{&xz7J@=T3EC*Al+JD8#2L?9*PR0vDDRP+1U~x`(HFx z76-2+E;-m|DBHO;$cvEh`kAi`i+w>&vPJTjSPiHumeFHa_I)3A&>X+YWdj=(9bE@I z+tS_B^xi{m;8<xx_Cvo!oEO1I0$7LJiNtD69^t`!Ffxf0A=u4Pjm=0jR50Pb8rxnD3U;eaJsc* zO8r<3fT(7G5zWca%>4y%staWhh=a^_dDw^b{pRW9wrs>DZcC#nDSDzdpjX=4+mi~N zdzt`)Yb0V#797&uZDV_9)w{-Y@zdQxgTuZ9mqBK}0Jn!nl1#sn416-Vh>Rc7=g?f| zRx4gO2mtvF@F4NTA}Gtl#f%-O>R|#T3iNQtd3M_4i*A6-MgVu#1FLk0=p^l=C^g{u z`qmzwh4(j_3tjq<69V=%a%HKlJOSwx&H*T(Wo4~wyjW$qIvquxzYU&Z(QNK^2WVq)ENq1q&GQXXJ21&;%{ z*yRTKm6Ufun7Jv+bhotaX{a1cpV5-xZo-L}X(QP$*P2vZV9o#G88SJxrthK* z;$>;yxYl!T=U{U2IR=D1R6BMN_$WBn*mk<69O3n1t_(L!39TJTU}`* zxgfPd8N92jWz1MaLmJ?ndh|8nfhyQD$ykUHTQkws&F@ac>Mcz-MWRsb)sl18`d)w> zqS!i%iZ$vgK*(zYY=zh#4nC&p;$pmY_jH@)G~KoHH~sx$vlOzTN)!BrgQBGin;XjY zU@dnA1znPU%44FCAw%Ng=Dmv%7U>*3xd*jp=oK`i2(6m{3O4u0sm|?qJ&4tm3#9pl z%6^~Gx*o7i2VR>I0+At(nTj`nTd;s?^6))wO9~Rv zFv(}YK342L+T?j9JtG5(&DIB;_;J=4nk@r*?I7lBBsFOW0x8?*nlsr~9)x|NsQcqh z04or;VcXH~5K5ps6XP|xyA3>_o%4N=l)<@R!FU2G?op@gCDZicVg(r4Gn5jzvkaw1 zpr?&*_~6(x7J-pHM3MqNE^nz6#a3{bK+Y}7{1lOEu#Ty^Hlu0>B7~`7DGnz=@E2J5 zkk^GsQ1um@l!059wyFsBf4{}@4OVMGYpft`%FgxgU8w~HFyq-I3bYO;_2d?T!T@Mk zmvRAUor5h4FGW=DTv+tD%5gCZk>-Jw9!gLJ({4Bs<2Cr4sWrS{IpVlBL4cyhP)iEv z4!>UqvS0)rvj#|)DD;bd`XIE$K+=3_0bHCDZ@2m2T^<_(#$*Wc;6+p0_fSo|-fS&V zzC67=qq^pHGE5*Ic~2kECGwDAg;KQ^a(dzO1xPB^S-wlmwN)mBHIb{Ec(gEvcCu7g z2ULfl`fEUk1eCW!p_siFA!6Lolqe#i=c~~9V0NJr_~pTNwX9AV1c=w&S+(gg%KB9Mb zmvnkiPBXRD_umsT*fu|2GV(@|}JqR+_DOk7=Sh%+CzKshUJJYp5lbHSZZXdM2_>3k+ zp=lYx74q0pu>XnG4^Q00NZdlFV>%e^EoE!4l`&ug9XDbJrygzQ%M>S%wWM`GV{teU z@=~Dh+*qrpBXq!XJTE}~14*?XeV8aXx3siefU~4H;1*)O-O}tR$AbkmcFx%#h<6=u z(_PPp0ToL?MTZ;#w<2>mzqvGK<0euH*sMzOB6~nA>dYCEbU(RD9T?Q1UIj2J$?0%Tj>7JPg+Mx)@2aG%nRxOLs zllIvDf(HUn{U*?>W#RwnmH%{jVjY&x;4~=jxr~k0AYQkny?4%wBu{16eI*iiyU~AG zI{N74pM`2ZL-z_i>jz^To=t_h3FiY!#~Ku}PRA?AK%xD@bpT;&354!NDA;1Kc|0vU zTMGzXuxC`El@86x21lzfRTR|MMfU7>GDf3}sVex5M;+L!umh@ynvr)F?t_kC^Wo+1 zINn;&2!Mu=w6wHp^e3V^hCLtB&fU-S8zLDH;8tOg4FlKJ2%Xh;%L3TR5VnvOR$&!2 z4q+mO&veOdD3;d0@cZ>_a~k^p!9*Ksrms)U)_!s>@23rKjnQfxU-dONPzPUgk2l;X z`e_5-jX%WR9RiSvjxSQ=GC}9D3(hdX@!838MPU zehfYO;Wv#q&S3`z@d3@iP+@o`am}AHIHJ1dg9l+~ z6mZrwd4j&qG$oBH(~};sKVeS&)v7EW5Vz;Z#j~c*Od*+}U*y_l*oSE5!*+kr(MokD z@nI@Vb};hF(%$sF!3qY%It|KJeuX1YPgpjCcL0sUU>n^45Lv*5;!#^F8~nv z6Gj2IaYu+3&fBuQ0Oc^py9rq(*>@+?5q8$FCD`Mpp%WBO$0|Zo7d0m!3xGLXI(Ivq z4rF?sa(;WJsM^(tG>{LHJj`byx->v8iUQA+3{cHx3?qoOKy=PjHfzE51oFxTnX0dL z2xpE*%64hPk)aGm3?&=b=o`yvkfETBs&6zwCld&4~>19?ZH0P}ePO5*tL?YW16 z{DfrHqd?=n&%e{h)XM%C4JfL+D>1O)gh>g3OQRZ3Ifl91N}v@{!Xg@Q1klCTM*j&^@#l|*hWOe(+O%5?y$k7o-zrI z$nY!E;>7x1#M;=J0uHkXg#Q@I0oXP%+QHa{x%&<9k3h2|;(2EOGTzNeKb)YsSA%yf z4y1AUobpxQZNR1L85-IF$9Wo(Cg+HcH`uLGvbr?xltTmBE+6W9gNGbqD41L8gzc0t z2@?A>^a|LbHG%@@WEuR2&WbAF*(kW=g{}*o>x9usIgk(Gy2*`_-h17*=O1$bPv1CQn;RB)xFqr^k|ss?Lp5Q9BD z`6^DP{w<|Tf;k%Ihv2A$P!*UCVl^12rNDWq#rU#)x6F|vk4+1K?B#kJ`iJk}Px83o~)h+Dg~+9(XX!#lVZHgU^M+akygx znwU|p)g-b-(5zT*vatlr7}!`sAz~SgmPIrj157^+2U*=a_vj?2h>T2x(;dlu9OoUu zi7wh-$}#ETfh<^66d2P;I=Z^yJG7M!V`WYmMD;dSm$MM3){&`Pcpgv!Xh*OHd#eG1 zy<^-1n5l^ME^HpS)U($iKLfFhA(Vhm1{A7{_xTm7fwVX5j(B1R%$$J&IQCUY6cAvu z>eKKE2n)f~&E8#q=;f50hT_~+Q;zd$u5)I1uoqRt_yhr)G6@wicS<<1@SD5Ah|%!i zPUz8xUmII(%)f!fZD>%`;0#5U@c`ul)Oq87t;1f0ue{+#*&edldZJlQikuJeZTuWj z3E#As)N@*8fUzph{@g-%e*+8;Kb(3}JjPj+!P_9?;m6=;)IV2(#%G%~oSDj@g5h}4 zzfS)8zwb^T4+MCS*+1dGBgwA(K`F!`Se=Rv?mztJlEYUw#`Xxa{3CMJ(buMXou7X?&ubTQc1A zb6U57TNQY-R4Zw!-4LG|_d0LE8~{UU@90 z0o^4P7hC(xgVPpKHuk0Z0O&y#37xw3j*h1x7j*{IL&CxK^PH!{;Cp&!1~HTW^ESV) z%%@j@-Y5S7zP?j%I5E(|BMG$#fKgLgtmFy2En!~6rig$qd8Sb{sK^8bN(R_myCGoz zrLgR@V1W$=2t}JD4?_=5@VdWG_Fiz||GT#_BZ~@cD`XRrTQ->6j=L95t~!D;Bc2BYa*i_4xtj16 zbS;>HjtDYi4VtK-_{=GwV?KwoMfbnB&RvUe5j>E}@MZv#eh4hEVVXn%<=E<48?Yo9 zdk$Fzb0{-H>N9TP{*`){z;L2Gw9^_KHGq-=nG+JCuIvK3bfXgd71$oSlvN<#6RTka z%e!X)$%2u|7m^Kan!|f4uOOPmFbF=?!$Rv_L+cJ=c zAn9DB+K8Ssmb?}x5RCv;S2?k>*KPrvYx}5YHdh|Qs=PoS6`qU;2v~!0SV^X z2qvC=Bs|>}TPOCY2KwXCWQe{iHt?JV%cX3`g>Zd3@CrA&Z2u8-7N|6A^=qBy$;}q0 z9k+qBm!2W_8*9_L`~Qe6WqOV_2|@iE-_D);WLlAsSuXoquosX9q`c?X?P|wh2q{>; zlKm5~4gwQcJ&p?-;IJCoo{l&hD8-Sr3L>8@-sc>JKQIKKT#2IMyKp7;FC$q)&vYiDQ_ETLXx`&v0?V4-nIxSt;;7N}4P z9(0JU9q?En%Hin~{^O>78Wf~gM9KRKl8#S+m#mLrLYu^c*_K;+?YJd>x2_h1V@Pg?~k#ene#QZN6YMrS3AjbevX&PzrXtDl8?ZMi_;?08U91 zbi=PK%|XSci8zyJ?6g}6xC3m~a0&{9K#y1*T0KCG`kQ^h>*Kfy9Ab0K4r%Bx0cDEV z;>#a_VHWZFg1{FB8(?;#L%Nz$uQdXtP&NNLzDH2!fWVoB)zp*YaEfBfe_~{i+{KV+ z1?7+ElX(xBA3Zr$tDY!)-G1;?UvO_5&L~V|LPk&xD1B#^=;?R;Sf>({5*|Tse#}ki7Ej z`N>+y_W1v1umhss{^C)p3ftX|+m6G=2S3pJ{-2-0VLw|Io1)6@R<3G5h0Sp{ z>Sc(u*(WkG-`7o*1LXk6l?0fytuNUQ%4jrnXHno_FOkyHQV*>!2m|VnrLn#y%3ta~ zHb2>AIPz14_Yr_wKzGm;xeonyc;=`**uBb@T62>3|6%6Qs=tWsMMw{$B@io16f)2Y zpzcn;WQ(bkaP-b$-6<6~4WNVO<$|rMVwPn}5QqLe+eW;w`~Ng|twB+pXZR2W61la5 zt*e5JbfZwiML_|9$^^x%SMY)t3gQ4_@WxeOxoe6Mh=whQ+zCh_plgH(23Z%h1ra2I zVPQo91re6WBD>s$?enf?(&rxgOTW~u3C)pC%t2nWitL+)g(si>W{T8T5PJDEn_>zemkBtGTkFT4q ztb#q>N$1a4arn1hB(NLk=Jkrkho5C`&s7NziV9pX0nX zx%45xA&l1t2hRg^%sV1PO9jK;9cLF6r@<9Oge3=-=iZm5jK3O;ao8oqnBx&69t1Za zp*x3LX*u%32V-Ej>M7pOT>@Jo{vcrydSk^9{>rvHousbN%~|vJV|XeNTW@sJh@$TQ z{Kq91hIVYM)%Yp#%pyjzxmF`dZGi_%Q1|Cr+e^k_aa@;?4;*HV!s1#SYBdVlFN@!& z|5oGgba^Q$I1E2|R~EiQSpSQvvfv#I%G>q-d5S-7a#=X0UE5%sV0cc)fFG*qsmH~V zDiEuTmCAS^6u)+x-!e%C^rFL*B4u+3NC&OWZ$2OUrWj%xEv+2gFItKZN|~%`loa$J z+E~Ze^#}YL&Vjj$9%^35eB*X@I4B;nlWemHHGg$WuyOGl7G2*B_e>+1>NLK zjhq@~H&lnqB;Z8r(!kVcJv|5a-=SBQOwG~V#tHewp&!ySVs>Pkyo)OSwZi{$x_@~V zo=IFV-Plzp=c}bwBHIOwc@uWUkReyY3etiTPdYn?^wpW-0zbh##(Yge{%6-*Q7XnR zjK)JY=`+|`NZKbD-Cd;CjB_Z#DLIfW;rbaT70fh+&ZTX;O4}s*^BAYU%mt!aOShvn z%!U(v%sUbDNGxwXc#t7|6{gri)y(01H7pOr)Ri9>m%$LNMWhr?FbdoS8*SKQdQwbG zL-lEg_-t`?co0fh8Ki-c(~onT5nz&r9dP91<109@KV_39a0=e0^p}lkMTh}a*VbbN zc7ejqJb3LAWlDjF_o4_Wte=lGUNUh!GWhy6Wyz^E3&B^lip4fxf74CO{3qDEh|`M5 zbC}?-y5An(qZnlJET<=0y32gFYU^2ljgKWUu(;xc8<4jfBUx>6u`wyc^lZx-jP z65akp-+>sYX3Mq>R3F?q?%d#O4}|ps4-sK6|-X`IRy_@4n1F7 zFSMyP{ddCT?M%(&l^#{2IMS408SaaserGiGdY`+LOUA2;Q{6csLn0rnri(14NY)Q9 zz14Gugu3C4Ze2Vqk$qo2NEP4k+U$CDe{!N}l@Z}rL7?D$VaGin;D&fWphk3c`i7Xf z0Ln*_k^oexVg$2PShx34C+I0i_9vrM z0Fp5g21|3oKN5bY^e-cvY(2%TA?-zWc*;sOwq+Pbhz48aCJ=a)jvPgtn54)%qa?eP zDK|g?5hTT}pU+5|ey_!RJW`%J`w6EE>KKZ8LlU+ZDOnIbKqn}&yAGWdP3aa$*FnvP zs3WAOH=875fWfdq-Ma5iKU~|+Zdko@VA-{U*ZKiEP$CZ8KfA8E(y*tk*bt~ufBZIP z*SE{@>(q-y!pj&mmhajZ5|m(WyF|(-x?TwH)%b&E=PpKbSxj9OwYb3FxU8fB!Mz2e)PV?T4tYqs!YOiRS~b1T0{ z$Y%w2IjaT%|KTZybviKXk|Kb^gUViey3a-N!%FU9Jtipx6Sjxo4`^=ldQ@32mE8$0 zIw;=G7u&)WS(XdK^tavQN-E<)*K2a$#%Mf+2UIIEDnIS6Tp>{Lte)=P`4i&2ooY4A zAB-vd+uPf%`fMmT&iJ9;u>H)A(_JkaivI+Pd0ZQee5}=dyv_wrGrQzaEaR?O4RBo+ zq778~5GA$uDYVYu{j8;;V^Ta-s5@*i$OrmWz= zCz|VxXuEIQ{5T%nK@*|E8Fwl~)KEy#2CDN1yB{2eE6@gFuyh@gR~4k#hpBAo0e;>H zYB!mT{f8#+F1m0af&yUy71~V?6pVcongHl*6uJo$yxdBVw^!tL@L(WtJp)-;B^ohO z3qY3l5NVMlChv!~$o{W4&~JI}8J%zD@DU{8-k=r^-1OT^LS&$B6x~PRjjq={>__$t zwG<8L{G*VRN?aW3f0rUQKu#4S4LEPc<0lK+GEAAA)?lU|U-iN?$CiWj>Qq1GA$x}n!@(YFsV(`XrK_l&N4lpB#0KRqk=8f}4qOWQa2`erRb!1RiX#{!% group_by(method, names) %>% summarize(median_diff = median(perf_metric_diff)) %>% - slice_min(order_by = median_diff, n = 5) + filter(median_diff > 0) %>% + slice_max(order_by = median_diff, n = top_n) feat_df %>% right_join(top_feats, by = c("method", "names")) %>% mutate(features = factor(names, levels = rev(unique(top_feats$names)))) %>% ggplot(aes(x = perf_metric_diff, y = features, color = method)) + geom_boxplot() + - facet_wrap(~method) + theme_bw() ``` -The features that resulted in the largest **decrease** in performance when -permuted are the most importance features. - +See the docs for `get_feature_importance()` for more details on how these values +are computed. ## Live progress updates From c34ce8e63f96a5b09824091b22c73e0e9eac68f1 Mon Sep 17 00:00:00 2001 From: "github-actions[bot]" <41898282+github-actions[bot]@users.noreply.github.com> Date: Thu, 17 Nov 2022 17:35:11 +0000 Subject: [PATCH 2/3] =?UTF-8?q?=F0=9F=8E=A8=20Style=20R=20code?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- vignettes/parallel.Rmd | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/vignettes/parallel.Rmd b/vignettes/parallel.Rmd index 536cc30e..76160ecf 100644 --- a/vignettes/parallel.Rmd +++ b/vignettes/parallel.Rmd @@ -182,7 +182,7 @@ top_n <- 5 top_feats <- feat_df %>% group_by(method, names) %>% summarize(median_diff = median(perf_metric_diff)) %>% - filter(median_diff > 0) %>% + filter(median_diff > 0) %>% slice_max(order_by = median_diff, n = top_n) feat_df %>% From b2a192ce5c17c4b522f453231f64838476be5470 Mon Sep 17 00:00:00 2001 From: "github-actions[bot]" <41898282+github-actions[bot]@users.noreply.github.com> Date: Thu, 17 Nov 2022 18:03:37 +0000 Subject: [PATCH 3/3] =?UTF-8?q?=F0=9F=93=91=20Build=20docs=20site?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit --- docs/dev/articles/parallel.html | 4 ++-- docs/dev/pkgdown.yml | 2 +- docs/dev/reference/get_perf_metric_fn.html | 6 +++--- docs/dev/search.json | 2 +- 4 files changed, 7 insertions(+), 7 deletions(-) diff --git a/docs/dev/articles/parallel.html b/docs/dev/articles/parallel.html index a880d6bc..78f3ad98 100644 --- a/docs/dev/articles/parallel.html +++ b/docs/dev/articles/parallel.html @@ -190,7 +190,7 @@

Call run_ml() future.apply package to run_ml() in parallel, but you can accomplish the same thing with parallel versions of the purrr::map() functions using the furrr package -(e.g. furrr::future_map_dfr()).

+(e.g. furrr::future_map_dfr()).

Extract the performance results and combine into one dataframe for all seeds:

@@ -311,7 +311,7 @@ 

Feature importancetop_feats <- feat_df %>% group_by(method, names) %>% summarize(median_diff = median(perf_metric_diff)) %>% - filter(median_diff > 0) %>% + filter(median_diff > 0) %>% slice_max(order_by = median_diff, n = top_n) #> `summarise()` has grouped output by 'method'. You can override using the #> `.groups` argument. diff --git a/docs/dev/pkgdown.yml b/docs/dev/pkgdown.yml index 6b0d0384..4e6d68c8 100644 --- a/docs/dev/pkgdown.yml +++ b/docs/dev/pkgdown.yml @@ -7,7 +7,7 @@ articles: parallel: parallel.html preprocess: preprocess.html tuning: tuning.html -last_built: 2022-11-04T17:10Z +last_built: 2022-11-17T17:37Z urls: reference: http://www.schlosslab.org/mikropml/reference article: http://www.schlosslab.org/mikropml/articles diff --git a/docs/dev/reference/get_perf_metric_fn.html b/docs/dev/reference/get_perf_metric_fn.html index 013b24b3..ed7941cb 100644 --- a/docs/dev/reference/get_perf_metric_fn.html +++ b/docs/dev/reference/get_perf_metric_fn.html @@ -93,7 +93,7 @@

Examples#> data$obs <- factor(data$obs, levels = lev) #> postResample(data[, "pred"], data[, "obs"]) #> } -#> <bytecode: 0x7ff721e00db0> +#> <bytecode: 0x7fa811f33db0> #> <environment: namespace:caret> get_perf_metric_fn("binary") #> function (data, lev = NULL, model = NULL) @@ -151,7 +151,7 @@

Examples#> stats <- stats[c(stat_list)] #> return(stats) #> } -#> <bytecode: 0x7ff736412c60> +#> <bytecode: 0x7fa810d49360> #> <environment: namespace:caret> get_perf_metric_fn("multiclass") #> function (data, lev = NULL, model = NULL) @@ -209,7 +209,7 @@

Examples#> stats <- stats[c(stat_list)] #> return(stats) #> } -#> <bytecode: 0x7ff736412c60> +#> <bytecode: 0x7fa810d49360> #> <environment: namespace:caret>

diff --git a/docs/dev/search.json b/docs/dev/search.json index 4de45925..505480e9 100644 --- a/docs/dev/search.json +++ b/docs/dev/search.json @@ -1 +1 @@ -[{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":null,"dir":"","previous_headings":"","what":"Contributor Covenant Code of Conduct","title":"Contributor Covenant Code of Conduct","text":"document adapted Tidyverse Code Conduct.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"our-pledge","dir":"","previous_headings":"","what":"Our Pledge","title":"Contributor Covenant Code of Conduct","text":"members, contributors, leaders pledge make participation community harassment-free experience everyone, regardless age, body size, visible invisible disability, ethnicity, sex characteristics, gender identity expression, level experience, education, socio-economic status, nationality, personal appearance, race, religion, sexual identity orientation. pledge act interact ways contribute open, welcoming, diverse, inclusive, healthy community.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"our-standards","dir":"","previous_headings":"","what":"Our Standards","title":"Contributor Covenant Code of Conduct","text":"Examples behavior contributes positive environment community include: Demonstrating empathy kindness toward people respectful differing opinions, viewpoints, experiences Giving gracefully accepting constructive feedback Accepting responsibility apologizing affected mistakes, learning experience Focusing best just us individuals, overall community Examples unacceptable behavior include: use sexualized language imagery, sexual attention advances kind Trolling, insulting derogatory comments, personal political attacks Public private harassment Publishing others’ private information, physical email address, without explicit permission conduct reasonably considered inappropriate professional setting","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"enforcement-responsibilities","dir":"","previous_headings":"","what":"Enforcement Responsibilities","title":"Contributor Covenant Code of Conduct","text":"Community leaders responsible clarifying enforcing standards acceptable behavior take appropriate fair corrective action response behavior deem inappropriate, threatening, offensive, harmful. Community leaders right responsibility remove, edit, reject comments, commits, code, wiki edits, issues, contributions aligned Code Conduct, communicate reasons moderation decisions appropriate.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"scope","dir":"","previous_headings":"","what":"Scope","title":"Contributor Covenant Code of Conduct","text":"Code Conduct applies within community spaces, also applies individual officially representing community public spaces. Examples representing community include using official e-mail address, posting via official social media account, acting appointed representative online offline event.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"enforcement","dir":"","previous_headings":"","what":"Enforcement","title":"Contributor Covenant Code of Conduct","text":"Instances abusive, harassing, otherwise unacceptable behavior may reported community leaders responsible enforcement [INSERT CONTACT METHOD]. complaints reviewed investigated promptly fairly. community leaders obligated respect privacy security reporter incident.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"enforcement-guidelines","dir":"","previous_headings":"","what":"Enforcement Guidelines","title":"Contributor Covenant Code of Conduct","text":"Community leaders follow Community Impact Guidelines determining consequences action deem violation Code Conduct:","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"id_1-correction","dir":"","previous_headings":"Enforcement Guidelines","what":"1. Correction","title":"Contributor Covenant Code of Conduct","text":"Community Impact: Use inappropriate language behavior deemed unprofessional unwelcome community. Consequence: private, written warning community leaders, providing clarity around nature violation explanation behavior inappropriate. public apology may requested.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"id_2-warning","dir":"","previous_headings":"Enforcement Guidelines","what":"2. Warning","title":"Contributor Covenant Code of Conduct","text":"Community Impact: violation single incident series actions. Consequence: warning consequences continued behavior. interaction people involved, including unsolicited interaction enforcing Code Conduct, specified period time. includes avoiding interactions community spaces well external channels like social media. Violating terms may lead temporary permanent ban.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"id_3-temporary-ban","dir":"","previous_headings":"Enforcement Guidelines","what":"3. Temporary Ban","title":"Contributor Covenant Code of Conduct","text":"Community Impact: serious violation community standards, including sustained inappropriate behavior. Consequence: temporary ban sort interaction public communication community specified period time. public private interaction people involved, including unsolicited interaction enforcing Code Conduct, allowed period. Violating terms may lead permanent ban.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"id_4-permanent-ban","dir":"","previous_headings":"Enforcement Guidelines","what":"4. Permanent Ban","title":"Contributor Covenant Code of Conduct","text":"Community Impact: Demonstrating pattern violation community standards, including sustained inappropriate behavior, harassment individual, aggression toward disparagement classes individuals. Consequence: permanent ban sort public interaction within community.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"attribution","dir":"","previous_headings":"","what":"Attribution","title":"Contributor Covenant Code of Conduct","text":"Code Conduct adapted Contributor Covenant, version 2.0, available https://www.contributor-covenant.org/version/2/0/ code_of_conduct.html. Community Impact Guidelines inspired Mozilla’s code conduct enforcement ladder. answers common questions code conduct, see FAQ https://www.contributor-covenant.org/faq. Translations available https:// www.contributor-covenant.org/translations.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CONTRIBUTING.html","id":null,"dir":"","previous_headings":"","what":"Contributing to mikropml","title":"Contributing to mikropml","text":"document adapted Tidyverse Contributing guide.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CONTRIBUTING.html","id":"fixing-typos","dir":"","previous_headings":"","what":"Fixing typos","title":"Contributing to mikropml","text":"can fix typos, spelling mistakes, grammatical errors documentation directly using GitHub web interface, long changes made source file. generally means ’ll need edit roxygen2 comments .R, .Rd file. can find .R file generates .Rd reading comment first line.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CONTRIBUTING.html","id":"bigger-changes","dir":"","previous_headings":"","what":"Bigger changes","title":"Contributing to mikropml","text":"want make bigger change, ’s good idea first file issue make sure someone team agrees ’s needed. ’ve found bug, please file issue illustrates bug minimal reprex (also help write unit test, needed).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CONTRIBUTING.html","id":"pull-request-process","dir":"","previous_headings":"Bigger changes","what":"Pull request process","title":"Contributing to mikropml","text":"Fork package clone onto computer. haven’t done , recommend using usethis::create_from_github(\"SchlossLab/mikropml\", fork = TRUE). Install development dependences devtools::install_dev_deps(), make sure package passes R CMD check running devtools::check(). R CMD check doesn’t pass cleanly, ’s good idea ask help continuing. Create Git branch pull request (PR). recommend using usethis::pr_init(\"brief-description--change\"). Make changes, commit git, create PR running usethis::pr_push(), following prompts browser. title PR briefly describe change. body PR contain Fixes #issue-number. user-facing changes, add bullet top NEWS.md (.e. just first header). Follow style described https://style.tidyverse.org/news.html.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CONTRIBUTING.html","id":"code-style","dir":"","previous_headings":"Bigger changes","what":"Code style","title":"Contributing to mikropml","text":"New code follow tidyverse style guide. can use styler package apply styles, please don’t restyle code nothing PR. use roxygen2, Markdown syntax, documentation. use testthat unit tests. Contributions test cases included easier accept.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CONTRIBUTING.html","id":"code-of-conduct","dir":"","previous_headings":"","what":"Code of Conduct","title":"Contributing to mikropml","text":"Please note mikropml project released Contributor Code Conduct. contributing project agree abide terms.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/LICENSE.html","id":null,"dir":"","previous_headings":"","what":"MIT License","title":"MIT License","text":"Copyright (c) 2019-2021 Begüm D. Topçuoğlu, Zena Lapp, Kelly L. Sovacool, Evan Snitkin, Jenna Wiens, Patrick D. Schloss Permission hereby granted, free charge, person obtaining copy software associated documentation files (“Software”), deal Software without restriction, including without limitation rights use, copy, modify, merge, publish, distribute, sublicense, /sell copies Software, permit persons Software furnished , subject following conditions: copyright notice permission notice shall included copies substantial portions Software. SOFTWARE PROVIDED “”, WITHOUT WARRANTY KIND, EXPRESS IMPLIED, INCLUDING LIMITED WARRANTIES MERCHANTABILITY, FITNESS PARTICULAR PURPOSE NONINFRINGEMENT. EVENT SHALL AUTHORS COPYRIGHT HOLDERS LIABLE CLAIM, DAMAGES LIABILITY, WHETHER ACTION CONTRACT, TORT OTHERWISE, ARISING , CONNECTION SOFTWARE USE DEALINGS SOFTWARE.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/SUPPORT.html","id":null,"dir":"","previous_headings":"","what":"Getting help with mikropml","title":"Getting help with mikropml","text":"Thanks using mikropml! filing issue, places explore pieces put together make process smooth possible.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/SUPPORT.html","id":"make-a-reprex","dir":"","previous_headings":"","what":"Make a reprex","title":"Getting help with mikropml","text":"Start making minimal reproducible example using reprex package. haven’t heard used reprex , ’re treat! Seriously, reprex make R-question-asking endeavors easier (pretty insane ROI five ten minutes ’ll take learn ’s ). additional reprex pointers, check Get help! section tidyverse site.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/SUPPORT.html","id":"where-to-ask","dir":"","previous_headings":"","what":"Where to ask?","title":"Getting help with mikropml","text":"Armed reprex, next step figure ask. ’s question: start community.rstudio.com, /StackOverflow. people answer questions. ’s bug: ’re right place, file issue. ’re sure: let community help figure ! problem bug feature request, can easily return report . opening new issue, sure search issues pull requests make sure bug hasn’t reported /already fixed development version. default, search pre-populated :issue :open. can edit qualifiers (e.g. :pr, :closed) needed. example, ’d simply remove :open search issues repo, open closed.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/SUPPORT.html","id":"what-happens-next","dir":"","previous_headings":"","what":"What happens next?","title":"Getting help with mikropml","text":"efficient possible, development tidyverse packages tends bursty, shouldn’t worry don’t get immediate response. Typically don’t look repo sufficient quantity issues accumulates, ’s burst intense activity focus efforts. makes development efficient avoids expensive context switching problems, cost taking longer get back . process makes good reprex particularly important might multiple months initial report start working . can’t reproduce bug, can’t fix !","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"its-running-so-slow","dir":"Articles","previous_headings":"","what":"It’s running so slow!","title":"Introduction to mikropml","text":"Since assume lot won’t read entire vignette, ’m going say beginning. run_ml() function running super slow, consider parallelizing. See vignette(\"parallel\") examples.","code":""},{"path":[]},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"the-input-data","dir":"Articles","previous_headings":"Understanding the inputs","what":"The input data","title":"Introduction to mikropml","text":"input data run_ml() dataframe row sample observation. One column (assumed first) outcome interest, columns features. package otu_mini_bin small example dataset mikropml. , dx outcome column (normal cancer), 10 features (Otu00001 Otu00010). 2 outcomes, performing binary classification majority examples . bottom, also briefly provide examples multi-class continuous outcomes. ’ll see, run way binary classification! feature columns amount Operational Taxonomic Unit (OTU) microbiome samples patients cancer without cancer. goal predict dx, stands diagnosis. diagnosis can cancer based individual’s microbiome. need understand exactly means, ’re interested can read original paper (Topçuoğlu et al. 2020). real machine learning applications ’ll need use features, purposes vignette ’ll stick example dataset everything runs faster.","code":"# install.packages(\"devtools\") # devtools::install_github(\"SchlossLab/mikropml\") library(mikropml) head(otu_mini_bin) #> dx Otu00001 Otu00002 Otu00003 Otu00004 Otu00005 Otu00006 Otu00007 #> 1 normal 350 268 213 1 208 230 70 #> 2 normal 568 1320 13 293 671 103 48 #> 3 normal 151 756 802 556 145 271 57 #> 4 normal 299 30 1018 0 25 99 75 #> 5 normal 1409 174 0 3 2 1136 296 #> 6 normal 167 712 213 4 332 534 139 #> Otu00008 Otu00009 Otu00010 #> 1 230 235 64 #> 2 204 119 115 #> 3 176 37 710 #> 4 78 255 197 #> 5 1 537 533 #> 6 251 155 122"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"the-methods-we-support","dir":"Articles","previous_headings":"Understanding the inputs","what":"The methods we support","title":"Introduction to mikropml","text":"methods use supported great ML wrapper package caret, use train machine learning models. methods tested (backend packages) : Logistic/multiclass/linear regression (\"glmnet\") Random forest (\"rf\") Decision tree (\"rpart2\") Support vector machine radial basis kernel (\"svmRadial\") xgboost (\"xgbTree\") documentation methods, well many others, can look available models (see list tag). vetted models used caret, function general enough others might work. can’t promise can help models, feel free [start new discussion GitHub]https://github.com/SchlossLab/mikropml/discussions) questions models might able help. first focus glmnet, default implementation L2-regularized logistic regression. cover examples towards end.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"before-running-ml","dir":"Articles","previous_headings":"","what":"Before running ML","title":"Introduction to mikropml","text":"execute run_ml(), consider preprocessing data, either preprocess_data() function. can learn preprocessing vignette: vignette(\"preprocess\").","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"the-simplest-way-to-run_ml","dir":"Articles","previous_headings":"","what":"The simplest way to run_ml()","title":"Introduction to mikropml","text":"mentioned , minimal input dataset (dataset) machine learning model want use (method). may also want provide: outcome column name. default run_ml() pick first column, ’s best practice specify column name explicitly. seed results reproducible, get results see (.e train/test split). Say want use logistic regression, method use glmnet. , run ML pipeline : ’ll notice things: takes little run. parameters use. message stating ‘dx’ used outcome column. want, ’s nice sanity check! warning. Don’t worry warning right now - just means hyperparameters aren’t good fit - ’re interested learning , see vignette(\"tuning\"). Now, let’s dig output bit. results list 4 things: trained_model trained model caret. bunch info won’t get , can learn caret::train() documentation. test_data partition dataset used testing. machine learning, ’s always important held-test dataset used training stage. pipeline using run_ml() split data training testing sets. training data used build model (e.g. tune hyperparameters, learn data) test data used evaluate well model performs. performance dataframe (mainly) performance metrics (1 column cross-validation performance metric, several test performance metrics, 2 columns end ML method seed): using logistic regression binary classification, area receiver-operator characteristic curve (AUC) useful metric evaluate model performance. , ’s default use mikropml. However, crucial evaluate model performance using multiple metrics. can find information performance metrics use package. cv_metric_AUC AUC cross-validation folds training data. gives us sense well model performs training data. columns performance metrics test data — data wasn’t used build model. , can see AUC test data much 0.5, suggesting model predict much better chance, model overfit cross-validation AUC (cv_metric_AUC, measured training) much higher testing AUC. isn’t surprising since ’re using features example dataset, don’t discouraged. default option also provides number performance metrics might interested , including area precision-recall curve (prAUC). last columns results$performance method seed (set one) help combining results multiple runs (see vignette(\"parallel\")). feature_importance information feature importance values find_feature_importance = TRUE (default FALSE). Since used defaults, ’s nothing :","code":"results <- run_ml(otu_mini_bin, \"glmnet\", outcome_colname = \"dx\", seed = 2019 ) names(results) #> [1] \"trained_model\" \"test_data\" \"performance\" #> [4] \"feature_importance\" names(results$trained_model) #> [1] \"method\" \"modelInfo\" \"modelType\" \"results\" \"pred\" #> [6] \"bestTune\" \"call\" \"dots\" \"metric\" \"control\" #> [11] \"finalModel\" \"preProcess\" \"trainingData\" \"ptype\" \"resample\" #> [16] \"resampledCM\" \"perfNames\" \"maximize\" \"yLimits\" \"times\" #> [21] \"levels\" head(results$test_data) #> dx Otu00009 Otu00005 Otu00010 Otu00001 Otu00008 Otu00004 Otu00003 #> 9 normal 119 142 248 256 363 112 871 #> 14 normal 60 209 70 86 96 1 123 #> 16 cancer 205 5 180 1668 95 22 3 #> 17 normal 188 356 107 381 1035 915 315 #> 27 normal 4 21 161 7 1 27 8 #> 30 normal 13 166 5 31 33 5 58 #> Otu00002 Otu00007 Otu00006 #> 9 995 0 137 #> 14 426 54 40 #> 16 20 590 570 #> 17 357 253 341 #> 27 25 322 5 #> 30 179 6 30 results$performance #> # A tibble: 1 × 17 #> cv_metric_AUC logLoss AUC prAUC Accuracy Kappa F1 Sensi…¹ Speci…² Pos_P…³ #> #> 1 0.622 0.684 0.647 0.606 0.590 0.179 0.6 0.6 0.579 0.6 #> # … with 7 more variables: Neg_Pred_Value , Precision , Recall , #> # Detection_Rate , Balanced_Accuracy , method , seed , #> # and abbreviated variable names ¹​Sensitivity, ²​Specificity, ³​Pos_Pred_Value results$feature_importance #> [1] \"Skipped feature importance\""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"customizing-parameters","dir":"Articles","previous_headings":"","what":"Customizing parameters","title":"Introduction to mikropml","text":"arguments allow change execute run_ml(). ’ve chosen reasonable defaults , encourage change think something else better data.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"changing-kfold-cv_times-and-training_frac","dir":"Articles","previous_headings":"Customizing parameters","what":"Changing kfold, cv_times, and training_frac","title":"Introduction to mikropml","text":"kfold: number folds run cross-validation (default: 5). cv_times: number times run repeated cross-validation (default: 100). training_frac: fraction data training set (default: 0.8). rest data used testing. ’s example change default parameters: might noticed one ran faster — ’s reduced kfold cv_times. okay testing things may even necessary smaller datasets. general may better larger numbers parameters; think defaults good starting point (Topçuoğlu et al. 2020).","code":"results_custom <- run_ml(otu_mini_bin, \"glmnet\", kfold = 2, cv_times = 5, training_frac = 0.5, seed = 2019 ) #> Using 'dx' as the outcome column. #> Training the model... #> Loading required package: ggplot2 #> Loading required package: lattice #> #> Attaching package: 'caret' #> The following object is masked from 'package:mikropml': #> #> compare_models #> Warning in (function (w) : `caret::train()` issued the following warning: #> #> simpleWarning in nominalTrainWorkflow(x = x, y = y, wts = weights, info = trainInfo, : There were missing values in resampled performance measures. #> #> This warning usually means that the model didn't converge in some cross-validation folds because it is predicting something close to a constant. As a result, certain performance metrics can't be calculated. This suggests that some of the hyperparameters chosen are doing very poorly. #> Training complete."},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"custom-training-indices","dir":"Articles","previous_headings":"Customizing parameters > Changing kfold, cv_times, and training_frac","what":"Custom training indices","title":"Introduction to mikropml","text":"training_frac fraction 0 1, random sample observations dataset chosen training set satisfy training_frac. However, cases might wish control exactly observations training set. can instead assign training_frac vector indices correspond rows dataset go training set (remaining sequences go testing set). ’s example ~80% data training set:","code":"n_obs <- otu_mini_bin %>% nrow() training_size <- 0.8 * n_obs training_rows <- sample(n_obs, training_size) results_custom_train <- run_ml(otu_mini_bin, \"glmnet\", kfold = 2, cv_times = 5, training_frac = training_rows, seed = 2019 ) #> Using 'dx' as the outcome column. #> Using the custom training set indices provided by `training_frac`. #> The fraction of data in the training set will be 0.8 #> Training the model... #> Training complete."},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"changing-the-performance-metric","dir":"Articles","previous_headings":"Customizing parameters","what":"Changing the performance metric","title":"Introduction to mikropml","text":"two arguments allow change performance metric use model evaluation, performance metrics calculate using test data. perf_metric_function function used calculate performance metrics. default classification caret::multiClassSummary() default regression caret::defaultSummary(). ’d suggest changing unless really know ’re . perf_metric_name column name output perf_metric_function. chose reasonable defaults (AUC binary, logLoss multiclass, RMSE continuous), default functions calculate bunch different performance metrics, can choose different one ’d like. default performance metrics available classification : default performance metrics available regression : ’s example using prAUC instead AUC: ’ll see cross-validation metric prAUC, instead default AUC:","code":"#> [1] \"logLoss\" \"AUC\" \"prAUC\" #> [4] \"Accuracy\" \"Kappa\" \"Mean_F1\" #> [7] \"Mean_Sensitivity\" \"Mean_Specificity\" \"Mean_Pos_Pred_Value\" #> [10] \"Mean_Neg_Pred_Value\" \"Mean_Precision\" \"Mean_Recall\" #> [13] \"Mean_Detection_Rate\" \"Mean_Balanced_Accuracy\" #> [1] \"RMSE\" \"Rsquared\" \"MAE\" results_pr <- run_ml(otu_mini_bin, \"glmnet\", cv_times = 5, perf_metric_name = \"prAUC\", seed = 2019 ) #> Using 'dx' as the outcome column. #> Training the model... #> Warning in (function (w) : `caret::train()` issued the following warning: #> #> simpleWarning in nominalTrainWorkflow(x = x, y = y, wts = weights, info = trainInfo, : There were missing values in resampled performance measures. #> #> This warning usually means that the model didn't converge in some cross-validation folds because it is predicting something close to a constant. As a result, certain performance metrics can't be calculated. This suggests that some of the hyperparameters chosen are doing very poorly. #> Training complete. results_pr$performance #> # A tibble: 1 × 17 #> cv_metric_p…¹ logLoss AUC prAUC Accur…² Kappa F1 Sensi…³ Speci…⁴ Pos_P…⁵ #> #> 1 0.577 0.691 0.663 0.605 0.538 0.0539 0.690 1 0.0526 0.526 #> # … with 7 more variables: Neg_Pred_Value , Precision , Recall , #> # Detection_Rate , Balanced_Accuracy , method , seed , #> # and abbreviated variable names ¹​cv_metric_prAUC, ²​Accuracy, ³​Sensitivity, #> # ⁴​Specificity, ⁵​Pos_Pred_Value"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"using-groups","dir":"Articles","previous_headings":"Customizing parameters","what":"Using groups","title":"Introduction to mikropml","text":"optional groups vector groups keep together splitting data train test sets cross-validation. Sometimes ’s important split data based grouping instead just randomly. allows control similarities within groups don’t want skew predictions (.e. batch effects). example, biological data may samples collected multiple hospitals, might like keep observations hospital partition. ’s example split data train/test sets based groups: one difference run_ml() report much data training set run code chunk. can little finicky depending many samples groups . won’t exactly specify training_frac, since include one group either training set test set.","code":"# make random groups set.seed(2019) grps <- sample(LETTERS[1:8], nrow(otu_mini_bin), replace = TRUE) results_grp <- run_ml(otu_mini_bin, \"glmnet\", cv_times = 2, training_frac = 0.8, groups = grps, seed = 2019 ) #> Using 'dx' as the outcome column. #> Fraction of data in the training set: 0.795 #> Groups in the training set: A B D F G H #> Groups in the testing set: C E #> Groups will be kept together in CV partitions #> Training the model... #> Training complete."},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"controlling-how-groups-are-assigned-to-partitions","dir":"Articles","previous_headings":"Customizing parameters > Using groups","what":"Controlling how groups are assigned to partitions","title":"Introduction to mikropml","text":"use groups parameter , default run_ml() assume want observations group placed partition train/test split. makes sense want use groups control batch effects. However, cases might prefer control exactly groups end partition, might even okay observations group assigned different partitions. example, say want groups B used training, C D testing, don’t preference happens groups. can give group_partitions parameter named list specify groups go training set go testing set. case, observations & B used training, C & D used testing, remaining groups randomly assigned one satisfy training_frac closely possible. another scenario, maybe want groups F used training, also want allow observations selected training F used testing: need even control , take look setting custom training indices. might also prefer provide train control scheme cross_val parameter run_ml().","code":"results_grp_part <- run_ml(otu_mini_bin, \"glmnet\", cv_times = 2, training_frac = 0.8, groups = grps, group_partitions = list( train = c(\"A\", \"B\"), test = c(\"C\", \"D\") ), seed = 2019 ) #> Using 'dx' as the outcome column. #> Fraction of data in the training set: 0.785 #> Groups in the training set: A B E F G H #> Groups in the testing set: C D #> Groups will not be kept together in CV partitions because the number of groups in the training set is not larger than `kfold` #> Training the model... #> Training complete. results_grp_trainA <- run_ml(otu_mini_bin, \"glmnet\", cv_times = 2, kfold = 2, training_frac = 0.5, groups = grps, group_partitions = list( train = c(\"A\", \"B\", \"C\", \"D\", \"E\", \"F\"), test = c(\"A\", \"B\", \"C\", \"D\", \"E\", \"F\", \"G\", \"H\") ), seed = 2019 ) #> Using 'dx' as the outcome column. #> Fraction of data in the training set: 0.5 #> Groups in the training set: A B C D E F #> Groups in the testing set: A B C D E F G H #> Groups will be kept together in CV partitions #> Training the model... #> Training complete."},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"more-arguments","dir":"Articles","previous_headings":"Customizing parameters","what":"More arguments","title":"Introduction to mikropml","text":"ML methods take optional arguments, ntree randomForest-based models case weights. additional arguments give run_ml() forwarded along caret::train() can leverage options.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"case-weights","dir":"Articles","previous_headings":"Customizing parameters > More arguments","what":"Case weights","title":"Introduction to mikropml","text":"want use case weights, also need use custom indices training data (.e. perform partition run_ml() ). ’s one way weights calculated proportion class data set, ~70% data training set: See caret docs list models accept case weights.","code":"library(dplyr) case_weights_dat <- otu_mini_bin %>% count(dx) %>% mutate(p = n / sum(n)) %>% select(dx, p) %>% right_join(otu_mini_bin, by = \"dx\") %>% select(-starts_with(\"Otu\")) %>% mutate( in_train = sample(c(TRUE, FALSE), size = nrow(otu_mini_bin), replace = TRUE, prob = c(0.70, 0.30) ), row_num = row_number() ) %>% filter(in_train) head(case_weights_dat) #> dx p in_train row_num #> 1 cancer 0.49 TRUE 3 #> 2 cancer 0.49 TRUE 4 #> 3 cancer 0.49 TRUE 5 #> 4 cancer 0.49 TRUE 6 #> 5 cancer 0.49 TRUE 8 #> 6 cancer 0.49 TRUE 9 nrow(case_weights_dat) / nrow(otu_mini_bin) #> [1] 0.75 results_weighted <- run_ml(otu_mini_bin, \"glmnet\", outcome_colname = \"dx\", seed = 2019, training_frac = case_weights_dat %>% pull(row_num), weights = case_weights_dat %>% pull(p) )"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"finding-feature-importance","dir":"Articles","previous_headings":"","what":"Finding feature importance","title":"Introduction to mikropml","text":"find features contributing predictive power, can use find_feature_importance = TRUE. use permutation importance determine feature importance described (Topçuoğlu et al. 2020). Briefly, permutes features individually (correlated ones together) evaluates much performance metric decreases. performance decreases feature randomly shuffled, important feature . default FALSE takes run useful want know features important predicting outcome. Let’s look feature importance results: Now, can check feature importances: several columns: perf_metric: performance value permuted feature. perf_metric_diff: difference performance actual permuted data (.e. test performance minus permuted performance). Features larger perf_metric_diff important. pvalue: probability obtaining actual performance value null hypothesis. names: feature permuted. method: ML method used. perf_metric_name: performance metric used. seed: seed (set). can see , differences negligible (close zero), makes sense since model isn’t great. ’re interested feature importance, ’s especially useful run multiple different train/test splits, shown example snakemake workflow. can also choose permute correlated features together using corr_thresh (default: 1). features correlation threshold permuted together; .e. perfectly correlated features permuted together using default value. can see features permuted together names column. 3 features permuted together (doesn’t really make sense, ’s just example). previously executed run_ml() without feature importance now wish find feature importance fact, see example code get_feature_importance() documentation. get_feature_importance() can show live progress bar, see vignette(\"parallel\") examples.","code":"results_imp <- run_ml(otu_mini_bin, \"rf\", outcome_colname = \"dx\", find_feature_importance = TRUE, seed = 2019 ) results_imp$feature_importance #> perf_metric perf_metric_diff pvalue names method perf_metric_name #> 1 0.5459125 0.0003375 0.51485149 Otu00001 rf AUC #> 2 0.5682625 -0.0220125 0.73267327 Otu00002 rf AUC #> 3 0.5482875 -0.0020375 0.56435644 Otu00003 rf AUC #> 4 0.6314375 -0.0851875 1.00000000 Otu00004 rf AUC #> 5 0.4991750 0.0470750 0.08910891 Otu00005 rf AUC #> 6 0.5364875 0.0097625 0.28712871 Otu00006 rf AUC #> 7 0.5382875 0.0079625 0.39603960 Otu00007 rf AUC #> 8 0.5160500 0.0302000 0.09900990 Otu00008 rf AUC #> 9 0.5293375 0.0169125 0.17821782 Otu00009 rf AUC #> 10 0.4976500 0.0486000 0.12871287 Otu00010 rf AUC #> seed #> 1 2019 #> 2 2019 #> 3 2019 #> 4 2019 #> 5 2019 #> 6 2019 #> 7 2019 #> 8 2019 #> 9 2019 #> 10 2019 results_imp_corr <- run_ml(otu_mini_bin, \"glmnet\", cv_times = 5, find_feature_importance = TRUE, corr_thresh = 0.2, seed = 2019 ) #> Using 'dx' as the outcome column. #> Training the model... #> Warning in (function (w) : `caret::train()` issued the following warning: #> #> simpleWarning in nominalTrainWorkflow(x = x, y = y, wts = weights, info = trainInfo, : There were missing values in resampled performance measures. #> #> This warning usually means that the model didn't converge in some cross-validation folds because it is predicting something close to a constant. As a result, certain performance metrics can't be calculated. This suggests that some of the hyperparameters chosen are doing very poorly. #> Training complete. #> Finding feature importance... #> Feature importance complete. results_imp_corr$feature_importance #> perf_metric perf_metric_diff pvalue #> 1 0.4941842 0.1531842 0.05940594 #> names #> 1 Otu00001|Otu00002|Otu00003|Otu00004|Otu00005|Otu00006|Otu00007|Otu00008|Otu00009|Otu00010 #> method perf_metric_name seed #> 1 glmnet AUC 2019"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"tuning-hyperparameters-using-the-hyperparameter-argument","dir":"Articles","previous_headings":"","what":"Tuning hyperparameters (using the hyperparameter argument)","title":"Introduction to mikropml","text":"important, whole vignette . bottom line provide default hyperparameters can start , ’s important tune hyperparameters. information default hyperparameters , tune hyperparameters, see vignette(\"tuning\").","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"other-models","dir":"Articles","previous_headings":"","what":"Other models","title":"Introduction to mikropml","text":"examples train evaluate models. output similar, won’t go details.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"random-forest","dir":"Articles","previous_headings":"Other models","what":"Random forest","title":"Introduction to mikropml","text":"rf engine takes optional argument ntree: number trees use random forest. can’t tuned using rf package implementation random forest. Please refer caret documentation interested packages random forest implementations.","code":"results_rf <- run_ml(otu_mini_bin, \"rf\", cv_times = 5, seed = 2019 ) results_rf_nt <- run_ml(otu_mini_bin, \"rf\", cv_times = 5, ntree = 1000, seed = 2019 )"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"decision-tree","dir":"Articles","previous_headings":"Other models","what":"Decision tree","title":"Introduction to mikropml","text":"","code":"results_dt <- run_ml(otu_mini_bin, \"rpart2\", cv_times = 5, seed = 2019 )"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"svm","dir":"Articles","previous_headings":"Other models","what":"SVM","title":"Introduction to mikropml","text":"get message “maximum number iterations reached”, see issue caret.","code":"results_svm <- run_ml(otu_mini_bin, \"svmRadial\", cv_times = 5, seed = 2019 )"},{"path":[]},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"multiclass-data","dir":"Articles","previous_headings":"Other data","what":"Multiclass data","title":"Introduction to mikropml","text":"provide otu_mini_multi multiclass outcome (three outcomes): ’s example running multiclass data: performance metrics slightly different, format everything else :","code":"otu_mini_multi %>% dplyr::pull(\"dx\") %>% unique() #> [1] \"adenoma\" \"carcinoma\" \"normal\" results_multi <- run_ml(otu_mini_multi, outcome_colname = \"dx\", seed = 2019 ) results_multi$performance #> # A tibble: 1 × 17 #> cv_metric…¹ logLoss AUC prAUC Accur…² Kappa Mean_F1 Mean_…³ Mean_…⁴ Mean_…⁵ #> #> 1 1.07 1.11 0.506 0.353 0.382 0.0449 NA 0.360 0.682 NaN #> # … with 7 more variables: Mean_Neg_Pred_Value , Mean_Precision , #> # Mean_Recall , Mean_Detection_Rate , Mean_Balanced_Accuracy , #> # method , seed , and abbreviated variable names #> # ¹​cv_metric_logLoss, ²​Accuracy, ³​Mean_Sensitivity, ⁴​Mean_Specificity, #> # ⁵​Mean_Pos_Pred_Value"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"continuous-data","dir":"Articles","previous_headings":"Other data","what":"Continuous data","title":"Introduction to mikropml","text":"’s example running continuous data, outcome column numerical: , performance metrics slightly different, format rest :","code":"results_cont <- run_ml(otu_mini_bin[, 2:11], \"glmnet\", outcome_colname = \"Otu00001\", seed = 2019 ) results_cont$performance #> # A tibble: 1 × 6 #> cv_metric_RMSE RMSE Rsquared MAE method seed #> #> 1 622. 731. 0.0893 472. glmnet 2019"},{"path":[]},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"summary","dir":"Articles","previous_headings":"","what":"Summary","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"Machine learning (ML) classification prediction based set features used make decisions healthcare, economics, criminal justice . However, implementing ML pipeline including preprocessing, model selection, evaluation can time-consuming, confusing, difficult. , present mikropml (pronounced “meek-ROPE em el”), easy--use R package implements ML pipelines using regression, support vector machines, decision trees, random forest, gradient-boosted trees. package available GitHub, CRAN, conda.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"statement-of-need","dir":"Articles","previous_headings":"","what":"Statement of need","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"applications machine learning (ML) require reproducible steps data pre-processing, cross-validation, testing, model evaluation, often interpretation model makes particular predictions. Performing steps important, failure implement can result incorrect misleading results (Teschendorff 2019; Wiens et al. 2019). Supervised ML widely used recognize patterns large datasets make predictions outcomes interest. Several packages including caret (Kuhn 2008) tidymodels (Kuhn, Wickham, RStudio 2020) R, scikitlearn (Pedregosa et al. 2011) Python, H2O autoML platform (H2O.ai 2020) allow scientists train ML models variety algorithms. packages provide tools necessary ML step, implement complete ML pipeline according good practices literature. makes difficult practitioners new ML easily begin perform ML analyses. enable broader range researchers apply ML problem domains, created mikropml, easy--use R package (R Core Team 2020) implements ML pipeline created Topçuoğlu et al. (Topçuoğlu et al. 2020) single function returns trained model, model performance metrics feature importance. mikropml leverages caret package support several ML algorithms: linear regression, logistic regression, support vector machines radial basis kernel, decision trees, random forest, gradient boosted trees. incorporates good practices ML training, testing, model evaluation (Topçuoğlu et al. 2020; Teschendorff 2019). Furthermore, provides data preprocessing steps based FIDDLE (FlexIble Data-Driven pipeLinE) framework outlined Tang et al. (Tang et al. 2020) post-training permutation importance steps estimate importance feature models trained (Breiman 2001; Fisher, Rudin, Dominici 2018). mikropml can used starting point application ML datasets many different fields. already applied microbiome data categorize patients colorectal cancer (Topçuoğlu et al. 2020), identify differences genomic clinical features associated bacterial infections (Lapp et al. 2020), predict gender-based biases academic publishing (Hagan et al. 2020).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"mikropml-package","dir":"Articles","previous_headings":"","what":"mikropml package","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"mikropml package includes functionality preprocess data, train ML models, evaluate model performance, quantify feature importance (Figure 1). also provide vignettes example Snakemake workflow (Köster Rahmann 2012) showcase run ideal ML pipeline multiple different train/test data splits. results can visualized using helper functions use ggplot2 (Wickham 2016). mikropml allows users get started quickly facilitates reproducibility, replacement understanding ML workflow still necessary interpreting results (Pollard et al. 2019). facilitate understanding enable one tailor code application, heavily commented code provided supporting documentation can read online.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"preprocessing-data","dir":"Articles","previous_headings":"mikropml package","what":"Preprocessing data","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"provide function preprocess_data() preprocess features using several different functions caret package. preprocess_data() takes continuous categorical data, re-factors categorical data binary features, provides options normalize continuous data, remove features near-zero variance, keep one instance perfectly correlated features. set default options based implemented FIDDLE (Tang et al. 2020). details use preprocess_data() can found accompanying vignette.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"running-ml","dir":"Articles","previous_headings":"mikropml package","what":"Running ML","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"main function mikropml, run_ml(), minimally takes model choice data frame outcome column feature columns. model choice, mikropml currently supports logistic linear regression (glmnet: Friedman, Hastie, Tibshirani 2010), support vector machines radial basis kernel (kernlab: Karatzoglou et al. 2004), decision trees (rpart: Therneau et al. 2019), random forest (randomForest: Liaw Wiener 2002), gradient-boosted trees (xgboost: Chen et al. 2020). run_ml() randomly splits data train test sets maintaining distribution outcomes found full dataset. also provides option split data train test sets based categorical variables (e.g. batch, geographic location, etc.). mikropml uses caret package (Kuhn 2008) train evaluate models, optionally quantifies feature importance. output includes best model built based tuning hyperparameters internal repeated cross-validation step, model evaluation metrics, optional feature importances. Feature importances calculated using permutation test, breaks relationship feature true outcome test data, measures change model performance. provides intuitive metric individual features influence model performance comparable across model types, particularly useful model interpretation (Topçuoğlu et al. 2020). introductory vignette contains comprehensive tutorial use run_ml(). mikropml pipeline","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"ideal-workflow-for-running-mikropml-with-many-different-traintest-splits","dir":"Articles","previous_headings":"mikropml package","what":"Ideal workflow for running mikropml with many different train/test splits","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"investigate variation model performance depending train test set used (Topçuoğlu et al. 2020; Lapp et al. 2020), provide examples run_ml() many times different train/test splits get summary information model performance local computer high-performance computing cluster using Snakemake workflow.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"tuning-visualization","dir":"Articles","previous_headings":"mikropml package","what":"Tuning & visualization","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"One particularly important aspect ML hyperparameter tuning. provide reasonable range default hyperparameters model type. However practitioners explore whether range appropriate data, customize hyperparameter range. Therefore, provide function plot_hp_performance() plot cross-validation performance metric single model models built using different train/test splits. helps evaluate hyperparameter range searched exhaustively allows user pick ideal set. also provide summary plots test performance metrics many train/test splits different models using plot_model_performance(). Examples described accompanying vignette hyperparameter tuning.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"dependencies","dir":"Articles","previous_headings":"mikropml package","what":"Dependencies","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"mikropml written R (R Core Team 2020) depends several packages: dplyr (Wickham et al. 2020), rlang (Henry, Wickham, RStudio 2020) caret (Kuhn 2008). ML algorithms supported mikropml require: glmnet (Friedman, Hastie, Tibshirani 2010), e1071 (Meyer et al. 2020), MLmetrics (Yan 2016) logistic regression, rpart2 (Therneau et al. 2019) decision trees, randomForest (Liaw Wiener 2002) random forest, xgboost (Chen et al. 2020) xgboost, kernlab (Karatzoglou et al. 2004) support vector machines. also allow parallelization cross-validation steps using foreach, doFuture, future.apply, future packages (Bengtsson Team 2020). Finally, use ggplot2 plotting (Wickham 2016).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"acknowledgments","dir":"Articles","previous_headings":"","what":"Acknowledgments","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"thank members Schloss Lab participated code clubs related initial development pipeline, made documentation improvements, provided general feedback. also thank Nick Lesniak designing mikropml logo. thank US Research Software Sustainability Institute (NSF #1743188) providing training KLS Winter School Research Software Engineering.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"funding","dir":"Articles","previous_headings":"","what":"Funding","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"Salary support PDS came NIH grant 1R01CA215574. KLS received support NIH Training Program Bioinformatics (T32 GM070449). ZL received support National Science Foundation Graduate Research Fellowship Program Grant . DGE 1256260. opinions, findings, conclusions recommendations expressed material authors necessarily reflect views National Science Foundation.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"author-contributions","dir":"Articles","previous_headings":"","what":"Author contributions","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"BDT, ZL, KLS contributed equally. Author order among co-first authors determined time since joining project. BDT, ZL, KLS conceptualized study wrote code. KLS structured code R package form. BDT, ZL, JW, PDS developed methodology. PDS, ES, JW supervised project. BDT, ZL, KLS wrote original draft. authors reviewed edited manuscript.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"conflicts-of-interest","dir":"Articles","previous_headings":"","what":"Conflicts of interest","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"None.","code":""},{"path":[]},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"speed-up-single-runs","dir":"Articles","previous_headings":"","what":"Speed up single runs","title":"Parallel processing","text":"default, preprocess_data(), run_ml(), compare_models() use one process series. ’d like parallelize various steps pipeline make run faster, install foreach, future, future.apply, doFuture. , register future plan prior calling functions: , used multicore plan split work across 2 cores. See future documentation picking best plan use case. Notably, multicore work inside RStudio Windows; need use multisession instead cases. registering future plan, can call preprocess_data() run_ml() usual, run certain tasks parallel. ’s also parallel version rf engine called parRF trains trees forest parallel. See caret docs information.","code":"doFuture::registerDoFuture() future::plan(future::multicore, workers = 2) otu_data_preproc <- preprocess_data(otu_mini_bin, \"dx\")$dat_transformed #> Using 'dx' as the outcome column. result1 <- run_ml(otu_data_preproc, \"glmnet\") #> Using 'dx' as the outcome column. #> Training the model... #> Loading required package: ggplot2 #> Loading required package: lattice #> #> Attaching package: 'caret' #> The following object is masked from 'package:mikropml': #> #> compare_models #> Training complete."},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"call-run_ml-multiple-times-in-parallel-in-r","dir":"Articles","previous_headings":"","what":"Call run_ml() multiple times in parallel in R","title":"Parallel processing","text":"can use functions future.apply package call run_ml() multiple times parallel different parameters. first need run future::plan() haven’t already. , call run_ml() multiple seeds using future_lapply(): call run_ml() different seed uses different random split data training testing sets. Since using seeds, must set future.seed TRUE (see future.apply documentation blog post details parallel-safe random seeds). example uses seeds speed simplicity, real data recommend using many seeds get better estimate model performance. examples, used functions future.apply package run_ml() parallel, can accomplish thing parallel versions purrr::map() functions using furrr package (e.g. furrr::future_map_dfr()). Extract performance results combine one dataframe seeds:","code":"# NOTE: use more seeds for real-world data results_multi <- future.apply::future_lapply(seq(100, 102), function(seed) { run_ml(otu_data_preproc, \"glmnet\", seed = seed) }, future.seed = TRUE) #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. perf_df <- future.apply::future_lapply(results_multi, function(result) { result[[\"performance\"]] %>% select(cv_metric_AUC, AUC, method) }, future.seed = TRUE ) %>% dplyr::bind_rows() perf_df #> # A tibble: 3 × 3 #> cv_metric_AUC AUC method #> #> 1 0.630 0.634 glmnet #> 2 0.591 0.608 glmnet #> 3 0.671 0.471 glmnet"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"multiple-ml-methods","dir":"Articles","previous_headings":"Call run_ml() multiple times in parallel in R","what":"Multiple ML methods","title":"Parallel processing","text":"may also wish compare performance different ML methods. mapply() can iterate multiple lists vectors, future_mapply() works way: Extract combine results seeds methods:","code":"# NOTE: use more seeds for real-world data param_grid <- expand.grid( seeds = seq(100, 102), methods = c(\"glmnet\", \"rf\") ) results_mtx <- future.apply::future_mapply( function(seed, method) { run_ml(otu_data_preproc, method, seed = seed, find_feature_importance = TRUE) }, param_grid$seeds, param_grid$methods %>% as.character(), future.seed = TRUE ) #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Finding feature importance... #> Feature importance complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Finding feature importance... #> Feature importance complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Finding feature importance... #> Feature importance complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Finding feature importance... #> Feature importance complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Finding feature importance... #> Feature importance complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Finding feature importance... #> Feature importance complete. perf_df <- lapply( results_mtx[\"performance\", ], function(x) { x %>% select(cv_metric_AUC, AUC, method) } ) %>% dplyr::bind_rows() feat_df <- results_mtx[\"feature_importance\", ] %>% dplyr::bind_rows()"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"visualize-the-results","dir":"Articles","previous_headings":"Call run_ml() multiple times in parallel in R","what":"Visualize the results","title":"Parallel processing","text":"ggplot2 required use plotting functions . can also create plots however like using results.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"performance","dir":"Articles","previous_headings":"Call run_ml() multiple times in parallel in R > Visualize the results","what":"Performance","title":"Parallel processing","text":"plot_model_performance() returns ggplot2 object. can add layers customize plot:","code":"perf_boxplot <- plot_model_performance(perf_df) perf_boxplot perf_boxplot + theme_classic() + scale_color_brewer(palette = \"Dark2\") + coord_flip()"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"feature-importance","dir":"Articles","previous_headings":"Call run_ml() multiple times in parallel in R > Visualize the results","what":"feature importance","title":"Parallel processing","text":"features resulted largest decrease performance permuted importance features.","code":"top_feats <- feat_df %>% group_by(method, names) %>% summarize(median_diff = median(perf_metric_diff)) %>% slice_min(order_by = median_diff, n = 5) #> `summarise()` has grouped output by 'method'. You can override using the #> `.groups` argument. feat_df %>% right_join(top_feats, by = c(\"method\", \"names\")) %>% mutate(features = factor(names, levels = rev(unique(top_feats$names)))) %>% ggplot(aes(x = perf_metric_diff, y = features, color = method)) + geom_boxplot() + facet_wrap(~method) + theme_bw()"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"live-progress-updates","dir":"Articles","previous_headings":"","what":"Live progress updates","title":"Parallel processing","text":"preprocess_data() get_feature_importance() support reporting live progress updates using progressr package. format , recommend using progress bar like : Note future backends support “near-live” progress updates, meaning progress may reported immediately parallel processing futures. Read progressr vignette. progressr customize format progress updates, see progressr docs.","code":"# optionally, specify the progress bar format with the `progress` package. progressr::handlers(progressr::handler_progress( format = \":message :bar :percent | elapsed: :elapsed | eta: :eta\", clear = FALSE, show_after = 0 )) # tell progressr to always report progress in any functions that use it. # set this to FALSE to turn it back off again. progressr::handlers(global = TRUE) # run your code and watch the live progress updates. dat <- preprocess_data(otu_mini_bin, \"dx\")$dat_transformed #> Using 'dx' as the outcome column. #> preprocessing ========================>------- 78% | elapsed: 1s | eta: 0s results <- run_ml(dat, \"glmnet\", kfold = 2, cv_times = 2, find_feature_importance = TRUE ) #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Feature importance =========================== 100% | elapsed: 37s | eta: 0s"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"parallelizing-with-snakemake","dir":"Articles","previous_headings":"","what":"Parallelizing with Snakemake","title":"Parallel processing","text":"parallelizing multiple calls run_ml() R examples , results objects held memory. isn’t big deal small dataset run seeds. However, large datasets run parallel , say, 100 seeds (recommended), may run problems trying store objects memory . Using workflow manager Snakemake Nextflow highly recommend maximize scalability reproducibility computational analyses. created template Snakemake workflow can use starting point ML project.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"its-running-so-slow","dir":"Articles","previous_headings":"","what":"It’s running so slow!","title":"Preprocessing data","text":"Since assume lot won’t read entire vignette, ’m going say beginning. preprocess_data() function running super slow, consider parallelizing goes faster! preprocess_data() also can report live progress updates. See vignette(\"parallel\") details.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"examples","dir":"Articles","previous_headings":"","what":"Examples","title":"Preprocessing data","text":"’re going start simple get complicated, want whole shebang , just scroll bottom. First, load mikropml:","code":"library(mikropml)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"binary-data","dir":"Articles","previous_headings":"Examples","what":"Binary data","title":"Preprocessing data","text":"Let’s start binary variables: addition dataframe , provide name outcome column preprocess_data(). ’s preprocessed data looks like: output list: dat_transformed transformed data, grp_feats list grouped features, removed_feats list features removed. , grp_feats NULL perfectly correlated features (e.g. c(0,1,0) c(0,1,0), c(0,1,0) c(1,0,1) - see details). first column (var1) dat_transformed character changed var1_yes zeros () ones (yes). values second column (var2) stay ’s already binary, name changes var2_1. third column (var3) factor also changed binary b 1 0, denoted new column name var3_b.","code":"# raw binary dataset bin_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"no\", \"yes\", \"no\"), var2 = c(0, 1, 1), var3 = factor(c(\"a\", \"a\", \"b\")) ) bin_df #> outcome var1 var2 var3 #> 1 normal no 0 a #> 2 normal yes 1 a #> 3 cancer no 1 b # preprocess raw binary data preprocess_data(dataset = bin_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 4 #> outcome var1_yes var2_1 var3_b #> #> 1 normal 0 0 0 #> 2 normal 1 1 0 #> 3 cancer 0 1 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"categorical-data","dir":"Articles","previous_headings":"Examples","what":"Categorical data","title":"Preprocessing data","text":"non-binary categorical data: can see, variable split 3 different columns - one type (, b, c). , grp_feats NULL.","code":"# raw categorical dataset cat_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"a\", \"b\", \"c\") ) cat_df #> outcome var1 #> 1 normal a #> 2 normal b #> 3 cancer c # preprocess raw categorical data preprocess_data(dataset = cat_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 4 #> outcome var1_a var1_b var1_c #> #> 1 normal 1 0 0 #> 2 normal 0 1 0 #> 3 cancer 0 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"continuous-data","dir":"Articles","previous_headings":"Examples","what":"Continuous data","title":"Preprocessing data","text":"Now, looking continuous variables: Wow! numbers change? default normalize data using \"center\" \"scale\". often best practice, may want normalize data, may want normalize data different way. don’t want normalize data, can use method=NULL: can also normalize data different ways. can choose method supported method argument caret::preProcess() (see caret::preProcess() docs details). Note methods applied continuous variables. Another feature preprocess_data() provide continuous variables characters, converted numeric: don’t want happen, want character data remain character data even can converted numeric, can use to_numeric=FALSE kept categorical: can see output, case features treated groups rather numbers (e.g. normalized).","code":"# raw continuous dataset cont_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(1, 2, 3) ) cont_df #> outcome var1 #> 1 normal 1 #> 2 normal 2 #> 3 cancer 3 # preprocess raw continuous data preprocess_data(dataset = cont_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 2 #> outcome var1 #> #> 1 normal -1 #> 2 normal 0 #> 3 cancer 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0) # preprocess raw continuous data, no normalization preprocess_data(dataset = cont_df, outcome_colname = \"outcome\", method = NULL) # raw continuous dataset as characters cont_char_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"1\", \"2\", \"3\") ) cont_char_df #> outcome var1 #> 1 normal 1 #> 2 normal 2 #> 3 cancer 3 # preprocess raw continuous character data as numeric preprocess_data(dataset = cont_char_df, outcome_colname = \"outcome\") # preprocess raw continuous character data as characters preprocess_data(dataset = cont_char_df, outcome_colname = \"outcome\", to_numeric = FALSE) #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 4 #> outcome var1_1 var1_2 var1_3 #> #> 1 normal 1 0 0 #> 2 normal 0 1 0 #> 3 cancer 0 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"collapse-perfectly-correlated-features","dir":"Articles","previous_headings":"Examples","what":"Collapse perfectly correlated features","title":"Preprocessing data","text":"default, preprocess_data() collapses features perfectly positively negatively correlated. multiple copies features add information machine learning, makes run_ml faster. can see, end one variable, 3 grouped together. Also, second element list longer NULL. Instead, tells grp1 contains var1, var2, var3. want group positively correlated features, negatively correlated features (e.g. interpretability, another downstream application), can using group_neg_corr=FALSE: , var3 kept ’s ’s negatively correlated var1 var2. can also choose keep features separate, even perfectly correlated, using collapse_corr_feats=FALSE: case, grp_feats always NULL.","code":"# raw correlated dataset corr_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"no\", \"yes\", \"no\"), var2 = c(0, 1, 0), var3 = c(1, 0, 1) ) corr_df #> outcome var1 var2 var3 #> 1 normal no 0 1 #> 2 normal yes 1 0 #> 3 cancer no 0 1 # preprocess raw correlated dataset preprocess_data(dataset = corr_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 2 #> outcome grp1 #> #> 1 normal 0 #> 2 normal 1 #> 3 cancer 0 #> #> $grp_feats #> $grp_feats$grp1 #> [1] \"var1_yes\" \"var3_1\" #> #> #> $removed_feats #> [1] \"var2\" # preprocess raw correlated dataset; don't group negatively correlated features preprocess_data(dataset = corr_df, outcome_colname = \"outcome\", group_neg_corr = FALSE) #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_yes var3_1 #> #> 1 normal 0 1 #> 2 normal 1 0 #> 3 cancer 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var2\" # preprocess raw correlated dataset; don't group negatively correlated features preprocess_data(dataset = corr_df, outcome_colname = \"outcome\", collapse_corr_feats = FALSE) #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_yes var3_1 #> #> 1 normal 0 1 #> 2 normal 1 0 #> 3 cancer 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var2\""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"data-with-near-zero-variance","dir":"Articles","previous_headings":"Examples","what":"Data with near-zero variance","title":"Preprocessing data","text":"variables zero, “”? ones won’t contribute information, remove : , var3, var4, var5 variability, variables removed preprocessing: can read caret::preProcess() documentation information. default, remove features “near-zero variance” (remove_var='nzv'). uses default arguments caret::nearZeroVar(). However, particularly smaller datasets, might want remove features near-zero variance. want remove features zero variance, can use remove_var='zv': want include features, can use argument remove_zv=NULL. work, collapse correlated features (otherwise errors underlying caret function use). want nuanced remove near-zero variance features (e.g. change default 10% cutoff percentage distinct values total number samples), can use caret::preProcess() function running preprocess_data remove_var=NULL (see caret::nearZeroVar() function information).","code":"# raw dataset with non-variable features nonvar_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"no\", \"yes\", \"no\"), var2 = c(0, 1, 1), var3 = c(\"no\", \"no\", \"no\"), var4 = c(0, 0, 0), var5 = c(12, 12, 12) ) nonvar_df #> outcome var1 var2 var3 var4 var5 #> 1 normal no 0 no 0 12 #> 2 normal yes 1 no 0 12 #> 3 cancer no 1 no 0 12 # remove features with near-zero variance preprocess_data(dataset = nonvar_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_yes var2_1 #> #> 1 normal 0 0 #> 2 normal 1 1 #> 3 cancer 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var4\" \"var3\" \"var5\" # remove features with zero variance preprocess_data(dataset = nonvar_df, outcome_colname = \"outcome\", remove_var = \"zv\") #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_yes var2_1 #> #> 1 normal 0 0 #> 2 normal 1 1 #> 3 cancer 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var4\" \"var3\" \"var5\" # don't remove features with near-zero or zero variance preprocess_data(dataset = nonvar_df, outcome_colname = \"outcome\", remove_var = NULL, collapse_corr_feats = FALSE) #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 5 #> outcome var1_yes var2_1 var3 var5 #> #> 1 normal 0 0 0 12 #> 2 normal 1 1 0 12 #> 3 cancer 0 1 0 12 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var4\""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"missing-data","dir":"Articles","previous_headings":"Examples","what":"Missing data","title":"Preprocessing data","text":"preprocess_data() also deals missing data. : Removes missing outcome variables. Maintains zero variability feature already variability (.e. feature removed removing features near-zero variance). Replaces missing binary categorical variables zero (splitting multiple columns). Replaces missing continuous data median value feature. ’d like deal missing data different way, please prior inputting data preprocess_data().","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"remove-missing-outcome-variables","dir":"Articles","previous_headings":"Examples > Missing data","what":"Remove missing outcome variables","title":"Preprocessing data","text":"","code":"# raw dataset with missing outcome value miss_oc_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\", NA), var1 = c(\"no\", \"yes\", \"no\", \"no\"), var2 = c(0, 1, 1, 1) ) miss_oc_df #> outcome var1 var2 #> 1 normal no 0 #> 2 normal yes 1 #> 3 cancer no 1 #> 4 no 1 # preprocess raw dataset with missing outcome value preprocess_data(dataset = miss_oc_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> Removed 1/4 (25%) of samples because of missing outcome value (NA). #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_yes var2_1 #> #> 1 normal 0 0 #> 2 normal 1 1 #> 3 cancer 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"maintain-zero-variability-in-a-feature-if-it-already-has-no-variability","dir":"Articles","previous_headings":"Examples > Missing data","what":"Maintain zero variability in a feature if it already has no variability","title":"Preprocessing data","text":", non-variable feature missing data removed removed features near-zero variance. maintained feature, ’d ones:","code":"# raw dataset with missing value in non-variable feature miss_nonvar_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"no\", \"yes\", \"no\"), var2 = c(NA, 1, 1) ) miss_nonvar_df #> outcome var1 var2 #> 1 normal no NA #> 2 normal yes 1 #> 3 cancer no 1 # preprocess raw dataset with missing value in non-variable feature preprocess_data(dataset = miss_nonvar_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> There are 1 missing value(s) in features with no variation. Missing values were replaced with the non-varying value. #> $dat_transformed #> # A tibble: 3 × 2 #> outcome var1_yes #> #> 1 normal 0 #> 2 normal 1 #> 3 cancer 0 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var2\" # preprocess raw dataset with missing value in non-variable feature preprocess_data(dataset = miss_nonvar_df, outcome_colname = \"outcome\", remove_var = NULL, collapse_corr_feats = FALSE) #> Using 'outcome' as the outcome column. #> There are 1 missing value(s) in features with no variation. Missing values were replaced with the non-varying value. #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_yes var2 #> #> 1 normal 0 1 #> 2 normal 1 1 #> 3 cancer 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"replace-missing-binary-and-categorical-variables-with-zero","dir":"Articles","previous_headings":"Examples > Missing data","what":"Replace missing binary and categorical variables with zero","title":"Preprocessing data","text":"binary variable split two, missing value considered zero .","code":"# raw dataset with missing value in categorical feature miss_cat_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"no\", \"yes\", NA), var2 = c(NA, 1, 0) ) miss_cat_df #> outcome var1 var2 #> 1 normal no NA #> 2 normal yes 1 #> 3 cancer 0 # preprocess raw dataset with missing value in non-variable feature preprocess_data(dataset = miss_cat_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> 2 categorical missing value(s) (NA) were replaced with 0. Note that the matrix is not full rank so missing values may be duplicated in separate columns. #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_no var1_yes #> #> 1 normal 1 0 #> 2 normal 0 1 #> 3 cancer 0 0 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var2\""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"replace-missing-continuous-data-with-the-median-value-of-that-feature","dir":"Articles","previous_headings":"Examples > Missing data","what":"Replace missing continuous data with the median value of that feature","title":"Preprocessing data","text":"’re normalizing continuous features ’s easier see ’s going (.e. median value used):","code":"# raw dataset with missing value in continuous feature miss_cont_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\", \"normal\"), var1 = c(1, 2, 2, NA), var2 = c(1, 2, 3, NA) ) miss_cont_df #> outcome var1 var2 #> 1 normal 1 1 #> 2 normal 2 2 #> 3 cancer 2 3 #> 4 normal NA NA # preprocess raw dataset with missing value in continuous feature preprocess_data(dataset = miss_cont_df, outcome_colname = \"outcome\", method = NULL) #> Using 'outcome' as the outcome column. #> 2 missing continuous value(s) were imputed using the median value of the feature. #> $dat_transformed #> # A tibble: 4 × 3 #> outcome var1 var2 #> #> 1 normal 1 1 #> 2 normal 2 2 #> 3 cancer 2 3 #> 4 normal 2 2 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"putting-it-all-together","dir":"Articles","previous_headings":"Examples","what":"Putting it all together","title":"Preprocessing data","text":"’s complicated example raw data puts everything discussed together: Let’s throw preprocessing function default values: can see, got several messages: One samples (row 4) removed outcome value missing. One variables feature variation missing value replaced non-varying value (var11). Four categorical missing values replaced zero (var9). 4 missing rather just 1 (like raw data) split categorical variable 4 different columns first. One missing continuous value imputed using median value feature (var8). Additionally, can see continuous variables normalized, categorical variables changed binary, several features grouped together. variables group can found grp_feats.","code":"test_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\", NA), var1 = 1:4, var2 = c(\"a\", \"b\", \"c\", \"d\"), var3 = c(\"no\", \"yes\", \"no\", \"no\"), var4 = c(0, 1, 0, 0), var5 = c(0, 0, 0, 0), var6 = c(\"no\", \"no\", \"no\", \"no\"), var7 = c(1, 1, 0, 0), var8 = c(5, 6, NA, 7), var9 = c(NA, \"x\", \"y\", \"z\"), var10 = c(1, 0, NA, NA), var11 = c(1, 1, NA, NA), var12 = c(\"1\", \"2\", \"3\", \"4\") ) test_df #> outcome var1 var2 var3 var4 var5 var6 var7 var8 var9 var10 var11 var12 #> 1 normal 1 a no 0 0 no 1 5 1 1 1 #> 2 normal 2 b yes 1 0 no 1 6 x 0 1 2 #> 3 cancer 3 c no 0 0 no 0 NA y NA NA 3 #> 4 4 d no 0 0 no 0 7 z NA NA 4 preprocess_data(dataset = test_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> Removed 1/4 (25%) of samples because of missing outcome value (NA). #> There are 1 missing value(s) in features with no variation. Missing values were replaced with the non-varying value. #> 2 categorical missing value(s) (NA) were replaced with 0. Note that the matrix is not full rank so missing values may be duplicated in separate columns. #> 1 missing continuous value(s) were imputed using the median value of the feature. #> $dat_transformed #> # A tibble: 3 × 6 #> outcome grp1 var2_a grp2 grp3 var8 #> #> 1 normal -1 1 0 0 -0.707 #> 2 normal 0 0 1 0 0.707 #> 3 cancer 1 0 0 1 0 #> #> $grp_feats #> $grp_feats$grp1 #> [1] \"var1\" \"var12\" #> #> $grp_feats$var2_a #> [1] \"var2_a\" #> #> $grp_feats$grp2 #> [1] \"var2_b\" \"var3_yes\" \"var9_x\" #> #> $grp_feats$grp3 #> [1] \"var2_c\" \"var7_1\" \"var9_y\" #> #> $grp_feats$var8 #> [1] \"var8\" #> #> #> $removed_feats #> [1] \"var4\" \"var5\" \"var10\" \"var6\" \"var11\""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"next-step-train-and-evaluate-your-model","dir":"Articles","previous_headings":"Examples","what":"Next step: train and evaluate your model!","title":"Preprocessing data","text":"preprocess data (either using preprocess_data() preprocessing data ), ’re ready train evaluate machine learning models! Please see run_ml() information training models.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"the-simplest-way-to-run_ml","dir":"Articles","previous_headings":"","what":"The simplest way to run_ml()","title":"Hyperparameter tuning","text":"mentioned , minimal input dataset (dataset) machine learning model want use (method). run_ml(), default 100 times repeated, 5-fold cross-validation, evaluate hyperparameters 500 total iterations. Say want run L2 regularized logistic regression. : ’ll probably get warning run dataset small. want learn , check introductory vignette training evaluating ML model: vignette(\"introduction\"). default, run_ml() selects hyperparameters depending dataset method used. can see, alpha hyperparameter set 0, specifies L2 regularization. glmnet gives us option run L1 L2 regularization. change alpha 1, run L1-regularized logistic regression. can also tune alpha specifying variety values 0 1. use value 0 1, running elastic net. default hyperparameter lambda adjusts L2 regularization penalty range values 10^-4 10. look 100 repeated cross-validation performance metrics AUC, Accuracy, prAUC tested lambda value, see appropriate dataset better others.","code":"results <- run_ml(dat, \"glmnet\", outcome_colname = \"dx\", cv_times = 100, seed = 2019 ) #> Using 'dx' as the outcome column. #> Training the model... #> Loading required package: ggplot2 #> Loading required package: lattice #> #> Attaching package: 'caret' #> The following object is masked from 'package:mikropml': #> #> compare_models #> Training complete. results$trained_model #> glmnet #> #> 161 samples #> 10 predictor #> 2 classes: 'cancer', 'normal' #> #> No pre-processing #> Resampling: Cross-Validated (5 fold, repeated 100 times) #> Summary of sample sizes: 128, 129, 129, 129, 129, 130, ... #> Resampling results across tuning parameters: #> #> lambda logLoss AUC prAUC Accuracy Kappa F1 #> 1e-04 0.7113272 0.6123301 0.5725828 0.5853927 0.17080523 0.5730989 #> 1e-03 0.7113272 0.6123301 0.5725828 0.5853927 0.17080523 0.5730989 #> 1e-02 0.7112738 0.6123883 0.5726478 0.5854514 0.17092470 0.5731635 #> 1e-01 0.6819806 0.6210744 0.5793961 0.5918756 0.18369829 0.5779616 #> 1e+00 0.6803749 0.6278273 0.5827655 0.5896356 0.17756961 0.5408139 #> 1e+01 0.6909820 0.6271894 0.5814202 0.5218000 0.02920942 0.1875293 #> Sensitivity Specificity Pos_Pred_Value Neg_Pred_Value Precision #> 0.5789667 0.5920074 0.5796685 0.5977166 0.5796685 #> 0.5789667 0.5920074 0.5796685 0.5977166 0.5796685 #> 0.5789667 0.5921250 0.5797769 0.5977182 0.5797769 #> 0.5805917 0.6032353 0.5880165 0.6026963 0.5880165 #> 0.5057833 0.6715588 0.6005149 0.5887829 0.6005149 #> 0.0607250 0.9678676 0.7265246 0.5171323 0.7265246 #> Recall Detection_Rate Balanced_Accuracy #> 0.5789667 0.2839655 0.5854870 #> 0.5789667 0.2839655 0.5854870 #> 0.5789667 0.2839636 0.5855458 #> 0.5805917 0.2847195 0.5919135 #> 0.5057833 0.2478291 0.5886711 #> 0.0607250 0.0292613 0.5142963 #> #> Tuning parameter 'alpha' was held constant at a value of 0 #> AUC was used to select the optimal model using the largest value. #> The final values used for the model were alpha = 0 and lambda = 1. results$trained_model$results #> alpha lambda logLoss AUC prAUC Accuracy Kappa F1 #> 1 0 1e-04 0.7113272 0.6123301 0.5725828 0.5853927 0.17080523 0.5730989 #> 2 0 1e-03 0.7113272 0.6123301 0.5725828 0.5853927 0.17080523 0.5730989 #> 3 0 1e-02 0.7112738 0.6123883 0.5726478 0.5854514 0.17092470 0.5731635 #> 4 0 1e-01 0.6819806 0.6210744 0.5793961 0.5918756 0.18369829 0.5779616 #> 5 0 1e+00 0.6803749 0.6278273 0.5827655 0.5896356 0.17756961 0.5408139 #> 6 0 1e+01 0.6909820 0.6271894 0.5814202 0.5218000 0.02920942 0.1875293 #> Sensitivity Specificity Pos_Pred_Value Neg_Pred_Value Precision Recall #> 1 0.5789667 0.5920074 0.5796685 0.5977166 0.5796685 0.5789667 #> 2 0.5789667 0.5920074 0.5796685 0.5977166 0.5796685 0.5789667 #> 3 0.5789667 0.5921250 0.5797769 0.5977182 0.5797769 0.5789667 #> 4 0.5805917 0.6032353 0.5880165 0.6026963 0.5880165 0.5805917 #> 5 0.5057833 0.6715588 0.6005149 0.5887829 0.6005149 0.5057833 #> 6 0.0607250 0.9678676 0.7265246 0.5171323 0.7265246 0.0607250 #> Detection_Rate Balanced_Accuracy logLossSD AUCSD prAUCSD AccuracySD #> 1 0.2839655 0.5854870 0.085315967 0.09115229 0.07296554 0.07628572 #> 2 0.2839655 0.5854870 0.085315967 0.09115229 0.07296554 0.07628572 #> 3 0.2839636 0.5855458 0.085276565 0.09122242 0.07301412 0.07637123 #> 4 0.2847195 0.5919135 0.048120032 0.09025695 0.07329214 0.07747312 #> 5 0.2478291 0.5886711 0.012189172 0.09111917 0.07505095 0.07771171 #> 6 0.0292613 0.5142963 0.001610008 0.09266875 0.07640896 0.03421597 #> KappaSD F1SD SensitivitySD SpecificitySD Pos_Pred_ValueSD #> 1 0.15265728 0.09353786 0.13091452 0.11988406 0.08316345 #> 2 0.15265728 0.09353786 0.13091452 0.11988406 0.08316345 #> 3 0.15281903 0.09350099 0.13073501 0.12002481 0.08329024 #> 4 0.15485134 0.09308733 0.12870031 0.12037225 0.08554483 #> 5 0.15563046 0.10525917 0.13381009 0.11639614 0.09957685 #> 6 0.06527242 0.09664720 0.08010494 0.06371495 0.31899811 #> Neg_Pred_ValueSD PrecisionSD RecallSD Detection_RateSD Balanced_AccuracySD #> 1 0.08384956 0.08316345 0.13091452 0.06394409 0.07640308 #> 2 0.08384956 0.08316345 0.13091452 0.06394409 0.07640308 #> 3 0.08385838 0.08329024 0.13073501 0.06384692 0.07648207 #> 4 0.08427362 0.08554483 0.12870031 0.06272897 0.07748791 #> 5 0.07597766 0.09957685 0.13381009 0.06453637 0.07773039 #> 6 0.02292294 0.31899811 0.08010494 0.03803159 0.03184136"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"customizing-hyperparameters","dir":"Articles","previous_headings":"","what":"Customizing hyperparameters","title":"Hyperparameter tuning","text":"example, want change lambda values provide better range test cross-validation step. don’t want use defaults provide named list new values. example: Now let’s run L2 logistic regression new lambda values: time, cover larger different range lambda settings cross-validation. know lambda value best one? answer , need run ML pipeline multiple data splits look mean cross-validation performance lambda across modeling experiments. describe run pipeline multiple data splits vignette(\"parallel\"). train model new lambda range defined . run 3 times different seed, result different splits data training testing sets. can use plot_hp_performance see lambda gives us largest mean AUC value across modeling experiments. can see, get mean maxima 0.03 best lambda value dataset run 3 data splits. fact seeing maxima middle range edges, shows providing large enough range exhaust lambda search build model. recommend user use plot make sure best hyperparameter edges provided list. better understanding global maxima, better run data splits using seeds. picked 3 seeds keep runtime vignette, real-world data recommend using many seeds.","code":"new_hp <- list( alpha = 1, lambda = c(0.00001, 0.0001, 0.001, 0.01, 0.015, 0.02, 0.025, 0.03, 0.04, 0.05, 0.06, 0.1) ) new_hp #> $alpha #> [1] 1 #> #> $lambda #> [1] 0.00001 0.00010 0.00100 0.01000 0.01500 0.02000 0.02500 0.03000 0.04000 #> [10] 0.05000 0.06000 0.10000 results <- run_ml(dat, \"glmnet\", outcome_colname = \"dx\", cv_times = 100, hyperparameters = new_hp, seed = 2019 ) #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. results$trained_model #> glmnet #> #> 161 samples #> 10 predictor #> 2 classes: 'cancer', 'normal' #> #> No pre-processing #> Resampling: Cross-Validated (5 fold, repeated 100 times) #> Summary of sample sizes: 128, 129, 129, 129, 129, 130, ... #> Resampling results across tuning parameters: #> #> lambda logLoss AUC prAUC Accuracy Kappa F1 #> 0.00001 0.7215038 0.6112253 0.5720005 0.5842184 0.1684871 0.5726974 #> 0.00010 0.7215038 0.6112253 0.5720005 0.5842184 0.1684871 0.5726974 #> 0.00100 0.7209099 0.6112771 0.5719601 0.5845329 0.1691285 0.5730414 #> 0.01000 0.6984432 0.6156112 0.5758977 0.5830960 0.1665062 0.5759265 #> 0.01500 0.6913332 0.6169396 0.5770496 0.5839720 0.1683912 0.5786347 #> 0.02000 0.6870103 0.6177313 0.5779563 0.5833645 0.1673234 0.5796891 #> 0.02500 0.6846387 0.6169757 0.5769305 0.5831907 0.1669901 0.5792840 #> 0.03000 0.6834369 0.6154763 0.5754118 0.5821394 0.1649081 0.5786336 #> 0.04000 0.6833322 0.6124776 0.5724802 0.5786224 0.1578750 0.5735757 #> 0.05000 0.6850454 0.6069059 0.5668928 0.5732197 0.1468699 0.5624480 #> 0.06000 0.6880861 0.5974311 0.5596714 0.5620224 0.1240112 0.5375824 #> 0.10000 0.6944846 0.5123565 0.3034983 0.5120114 0.0110144 0.3852423 #> Sensitivity Specificity Pos_Pred_Value Neg_Pred_Value Precision #> 0.5798500 0.5888162 0.5780748 0.5971698 0.5780748 #> 0.5798500 0.5888162 0.5780748 0.5971698 0.5780748 #> 0.5801167 0.5891912 0.5784544 0.5974307 0.5784544 #> 0.5883667 0.5783456 0.5755460 0.5977390 0.5755460 #> 0.5929750 0.5756471 0.5763123 0.5987220 0.5763123 #> 0.5967167 0.5708824 0.5748385 0.5990649 0.5748385 #> 0.5970250 0.5702721 0.5743474 0.5997928 0.5743474 #> 0.5964500 0.5687721 0.5734044 0.5982451 0.5734044 #> 0.5904500 0.5677353 0.5699817 0.5943308 0.5699817 #> 0.5734833 0.5736176 0.5668523 0.5864448 0.5668523 #> 0.5360333 0.5881250 0.5595918 0.5722851 0.5595918 #> 0.1145917 0.8963456 0.5255752 0.5132665 0.5255752 #> Recall Detection_Rate Balanced_Accuracy #> 0.5798500 0.28441068 0.5843331 #> 0.5798500 0.28441068 0.5843331 #> 0.5801167 0.28453770 0.5846539 #> 0.5883667 0.28860521 0.5833561 #> 0.5929750 0.29084305 0.5843110 #> 0.5967167 0.29264681 0.5837995 #> 0.5970250 0.29278708 0.5836485 #> 0.5964500 0.29248583 0.5826110 #> 0.5904500 0.28951992 0.5790926 #> 0.5734833 0.28119862 0.5735505 #> 0.5360333 0.26270204 0.5620792 #> 0.1145917 0.05585777 0.5054686 #> #> Tuning parameter 'alpha' was held constant at a value of 1 #> AUC was used to select the optimal model using the largest value. #> The final values used for the model were alpha = 1 and lambda = 0.02. results <- lapply(seq(100, 102), function(seed) { run_ml(dat, \"glmnet\", seed = seed, hyperparameters = new_hp) }) #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. models <- lapply(results, function(x) x$trained_model) hp_metrics <- combine_hp_performance(models) plot_hp_performance(hp_metrics$dat, lambda, AUC)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"hyperparameter-options","dir":"Articles","previous_headings":"","what":"Hyperparameter options","title":"Hyperparameter tuning","text":"can see default hyperparameters used dataset get_hyperparams_list(). examples built-datasets provide: hyperparameters tuned modeling methods. output similar, won’t go details.","code":"get_hyperparams_list(otu_mini_bin, \"glmnet\") #> $lambda #> [1] 1e-04 1e-03 1e-02 1e-01 1e+00 1e+01 #> #> $alpha #> [1] 0 get_hyperparams_list(otu_mini_bin, \"rf\") #> $mtry #> [1] 2 3 6 get_hyperparams_list(otu_small, \"rf\") #> $mtry #> [1] 4 8 16"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"regression","dir":"Articles","previous_headings":"Hyperparameter options","what":"Regression","title":"Hyperparameter tuning","text":"mentioned , glmnet uses alpha parameter lambda hyperparameter. alpha 0 L2 regularization (ridge). alpha 1 L1 regularization (lasso). alpha elastic net. can also tune alpha like hyperparameter. Please refer original glmnet documentation information: https://web.stanford.edu/~hastie/glmnet/glmnet_alpha.html default hyperparameters chosen run_ml() fixed glmnet.","code":"#> $lambda #> [1] 1e-04 1e-03 1e-02 1e-01 1e+00 1e+01 #> #> $alpha #> [1] 0"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"random-forest","dir":"Articles","previous_headings":"Hyperparameter options","what":"Random forest","title":"Hyperparameter tuning","text":"run rf parRF, using randomForest package implementation. tuning mtry hyperparameter. number features randomly collected sampled tree node. number needs less number features dataset. Please refer original documentation information: https://cran.r-project.org/web/packages/randomForest/randomForest.pdf default, take square root number features dataset provide range [sqrt_features / 2, sqrt_features, sqrt_features * 2]. example number features 1000: Similar glmnet method, can provide mtry range.","code":"#> $mtry #> [1] 16 32 64"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"decision-tree","dir":"Articles","previous_headings":"Hyperparameter options","what":"Decision tree","title":"Hyperparameter tuning","text":"run rpart2, running rpart package implementation decision tree. tuning maxdepth hyperparameter. maximum depth node final tree. Please refer original documentation information maxdepth: https://cran.r-project.org/web/packages/rpart/rpart.pdf default, provide range less number features dataset. example 1000 features: 10 features:","code":"#> $maxdepth #> [1] 1 2 4 8 16 30 #> $maxdepth #> [1] 1 2 4 8"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"svm-with-radial-basis-kernel","dir":"Articles","previous_headings":"Hyperparameter options","what":"SVM with radial basis kernel","title":"Hyperparameter tuning","text":"run svmRadial method, tuning C sigma hyperparameters. sigma defines far influence single training example reaches C behaves regularization parameter. Please refer great sklearn resource information hyperparameters: https://scikit-learn.org/stable/auto_examples/svm/plot_rbf_parameters.html default, provide 2 separate range values two hyperparameters.","code":"#> $C #> [1] 1e-03 1e-02 1e-01 1e+00 1e+01 1e+02 #> #> $sigma #> [1] 1e-06 1e-05 1e-04 1e-03 1e-02 1e-01"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"xgboost","dir":"Articles","previous_headings":"Hyperparameter options","what":"XGBoost","title":"Hyperparameter tuning","text":"run xgbTree method, tuning nrounds, gamma, eta max_depth, colsample_bytree, min_child_weight subsample hyperparameters. can read hyperparameters : https://xgboost.readthedocs.io/en/latest/parameter.html default, set nrounds, gamma, colsample_bytree min_child_weight fixed values provide range values eta, max_depth subsample. can changed optimized user supplying custom named list hyperparameters run_ml().","code":"#> $nrounds #> [1] 100 #> #> $gamma #> [1] 0 #> #> $eta #> [1] 0.001 0.010 0.100 1.000 #> #> $max_depth #> [1] 1 2 4 8 16 30 #> #> $colsample_bytree #> [1] 0.8 #> #> $min_child_weight #> [1] 1 #> #> $subsample #> [1] 0.4 0.5 0.6 0.7"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"other-ml-methods","dir":"Articles","previous_headings":"","what":"Other ML methods","title":"Hyperparameter tuning","text":"ML methods tested set default hyperparameters , theory may able use methods supported caret run_ml(). Take look available models caret (see list tag). need give run_ml() custom hyperparameters just like examples :","code":"run_ml(otu_mini_bin, \"regLogistic\", hyperparameters = list( cost = 10^seq(-4, 1, 1), epsilon = c(0.01), loss = c(\"L2_primal\") ) )"},{"path":"http://www.schlosslab.org/mikropml/dev/authors.html","id":null,"dir":"","previous_headings":"","what":"Authors","title":"Authors and Citation","text":"Begüm Topçuoğlu. Author. Zena Lapp. Author. Kelly Sovacool. Author, maintainer. Evan Snitkin. Author. Jenna Wiens. Author. Patrick Schloss. Author. Nick Lesniak. Contributor. Courtney Armour. Contributor. Sarah Lucas. Contributor.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/authors.html","id":"citation","dir":"","previous_headings":"","what":"Citation","title":"Authors and Citation","text":"Topçuoğlu et al., (2021). mikropml: User-Friendly R Package Supervised Machine Learning Pipelines. Journal Open Source Software, 6(61), 3073, https://doi.org/10.21105/joss.03073","code":"@Article{, title = {{mikropml}: User-Friendly R Package for Supervised Machine Learning Pipelines}, author = {Begüm D. Topçuoğlu and Zena Lapp and Kelly L. Sovacool and Evan Snitkin and Jenna Wiens and Patrick D. Schloss}, journal = {Journal of Open Source Software}, year = {2021}, month = {May}, volume = {6}, number = {61}, pages = {3073}, doi = {10.21105/joss.03073}, url = {https://joss.theoj.org/papers/10.21105/joss.03073}, }"},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"mikropml-","dir":"","previous_headings":"","what":"User-Friendly R Package for Supervised Machine Learning Pipelines","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"meek-ROPE em el User-Friendly R Package Supervised Machine Learning Pipelines interface build machine learning models classification regression problems. mikropml implements ML pipeline described Topçuoğlu et al. (2020) reasonable default options data preprocessing, hyperparameter tuning, cross-validation, testing, model evaluation, interpretation steps. See website information, documentation, examples.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"installation","dir":"","previous_headings":"","what":"Installation","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"can install latest release CRAN: development version GitHub: install terminal using conda mamba:","code":"install.packages('mikropml') # install.packages(\"devtools\") devtools::install_github(\"SchlossLab/mikropml\") mamba install -c conda-forge r-mikropml"},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"dependencies","dir":"","previous_headings":"Installation","what":"Dependencies","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"Imports: caret, dplyr, e1071, glmnet, kernlab, MLmetrics, randomForest, rlang, rpart, stats, utils, xgboost Suggests: doFuture, foreach, future, future.apply, ggplot2, knitr, progress, progressr, purrr, rmarkdown, testthat, tidyr","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"usage","dir":"","previous_headings":"","what":"Usage","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"Check introductory vignette quick start tutorial. -depth discussion, read vignettes /take look reference documentation. can watch Riffomonas Project series video tutorials covering mikropml skills related machine learning. also provide example Snakemake workflow running mikropml HPC.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"help--contributing","dir":"","previous_headings":"","what":"Help & Contributing","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"come across bug, open issue include minimal reproducible example. questions, create new post Discussions. ’d like contribute, see guidelines .","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"code-of-conduct","dir":"","previous_headings":"","what":"Code of Conduct","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"Please note mikropml project released Contributor Code Conduct. contributing project, agree abide terms.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"license","dir":"","previous_headings":"","what":"License","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"mikropml package licensed MIT license. Text images included repository, including mikropml logo, licensed CC 4.0 license.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"citation","dir":"","previous_headings":"","what":"Citation","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"cite mikropml publications, use: Topçuoğlu BD, Lapp Z, Sovacool KL, Snitkin E, Wiens J, Schloss PD (2021). “mikropml: User-Friendly R Package Supervised Machine Learning Pipelines.” Journal Open Source Software, 6(61), 3073. doi:10.21105/joss.03073, https://joss.theoj.org/papers/10.21105/joss.03073. BibTeX entry LaTeX users :","code":"@Article{, title = {{mikropml}: User-Friendly R Package for Supervised Machine Learning Pipelines}, author = {Begüm D. Topçuoğlu and Zena Lapp and Kelly L. Sovacool and Evan Snitkin and Jenna Wiens and Patrick D. Schloss}, journal = {Journal of Open Source Software}, year = {2021}, month = {May}, volume = {6}, number = {61}, pages = {3073}, doi = {10.21105/joss.03073}, url = {https://joss.theoj.org/papers/10.21105/joss.03073}, }"},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"why-the-name","dir":"","previous_headings":"","what":"Why the name?","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"word “mikrop” (pronounced “meek-ROPE”) Turkish “microbe”. package originally implemented machine learning pipeline microbiome-based classification problems (see Topçuoğlu et al. 2020). realized methods applicable many fields , stuck name like !","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/pull_request_template.html","id":"issues","dir":"","previous_headings":"","what":"Issues","title":"NA","text":"Resolves # .","code":""},{"path":[]},{"path":"http://www.schlosslab.org/mikropml/dev/pull_request_template.html","id":"checklist","dir":"","previous_headings":"","what":"Checklist","title":"NA","text":"(Strikethrough points applicable.) Write unit tests new functionality bug fixes. roxygen comments vignettes Update NEWS.md includes user-facing changes. check workflow succeeds recent commit. always required PR can merged.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/calc_perf_metrics.html","id":null,"dir":"Reference","previous_headings":"","what":"Get performance metrics for test data — calc_perf_metrics","title":"Get performance metrics for test data — calc_perf_metrics","text":"Get performance metrics test data","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/calc_perf_metrics.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get performance metrics for test data — calc_perf_metrics","text":"","code":"calc_perf_metrics( test_data, trained_model, outcome_colname, perf_metric_function, class_probs )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/calc_perf_metrics.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get performance metrics for test data — calc_perf_metrics","text":"test_data Held test data: dataframe outcome features. trained_model Trained model caret::train(). outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). perf_metric_function Function calculate performance metric used cross-validation test performance. functions provided caret (see caret::defaultSummary()). Defaults: binary classification = twoClassSummary, multi-class classification = multiClassSummary, regression = defaultSummary. class_probs Whether use class probabilities (TRUE categorical outcomes, FALSE numeric outcomes).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/calc_perf_metrics.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get performance metrics for test data — calc_perf_metrics","text":"Dataframe performance metrics.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/calc_perf_metrics.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get performance metrics for test data — calc_perf_metrics","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/calc_perf_metrics.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get performance metrics for test data — calc_perf_metrics","text":"","code":"if (FALSE) { results <- run_ml(otu_small, \"glmnet\", kfold = 2, cv_times = 2) calc_perf_metrics(results$test_data, results$trained_model, \"dx\", multiClassSummary, class_probs = TRUE ) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/combine_hp_performance.html","id":null,"dir":"Reference","previous_headings":"","what":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","title":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","text":"Combine hyperparameter performance metrics multiple train/test splits generated , instance, looping R using snakemake workflow high-performance computer.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/combine_hp_performance.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","text":"","code":"combine_hp_performance(trained_model_lst)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/combine_hp_performance.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","text":"trained_model_lst List trained models.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/combine_hp_performance.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","text":"Named list: dat: Dataframe performance metric group hyperparameters params: Hyperparameters tuned. Metric: Performance metric used.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/combine_hp_performance.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/combine_hp_performance.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","text":"","code":"if (FALSE) { results <- lapply(seq(100, 102), function(seed) { run_ml(otu_small, \"glmnet\", seed = seed, cv_times = 2, kfold = 2) }) models <- lapply(results, function(x) x$trained_model) combine_hp_performance(models) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/compare_models.html","id":null,"dir":"Reference","previous_headings":"","what":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","title":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","text":"wrapper permute_p_value().","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/compare_models.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","text":"","code":"compare_models(merged_data, metric, group_name, nperm = 10000)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/compare_models.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","text":"merged_data concatenated performance data run_ml metric metric compare, must numeric group_name column group variables compare nperm number permutations, default=10000","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/compare_models.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","text":"table p-values pairs group variable","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/compare_models.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","text":"Courtney R Armour, armourc@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/compare_models.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","text":"","code":"df <- dplyr::tibble( model = c(\"rf\", \"rf\", \"glmnet\", \"glmnet\", \"svmRadial\", \"svmRadial\"), AUC = c(.2, 0.3, 0.8, 0.9, 0.85, 0.95) ) set.seed(123) compare_models(df, \"AUC\", \"model\", nperm = 10) #> group1 group2 p_value #> 1 glmnet svmRadial 0.7272727 #> 2 rf glmnet 0.2727273 #> 3 rf svmRadial 0.5454545"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/define_cv.html","id":null,"dir":"Reference","previous_headings":"","what":"Define cross-validation scheme and training parameters — define_cv","title":"Define cross-validation scheme and training parameters — define_cv","text":"Define cross-validation scheme training parameters","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/define_cv.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Define cross-validation scheme and training parameters — define_cv","text":"","code":"define_cv( train_data, outcome_colname, hyperparams_list, perf_metric_function, class_probs, kfold = 5, cv_times = 100, groups = NULL, group_partitions = NULL )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/define_cv.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Define cross-validation scheme and training parameters — define_cv","text":"train_data Dataframe training model. outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). hyperparams_list Named list lists hyperparameters. perf_metric_function Function calculate performance metric used cross-validation test performance. functions provided caret (see caret::defaultSummary()). Defaults: binary classification = twoClassSummary, multi-class classification = multiClassSummary, regression = defaultSummary. class_probs Whether use class probabilities (TRUE categorical outcomes, FALSE numeric outcomes). kfold Fold number k-fold cross-validation (default: 5). cv_times Number cross-validation partitions create (default: 100). groups Vector groups keep together splitting data train test sets. number groups training set larger kfold, groups also kept together cross-validation. Length matches number rows dataset (default: NULL). group_partitions Specify assign groups training testing partitions (default: NULL). groups specifies samples belong group \"\" belong group \"B\", setting group_partitions = list(train = c(\"\", \"B\"), test = c(\"B\")) result samples group \"\" placed training set, samples \"B\" also training set, remaining samples \"B\" testing set. partition sizes close training_frac possible. number groups training set larger kfold, groups also kept together cross-validation.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/define_cv.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Define cross-validation scheme and training parameters — define_cv","text":"Caret object trainControl controls cross-validation","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/define_cv.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Define cross-validation scheme and training parameters — define_cv","text":"Begüm Topçuoğlu, topcuoglu.begum@gmail.com Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/define_cv.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Define cross-validation scheme and training parameters — define_cv","text":"","code":"training_inds <- get_partition_indices(otu_small %>% dplyr::pull(\"dx\"), training_frac = 0.8, groups = NULL ) train_data <- otu_small[training_inds, ] test_data <- otu_small[-training_inds, ] cv <- define_cv(train_data, outcome_colname = \"dx\", hyperparams_list = get_hyperparams_list(otu_small, \"glmnet\"), perf_metric_function = caret::multiClassSummary, class_probs = TRUE, kfold = 5 )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_caret_processed_df.html","id":null,"dir":"Reference","previous_headings":"","what":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","title":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","text":"Get preprocessed dataframe continuous variables","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_caret_processed_df.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","text":"","code":"get_caret_processed_df(features, method)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_caret_processed_df.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","text":"features Dataframe features machine learning method Methods preprocess data, described caret::preProcess() (default: c(\"center\",\"scale\"), use NULL normalization).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_caret_processed_df.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","text":"Named list: processed: Dataframe processed features. removed: Names features removed preprocessing.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_caret_processed_df.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_caret_processed_df.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","text":"","code":"get_caret_processed_df(mikropml::otu_small[, 2:ncol(otu_small)], c(\"center\", \"scale\")) #> $processed #> Otu00001 Otu00002 Otu00003 Otu00004 Otu00005 #> 1 -0.4198476322 -0.218855527 -0.174296240 -0.59073845 -0.048774220 #> 2 -0.1045750483 1.754032339 -0.718419364 0.03805034 1.537072974 #> 3 -0.7076423302 0.696324396 1.428146361 0.60439092 -0.264559044 #> 4 -0.4936040623 -0.665193276 2.015799335 -0.59289184 -0.675577755 #> 5 1.1116829471 -0.395140184 -0.753787367 -0.58643168 -0.754356341 #> 6 -0.6845030580 0.613808173 -0.174296240 -0.58427829 0.375945115 #> 7 -0.7698291243 -0.496410093 -0.318488868 0.15863997 -0.658451975 #> 8 -0.4241862457 -0.477656406 -0.397386721 -0.55628427 -0.391289813 #> 9 -0.5557908564 1.144537514 1.615868839 -0.35171258 -0.274834512 #> 10 1.4573258257 -0.451401245 -0.693933823 -0.05669866 -0.706404158 #> 11 0.2931311927 -0.721454336 -0.753787367 3.03341063 -0.449517464 #> 12 1.1044519245 0.002437979 -0.473563958 -0.41846755 0.413621830 #> 13 -0.5933921737 -0.297621012 -0.340253793 -0.59289184 -0.655026820 #> 14 -0.8016456236 0.077452727 -0.419151646 -0.59073845 -0.045349064 #> 15 -0.7915221920 0.291244758 -0.269517787 -0.59289184 -0.220032017 #> 16 1.4862499159 -0.683946963 -0.745625520 -0.54551734 -0.744080874 #> 17 -0.3750152923 -0.051947713 0.103206554 1.37745659 0.458148857 #> 18 0.2135899445 0.325001395 0.478651509 -0.34309903 0.560903535 #> 19 -0.5181895390 -0.100707299 -0.073633462 -0.40770062 -0.237157796 #> 20 0.8745054069 -0.676445488 -0.560623658 -0.58212491 -0.154954054 #> 21 2.0184531767 -0.682071594 -0.740184289 -0.58643168 -0.720104782 #> 22 0.5867107089 -0.646439589 -0.560623658 0.09188499 -0.593374013 #> 23 -0.4603413585 -0.397015552 0.386150578 -0.42062094 -0.463218088 #> 24 -0.7553670792 1.401463025 0.829610924 -0.58858507 -0.295385447 #> 25 1.9316809059 0.334378238 -0.228708552 -0.42923448 -0.535146362 #> 26 1.2201482855 -0.108208774 -0.302165174 -0.58858507 0.358819335 #> 27 -0.9158957801 -0.674570119 -0.732022442 -0.53475041 -0.689278379 #> 28 -0.7597056927 -0.595804634 -0.375621796 -0.57566475 -0.730380250 #> 29 -0.5109585165 -0.558297260 -0.432754724 3.84093048 -0.672152599 #> 30 -0.8811868718 -0.385763340 -0.595991661 -0.58212491 -0.192630769 #> 31 0.3437483507 0.902614952 1.376454664 -0.59289184 1.396641581 #> 32 -0.5109585165 0.535042688 -0.484446421 -0.59289184 0.550628067 #> 33 1.2302717171 -0.582677053 0.007985007 -0.40554723 -0.672152599 #> 34 -0.0770971626 0.244360541 -0.313047636 -0.28711099 2.273481498 #> 35 -0.2275024319 2.211622300 1.515206061 -0.57781814 1.269910812 #> 36 0.0284757669 -0.663317907 -0.634080280 -0.57781814 -0.730380250 #> 37 -0.3157209072 -0.290119537 -0.231429168 -0.58643168 -0.233732640 #> 38 -0.1653156379 1.476477772 1.836238704 1.65309003 4.393653017 #> 39 -0.6859492625 -0.134463935 -0.258635324 0.68191283 0.399921206 #> 40 -0.3967083600 -0.126962461 -0.269517787 -0.57135798 0.304016840 #> 41 0.0009978811 -0.485157881 -0.291282711 -0.58858507 -0.463218088 #> 42 -0.4111704051 -0.029443288 -0.400107336 -0.19236199 0.050555301 #> 43 1.4399713715 -0.693323806 -0.734743058 3.42532693 -0.757781497 #> 44 -0.4805882217 -0.593929265 -0.397386721 1.44851834 -0.648176508 #> 45 0.3827958725 -0.421395345 -0.609594739 2.34648041 -0.333062162 #> 46 1.6438862078 -0.592053897 -0.579667967 -0.58427829 -0.726955094 #> 47 -0.5471136293 -0.280742693 0.269164106 1.53680717 -0.202906237 #> 48 1.6164083221 -0.653941063 -0.615035970 -0.58643168 -0.696128691 #> 49 -0.8609400086 -0.078202875 -0.397386721 0.99630722 -0.086450936 #> 50 0.6026189585 0.146841369 -0.345695024 -0.58212491 1.002748649 #> 51 -0.8363545319 -0.635187377 -0.421872261 -0.56705120 -0.593374013 #> 52 -0.1783314785 0.328752132 -0.666727667 -0.58643168 -0.531721206 #> 53 1.6728102981 -0.548920417 -0.639521511 -0.58858507 -0.562547610 #> 54 1.2620882164 -0.458902719 -0.751066751 1.21595262 -0.579673389 #> 55 -0.7828449649 -0.065075294 -0.130766390 1.23963987 0.375945115 #> 56 2.0705165392 -0.605181477 -0.606874124 -0.58858507 -0.733805406 #> 57 -0.6469017407 -0.327626911 -0.666727667 -0.57566475 -0.600224325 #> 58 -0.3663380652 8.732279248 -0.740184289 -0.54551734 -0.730380250 #> 59 -0.4415406999 1.363955651 -0.748346136 -0.10191977 1.746007486 #> 60 -0.4111704051 -0.479531775 -0.242311630 -0.59289184 -0.422116216 #> 61 -0.2694423628 2.967395884 -0.740184289 0.50964193 0.721885863 #> 62 0.6112961856 0.047446828 -0.579667967 -0.52613687 0.564328691 #> 63 -0.2347334545 -0.425146083 -0.479005189 -0.05454527 -0.665302287 #> 64 -0.8334621229 -0.344505229 -0.356577486 -0.45507512 -0.350187942 #> 65 -0.8884178944 -0.550795785 -0.400107336 0.57424351 -0.476918711 #> 66 -0.6049618098 -0.721454336 1.305718658 1.43129125 -0.487194179 #> 67 -0.8970951214 -0.642688851 -0.623197817 -0.58858507 -0.682428067 #> 68 -0.7293353979 0.801345043 -0.130766390 0.47303436 -0.257708732 #> 69 -0.7221043754 -0.721454336 2.736762475 1.05660204 -0.052199376 #> 70 -0.1002364348 -0.717703599 0.679977065 -0.57135798 0.512951352 #> 71 -0.2708885673 -0.584552422 0.783360459 -0.59289184 0.389645738 #> 72 -0.7221043754 0.150592106 1.036377712 1.00276738 0.030004366 #> 73 -0.4979426759 0.486283102 0.116809632 0.01436309 0.314292308 #> 74 -0.5557908564 -0.412018502 -0.527976271 -0.32587194 -0.315936383 #> 75 -0.7061961257 0.306247708 -0.323930099 -0.17944168 0.526651975 #> 76 0.4203971899 -0.558297260 0.269164106 0.31583716 0.074531393 #> 77 -0.6425631272 0.304372339 0.699021375 -0.52829025 0.427322453 #> 78 0.6488975029 -0.053823082 2.739483091 -0.59073845 -0.291960291 #> 79 -0.7380126250 -0.190724996 -0.424592877 -0.16867475 -0.199481081 #> 80 2.5159475292 -0.717703599 -0.710257517 -0.59289184 -0.757781497 #> 81 -0.8710634402 -0.331377648 -0.470843343 -0.55628427 -0.555697298 #> 82 -0.4039393825 1.645260955 -0.138928237 -0.59289184 0.410196674 #> 83 1.0032176086 -0.425146083 -0.405548568 -0.59073845 0.095082329 #> 84 -0.8305697138 -0.548920417 -0.748346136 -0.59289184 -0.319361539 #> 85 -0.8088766461 -0.368885022 2.105579651 -0.57135798 -0.579673389 #> 86 -0.6859492625 -0.550795785 0.410636119 0.67545267 -0.490619335 #> 87 -0.0062331415 0.167470424 0.367106269 -0.59289184 0.869167568 #> 88 -0.6497941497 1.360204913 -0.751066751 -0.59289184 -0.291960291 #> 89 -0.4458793134 0.788217462 0.731668762 -0.30864485 1.211683161 #> 90 -0.1421763657 -0.717703599 0.767036765 -0.59289184 2.119349482 #> 91 -0.7915221920 1.123908458 0.652770909 2.20651030 -0.045349064 #> 92 -0.4502179269 0.116835470 -0.054589152 -0.26342374 3.900430564 #> 93 1.3633225323 -0.593929265 -0.753787367 0.12203240 -0.206331393 #> 94 -0.6743796264 -0.442024401 -0.538858733 2.10314776 -0.679002911 #> 95 -0.2072555688 0.193725586 -0.364739333 -0.49383607 0.554053223 #> 96 -0.3460912020 2.147859765 2.856469562 1.86412189 1.304162371 #> 97 0.2121437400 -0.700825281 -0.742904905 0.42135309 -0.747506030 #> 98 -0.5948383782 0.169345793 0.024308701 -0.36463290 -0.401565281 #> 99 1.5281898468 -0.704576018 -0.699375055 -0.58858507 -0.692703535 #> 100 -0.5688066970 -0.680196225 -0.729301827 -0.58212491 -0.740655718 #> 101 -0.2361796590 -0.179472784 -0.498049499 -0.58427829 -0.631050728 #> 102 2.6518907534 -0.683946963 -0.721139980 -0.53905718 -0.740655718 #> 103 0.8325654761 -0.590178528 -0.473563958 2.55966565 -0.672152599 #> 104 0.3061470334 -0.162594466 0.524901975 0.39981923 1.636402496 #> 105 -0.5644680835 0.915742533 -0.449078418 -0.57997152 0.841766321 #> 106 -0.2390720680 0.345630450 -0.348415640 -0.56274443 0.132759044 #> 107 -0.7394588295 -0.278867325 -0.748346136 0.10049854 0.290316216 #> 108 -0.3648918607 1.302068484 0.802404768 -0.59289184 0.506101040 #> 109 1.5079429836 -0.682071594 -0.650403974 -0.50029623 -0.713254470 #> 110 -0.6035156053 -0.451401245 -0.582388583 -0.59289184 -0.583098545 #> 111 -0.2810119989 1.495231459 0.660932756 -0.59289184 -0.130977963 #> 112 -0.4502179269 -0.719578968 1.033657096 -0.58427829 -0.754356341 #> 113 -0.4458793134 0.508787527 2.353155672 -0.59289184 4.314874431 #> 114 -0.7813987604 0.090580308 -0.304885790 -0.17944168 -0.329637006 #> 115 -0.7365664205 3.171811071 0.228354872 -0.59073845 0.649957589 #> 116 -0.0264800046 -0.078202875 -0.413710414 0.21462801 0.321142620 #> 117 -0.6324396956 -0.205727946 -0.753787367 0.15863997 -0.702979003 #> 118 -0.5557908564 -0.213229420 0.821449078 0.09188499 2.633122871 #> 119 0.2309443987 1.073273503 2.619776003 -0.42062094 0.817790229 #> 120 0.3900268951 -0.674570119 -0.718419364 -0.58858507 -0.737230562 #> 121 -0.9057723485 -0.344505229 -0.661286436 -0.59073845 -0.668727443 #> 122 -0.0264800046 0.013690191 -0.751066751 -0.59289184 -0.761206653 #> 123 -0.7076423302 -0.637062745 -0.468122727 -0.50890978 -0.675577755 #> 124 0.0545074481 -0.650190326 0.443283506 -0.57566475 0.348543867 #> 125 0.3249476920 0.144966000 -0.585109199 -0.58427829 -0.196055925 #> 126 -0.0496192768 2.852998394 0.233796103 -0.58858507 0.944520999 #> 127 -0.3388601794 -0.057573819 0.954759243 -0.42923448 -0.004247193 #> 128 -0.0366034362 -0.286368800 -0.511652577 1.86196850 -0.757781497 #> 129 -0.8522627815 -0.355757441 -0.386504258 -0.32371856 0.194411850 #> 130 -0.2766733854 0.094331045 1.893371632 2.95158195 2.937961748 #> 131 0.5433245735 -0.537668204 -0.712978133 -0.58427829 -0.747506030 #> 132 -0.3764614968 -0.121336354 0.062397319 -0.56274443 -0.360463410 #> 133 0.1340486963 -0.316374699 0.312693956 -0.45722851 -0.052199376 #> 134 -0.5196357436 0.308123076 -0.280400249 -0.56705120 0.598580250 #> 135 0.1499569459 -0.706451387 -0.712978133 -0.57781814 -0.744080874 #> 136 -0.3808001103 1.189546362 0.475930894 -0.59289184 0.214962786 #> 137 -0.6859492625 0.872609053 5.601570724 -0.56705120 -0.672152599 #> 138 0.1210328557 -0.301371749 -0.443637186 -0.59289184 -0.562547610 #> 139 -0.8450317590 -0.192600365 -0.636800895 1.93303025 -0.709829314 #> 140 0.1803272407 0.475030890 0.435121659 1.12120363 1.276761123 #> 141 -0.8464779635 0.321250657 -0.220546705 -0.58643168 -0.538571518 #> 142 -0.8826330763 -0.472030300 0.764316150 3.24444248 0.026579210 #> 143 -0.8537089861 -0.522665255 -0.549741196 1.32792871 -0.357038254 #> 144 -0.7582594882 -0.344505229 -0.664007052 -0.45722851 -0.726955094 #> 145 -0.5832687421 -0.171971309 1.553294680 -0.58212491 -0.374164034 #> 146 -0.6469017407 -0.470154931 -0.644962742 -0.59073845 -0.730380250 #> 147 -0.3460912020 -0.023817182 2.127344576 1.81459400 0.307441996 #> 148 -0.5644680835 -0.408267765 2.886396334 0.15863997 -0.346762786 #> 149 1.1478380599 -0.593929265 -0.601432892 -0.58427829 -0.730380250 #> 150 0.1427259234 3.299336143 2.657864622 -0.56705120 3.835352601 #> 151 -0.5659142880 3.123051485 1.289394964 -0.57781814 0.899993971 #> 152 0.9699549048 0.081203464 -0.285841480 -0.58643168 0.423897298 #> 153 -0.8378007364 0.203102430 0.070559166 -0.58427829 -0.442667152 #> 154 -0.6830568535 -0.653941063 -0.560623658 -0.56274443 -0.668727443 #> 155 -0.3735690877 -0.466404194 -0.590550430 -0.49383607 -0.689278379 #> 156 -0.3460912020 -0.235733845 -0.294003327 2.08161390 -0.682428067 #> 157 4.1935447642 -0.668944013 -0.683051361 2.59842660 -0.716679626 #> 158 0.1687576046 -0.477656406 -0.000176840 -0.57135798 -0.713254470 #> 159 0.9280149739 -0.592053897 2.867352025 -0.58212491 0.584879626 #> 160 1.1001133110 -0.674570119 -0.715698748 -0.48737591 -0.494044491 #> 161 -0.6526865587 -0.693323806 -0.718419364 -0.57566475 -0.750931186 #> 162 -0.0192489821 0.495659946 -0.751066751 -0.59289184 5.606158216 #> 163 -0.2491954996 -0.653941063 -0.536138117 -0.16006120 -0.668727443 #> 164 0.4478750756 -0.250736794 -0.179737471 -0.44215480 -0.500894803 #> 165 0.4088275538 -0.663317907 -0.595991661 -0.56705120 0.310867152 #> 166 -0.9130033711 0.317499920 0.761595534 -0.59289184 -0.449517464 #> 167 -0.8999875305 0.506912158 0.595637981 0.91447854 -0.720104782 #> 168 2.9367930424 -0.721454336 -0.748346136 -0.58643168 -0.754356341 #> 169 3.0119956771 -0.689573069 -0.680330745 -0.59073845 -0.726955094 #> 170 3.9332279519 -0.706451387 -0.740184289 0.67760606 -0.432391684 #> 171 -0.5962845827 0.291244758 -0.196061165 -0.57351136 0.701334928 #> 172 -0.7683829198 -0.554546523 -0.658565820 -0.12776041 -0.552272142 #> 173 -0.1754390695 -0.712077493 -0.734743058 2.77931105 -0.730380250 #> 174 -0.3186133163 -0.166345203 -0.397386721 1.53034702 -0.028223285 #> 175 -0.5890535602 0.373760981 -0.043706690 -0.30649147 -0.518020582 #> 176 -0.3446449975 -0.160719097 2.959852956 -0.57781814 -0.161804366 #> 177 -0.5283129706 -0.288244168 -0.606874124 -0.57566475 0.067681081 #> 178 1.8608168848 -0.036944763 -0.160693162 -0.39262692 -0.329637006 #> 179 0.0501688346 -0.698949912 -0.726581211 0.53978933 -0.750931186 #> 180 -0.8363545319 0.364384137 -0.492608267 -0.33233210 -0.446092308 #> 181 -0.8378007364 0.131838419 -0.296723943 -0.56489782 -0.634475884 #> 182 -0.8233386913 -0.593929265 -0.095398387 -0.58858507 1.266485656 #> 183 -0.7177657618 -0.571424841 -0.737463673 3.51146238 -0.644751352 #> 184 -0.7625981017 0.683196815 -0.405548568 -0.39478030 -0.175504990 #> 185 -0.3301829524 -0.672694750 -0.742904905 -0.59289184 -0.569397921 #> 186 -0.9202343936 -0.383887972 -0.117163312 1.05660204 -0.048774220 #> 187 0.0762005158 -0.006938864 -0.593271045 2.91066761 0.036854678 #> 188 0.5028308471 -0.708326756 -0.742904905 -0.23542972 -0.062474844 #> 189 -0.8652786222 -0.389514078 0.032470547 -0.47230221 -0.055624532 #> 190 -0.7842911694 -0.059449188 -0.356577486 0.82403632 -0.668727443 #> 191 0.7212077286 -0.685822331 -0.753787367 -0.55197750 -0.631050728 #> 192 0.9844169499 -0.713952862 -0.751066751 -0.58427829 -0.754356341 #> 193 0.3061470334 0.120586207 -0.261355940 -0.58427829 0.817790229 #> 194 -0.2810119989 -0.577050947 -0.443637186 0.18017383 -0.661877131 #> 195 -0.5413288112 0.195600955 -0.356577486 -0.58212491 0.677358836 #> 196 1.6858261387 -0.702700649 -0.734743058 -0.59073845 -0.723529938 #> 197 1.1478380599 -0.078202875 3.286326831 -0.57135798 0.852041788 #> 198 5.1046536074 -0.691448437 -0.753787367 -0.48737591 -0.716679626 #> 199 0.9309073830 -0.350131335 -0.590550430 -0.58212491 1.232234096 #> 200 -0.6252086730 0.400016142 -0.424592877 -0.58427829 -0.048774220 #> Otu00006 Otu00007 Otu00008 Otu00009 Otu00010 Otu00011 #> 1 -0.16741683 -0.568745802 -0.0623643594 0.05469266 -0.637876227 0.72598299 #> 2 -0.57321773 -0.642744310 -0.1320452079 -0.40907548 -0.383314722 0.01116897 #> 3 -0.03641025 -0.612472193 -0.2070861217 -0.73691158 2.586569508 -0.77512645 #> 4 -0.58599886 -0.551927960 -0.4697293198 0.13465268 0.025980248 -0.02010414 #> 5 2.72750923 0.191420685 -0.6760918326 1.26208901 1.703091342 1.58822740 #> 6 0.80394909 -0.336659574 -0.0060836741 -0.26514744 -0.348374907 0.55621466 #> 7 2.20028760 -0.716742817 0.0635971744 -0.84485761 -0.682798846 -0.48920084 #> 8 -0.06197251 0.376416954 -0.0221638699 0.61841082 2.381922023 -0.75278851 #> 9 -0.46457812 -0.804195599 0.2940799810 -0.40907548 0.280541753 0.18540489 #> 10 0.68891891 -0.370295259 1.5885357433 -0.72091957 0.495172042 -0.82873750 #> 11 -0.24090833 0.066968649 -0.1963659911 0.09067467 0.500163444 0.38644633 #> 12 -0.35593851 -0.777287051 0.3423205684 0.48247878 0.634931300 -0.48920084 #> 13 -0.49972623 -0.141572599 -0.2178062522 -0.14520740 -0.248546866 1.11019552 #> 14 -0.77452053 -0.622562899 -0.4214887324 -0.64495755 -0.607927815 -0.04690967 #> 15 -0.68824790 -0.582200076 0.8997673563 -0.79688159 -0.892437732 -0.62769606 #> 16 0.91897926 1.180309832 -0.4241687650 -0.06524738 -0.058873587 -0.06478002 #> 17 0.18725954 0.046787238 2.0950619112 -0.13321340 -0.423245938 -0.65003400 #> 18 1.43341977 -0.316478163 0.7791658878 0.44249877 -0.353366309 -0.02903932 #> 19 -0.73298186 -0.693197838 -0.2124461869 0.28657672 0.045945856 -0.55174707 #> 20 -0.11948759 -0.481293021 -0.2365664806 0.54644680 2.327016600 1.20848245 #> 21 5.17190045 0.712773807 -0.6787718653 0.43450277 1.468495445 -0.87788097 #> 22 -0.60517055 1.372033238 0.5004424938 1.02620694 0.040954454 0.29262699 #> 23 -0.26008003 -0.568745802 -0.6787718653 -0.80087960 -0.677807444 0.29709458 #> 24 -0.25368946 0.524413969 -0.6787718653 0.62240882 0.619957094 -0.09605313 #> 25 0.79116796 -0.797468462 0.2779997852 0.79432287 -0.363349113 0.69024229 #> 26 -0.25368946 0.904497213 -0.5126098420 0.24659671 3.609806932 -0.80193198 #> 27 -0.88635543 0.278873467 -0.6760918326 -0.86884562 -0.153710227 -0.87788097 #> 28 -0.88955071 0.077059355 0.1895587082 0.22260871 -0.842523712 -0.87788097 #> 29 2.07567158 -0.518292274 -0.6760918326 0.31056473 0.445258021 -0.81086715 #> 30 -0.80647336 -0.784014188 -0.5903307884 -0.83286360 -0.932368949 -0.80639956 #> 31 1.79768199 -0.797468462 -0.6787718653 -0.82886560 -0.378323320 2.83915193 #> 32 -0.49333566 0.100604335 -0.6787718653 -0.25715144 -0.712747258 -0.54727948 #> 33 -0.79049695 0.917951487 0.3878811232 1.19812099 -0.647859031 0.67683952 #> 34 -0.37511020 1.028949248 -0.6787718653 0.02670665 -0.558013794 -0.87788097 #> 35 -0.15463570 -0.239116087 -0.5822906904 -0.64895555 0.585017279 0.01116897 #> 36 0.92536983 0.743045923 0.9480079437 2.52545738 0.470215032 -0.46239532 #> 37 2.33129419 0.325963426 -0.5849707231 -0.84485761 -0.897429135 1.27549626 #> 38 1.15862546 -0.787377757 -0.6787718653 -0.11722139 0.679853918 1.45419977 #> 39 -0.53806962 -0.373658828 0.0582371091 -0.35710146 -0.737704268 -0.31496493 #> 40 -0.56363188 -0.535110117 -0.5045697441 -0.02926537 0.555068867 -0.24348353 #> 41 -0.11309703 1.207218380 -0.0864846531 0.96623692 1.363676002 0.34177045 #> 42 2.76585263 -0.387113102 0.7014449414 -0.70492757 -0.892437732 0.98063548 #> 43 -0.62753753 -0.797468462 -0.6707317674 6.20761646 1.054209073 0.15859936 #> 44 -0.36552436 0.547958949 -0.6653717021 0.57043481 0.510146248 0.65896917 #> 45 0.01151899 -0.794104894 -0.6466114737 1.17413298 1.608254703 -0.85554303 #> 46 0.17128313 -0.555291528 0.0207166523 1.17813099 -0.233572660 2.34771729 #> 47 -0.75215356 0.036696533 0.9185275847 0.13865068 -0.298460887 0.34623804 #> 48 1.35034242 3.773621166 0.6022837339 -0.57299353 0.150765299 3.36185968 #> 49 -0.81286393 -0.784014188 -0.2526466764 -0.83686161 2.242162765 -0.80193198 #> 50 -0.60836584 0.574867497 -0.4214887324 -0.80087960 -0.148718825 -0.70364505 #> 51 0.16489256 2.014474827 -0.6787718653 0.28257872 2.297068188 -0.56514983 #> 52 0.63140383 0.161148568 0.2860398831 -0.21717142 0.400335403 -0.23454835 #> 53 0.38856234 -0.800832031 -0.6680517347 3.36503763 0.055928660 0.69917746 #> 54 -0.76173940 3.867801085 -0.6787718653 -0.23716143 -0.617910619 -0.87341338 #> 55 -0.54765547 -0.689834269 1.0686094123 -0.25315343 -0.792609691 -0.73045058 #> 56 2.00537536 1.789115735 -0.6787718653 -0.62496754 1.618237507 -0.87341338 #> 57 -0.78730166 -0.296296752 -0.1856458606 0.29857073 0.794656166 -0.46686291 #> 58 0.91258870 -0.800832031 -0.6734118000 -0.88483762 -0.867480722 -0.84660785 #> 59 -0.10990175 -0.565382234 -0.6760918326 0.78232887 0.150765299 -0.77065886 #> 60 -0.80966864 -0.403930944 0.2833598504 -0.08123938 -0.043899381 -0.18540489 #> 61 -0.27925173 -0.659562152 -0.5045697441 -0.58098953 -0.692781650 0.40431668 #> 62 0.74004343 -0.091119071 -0.6573316042 0.29857073 -0.423245938 0.12285866 #> 63 -0.33037625 -0.333296005 -0.4884895482 0.41451276 -0.742695670 -0.39538150 #> 64 -0.46777340 -0.054119818 1.5965758412 -0.61297354 -0.538048186 -0.44005738 #> 65 -0.80008280 -0.777287051 -0.5769306252 -0.73691158 -0.353366309 -0.82873750 #> 66 -0.72659129 0.450415461 1.5670954822 -0.02126937 -0.508099773 0.69470987 #> 67 -0.66588092 -0.696561406 1.1248900976 -0.75290358 -0.498116969 -0.62322847 #> 68 0.06583880 0.362962680 -0.6787718653 0.10666667 -0.218598454 0.82426992 #> 69 0.25755576 -0.575472939 0.0448369460 -0.42506748 -0.687790248 -0.66790435 #> 70 1.16821131 0.188057116 -0.1320452079 -0.08923539 -0.288478082 -0.12732625 #> 71 0.02430012 0.140967157 -0.6707317674 0.50646679 0.709802331 2.57556426 #> 72 0.12335389 -0.461111609 0.6451642560 -0.36109947 -0.273503876 -0.38197874 #> 73 -0.16741683 -0.175208285 -0.1722456974 -0.62896554 -0.558013794 0.16753454 #> 74 0.12335389 -0.040665543 -0.2392465133 0.19862070 0.020988846 -0.68130711 #> 75 -0.48055453 -0.683107132 -0.3116073944 -0.53701152 -0.188650041 -0.02457173 #> 76 -0.48694510 -0.804195599 -0.0302039678 -0.04525737 -0.518082577 0.55621466 #> 77 -0.84162147 -0.558655097 -0.6117710494 -0.80087960 0.205670722 -0.42218703 #> 78 0.86785474 0.053514375 0.1654384145 -0.88083962 2.322025198 -0.86001062 #> 79 -0.62114697 -0.498110863 -0.3089273618 -0.54500752 -0.712747258 -0.15413177 #> 80 -0.90233184 -0.797468462 -0.6760918326 -0.88483762 0.649905506 2.71405948 #> 81 -0.50611679 -0.716742817 -0.6707317674 -0.75290358 -0.852506516 1.21741763 #> 82 -0.08433949 -0.366931691 -0.6787718653 0.05869066 -0.328409299 -0.87788097 #> 83 0.65377080 -0.155026873 -0.6600116368 0.49847079 1.488461053 0.78406163 #> 84 -0.81925449 0.231783507 -0.6787718653 0.63040483 -0.308443691 -0.84660785 #> 85 -0.71381016 -0.753742071 -0.1427653384 -0.71692157 -0.882454928 -0.86894579 #> 86 -0.88316014 0.322599857 -0.6734118000 1.87378319 -0.533056784 1.00744101 #> 87 0.13293973 -0.477929452 -0.6707317674 -0.03326337 1.223916744 2.28070348 #> 88 -0.35274323 -0.400567376 -0.4482890587 -0.66494756 -0.418254536 -0.22114559 #> 89 -0.12587816 -0.141572599 0.5138426570 -0.60097954 -0.458185753 0.30602975 #> 90 1.82643953 -0.575472939 -0.3866483081 -0.34910546 -0.088822000 1.92776406 #> 91 -0.90233184 -0.804195599 -0.5983708863 -0.71692157 -0.707755856 -0.41325185 #> 92 -0.36871964 -0.494747295 -0.0516442289 -0.32111945 -0.702764454 -0.80193198 #> 93 -0.14824514 -0.800832031 0.0930775334 4.84030006 3.445090663 -0.35964080 #> 94 -0.86718373 -0.091119071 1.1329301955 0.40651676 0.305498763 -0.60089053 #> 95 -0.21215079 -0.380385965 0.2377992956 -0.53701152 -0.707755856 0.12732625 #> 96 -0.55724132 -0.464475178 -0.6787718653 -0.87284362 -0.538048186 -0.87788097 #> 97 3.31863652 3.736621913 -0.6734118000 0.94624692 2.671423343 -0.50707119 #> 98 -0.75215356 -0.535110117 -0.6787718653 -0.50102951 -0.013950969 1.64630604 #> 99 0.78477739 -0.804195599 0.2699596873 0.65039483 0.510146248 -0.67237194 #> 100 1.54844994 -0.800832031 -0.6787718653 -0.88483762 -0.667824639 -0.73938575 #> 101 2.03732818 -0.030574838 0.2511994588 -0.60897554 -0.098804804 1.15040381 #> 102 -0.88316014 2.815004136 -0.3330476555 1.66588713 -0.937360351 -0.31049734 #> 103 -0.41664888 2.848639821 -0.6787718653 1.03820094 -0.443211546 -0.15859936 #> 104 -0.39747718 0.262055624 0.0501970112 2.35754133 -0.268512474 -0.13179383 #> 105 -0.44860171 0.191420685 0.2404793283 0.11466267 -0.533056784 2.22262484 #> 106 -0.77771582 -0.730197092 0.8381266057 -0.82486760 -0.942351753 0.84214027 #> 107 -0.39428190 -0.020484132 1.2026110440 -0.52501751 -0.712747258 0.19434006 #> 108 0.83590191 -0.538473686 -0.3384077207 -0.50502751 -0.363349113 -0.66343676 #> 109 -0.28883757 0.066968649 1.6716167550 -0.57699153 -0.657841835 -0.76172369 #> 110 -0.73298186 -0.340023142 0.0475169786 -0.20517742 -0.707755856 -0.41325185 #> 111 -0.47096868 -0.518292274 -0.2794470028 -0.42906549 -0.043899381 -0.25688629 #> 112 -0.89913656 0.194784253 -0.6760918326 -0.46104950 -0.957325959 -0.87788097 #> 113 0.46524913 0.369689817 -0.6787718653 -0.41707148 0.530111856 -0.33283528 #> 114 -0.87357430 -0.582200076 -0.0007236088 -0.64095955 -0.702764454 -0.39091392 #> 115 -0.36232907 -0.363568122 -0.2499666438 -0.07324338 0.270558949 -0.10498831 #> 116 0.08501049 0.921315055 -0.3276875902 -0.61297354 0.125808289 2.32091177 #> 117 1.88395462 0.009787984 2.6230283401 -0.88083962 1.203951135 -0.81980233 #> 118 -0.89594127 1.405668923 1.9905406385 -0.46104950 -0.867480722 0.22561317 #> 119 -0.58599886 0.151057863 -0.6734118000 -0.60897554 1.628220311 0.02010414 #> 120 2.31851306 3.225359496 -0.3357276881 3.15314357 0.365395588 0.14519660 #> 121 -0.19617438 -0.713379249 0.2377992956 -0.79288359 -0.927377547 0.19434006 #> 122 0.04027654 -0.454384472 0.5084825917 0.21861070 0.020988846 -0.05584485 #> 123 -0.83203562 0.053514375 -0.6787718653 0.16663669 -0.882454928 -0.80193198 #> 124 1.19696885 0.201511390 0.0421569133 0.49447279 -0.632884825 0.15413177 #> 125 -0.02682440 -0.111300483 -0.6707317674 -0.08923539 -0.108787608 0.07371520 #> 126 -0.41984416 -0.521655843 -0.1508054363 -0.20917542 -0.113779010 0.12732625 #> 127 -0.31439983 -0.259297498 -0.6626916695 0.65039483 0.844570187 -0.73045058 #> 128 -0.50292151 2.169198979 0.0582371091 -0.09323339 1.193968331 -0.84214027 #> 129 -0.84162147 -0.171844716 -0.2338864480 -0.83686161 -0.822558104 -0.81980233 #> 130 -0.64670923 -0.370295259 -0.6787718653 -0.45705149 -0.682798846 -0.87788097 #> 131 -0.04599609 1.752116482 -0.6760918326 0.94624692 1.533383672 0.28815940 #> 132 0.82631607 -0.138209031 -0.6760918326 -0.44505749 0.160748104 -0.86894579 #> 133 0.43329630 0.235147076 3.7084415570 -0.53701152 -0.563005196 -0.46686291 #> 134 -0.27286116 -0.575472939 -0.0543242615 -0.45705149 -0.672816042 -0.47133049 #> 135 2.29934136 3.295994435 0.5835235055 1.39802105 -0.538048186 -0.72598299 #> 136 -0.49014038 -0.414021650 0.3369605031 -0.55300352 0.440266619 -0.20327524 #> 137 -0.07475364 -0.498110863 -0.1963659911 -0.53701152 -0.752678475 0.19434006 #> 138 -0.53167905 -0.420748787 -0.6787718653 -0.36909547 -0.882454928 0.56068225 #> 139 -0.87037901 -0.696561406 -0.3893283408 -0.35710146 0.919441218 -0.82873750 #> 140 -0.88955071 1.153401284 -0.1052448815 -0.87684162 1.832867796 -0.87788097 #> 141 -0.74256771 -0.646107878 -0.6787718653 0.21461270 -0.832540908 -0.68130711 #> 142 -0.81286393 -0.740287797 -0.1963659911 -0.83286360 -0.947343155 -0.77959404 #> 143 -0.52209321 -0.740287797 -0.4080885692 -0.70492757 -0.762661279 -0.87341338 #> 144 -0.78410638 -0.528382980 -0.0328840004 -0.74490758 -0.777635485 -0.53387672 #> 145 -0.81925449 -0.666289290 -0.6707317674 -0.88483762 -0.907411939 0.97170031 #> 146 -0.89594127 -0.625926467 -0.4992096788 -0.10122939 -0.243555464 -0.35070563 #> 147 1.67945653 -0.800832031 3.4377582610 -0.88483762 -0.927377547 -0.87788097 #> 148 -0.87357430 -0.350113848 -0.6760918326 -0.19318342 -0.847515114 0.23901594 #> 149 -0.14504986 3.423810040 -0.6573316042 -0.82886560 -0.937360351 -0.86894579 #> 150 -0.54765547 -0.686470701 -0.4911695809 -0.88083962 -0.957325959 -0.87788097 #> 151 0.95732265 -0.740287797 -0.6707317674 -0.52501751 -0.677807444 0.73045058 #> 152 0.12974445 0.023242259 0.2538794914 0.43050477 -0.852506516 -0.36410839 #> 153 -0.88316014 -0.760469208 -0.0570042941 -0.69693156 -0.083830598 -0.60089053 #> 154 -0.87676958 0.181329979 -0.6787718653 -0.58098953 -0.907411939 -0.74385334 #> 155 -0.22493192 0.299054878 -0.6760918326 0.60641682 1.154037115 2.12880550 #> 156 1.17460187 -0.185298990 3.1563548344 0.87028490 0.120816887 -0.01116897 #> 157 -0.85440260 -0.800832031 -0.6707317674 -0.87684162 -0.947343155 -0.87788097 #> 158 -0.26966588 -0.235752518 -0.6653717021 -0.02526737 -0.123761814 -0.39091392 #> 159 -0.77452053 -0.740287797 2.1754628902 -0.10122939 -0.822558104 -0.10945590 #> 160 -0.57002245 2.630007867 2.0468213238 2.70936544 -0.797601093 1.57035705 #> 161 -0.85440260 -0.269388203 1.2990922188 -0.24115943 0.001023237 0.19434006 #> 162 -0.75854412 0.164512137 -0.6787718653 -0.76889559 0.979338042 -0.51600637 #> 163 0.34063311 -0.269388203 1.1945709461 -0.40107948 0.020988846 0.41325185 #> 164 1.27046035 2.199471096 0.4656020696 -0.85685161 -0.303452289 -0.74832093 #> 165 -0.10031590 1.190400537 0.2243991325 0.93825092 -0.353366309 -0.50260361 #> 166 -0.89913656 -0.656198584 -0.6653717021 3.16913557 -0.957325959 -0.52047396 #> 167 -0.61156112 -0.639380741 -0.2044060890 -0.73691158 0.110834083 -0.74832093 #> 168 -0.89913656 -0.141572599 1.6823368855 0.02270865 2.371939219 -0.78406163 #> 169 0.02430012 0.282237035 1.0257288901 2.52145938 0.714793733 0.36857598 #> 170 -0.84162147 -0.804195599 -0.6760918326 -0.88483762 -0.952334557 6.31046751 #> 171 -0.58919414 -0.356840985 0.3021200789 -0.34110946 -0.338392103 0.06924761 #> 172 -0.45179699 -0.511565137 -0.0650443920 -0.63296355 -0.617910619 -0.69470987 #> 173 -0.61795169 -0.356840985 -0.4992096788 -0.30112945 -0.717738660 -0.81086715 #> 174 0.44927271 0.803590157 -0.6760918326 0.21861070 0.450249423 0.52940913 #> 175 0.46205385 -0.158390442 -0.6760918326 -0.47304350 -0.637876227 0.72598299 #> 176 0.81673022 0.019878690 -0.6734118000 -0.09723139 0.370386990 0.38644633 #> 177 -0.41345360 -0.659562152 -0.0757645226 -0.40107948 -0.862489320 0.01563656 #> 178 -0.90233184 -0.797468462 -0.6787718653 1.03420294 0.170730908 2.49514768 #> 179 -0.82884034 -0.252570361 4.7054136970 0.72635685 3.634763942 -0.78852921 #> 180 -0.47096868 -0.706652112 -0.6626916695 -0.68493756 -0.927377547 -0.44899256 #> 181 -0.02362912 -0.760469208 -0.3678880797 -0.82886560 -0.872472124 -0.81533474 #> 182 -0.74256771 -0.625926467 0.9506879764 -0.01727137 -0.727721464 0.10945590 #> 183 -0.22812720 1.583938055 -0.1910059258 0.38252875 -0.652850433 -0.01116897 #> 184 -0.72020073 -0.706652112 -0.6358913431 -0.68493756 -0.518082577 -0.54727948 #> 185 -0.25049418 0.938132898 -0.6787718653 0.13065468 -0.782626887 -0.16753454 #> 186 -0.80966864 -0.733560660 -0.2285263827 -0.86884562 -0.912403341 -0.81980233 #> 187 -0.89913656 -0.797468462 -0.6787718653 1.25409301 2.312042394 -0.86894579 #> 188 2.32490362 0.023242259 -0.6171311147 -0.87684162 -0.008959567 -0.83320509 #> 189 -0.89274599 -0.784014188 -0.3678880797 -0.88083962 -0.882454928 -0.84214027 #> 190 -0.31120455 -0.548564391 -0.3518078839 -0.39308348 -0.777635485 0.04690967 #> 191 2.05330460 0.921315055 0.9453279111 -0.26914544 -0.208615649 2.93297127 #> 192 0.66335665 0.070332218 -0.6787718653 0.32655674 -0.368340516 -0.35070563 #> 193 1.15862546 -0.131481894 -0.3062473291 -0.44905549 -0.563005196 0.07371520 #> 194 2.01815649 -0.121391188 -0.3625280144 -0.44505749 0.899475609 -0.36857598 #> 195 -0.59877999 -0.498110863 0.9346077806 -0.28113944 -0.767652681 -0.58302018 #> 196 -0.89913656 -0.457748041 6.7583186948 -0.02926537 0.035963052 -0.68577470 #> 197 -0.21215079 -0.757105640 -0.4429289935 0.58242881 -0.737704268 0.90468649 #> 198 0.46844441 3.948526730 -0.6600116368 1.81381317 0.609974290 -0.86894579 #> 199 2.88088280 -0.797468462 -0.6064109842 -0.01727137 3.969187880 0.98957066 #> 200 -0.20576023 0.147694294 -0.5126098420 -0.88083962 0.210662124 0.80193198 #> Otu00012 Otu00013 Otu00014 Otu00015 Otu00016 #> 1 -0.025998508 1.524303976 1.671766383 1.2574976512 -0.49503122 #> 2 0.413356123 -0.551251533 0.971673864 0.3058058591 -0.50331257 #> 3 -0.678984290 -0.980085316 0.007910136 -0.6658514951 -0.47570808 #> 4 -0.654799631 -0.842858506 -0.542162557 -0.4795062491 -0.50055212 #> 5 -0.674953513 0.517974032 -0.792195600 -0.9653349262 0.34414511 #> 6 -0.267845094 1.215543652 2.385497069 0.4056336694 -0.10856844 #> 7 -0.638676525 0.323569384 -0.737642936 -0.4928166238 -0.39841553 #> 8 -0.058244719 0.294980465 -0.437603285 -0.6059548089 3.78366388 #> 9 -0.038090837 -0.219620075 0.953489643 -0.9453693641 -0.32940431 #> 10 0.183601866 -0.699913911 -0.751281102 -0.8921278653 0.16471596 #> 11 0.425448452 0.392182789 -0.792195600 0.6585307890 -0.50331257 #> 12 -0.211414224 -0.911471911 2.280937796 0.0861846763 1.72712978 #> 13 -0.199321895 -0.534098182 0.485245945 -0.7457137434 -0.49503122 #> 14 -0.610461090 -0.671324993 0.958035698 0.1327709878 -0.12237068 #> 15 -0.501630127 -0.717067263 0.262489234 -0.0136431341 -0.47294763 #> 16 -0.509691680 -0.339693534 -0.769465323 -0.5260925606 -0.19690279 #> 17 -0.404891492 -0.008062075 1.294443792 -0.4795062491 -0.50331257 #> 18 -0.582245655 -0.162442237 1.358088566 1.4970843961 0.02669354 #> 19 -0.457291586 -0.791398452 0.812561928 1.1044283420 -0.37633194 #> 20 -0.594337985 0.643765275 -0.010274085 0.0928398636 1.70504620 #> 21 -0.707199725 0.020526844 -0.792195600 -0.9520245515 -0.47846853 #> 22 -0.030029284 0.312133816 -0.342136123 2.6883629331 0.29997793 #> 23 0.683418144 -0.585558236 0.262489234 0.8981175339 -0.50331257 #> 24 0.687448920 0.981114517 -0.783103489 -0.2066435675 0.39935408 #> 25 0.598771839 -0.231055642 1.771779600 -0.4329199376 -0.36252970 #> 26 -0.654799631 -0.379718020 0.635265770 -0.7390585561 0.90727659 #> 27 -0.767661371 -1.008674235 -0.792195600 -0.9520245515 -0.48674988 #> 28 0.107017114 -0.705631695 -0.701274494 -0.7257481813 -0.49779167 #> 29 0.175540314 -0.448331426 -0.692182383 -0.6725066825 -0.48122898 #> 30 -0.699138172 -0.934343046 0.080647021 -0.8521967411 -0.50331257 #> 31 -0.328306740 0.060551330 1.680858494 -0.9586797389 -0.24383041 #> 32 -0.650768855 0.357876086 -0.792195600 1.2308769018 -0.04507812 #> 33 2.170774647 1.209825868 -0.387596677 -0.2865058157 -0.50331257 #> 34 0.119109444 -1.014392019 -0.792195600 -0.0668846329 -0.30456028 #> 35 -0.142891024 -0.168160021 2.430957622 -0.1800228180 0.33586376 #> 36 -0.529845562 2.467738298 -0.778557434 -0.5726788721 2.28474037 #> 37 -0.618522643 1.204108084 0.894390924 -0.0202983214 -0.20242369 #> 38 -0.723322831 0.500820681 -0.792195600 -0.9120934274 2.25437544 #> 39 -0.308152858 0.923936680 -0.164839966 -0.0469190709 -0.01471319 #> 40 0.687448920 -0.785680668 1.117147634 0.1327709878 -0.50055212 #> 41 1.594373617 1.095470193 -0.687636328 2.3422931905 -0.05888037 #> 42 -0.437137704 -0.762809533 -0.196662353 -0.5527133100 -0.36529014 #> 43 -0.046152390 1.850217651 -0.787649545 0.3723577327 -0.13893337 #> 44 0.542340969 -0.088111048 0.062462800 -0.2931610031 -0.45914539 #> 45 -0.416983822 0.849605491 -0.096649136 -0.8721623032 0.21164358 #> 46 2.731052571 0.180624789 -0.487609894 1.3173943374 -0.50331257 #> 47 1.062311128 0.489385113 0.594351273 0.1727021119 -0.13617292 #> 48 -0.066306272 3.559834998 -0.628537609 -0.9520245515 -0.18586099 #> 49 -0.646738078 -0.871447425 0.553436775 -0.8056104296 -0.18862144 #> 50 -0.130798695 -0.705631695 1.571753166 -0.4595406870 -0.50331257 #> 51 0.006247703 -0.145288886 -0.778557434 -0.3730232514 0.66435713 #> 52 0.489940875 -0.414024723 0.489792000 3.0677086125 0.06533981 #> 53 -0.622553420 0.695225329 -0.687636328 -0.9520245515 -0.50055212 #> 54 3.315515153 -0.997238668 -0.778557434 -0.9387141768 0.70024296 #> 55 -0.570153326 -0.734220614 1.735411158 -0.4528854997 -0.35424835 #> 56 -0.098552484 2.307640352 -0.783103489 0.8714967845 1.32410431 #> 57 -0.683015066 -0.677042776 0.026094358 -0.0202983214 -0.11961023 #> 58 -0.739445936 -1.014392019 -0.783103489 -0.9586797389 -0.48122898 #> 59 0.240032736 1.221261436 0.048824634 -0.3530576893 0.45732350 #> 60 -0.545968668 0.878194409 0.471607779 0.5986341028 -0.23830952 #> 61 -0.678984290 -0.053804345 -0.792195600 -0.2332643169 -0.01747364 #> 62 -0.683015066 -0.791398452 -0.037550417 -0.4196095629 -0.32112297 #> 63 -0.009875402 -0.757091749 -0.733096881 -0.2399195042 -0.13617292 #> 64 -0.578214879 -0.596993804 -0.787649545 -0.9054382400 -0.50331257 #> 65 -0.755569042 -1.014392019 -0.483063838 -0.9387141768 -0.27143490 #> 66 2.138528435 1.753015327 -0.623991553 -0.8721623032 -0.28799759 #> 67 -0.497599350 -0.368282453 -0.483063838 -0.7656793055 0.66435713 #> 68 -0.352491399 -0.528380398 1.198976630 -0.0003327594 0.05429802 #> 69 -0.102583260 -0.396871372 -0.792195600 2.5352936239 -0.49779167 #> 70 -0.421014598 -0.385435804 1.417187285 2.0228441973 -0.48951032 #> 71 -0.489537798 1.158365814 0.357956396 -0.1800228180 -0.50331257 #> 72 -0.574184103 0.003373492 -0.792195600 2.9346048653 -0.12513113 #> 73 -0.545968668 -0.133853318 0.862568536 1.2042561523 -0.28799759 #> 74 -0.203352671 0.071986898 0.467061724 1.5370155202 0.84102583 #> 75 -0.493568574 -0.351129101 2.640076167 0.0262879901 -0.48674988 #> 76 2.344098033 -1.014392019 0.953489643 -0.6192651836 -0.27143490 #> 77 -0.654799631 -0.494073696 -0.778557434 -0.1999883801 -0.48951032 #> 78 2.194959305 -1.014392019 -0.787649545 3.3339161068 -0.50055212 #> 79 -0.489537798 -0.768527317 0.621627604 0.4854959177 0.23648762 #> 80 -0.731384383 2.416278244 -0.787649545 -0.9387141768 -0.50331257 #> 81 -0.719292054 -0.762809533 -0.437603285 0.6452204143 -0.31836252 #> 82 -0.558060997 0.346440519 -0.792195600 -0.9653349262 -0.50331257 #> 83 -0.574184103 0.986832301 -0.210300519 1.8431541387 -0.01747364 #> 84 0.514125534 -0.842858506 -0.587623111 -0.9520245515 -0.50055212 #> 85 -0.650768855 -0.814269587 -0.469425672 -0.1667124433 -0.50331257 #> 86 0.514125534 -1.014392019 -0.792195600 -0.9387141768 -0.06992216 #> 87 -0.392799163 0.552280735 1.153516077 1.2841184006 -0.43706180 #> 88 -0.441168480 -0.471202561 -0.792195600 0.6052892902 -0.41221777 #> 89 -0.231568106 0.134882519 1.435371507 -0.1334365065 -0.50331257 #> 90 0.280340501 3.136718999 0.989858085 -0.1134709444 -0.50331257 #> 91 -0.674953513 0.026244628 -0.792195600 0.8648415971 -0.47294763 #> 92 -0.320245187 0.043397979 0.639811826 2.3223276284 -0.44534315 #> 93 1.646773711 -0.133853318 -0.792195600 0.5387374166 -0.23002817 #> 94 -0.610461090 -0.842858506 0.357956396 -0.6858170572 1.29926027 #> 95 -0.340399070 -0.516944831 2.621891945 -0.4728510618 -0.47294763 #> 96 -0.767661371 -1.014392019 -0.792195600 -0.7190929940 -0.44534315 #> 97 -0.263814317 2.730756352 -0.792195600 -0.6458859330 1.69400440 #> 98 -0.287998976 -0.196748940 1.176246353 -0.2066435675 -0.16929830 #> 99 5.508257532 1.512868408 -0.769465323 -0.8721623032 -0.43154091 #> 100 -0.751538266 -1.014392019 -0.783103489 -0.0003327594 -0.50055212 #> 101 -0.582245655 0.112011384 -0.764919268 0.2459091729 3.51866083 #> 102 -0.203352671 -0.728502830 -0.755827157 -0.9520245515 -0.48122898 #> 103 2.106282224 -0.196748940 -0.701274494 0.5786685407 -0.50331257 #> 104 -0.421014598 0.134882519 -0.792195600 -0.9453693641 1.54494019 #> 105 -0.263814317 0.300698249 2.976484260 0.1660469246 -0.50331257 #> 106 -0.683015066 -0.202466723 0.903483034 -0.6725066825 0.38279139 #> 107 -0.267845094 -0.202466723 -0.251215017 1.0578420305 -0.18034010 #> 108 0.312586712 -0.276797912 -0.787649545 1.0179109063 -0.44810360 #> 109 0.115078667 -0.522662615 -0.751281102 -0.6325755583 -0.50055212 #> 110 -0.646738078 -0.133853318 -0.651267885 -0.6658514951 -0.07268261 #> 111 -0.570153326 -0.516944831 2.126371915 0.3989784821 -0.01195274 #> 112 0.288402054 -0.322540183 -0.792195600 1.1510146535 -0.40393642 #> 113 -0.412953045 -0.665607209 0.685272379 2.0960512583 -0.41773866 #> 114 -0.662861184 -0.762809533 -0.664906051 0.6252548522 -0.26867445 #> 115 -0.433106927 -0.333975750 1.989990256 1.0844627799 -0.28523714 #> 116 -0.392799163 -0.030933210 -0.646721830 0.4056336694 -0.20794458 #> 117 -0.425045375 -0.591276020 -0.792195600 -0.7656793055 0.21716448 #> 118 -0.521784009 -0.282515696 0.271581345 -0.1933331927 0.04049578 #> 119 0.151355655 -0.625582722 2.549155060 1.6434985179 -0.50055212 #> 120 -0.231568106 0.603740788 -0.792195600 -0.8588519285 0.26409210 #> 121 -0.703168948 -0.848576290 -0.133017579 -0.3197817525 -0.50055212 #> 122 0.941387835 1.284157057 0.062462800 1.2109113397 -0.27971624 #> 123 -0.594337985 -1.014392019 -0.410326953 -0.7324033687 -0.49227077 #> 124 -0.493568574 1.186954733 0.307949787 2.1958790686 2.14947840 #> 125 0.933326283 0.409336140 -0.573984945 0.8781519718 -0.46466629 #> 126 -0.421014598 0.746685383 1.939983647 0.2392539855 -0.48122898 #> 127 -0.296060529 -0.728502830 -0.092103081 -0.5460581227 -0.47294763 #> 128 -0.723322831 -0.882882992 -0.724004770 -0.9187486147 -0.45914539 #> 129 0.006247703 -0.968649749 -0.323951902 -0.7856448676 -0.36529014 #> 130 -0.404891492 -0.568404885 2.108187694 -0.8388863664 -0.50331257 #> 131 0.058647797 -0.242491210 -0.351228234 0.7982897235 0.86034897 #> 132 -0.445199257 1.524303976 -0.787649545 0.4389096062 -0.13065203 #> 133 0.264217395 0.129164735 -0.605807332 -0.7923000549 -0.20242369 #> 134 -0.199321895 -0.151006669 3.244701524 0.1527365499 -0.50331257 #> 135 0.393202241 4.720545104 -0.783103489 -0.7324033687 -0.32388342 #> 136 0.123140220 -0.002344291 -0.273945294 0.4189440442 -0.36805059 #> 137 -0.038090837 0.792427653 1.785417766 -0.9453693641 -0.50331257 #> 138 3.795177548 -0.145288886 1.271713515 0.5919789155 -0.50331257 #> 139 -0.723322831 -0.934343046 -0.623991553 -0.8322311791 1.30478117 #> 140 0.824495319 -1.008674235 1.008042307 1.8564645134 -0.49503122 #> 141 0.868833860 -0.213902291 -0.442149340 -0.7324033687 -0.50331257 #> 142 -0.735415160 -0.962931965 -0.037550417 -0.8521967411 -0.45362449 #> 143 -0.723322831 -0.922907479 0.671634213 -0.7590241181 -0.30732073 #> 144 -0.598368761 -0.562687101 -0.696728438 0.1527365499 -0.35424835 #> 145 -0.658830408 -1.002956451 -0.783103489 3.7132617861 -0.41497822 #> 146 -0.638676525 -0.837140722 -0.783103489 -0.7457137434 -0.50331257 #> 147 -0.634645749 -1.008674235 -0.787649545 -0.9653349262 -0.50055212 #> 148 -0.715261278 -0.837140722 0.507976221 -0.8189208043 -0.11408933 #> 149 0.921233953 -0.940060830 -0.423965119 -0.8921278653 -0.50331257 #> 150 -0.106614037 -1.014392019 -0.792195600 -0.9653349262 -0.50331257 #> 151 -0.416983822 -0.408306939 -0.223938685 -0.3131265652 -0.42049911 #> 152 3.017237697 0.180624789 -0.546708613 0.4122888568 -0.41773866 #> 153 -0.566122550 -0.922907479 2.344582571 0.1993228614 -0.50331257 #> 154 -0.344429846 -1.014392019 -0.664906051 -0.9586797389 1.93140297 #> 155 1.134865104 -0.614147155 -0.783103489 1.1310490914 -0.45638494 #> 156 1.219511409 -0.419742507 -0.319405847 -0.9586797389 -0.44534315 #> 157 -0.767661371 1.890242137 -0.783103489 -0.9653349262 -0.49779167 #> 158 4.012839476 2.439149379 -0.351228234 0.1727021119 -0.49779167 #> 159 0.514125534 -0.968649749 -0.787649545 -0.8255759917 0.72232655 #> 160 0.485910099 0.929654463 -0.583077055 -0.4994718112 -0.16377741 #> 161 -0.715261278 1.106905760 -0.792195600 0.6984619132 -0.50331257 #> 162 -0.731384383 0.603740788 -0.792195600 1.6368433306 0.95144377 #> 163 -0.594337985 0.780992085 -0.687636328 0.0129776153 -0.48674988 #> 164 -0.545968668 0.060551330 -0.528524391 1.2907735880 -0.49227077 #> 165 -0.477445468 2.216155812 -0.787649545 -0.6791618698 2.69604719 #> 166 -0.646738078 -1.008674235 -0.792195600 -0.9653349262 -0.49503122 #> 167 -0.529845562 -0.431178074 0.017002247 0.9912901569 -0.45914539 #> 168 0.961541718 -1.002956451 -0.792195600 -0.8987830526 -0.49503122 #> 169 0.308555936 -0.682760560 -0.746735047 -0.8189208043 0.49596977 #> 170 -0.634645749 -1.008674235 -0.419419064 -0.9387141768 -0.02299454 #> 171 -0.469383915 -0.499791479 2.426411566 0.0861846763 -0.38185283 #> 172 0.183601866 -0.871447425 -0.755827157 -0.6991274319 8.63929272 #> 173 -0.191260342 -0.854294073 -0.792195600 -0.9520245515 1.62499319 #> 174 1.155018986 -0.299669047 -0.787649545 0.0395983648 -0.38737373 #> 175 0.227940407 0.981114517 0.021548302 0.7117722879 -0.32112297 #> 176 -0.384737610 0.186342573 -0.774011379 -0.9254038021 -0.50331257 #> 177 -0.541937891 -0.791398452 0.785285596 0.2126332361 -0.50331257 #> 178 1.183234421 0.352158303 -0.701274494 0.5254270419 1.07566395 #> 179 -0.235598882 -0.213902291 -0.792195600 -0.9320589894 -0.50055212 #> 180 -0.751538266 -0.677042776 -0.787649545 0.8714967845 -0.23830952 #> 181 -0.122737142 -0.728502830 -0.628537609 0.0994950510 -0.50055212 #> 182 -0.150952577 -0.048086562 -0.714912660 -0.6791618698 -0.44534315 #> 183 -0.469383915 0.094858033 -0.533070447 0.3257714212 0.23372717 #> 184 -0.654799631 -0.877165208 -0.619445498 -0.2399195042 -0.40669687 #> 185 -0.271875870 0.060551330 -0.787649545 -0.2665402537 -0.50331257 #> 186 -0.715261278 -0.962931965 0.648903936 2.6218110595 -0.48951032 #> 187 1.803973992 0.918218896 -0.655813940 -0.9653349262 3.58767204 #> 188 -0.545968668 0.415053924 -0.792195600 -0.8721623032 2.44484638 #> 189 -0.038090837 -0.940060830 -0.660359996 -0.8455415538 -0.50331257 #> 190 -0.638676525 -0.333975750 0.007910136 0.3856681074 0.21992493 #> 191 0.078801679 2.730756352 -0.678544217 -0.7324033687 -0.48674988 #> 192 -0.416983822 0.094858033 -0.792195600 -0.9586797389 -0.48398943 #> 193 -0.400860716 1.152648031 2.117279805 -0.1667124433 0.36070780 #> 194 4.726286904 -0.191031156 -0.683090272 -0.7190929940 0.57602278 #> 195 -0.154983354 -0.516944831 2.149102192 -0.2598850663 -0.41221777 #> 196 0.631018050 0.317851600 -0.792195600 -0.9653349262 -0.50331257 #> 197 1.195326751 0.826734356 0.821654039 -0.7390585561 -0.50331257 #> 198 -0.719292054 2.136106839 -0.792195600 -0.6458859330 -0.13341247 #> 199 -0.497599350 1.381359381 0.280673455 -0.8056104296 0.18403910 #> 200 -0.283968200 1.124059112 0.703456600 1.6501537053 -0.44258270 #> Otu00017 Otu00018 Otu00019 Otu00020 Otu00021 Otu00022 #> 1 0.47611468 0.399615523 0.55293856 0.554816232 -0.35537010 1.647612103 #> 2 -0.32110972 -0.679309939 0.61541514 -0.360008658 0.15159833 -0.375705829 #> 3 0.49083266 -0.679309939 -0.13846893 -0.529188603 -0.63100342 -0.081618920 #> 4 -0.26714376 0.030253653 0.08644676 -0.266019799 0.74224116 -0.187490207 #> 5 -0.52961456 -0.674449915 -0.64244668 -0.685836701 -0.63100342 -0.367863511 #> 6 3.30687454 -0.008626544 -0.08432256 0.172594874 0.78161735 -0.356100035 #> 7 -0.50263159 -0.518929127 -0.52165862 -0.403870125 -0.63100342 -0.026722697 #> 8 -0.53452056 0.419055622 0.69871725 -0.027914691 -0.60639331 -0.207096001 #> 9 1.29296306 -0.679309939 0.29053693 -0.673304853 -0.63100342 0.141887131 #> 10 -0.52225557 -0.436308709 -0.03017619 0.918239819 -0.52271890 -0.281598018 #> 11 -0.53452056 -0.679309939 -0.35921951 1.005962753 -0.63100342 -0.383548146 #> 12 2.65928302 -0.664729865 -0.21344082 -0.641975234 0.46660784 -0.273755700 #> 13 -0.44375963 -0.650149792 -0.64244668 -0.522922680 -0.62608140 -0.371784670 #> 14 0.94709032 -0.120407110 -0.34255909 -0.479061212 3.60193686 -0.277676859 #> 15 0.68216652 -0.280787922 -0.30923825 -0.585581919 -0.11911297 -0.360021194 #> 16 -0.53452056 2.304745168 -0.35921951 1.087419764 -0.62608140 -0.301203812 #> 17 2.23246135 -0.674449915 -0.23426635 -0.535454527 0.23035070 -0.340415400 #> 18 1.37881799 0.146894244 0.02813529 -0.165765017 0.69302092 -0.163963254 #> 19 0.70914950 0.137174194 0.40299477 -0.159499093 -0.16341118 0.185019877 #> 20 -0.50508458 2.960848490 -0.39670546 -0.234690180 -0.61623735 0.628110819 #> 21 -0.53452056 -0.664729865 -0.63828157 -0.679570777 -0.62115938 -0.379626987 #> 22 -0.53206756 0.224654637 0.28637182 0.673868786 -0.47842069 -0.367863511 #> 23 -0.53452056 0.278114908 0.60291983 2.033574274 -0.63100342 -0.003195744 #> 24 -0.52716157 -0.674449915 -0.64244668 -0.485327136 -0.62115938 -0.379626987 #> 25 -0.35299870 1.157779362 0.69455215 0.254051885 0.41738760 0.185019877 #> 26 2.12943543 0.900198058 -0.44668673 -0.604379690 -0.23231951 -0.352178876 #> 27 -0.53452056 -0.669589890 -0.64244668 -0.685836701 -0.63100342 -0.379626987 #> 28 -0.53452056 -0.679309939 5.46359780 2.321806774 -0.63100342 -0.336494241 #> 29 -0.51489658 -0.674449915 -0.38004504 0.442029602 -0.63100342 -0.293361494 #> 30 1.07709922 -0.679309939 4.20990108 -0.660773005 -0.29630582 -0.367863511 #> 31 -0.53452056 -0.023206617 -0.55081436 -0.585581919 -0.62115938 1.173151890 #> 32 0.40252473 -0.314808094 -0.56330968 -0.441465669 -0.63100342 0.604583867 #> 33 -0.53452056 -0.679309939 0.01980508 -0.071776158 -0.56701712 -0.379626987 #> 34 -0.53452056 -0.679309939 -0.64244668 -0.679570777 1.28366375 0.216389147 #> 35 0.31176380 -0.188447454 -0.18428509 -0.585581919 -0.26677368 -0.383548146 #> 36 -0.51980257 4.146694494 -0.57997010 -0.554252299 -0.63100342 -0.371784670 #> 37 1.22673211 0.389895474 -0.24676167 -0.660773005 -0.02559452 -0.152199778 #> 38 -0.53452056 -0.674449915 -0.63411647 -0.259753876 -0.61131533 -0.375705829 #> 39 -0.53452056 0.176054391 -0.49250288 -0.447731593 -0.53748498 -0.352178876 #> 40 2.04358049 -0.674449915 0.93612826 -0.197094636 0.03346976 -0.261992224 #> 41 0.24553285 0.559996335 -0.24676167 2.240349763 -0.62608140 -0.379626987 #> 42 -0.46093062 -0.329388168 -0.23843146 -0.410136049 1.79063218 -0.332573082 #> 43 -0.46093062 0.219794613 -0.64244668 -0.685836701 -0.62115938 -0.375705829 #> 44 1.26843308 0.195494490 1.00693505 -0.510390832 -0.60639331 0.024252367 #> 45 0.51536265 -0.679309939 -0.57997010 -0.240956104 -0.38982427 -0.379626987 #> 46 -0.50753758 -0.402288537 -0.17178977 -0.190828713 -0.62115938 -0.332573082 #> 47 0.75820946 -0.679309939 -0.54664925 0.078606015 0.89974591 -0.348257717 #> 48 -0.53452056 -0.105827036 0.02813529 3.430875305 -0.58670521 -0.328651923 #> 49 1.34692902 -0.343968241 -0.55081436 -0.610645614 0.80622746 0.024252367 #> 50 3.17195964 2.469986005 -0.22177104 -0.547986375 1.48054469 -0.367863511 #> 51 -0.53206756 -0.679309939 -0.41336588 0.968367210 -0.62608140 -0.265913383 #> 52 0.13514793 -0.207887552 -0.11347830 -0.529188603 0.72747509 -0.363942352 #> 53 -0.36526369 -0.679309939 -0.64244668 -0.598113766 -0.40951236 -0.360021194 #> 54 -0.53452056 -0.664729865 -0.36754972 -0.353742734 -0.55225105 0.094833225 #> 55 2.23491435 -0.368268364 0.18224419 -0.522922680 0.82099353 -0.254149906 #> 56 -0.51244358 0.885617984 -0.64244668 2.722825904 -0.49810879 -0.375705829 #> 57 -0.48055460 -0.431448684 -0.32173356 -0.366274582 0.53059414 -0.312967288 #> 58 -0.51734957 -0.679309939 -0.62995136 -0.679570777 -0.63100342 -0.363942352 #> 59 -0.51980257 -0.363408340 0.80700999 0.003414929 0.45184176 1.631927468 #> 60 0.14005393 1.138339263 -0.05100172 0.028478624 -0.38490224 -0.332573082 #> 61 -0.53452056 -0.679309939 -0.03434129 -0.472795288 -0.62608140 -0.383548146 #> 62 -0.03901494 -0.679309939 -0.55914457 -0.598113766 1.13108102 -0.301203812 #> 63 -0.52225557 0.788417492 -0.36754972 -0.303615343 -0.62608140 -0.363942352 #> 64 -0.53452056 -0.159287306 -0.09681787 1.156344927 -0.24216356 -0.132593984 #> 65 -0.47810160 -0.679309939 1.00276994 -0.616911538 -0.63100342 -0.171805572 #> 66 -0.53452056 -0.674449915 1.28183200 0.636273243 0.37308939 -0.332573082 #> 67 -0.48546060 -0.562669349 -0.35505441 -0.347476810 -0.62608140 -0.246307589 #> 68 -0.53206756 -0.008626544 -0.49250288 -0.052978387 -0.63100342 -0.293361494 #> 69 -0.53452056 -0.669589890 1.39845495 -0.491593060 -0.01575048 -0.258071065 #> 70 3.36819949 1.269559928 -0.62995136 -0.623177462 1.17045721 0.008567732 #> 71 0.32402879 -0.679309939 -0.20511061 -0.479061212 -0.55717307 0.012488891 #> 72 -0.53452056 0.321855129 1.36513411 0.141265254 -0.63100342 0.290891164 #> 73 1.25862108 0.083713924 -0.64244668 -0.134435397 2.44033929 0.118360178 #> 74 0.65273054 -0.679309939 1.11939289 -0.410136049 -0.25692963 -0.297282653 #> 75 2.94383081 -0.679309939 0.50295730 -0.372540506 1.28366375 -0.367863511 #> 76 1.98716153 1.775002486 -0.03017619 -0.397604201 -0.62608140 -0.379626987 #> 77 -0.29903274 -0.679309939 -0.50499820 -0.648241158 2.05149943 0.761430218 #> 78 -0.53452056 0.195494490 -0.64244668 -0.685836701 0.71763104 0.204625671 #> 79 0.99615028 -0.275927897 -0.24676167 -0.554252299 0.07776797 -0.371784670 #> 80 -0.53206756 -0.679309939 6.88389873 -0.679570777 -0.62608140 -0.383548146 #> 81 0.06646398 0.005953530 -0.36754972 -0.629443386 -0.63100342 -0.277676859 #> 82 -0.28186175 -0.674449915 -0.64244668 0.128733407 4.36977254 -0.046328491 #> 83 0.49573866 0.200354514 -0.55914457 -0.491593060 0.13683226 -0.344336558 #> 84 -0.53452056 -0.674449915 -0.64244668 -0.178296865 -0.62608140 7.537192593 #> 85 -0.53206756 -0.664729865 -0.64244668 -0.685836701 -0.63100342 -0.316888447 #> 86 -0.53452056 2.192964602 1.78164465 -0.679570777 -0.63100342 -0.234544113 #> 87 0.40743073 -0.475188906 -0.28008251 -0.422667897 0.31894713 0.377156657 #> 88 -0.53452056 -0.193307479 -0.05100172 -0.090573930 2.66183035 0.702612836 #> 89 -0.24016078 -0.679309939 0.47380156 0.254051885 -0.46857665 1.141782620 #> 90 -0.53452056 -0.679309939 -0.47167736 0.924505743 -0.63100342 0.561451120 #> 91 -0.29412674 -0.679309939 -0.64244668 -0.497858984 -0.62608140 -0.379626987 #> 92 -0.53452056 -0.679309939 0.44048072 -0.504124908 -0.62608140 -0.371784670 #> 93 -0.53452056 -0.679309939 0.27387650 1.782937318 -0.63100342 -0.383548146 #> 94 -0.53452056 2.601206669 1.18603458 -0.259753876 -0.08958083 -0.250228748 #> 95 3.55708035 -0.664729865 1.49008727 -0.598113766 1.48546672 -0.211017160 #> 96 -0.46828961 -0.655009816 -0.64244668 -0.679570777 4.06952910 0.020331208 #> 97 -0.53452056 -0.679309939 -0.45501694 -0.667038929 -0.62608140 -0.383548146 #> 98 0.78519244 -0.455748807 -0.05516682 -0.103105778 -0.63100342 -0.281598018 #> 99 -0.53452056 -0.669589890 3.29774300 0.354306667 -0.62608140 -0.383548146 #> 100 -0.53206756 -0.679309939 -0.52582373 0.147531178 -0.60639331 -0.383548146 #> 101 -0.40451166 1.002258574 -0.63411647 -0.065510234 1.30335184 -0.371784670 #> 102 -0.52225557 -0.679309939 -0.45918204 -0.604379690 -0.63100342 -0.379626987 #> 103 -0.43885363 2.800467678 -0.10514809 0.166328950 -0.62115938 -0.383548146 #> 104 -0.53452056 0.161474318 -0.52165862 -0.178296865 -0.61131533 0.549687644 #> 105 2.59305208 -0.674449915 0.31552756 -0.529188603 0.41246558 0.345787387 #> 106 1.42787796 -0.679309939 1.39012474 -0.673304853 0.20574059 -0.301203812 #> 107 -0.53452056 -0.188447454 0.50712240 -0.272285723 0.61919057 2.274997508 #> 108 -0.25978477 0.681496950 0.22389524 0.222722265 -0.62608140 1.337840559 #> 109 -0.52470857 -0.217607602 2.99785542 2.096233513 -0.60639331 -0.352178876 #> 110 -0.50263159 -0.382848438 -0.41336588 -0.203360560 -0.61623735 -0.269834542 #> 111 -0.53206756 -0.421728635 -0.62578626 -0.416401973 -0.62608140 -0.199253683 #> 112 -0.21072481 -0.669589890 -0.64244668 0.454561450 -0.62608140 -0.383548146 #> 113 -0.53452056 -0.032926667 -0.41336588 0.053542320 2.00227919 -0.316888447 #> 114 -0.40941766 -0.412008586 -0.06349703 -0.491593060 -0.54240700 0.286970005 #> 115 -0.53206756 0.054553776 -0.08848766 -0.052978387 -0.43412248 -0.128672825 #> 116 -0.45111862 1.211239632 0.01147487 0.015946776 0.82591556 -0.336494241 #> 117 -0.53452056 -0.013486568 0.57792920 -0.685836701 -0.39966831 -0.371784670 #> 118 -0.16902384 -0.465468857 0.42798540 0.028478624 0.34847927 0.094833225 #> 119 -0.53452056 -0.679309939 0.72370788 1.739075850 -0.63100342 -0.383548146 #> 120 -0.53452056 0.244094736 -0.21344082 -0.159499093 -0.63100342 -0.383548146 #> 121 -0.52716157 -0.679309939 -0.44252162 -0.679570777 -0.23724154 -0.383548146 #> 122 -0.53452056 -0.679309939 0.23639056 -0.522922680 0.03346976 -0.383548146 #> 123 -0.53452056 4.550076536 -0.48417267 1.544832209 -0.56701712 -0.340415400 #> 124 -0.53206756 -0.421728635 -0.48833778 0.009680852 -0.15356714 -0.352178876 #> 125 -0.48055460 -0.139847208 -0.13846893 -0.215892408 -0.63100342 -0.375705829 #> 126 -0.53452056 -0.309948069 -0.03017619 0.141265254 0.65364473 -0.348257717 #> 127 -0.47319561 -0.596689521 -0.45085183 -0.516656756 1.18522328 -0.156120937 #> 128 -0.49772559 1.687522044 -0.63828157 -0.140701321 -0.63100342 -0.332573082 #> 129 0.10571196 0.919638156 -0.57580499 2.716559980 0.73239711 -0.238465271 #> 130 1.58486984 -0.023206617 0.17391397 -0.660773005 -0.63100342 -0.383548146 #> 131 -0.51489658 0.419055622 -0.64244668 0.084871939 -0.25200761 -0.301203812 #> 132 -0.52470857 -0.669589890 1.18186948 -0.604379690 -0.54732902 -0.379626987 #> 133 -0.53452056 0.030253653 0.86115636 -0.234690180 -0.52764093 -0.285519177 #> 134 3.26762657 -0.650149792 0.57376409 -0.485327136 1.72172385 -0.328651923 #> 135 -0.53452056 0.880757959 1.11106268 2.478454871 -0.59654926 -0.324730765 #> 136 0.11552395 -0.679309939 -0.13430382 -0.547986375 0.70778699 0.118360178 #> 137 -0.53452056 -0.679309939 -0.64244668 -0.667038929 -0.61623735 -0.379626987 #> 138 -0.53206756 -0.460608832 0.26138119 -0.685836701 4.39438266 0.032094685 #> 139 0.17439590 0.380175425 -0.54248415 -0.109371702 -0.62115938 -0.324730765 #> 140 -0.52716157 -0.674449915 -0.63411647 -0.259753876 0.83083758 -0.265913383 #> 141 -0.53452056 0.428775671 0.59042451 -0.009116919 0.05807988 0.141887131 #> 142 -0.37262268 -0.523789152 -0.56330968 -0.673304853 0.61919057 2.714167291 #> 143 -0.53452056 -0.538369226 -0.35921951 -0.109371702 -0.61623735 -0.277676859 #> 144 -0.49527259 0.973098427 -0.53831904 0.786655417 -0.63100342 -0.277676859 #> 145 -0.08807490 -0.528649176 -0.63411647 -0.566784147 3.53302853 -0.352178876 #> 146 -0.51244358 -0.222467626 -0.60079562 -0.435199745 -0.62115938 -0.363942352 #> 147 -0.53452056 -0.679309939 -0.64244668 -0.466529364 -0.62608140 3.682693510 #> 148 0.14741292 -0.081526913 -0.50499820 -0.366274582 -0.62608140 2.231864761 #> 149 -0.53452056 -0.655009816 0.59042451 5.498630194 -0.49810879 -0.383548146 #> 150 -0.53452056 -0.679309939 -0.64244668 -0.554252299 -0.20770940 0.443816357 #> 151 -0.43394764 -0.679309939 -0.39254036 -0.360008658 -0.60147128 -0.261992224 #> 152 -0.48546060 -0.314808094 -0.62162115 0.091137863 1.57898517 -0.352178876 #> 153 -0.53452056 -0.596689521 -0.58413520 -0.591847843 0.34847927 0.130123654 #> 154 -0.52961456 -0.679309939 -0.63828157 4.320636500 0.09745607 -0.191411366 #> 155 -0.53452056 0.214934588 0.20306971 1.024760525 -0.57193914 -0.379626987 #> 156 -0.52470857 0.030253653 -0.63828157 -0.353742734 -0.63100342 -0.328651923 #> 157 -0.53206756 -0.679309939 -0.64244668 -0.685836701 -0.63100342 -0.383548146 #> 158 -0.53452056 -0.091246962 4.23489171 -0.673304853 -0.62608140 -0.211017160 #> 159 -0.53452056 2.523446276 -0.63828157 -0.328679038 0.54043819 1.333919400 #> 160 -0.53452056 1.002258574 0.05312592 1.569895905 -0.63100342 -0.371784670 #> 161 -0.52225557 0.428775671 -0.57997010 0.066074168 -0.63100342 -0.344336558 #> 162 -0.53452056 1.998563618 -0.64244668 0.066074168 -0.63100342 7.666590833 #> 163 -0.53206756 -0.266207848 -0.25925698 2.459657100 -0.63100342 -0.383548146 #> 164 -0.51244358 -0.674449915 -0.62578626 -0.228424256 -0.61623735 -0.371784670 #> 165 -0.51489658 0.351015277 0.32385777 -0.103105778 -0.63100342 -0.375705829 #> 166 -0.53452056 -0.674449915 -0.64244668 -0.648241158 0.11222214 -0.383548146 #> 167 -0.49036659 -0.514069103 -0.63828157 0.279115580 1.49038874 -0.258071065 #> 168 -0.53452056 -0.412008586 0.18224419 -0.159499093 -0.62608140 -0.360021194 #> 169 -0.53206756 -0.679309939 -0.63828157 -0.504124908 -0.63100342 -0.383548146 #> 170 -0.04882693 -0.679309939 -0.63828157 -0.685836701 -0.63100342 -0.261992224 #> 171 3.46877241 -0.407148561 1.34847369 -0.009116919 1.17045721 -0.132593984 #> 172 -0.50753758 1.109179116 -0.31340335 -0.616911538 -0.52764093 -0.167884413 #> 173 -0.53452056 -0.562669349 -0.60912584 2.171424600 -0.62115938 -0.309046129 #> 174 -0.45602462 0.423915646 -0.36754972 0.698932482 -0.63100342 -0.175726731 #> 175 0.17439590 0.039973702 -0.54248415 -0.554252299 0.23527273 -0.258071065 #> 176 0.70914950 -0.679309939 -0.64244668 -0.121903550 2.44526132 -0.375705829 #> 177 0.95444931 -0.271067872 -0.38004504 -0.585581919 -0.06989273 -0.344336558 #> 178 -0.11996387 1.279279977 -0.64244668 -0.685836701 3.24755116 -0.136515143 #> 179 -0.53452056 -0.679309939 -0.19261530 0.435763678 -0.61131533 -0.360021194 #> 180 -0.48546060 -0.518929127 -0.26342209 -0.479061212 -0.63100342 -0.320809606 #> 181 -0.49772559 -0.635569718 -0.56747478 -0.673304853 -0.60639331 2.278918667 #> 182 -0.53206756 1.964543446 -0.63411647 0.391902211 -0.06004869 -0.375705829 #> 183 -0.52716157 -0.169007356 -0.42169609 3.180238349 -0.62608140 -0.383548146 #> 184 -0.32601572 -0.314808094 -0.50499820 -0.610645614 -0.13387904 -0.062013126 #> 185 -0.51489658 3.373950582 -0.27591741 -0.510390832 -0.61131533 -0.383548146 #> 186 -0.51980257 -0.679309939 -0.63411647 -0.641975234 -0.29630582 0.651637772 #> 187 0.38535374 0.783557467 -0.64244668 -0.504124908 1.10154888 -0.371784670 #> 188 -0.53452056 1.993703594 0.05729102 0.084871939 -0.63100342 -0.383548146 #> 189 -0.49281959 -0.353688291 -0.55081436 4.583805304 -0.60639331 3.910120720 #> 190 -0.37262268 -0.339108217 -0.08015745 -0.347476810 -0.62608140 -0.062013126 #> 191 -0.53452056 1.532001256 1.58588470 -0.428933821 -0.57193914 -0.081618920 #> 192 -0.53452056 -0.669589890 -0.27175230 -0.266019799 -0.63100342 -0.379626987 #> 193 3.84898713 -0.518929127 -0.16345956 -0.510390832 0.37308939 -0.348257717 #> 194 -0.52716157 0.715517123 0.39466456 -0.497858984 -0.21755344 -0.379626987 #> 195 3.26026757 0.268394859 -0.03017619 0.153797102 0.67825485 -0.211017160 #> 196 -0.48546060 4.652137053 0.77785425 -0.416401973 -0.63100342 -0.383548146 #> 197 -0.51244358 0.351015277 -0.14679914 -0.685836701 0.41738760 -0.367863511 #> 198 -0.53452056 -0.679309939 -0.63828157 -0.623177462 -0.63100342 -0.383548146 #> 199 1.06483423 -0.674449915 -0.53831904 -0.667038929 -0.18309928 -0.375705829 #> 200 -0.53452056 -0.552949299 0.14059313 -0.002850995 0.27957094 0.196783353 #> Otu00023 Otu00024 Otu00025 Otu00026 Otu00027 Otu00028 #> 1 -0.0069254588 -0.177204415 -0.24303824 -0.22202016 -0.24641906 -0.292554022 #> 2 -0.6642571429 -0.678440995 -0.43616774 -0.29146475 -0.38539990 -0.307394436 #> 3 -0.3747181868 0.177117995 0.04157367 -0.47086329 -0.41259180 -0.168883908 #> 4 -0.3199405465 0.954898895 -0.28369708 0.43770350 -0.36425064 -0.314814643 #> 5 -0.9068438359 -0.695725015 -0.39550890 -0.61553953 -0.06816104 -0.314814643 #> 6 -0.3434166781 0.851194775 0.03649131 -0.45350214 -0.38842122 -0.319761448 #> 7 0.4078195324 -0.669798985 -0.42600303 0.87751927 -0.23131245 -0.295027425 #> 8 -0.0851792307 -0.592020895 -0.35485005 -0.57503018 0.01945732 -0.322234850 #> 9 -0.8990184587 -0.393254665 -0.45141481 -0.62132658 -0.31288816 -0.319761448 #> 10 -0.4060196956 -0.341402605 1.42397434 -0.62132658 -0.40957048 0.214493446 #> 11 0.1965343482 3.962318375 -0.07023815 0.46085170 -0.20412055 -0.322234850 #> 12 1.2451348919 0.324032165 -0.14647348 -0.58660428 0.02852128 -0.319761448 #> 13 0.0713283131 0.488230355 -0.30402650 -0.37248345 -0.39748519 -0.314814643 #> 14 -0.5625272394 -0.280908535 -0.26845001 1.35205733 -0.37935725 -0.322234850 #> 15 -0.6955586517 0.107981915 -0.37009712 -0.26252951 -0.31288816 -0.312341241 #> 16 1.6911813918 -0.713009035 -0.43616774 -0.01368637 -0.32497345 -0.307394436 #> 17 -0.1399568711 0.099339905 0.21437375 -0.25095541 -0.38237857 -0.314814643 #> 18 -0.4138450728 -0.030290245 0.21437375 -0.22780721 -0.39144254 -0.183724322 #> 19 -0.7581616692 -0.021648235 -0.37517948 0.53608334 -0.12556616 -0.307394436 #> 20 0.8538660323 -0.592020895 -0.45141481 -0.54030789 -0.30986683 -0.312341241 #> 21 -0.8911930815 -0.704367025 5.62708227 -0.62132658 -0.41259180 -0.297500827 #> 22 0.7756122604 -0.704367025 0.61587983 -0.32618705 -0.31288816 -0.205984942 #> 23 0.3686926464 -0.721651045 -0.45649716 0.48978694 0.23699254 -0.299974229 #> 24 -0.1243061167 0.203044025 -0.40059125 -0.62132658 0.44848511 -0.314814643 #> 25 1.1434049884 -0.013006225 -0.29386179 -0.62132658 -0.41863444 -0.235665770 #> 26 -0.8285900640 0.168475985 -0.03974402 -0.58660428 0.33367486 -0.089735035 #> 27 -0.8677169499 -0.721651045 -0.14139113 -0.62132658 -0.41561312 1.485822222 #> 28 0.2200104798 -0.678440995 -0.44125010 2.96085712 -0.42467709 4.458851770 #> 29 -0.4216704500 -0.522884815 -0.43616774 -0.10049212 -0.32195212 -0.319761448 #> 30 -0.7816378008 -0.142636375 -0.37517948 -0.58660428 -0.40654915 -0.314814643 #> 31 -0.4920988447 1.680827735 -0.42600303 -0.60396543 -0.40352783 -0.317288045 #> 32 -0.6642571429 1.853667935 -0.31419121 -0.41299279 -0.40957048 -0.210931747 #> 33 1.3546901726 -0.721651045 -0.34976770 -0.59239133 0.49682627 -0.228245563 #> 34 -0.8990184587 -0.410538685 3.72119899 -0.49979854 -0.05909707 -0.260399793 #> 35 -0.2729882833 4.938865505 -0.18204997 -0.52873379 -0.33101609 -0.309867838 #> 36 2.7789088215 -0.661156975 1.47988025 -0.61553953 -0.15275807 -0.314814643 #> 37 -0.5234003535 2.026508135 0.45324446 -0.58081723 0.09801170 -0.314814643 #> 38 -0.9068438359 -0.721651045 0.34143264 -0.59817838 -0.36122932 -0.307394436 #> 39 -0.0069254588 -0.661156975 -0.26845001 -0.43614099 0.49984759 -0.287607218 #> 40 -0.6407810114 0.038845835 -0.25320295 -0.21623311 -0.37935725 -0.314814643 #> 41 1.1825318744 -0.609304915 -0.42092068 -0.61553953 0.26418444 -0.317288045 #> 42 -0.4529719588 0.073413875 -0.42092068 -0.37248345 -0.37935725 5.443265880 #> 43 3.1388761724 -0.721651045 -0.37517948 -0.62132658 -0.34914403 -0.297500827 #> 44 0.4391210411 0.090697895 -0.34976770 -0.59817838 -0.31288816 -0.295027425 #> 45 0.5252001902 -0.410538685 1.46971554 -0.61553953 -0.09535294 -0.317288045 #> 46 1.3077379094 -0.436464715 -0.24303824 0.16571217 -0.37633593 -0.210931747 #> 47 0.5173748130 0.393168245 0.04665602 -0.60396543 0.54818875 -0.317288045 #> 48 1.4877215849 -0.661156975 -0.33960299 -0.62132658 -0.41561312 -0.314814643 #> 49 -0.8442408184 0.151191965 -0.24812059 -0.60396543 -0.41863444 -0.290080620 #> 50 -0.6720825201 0.747490655 -0.18204997 -0.58660428 -0.38842122 -0.267820000 #> 51 -0.3590674325 -0.574736875 -0.44125010 1.11478830 -0.42467709 1.305263855 #> 52 -0.6407810114 0.427736285 -0.21762646 -0.60975248 -0.35518667 -0.302447632 #> 53 1.7459590322 -0.704367025 6.00825892 -0.60975248 0.58746594 -0.223298758 #> 54 1.4877215849 -0.522884815 1.16985657 -0.41877984 -0.36425064 -0.262873195 #> 55 -0.7425109149 0.254896085 -0.17188526 0.50714809 -0.10441691 -0.314814643 #> 56 0.8225645235 -0.713009035 0.03649131 -0.61553953 -0.36727196 -0.314814643 #> 57 -0.3590674325 -0.557452855 -0.45141481 1.07427895 0.25209915 -0.109522253 #> 58 -0.8911930815 -0.669798985 1.25117426 -0.62132658 -0.42467709 0.738854731 #> 59 -0.1008299851 0.445020305 -0.45141481 -0.38984460 0.56027404 -0.312341241 #> 60 0.0165506728 -0.254982505 0.61587983 0.62867613 0.19167270 -0.277713609 #> 61 -0.4294958272 -0.488316775 -0.45649716 -0.28567770 -0.37331461 -0.317288045 #> 62 -0.2338613974 -0.427822705 0.39733855 -0.40720575 -0.17390732 2.002763299 #> 63 1.9259427076 -0.592020895 -0.44633245 0.99904731 -0.42165577 -0.230718965 #> 64 -0.3981943184 -0.713009035 0.88524467 0.14256397 0.11613964 -0.317288045 #> 65 -0.6564317657 -0.531526825 -0.47174423 -0.55188199 8.52145880 0.006727654 #> 66 -0.6955586517 -0.177204415 -0.47174423 -0.62132658 -0.23433377 -0.322234850 #> 67 -0.5625272394 -0.687083005 -0.47174423 2.85669023 0.33367486 -0.322234850 #> 68 -0.3121151693 0.393168245 -0.45649716 0.17728626 -0.39748519 -0.319761448 #> 69 1.1590557428 -0.721651045 0.02124425 1.73400261 0.03758525 -0.309867838 #> 70 0.1808835938 1.940088035 -0.43616774 -0.54030789 -0.38539990 -0.319761448 #> 71 1.0181989533 -0.358686625 1.11395066 -0.61553953 -0.31893080 -0.304921034 #> 72 -0.3355913009 -0.721651045 -0.30910886 1.01640846 -0.16182203 -0.275240206 #> 73 -0.5860033710 -0.038932255 -0.42092068 -0.23359426 -0.26756832 -0.314814643 #> 74 -0.5781779938 -0.177204415 -0.36501477 0.14256397 0.83521439 0.006727654 #> 75 -0.4686227131 0.894404825 0.01107953 -0.30882590 -0.35216535 -0.304921034 #> 76 -0.6486063886 0.531440405 -0.44125010 -0.52294674 -0.36727196 -0.307394436 #> 77 -0.4842734675 0.721564625 -0.47174423 2.76409744 -0.37029328 -0.309867838 #> 78 -0.9068438359 1.015392965 0.94115058 -0.23938131 -0.39446386 -0.292554022 #> 79 -0.4451465816 -0.237698485 -0.26336766 -0.08313097 -0.28569625 -0.314814643 #> 80 0.0791536903 -0.721651045 0.36176206 -0.61553953 -0.42467709 -0.248032781 #> 81 -0.7190347833 -0.687083005 -0.29894415 0.60552794 -0.30986683 -0.322234850 #> 82 0.0087252956 1.145023115 -0.39042654 -0.23938131 -0.11045955 -0.270293402 #> 83 1.9885457251 -0.315476575 -0.33452063 -0.60396543 -0.40654915 -0.257926390 #> 84 0.2747881201 -0.721651045 -0.32943828 2.66571759 2.25221464 -0.314814643 #> 85 -0.8833677043 -0.229056475 -0.46157952 1.49673357 0.05269186 0.911992891 #> 86 -0.9068438359 -0.626588935 -0.45141481 1.59511342 1.12224003 -0.322234850 #> 87 -0.2495121518 5.517880175 -0.38534419 -0.61553953 -0.40352783 -0.309867838 #> 88 -0.2886390377 0.721564625 -0.08040286 -0.22780721 -0.21922716 -0.275240206 #> 89 -0.5234003535 0.133907945 -0.30910886 -0.19308491 -0.41561312 -0.173830713 #> 90 0.0008999184 0.082055885 -0.41075596 0.40876825 -0.42165577 -0.302447632 #> 91 -0.7659870464 -0.393254665 -0.44633245 0.45506465 -0.33705874 -0.302447632 #> 92 -0.7738124236 0.954898895 0.85983289 -0.30882590 -0.41561312 1.837045346 #> 93 0.1417567078 -0.721651045 6.81127108 -0.62132658 -0.14369410 -0.302447632 #> 94 -0.6016541254 -0.341402605 -0.46157952 1.02798256 -0.10743823 -0.149096690 #> 95 0.7286599972 0.254896085 -0.07532051 -0.53452084 -0.30080287 -0.319761448 #> 96 -0.9068438359 0.194402015 -0.46157952 -0.34354820 -0.42467709 -0.322234850 #> 97 1.9181173304 -0.704367025 -0.27353237 -0.62132658 0.98325919 -0.248032781 #> 98 -0.4529719588 0.142549955 0.31093850 0.24094381 -0.35820799 -0.277713609 #> 99 0.7286599972 -0.713009035 -0.07023815 -0.59239133 0.11311831 -0.280187011 #> 100 -0.5234003535 -0.704367025 -0.46666187 -0.60396543 0.06175583 3.006964628 #> 101 0.0243760500 0.514156385 -0.28369708 -0.61553953 3.79913175 -0.322234850 #> 102 5.4160609352 -0.609304915 -0.43108539 -0.61553953 5.83248179 -0.275240206 #> 103 1.1512303656 -0.609304915 -0.44125010 -0.54609494 0.83823571 -0.205984942 #> 104 -0.9068438359 -0.574736875 -0.28369708 0.40298120 -0.42467709 -0.319761448 #> 105 0.1495820850 0.254896085 -0.11597935 -0.59817838 -0.22526980 -0.282660413 #> 106 -0.7972885552 -0.056216275 -0.21254410 -0.59239133 0.43942114 -0.312341241 #> 107 -0.2260360202 -0.229056475 -0.34468534 0.61710203 -0.30080287 0.169972205 #> 108 -0.5468764851 1.335147335 -0.45141481 1.46779833 -0.12254484 -0.309867838 #> 109 1.1121034796 -0.678440995 -0.39550890 -0.59817838 -0.32195212 -0.312341241 #> 110 0.7599615060 -0.479674765 -0.45141481 0.94696386 -0.05305442 -0.309867838 #> 111 -0.6407810114 -0.289550545 1.47479789 0.06154527 -0.40957048 0.058669102 #> 112 -0.5468764851 -0.721651045 -0.25320295 -0.40141870 -0.07722500 -0.314814643 #> 113 -0.8990184587 -0.721651045 -0.24303824 -0.61553953 -0.42165577 -0.314814643 #> 114 -0.6486063886 -0.082142305 -0.30910886 -0.20465901 -0.22829113 -0.319761448 #> 115 -0.4842734675 0.073413875 -0.41583832 -0.62132658 0.20980063 -0.277713609 #> 116 0.1261059534 0.583292465 -0.43108539 -0.60396543 -0.40352783 -0.025426576 #> 117 0.0243760500 -0.514242805 -0.45141481 -0.62132658 -0.39748519 0.763588754 #> 118 -0.0304015904 -0.721651045 -0.27861472 -0.15257556 0.01945732 -0.319761448 #> 119 -0.7033840289 2.389472555 -0.45141481 -0.62132658 -0.38237857 -0.317288045 #> 120 1.8320381813 -0.652514965 -0.20237939 -0.61553953 0.10103302 -0.309867838 #> 121 -0.5547018623 -0.548810845 -0.47174423 -0.44771509 0.03154261 -0.272766804 #> 122 -0.1869091342 -0.254982505 3.03508101 -0.53452084 -0.31893080 -0.250506184 #> 123 -0.2260360202 -0.462390745 -0.46157952 2.06965148 -0.42467709 6.323797094 #> 124 0.1652328394 1.170949145 -0.44125010 -0.60975248 -0.42467709 3.514012096 #> 125 -0.9068438359 -0.531526825 -0.33960299 4.84743529 -0.38842122 -0.299974229 #> 126 -0.6329556342 3.564785915 -0.24812059 -0.52294674 -0.39748519 -0.245559379 #> 127 -0.9068438359 -0.367328635 -0.40059125 0.37983300 -0.36727196 -0.314814643 #> 128 1.6677052603 0.185760005 3.05032807 0.39140710 0.28533370 -0.314814643 #> 129 -0.0851792307 -0.522884815 -0.16680290 5.25252877 0.85032100 -0.280187011 #> 130 -0.6251302570 -0.695725015 0.10764429 -0.60975248 -0.27663229 -0.322234850 #> 131 -0.9068438359 -0.419180695 -0.42600303 -0.51715969 -0.02586252 -0.317288045 #> 132 1.4407693217 -0.592020895 -0.44125010 -0.55188199 1.61169427 -0.285133816 #> 133 0.4547717955 -0.488316775 0.03649131 -0.17572376 -0.21318451 -0.248032781 #> 134 -0.2808136605 0.427736285 0.24486788 -0.45928919 -0.29476022 -0.314814643 #> 135 -0.0695284764 -0.678440995 -0.33452063 -0.59239133 0.91679010 -0.317288045 #> 136 0.3217403832 -0.280908535 -0.39550890 -0.54030789 0.65997768 0.031461677 #> 137 0.4547717955 0.868478795 -0.44125010 0.07890642 -0.36727196 -0.136729678 #> 138 -0.5312257307 0.453662315 -0.47174423 -0.44192804 -0.40957048 1.082657649 #> 139 0.0400268043 -0.133994365 -0.41583832 1.91918820 0.06477715 -0.322234850 #> 140 -0.9068438359 2.795647025 -0.44125010 -0.55188199 -0.41561312 -0.317288045 #> 141 -0.4920988447 -0.583378885 -0.47174423 2.26062412 0.17656609 -0.116942460 #> 142 -0.7894631780 -0.237698485 -0.21762646 -0.42456689 -0.42467709 -0.099628644 #> 143 -0.5155749763 0.038845835 -0.24812059 0.23515676 -0.42467709 -0.015532966 #> 144 0.1417567078 0.142549955 0.09239722 1.66455801 -0.27663229 0.320849745 #> 145 -0.8833677043 -0.315476575 -0.15155584 -0.61553953 -0.40050651 5.809329418 #> 146 -0.3668928096 -0.609304915 -0.44633245 0.68075958 -0.42467709 -0.292554022 #> 147 -0.8990184587 -0.713009035 -0.44125010 -0.60975248 -0.31893080 -0.314814643 #> 148 -0.1869091342 -0.073500295 -0.41075596 1.02798256 0.45452776 -0.223298758 #> 149 -0.1008299851 -0.626588935 -0.39042654 -0.11785327 -0.39748519 -0.299974229 #> 150 0.0322014271 2.372188535 -0.39042654 0.42612940 -0.40352783 -0.322234850 #> 151 -0.2495121518 1.231443215 -0.46157952 -0.60396543 -0.42467709 -0.304921034 #> 152 0.3921687780 1.352431355 -0.20746175 -0.46507624 -0.41259180 -0.280187011 #> 153 -0.8442408184 0.548724425 -0.43108539 0.60552794 -0.34008006 -0.307394436 #> 154 1.2060080059 -0.617946925 -0.36501477 -0.62132658 0.43639982 -0.245559379 #> 155 0.9086436726 -0.531526825 -0.22779117 -0.56924313 0.30648295 0.706700501 #> 156 -0.4686227131 -0.522884815 -0.42092068 -0.61553953 -0.42165577 -0.314814643 #> 157 -0.8911930815 -0.687083005 0.98180942 -0.62132658 -0.33705874 -0.210931747 #> 158 0.9947228218 -0.220414465 0.74293871 0.07311937 -0.41561312 -0.295027425 #> 159 -0.6564317657 -0.125352355 -0.40567361 2.60784710 -0.41561312 -0.277713609 #> 160 -0.6877332745 -0.713009035 -0.34468534 -0.59239133 0.64184975 -0.139203081 #> 161 0.4078195324 -0.669798985 -0.47174423 3.04187582 -0.41561312 -0.314814643 #> 162 -0.8990184587 -0.721651045 -0.14647348 -0.62132658 -0.37633593 -0.285133816 #> 163 1.1121034796 -0.721651045 -0.35993241 0.74441713 -0.29173890 -0.290080620 #> 164 0.9712466902 -0.168562405 -0.32435592 -0.59817838 0.79895852 -0.272766804 #> 165 0.2356612341 -0.566094865 -0.33960299 -0.49979854 5.67839434 -0.297500827 #> 166 -0.3434166781 1.369715375 -0.46157952 -0.60975248 -0.41561312 4.716085608 #> 167 -0.5468764851 0.419094275 -0.46666187 3.73053472 -0.40654915 -0.307394436 #> 168 -0.5155749763 -0.721651045 -0.40567361 -0.59817838 -0.34008006 -0.287607218 #> 169 3.5849226723 -0.704367025 0.95639764 -0.53452084 0.37597337 -0.304921034 #> 170 -0.9068438359 -0.687083005 -0.39042654 -0.62132658 -0.41863444 -0.312341241 #> 171 -0.5390511079 0.617860505 -0.07532051 -0.37827050 -0.37633593 -0.314814643 #> 172 -0.4529719588 -0.626588935 -0.46157952 -0.26252951 2.99243865 -0.077368024 #> 173 -0.8207646868 -0.687083005 -0.40567361 -0.62132658 0.99836580 0.019094666 #> 174 0.4312956639 1.741321805 -0.39042654 -0.51137264 -0.15275807 -0.290080620 #> 175 -0.0695284764 0.107981915 -0.45649716 -0.50558559 -0.29778154 -0.295027425 #> 176 0.4547717955 4.307998775 1.64759798 -0.58660428 -0.37029328 -0.304921034 #> 177 -0.1321314939 -0.220414465 -0.24812059 0.70969483 -0.38842122 -0.319761448 #> 178 -0.9068438359 -0.410538685 -0.45649716 -0.62132658 -0.42165577 -0.299974229 #> 179 0.2982642517 -0.574736875 -0.16680290 -0.06576982 0.68414826 -0.319761448 #> 180 -0.5077495991 0.280822115 -0.44633245 -0.33776115 -0.37029328 0.244174274 #> 181 -0.6877332745 -0.522884815 0.01616189 0.77335237 -0.08931029 -0.302447632 #> 182 -0.5938287482 0.436378295 -0.46157952 1.04534371 -0.20109922 -0.196091333 #> 183 -0.4451465816 -0.367328635 -0.22779117 -0.19308491 -0.30684551 0.273855101 #> 184 -0.7738124236 0.151191965 0.03649131 -0.51137264 -0.36727196 1.483348819 #> 185 3.0997492864 -0.617946925 -0.42092068 -0.56924313 0.18260873 -0.314814643 #> 186 -0.8677169499 0.393168245 -0.47174423 0.21200856 -0.39144254 -0.069947817 #> 187 -0.9068438359 -0.609304915 -0.46157952 -0.61553953 -0.42165577 -0.309867838 #> 188 2.7710834443 -0.721651045 -0.34468534 -0.60396543 -0.08628897 0.773482363 #> 189 -0.8755423271 -0.047574265 -0.43108539 -0.43614099 -0.41863444 0.187286021 #> 190 -0.3355913009 -0.246340495 -0.40567361 1.58353932 -0.11650220 -0.302447632 #> 191 -0.6094795026 -0.479674765 -0.42092068 -0.45350214 -0.41259180 -0.245559379 #> 192 0.1104551991 -0.721651045 0.80900933 -0.59239133 -0.40957048 -0.307394436 #> 193 -0.5077495991 0.609218495 0.12289135 -0.56924313 -0.14671542 -0.297500827 #> 194 3.4518912600 -0.687083005 -0.40567361 1.55460407 0.06175583 -0.260399793 #> 195 -0.4842734675 0.315390155 2.58783373 -0.52873379 0.17958741 -0.282660413 #> 196 2.4658937338 -0.721651045 1.35282136 -0.16414966 -0.42467709 -0.322234850 #> 197 -0.0382269676 -0.669798985 -0.39550890 -0.58660428 -0.40352783 -0.161463701 #> 198 -0.9068438359 -0.721651045 0.15338549 -0.62132658 -0.41561312 -0.297500827 #> 199 -0.8598915727 0.107981915 0.40750326 -0.60396543 -0.27058964 -0.299974229 #> 200 -0.0304015904 0.004277795 -0.14647348 -0.55766903 -0.23131245 -0.317288045 #> Otu00029 Otu00030 Otu00031 Otu00032 Otu00033 #> 1 0.695821495 0.39193166 0.2730666130 1.850227727 -0.352365855 #> 2 -0.252260766 0.44720466 -0.1402887916 -0.493938512 0.152851091 #> 3 0.066720182 -0.59377025 -0.4629076438 -0.357825634 -0.288065517 #> 4 -0.473775313 -0.71352842 1.5937875395 -0.501500339 -0.435037719 #> 5 -0.571241714 0.33665866 -0.5637260352 -0.577118604 0.952012441 #> 6 -0.216818439 -0.52928508 -0.2411071829 0.337862411 0.079364989 #> 7 3.079318020 0.19847615 -0.3520074134 -0.395634767 -0.618752972 #> 8 0.031277854 -0.17001055 -0.3822529308 -0.357825634 -0.444223482 #> 9 -0.730732188 -0.11473754 0.3335576478 -0.070476224 -0.168650602 #> 10 0.137604837 -0.76880143 -0.4830713221 -0.516623992 0.740739900 #> 11 -0.305424257 0.16162748 -0.5939715526 -0.577118604 -0.600381447 #> 12 -0.730732188 -0.54770941 -0.5233986787 0.148816747 0.465167021 #> 13 -0.269981930 -0.62140675 -0.2209435046 0.103445788 -0.453409245 #> 14 -0.526938804 0.54853851 0.1420027042 0.572279035 -0.646310260 #> 15 -0.535799386 -0.33582956 -0.2411071829 0.436166157 -0.655496023 #> 16 -0.340866585 -0.38189040 -0.4729894830 -0.569556778 1.071427356 #> 17 -0.181376111 1.20260239 -0.4427439656 1.071359589 -0.582009922 #> 18 0.279374147 0.65908451 0.0109387955 -0.100723530 0.106922277 #> 19 0.270513565 0.72356969 -0.0797977567 0.466413463 -0.232950941 #> 20 1.431249791 0.85254003 0.4646215565 -0.546871298 0.446795495 #> 21 -0.730732188 -0.76880143 -0.5939715526 -0.569556778 1.787916843 #> 22 2.937548710 -0.28055656 -0.5536441961 -0.456129379 -0.159464840 #> 23 -0.004164473 0.04186930 -0.3217618960 0.141254920 -0.673867548 #> 24 0.146465418 1.07363205 -0.5838897135 0.504222596 0.116108040 #> 25 -0.730732188 0.79726702 -0.1806161481 -0.577118604 -0.021678400 #> 26 -0.730732188 -0.70431626 -0.5637260352 -0.138532663 4.424230724 #> 27 -0.686429278 -0.76880143 -0.5838897135 -0.531747645 1.705244979 #> 28 0.562912767 -0.76880143 -0.5939715526 -0.577118604 -0.490152295 #> 29 0.279374147 -0.52928508 -0.1402887916 -0.357825634 1.098984644 #> 30 -0.721871606 7.25499635 -0.5637260352 0.020265695 -0.692239074 #> 31 -0.128212620 1.34078490 1.6643604135 -0.569556778 -0.012492637 #> 32 1.378086300 -0.06867671 -0.5838897135 2.530792119 -0.627938735 #> 33 0.075580763 -0.43716340 -0.5939715526 -0.577118604 0.428423970 #> 34 -0.243400184 -0.76880143 -0.5838897135 -0.577118604 -0.223765178 #> 35 0.199628910 0.76041836 0.3033121304 -0.441005726 -0.407480431 #> 36 2.388192634 3.49643206 -0.5939715526 -0.509062165 -0.407480431 #> 37 -0.695289860 -0.67667975 -0.4830713221 0.821819312 -0.701424836 #> 38 -0.721871606 -0.03182804 -0.5939715526 -0.577118604 -0.012492637 #> 39 -0.234539602 2.08697046 0.5251125913 -0.350263807 -0.591195684 #> 40 -0.323145421 0.04186930 -0.1402887916 0.065636655 -0.609567210 #> 41 1.316062227 -0.34504173 -0.5233986787 -0.448567553 0.290637530 #> 42 -0.367448331 -0.06867671 -0.2713527003 -0.123409010 -0.692239074 #> 43 -0.721871606 -0.76880143 -0.5738078743 -0.577118604 -0.609567210 #> 44 0.748984986 0.39193166 1.3316597220 -0.478814859 -0.379923143 #> 45 1.989466449 -0.75037709 -0.4931531613 -0.289769194 2.936137175 #> 46 -0.057327965 -0.76880143 -0.4729894830 -0.569556778 2.467663279 #> 47 -0.730732188 -0.73195276 -0.3217618960 -0.297331021 -0.141093314 #> 48 3.495765369 -0.20685922 -0.5435623569 -0.524185818 -0.058421450 #> 49 -0.385169494 -0.72274059 -0.2108616655 -0.229274582 0.492724309 #> 50 -0.624405205 -0.63983108 -0.4124984482 0.489098943 0.042621939 #> 51 -0.588962878 2.18830430 -0.4830713221 -0.561994951 3.110666665 #> 52 -0.137073202 0.12477881 0.6662583392 1.056235936 -0.232950941 #> 53 -0.730732188 -0.76880143 -0.5939715526 -0.561994951 -0.692239074 #> 54 -0.305424257 -0.75037709 -0.5738078743 -0.577118604 -0.398294669 #> 55 -0.535799386 -0.63983108 -0.4225802873 0.050513002 -0.591195684 #> 56 -0.730732188 0.92623737 -0.5536441961 -0.478814859 0.446795495 #> 57 -0.367448331 2.16066779 -0.2511890220 5.563084576 -0.600381447 #> 58 -0.721871606 -0.75037709 -0.5838897135 -0.546871298 0.042621939 #> 59 -0.721871606 -0.23449572 2.7128716834 -0.577118604 1.622573115 #> 60 0.376840547 0.43799250 -0.4024166090 -0.115847183 -0.122721789 #> 61 0.111023091 0.09714230 4.3360477841 -0.055352571 -0.582009922 #> 62 -0.562381132 0.13399097 -0.2209435046 -0.577118604 -0.021678400 #> 63 1.750230739 0.22611265 -0.5133168395 -0.463691206 -0.554452634 #> 64 -0.314284839 0.36429516 2.6422988095 0.254682319 0.079364989 #> 65 -0.721871606 -0.75958926 -0.3923347699 -0.577118604 -0.085978738 #> 66 0.252792401 -0.54770941 -0.5939715526 -0.569556778 -0.333994330 #> 67 -0.358587749 -0.54770941 -0.4024166090 -0.554433125 -0.471780770 #> 68 -0.677568696 0.15241531 0.6965038566 0.012703869 -0.315622805 #> 69 0.642658004 -0.19764705 -0.0596340785 0.156378574 -0.517709583 #> 70 0.155326000 0.24453698 2.8741811096 -0.577118604 -0.499338058 #> 71 0.935057206 -0.48322424 -0.5939715526 0.942808538 -0.389108906 #> 72 -0.491496477 0.21690048 0.1117571868 -0.577118604 -0.343180093 #> 73 -0.730732188 -0.02261587 -0.4729894830 0.186625880 -0.673867548 #> 74 0.048999018 -0.46479990 -0.4225802873 -0.191465449 -0.425851957 #> 75 -0.145933784 1.34078490 -0.3217618960 0.436166157 -0.232950941 #> 76 -0.730732188 1.31314840 4.7393213494 0.141254920 -0.453409245 #> 77 -0.730732188 -0.05025237 4.3864569797 1.404079959 0.079364989 #> 78 -0.730732188 -0.76880143 -0.1302069524 -0.289769194 2.081861248 #> 79 -0.243400184 0.63144801 -0.3520074134 -0.168779969 -0.673867548 #> 80 6.614690190 0.31823432 -0.5939715526 -0.577118604 -0.389108906 #> 81 -0.394030076 -0.05025237 -0.5334805178 -0.342701980 -0.664681786 #> 82 1.759091320 -0.76880143 -0.5939715526 -0.577118604 0.162036853 #> 83 2.007187613 -0.28055656 -0.5334805178 -0.350263807 0.520281597 #> 84 -0.730732188 0.35508299 -0.5939715526 -0.478814859 -0.205393653 #> 85 -0.633265787 -0.08710104 -0.1201251133 -0.577118604 -0.710610599 #> 86 -0.101630874 0.08793014 -0.3419255742 -0.577118604 -0.269693992 #> 87 1.218595826 0.21690048 0.2125755781 1.094045069 -0.131907552 #> 88 -0.721871606 -0.40031473 -0.1906979872 -0.577118604 0.125293803 #> 89 -0.207957857 -0.45558774 -0.5939715526 -0.509062165 -0.425851957 #> 90 -0.730732188 -0.30819306 0.8376496045 -0.577118604 0.667253799 #> 91 -0.730732188 -0.76880143 1.7450151266 -0.093161703 -0.067607213 #> 92 -0.544659968 -0.17001055 -0.1503706307 -0.078038050 -0.582009922 #> 93 0.881893714 -0.76880143 -0.3520074134 -0.577118604 -0.398294669 #> 94 -0.137073202 -0.73195276 -0.1402887916 -0.577118604 -0.554452634 #> 95 -0.624405205 -0.29898089 -0.2612708612 0.383233371 -0.333994330 #> 96 -0.730732188 -0.76880143 -0.5939715526 2.349308281 -0.591195684 #> 97 0.243931819 -0.59377025 -0.5939715526 -0.577118604 2.807536497 #> 98 -0.482635895 0.42878033 1.4223962743 2.530792119 -0.159464840 #> 99 -0.730732188 -0.69510409 -0.5939715526 -0.561994951 -0.600381447 #> 100 -0.730732188 0.40114383 0.1420027042 -0.569556778 -0.600381447 #> 101 -0.704150442 0.91702520 -0.5637260352 -0.561994951 -0.389108906 #> 102 -0.491496477 2.38175981 -0.5939715526 -0.577118604 -0.683053311 #> 103 -0.243400184 -0.30819306 -0.4326621264 -0.569556778 -0.370737381 #> 104 1.316062227 -0.76880143 -0.5939715526 -0.009981611 -0.343180093 #> 105 0.040138436 0.56696284 -0.1201251133 0.156378574 -0.232950941 #> 106 -0.668708114 -0.23449572 -0.4528258047 0.020265695 -0.710610599 #> 107 0.261652983 1.19339022 0.4444578782 -0.138532663 -0.600381447 #> 108 -0.730732188 0.74199402 -0.5838897135 0.564717209 -0.582009922 #> 109 -0.704150442 -0.55692158 -0.4931531613 -0.561994951 -0.040049925 #> 110 -0.261121348 1.46975524 0.3133939695 -0.183903622 -0.288065517 #> 111 -0.367448331 -0.22528355 3.8823650230 -0.055352571 -0.572824159 #> 112 -0.721871606 -0.75958926 -0.5939715526 -0.531747645 -0.710610599 #> 113 -0.128212620 0.83411569 3.5496643316 0.678144607 -0.315622805 #> 114 -0.650986951 -0.10552538 -0.4830713221 -0.546871298 -0.664681786 #> 115 -0.500357059 0.99072254 3.0052450183 0.715953740 0.033436176 #> 116 -0.243400184 -0.56613375 -0.3419255742 -0.259521888 -0.361551618 #> 117 0.917336042 -0.76880143 -0.4427439656 -0.365387460 2.100232773 #> 118 0.616076258 0.43799250 0.7569948914 3.377716696 -0.563638396 #> 119 -0.225679020 -0.76880143 1.0090408698 2.939130754 0.703996850 #> 120 2.512240780 0.53932634 -0.5838897135 -0.546871298 -0.131907552 #> 121 -0.394030076 0.44720466 -0.4830713221 -0.531747645 -0.683053311 #> 122 0.111023091 -0.41873907 1.2409231698 0.950370364 -0.333994330 #> 123 -0.721871606 -0.75037709 -0.2915163786 -0.448567553 -0.683053311 #> 124 0.261652983 0.06029364 -0.3520074134 -0.161218143 -0.609567210 #> 125 -0.721871606 0.94466170 -0.3822529308 0.247120493 -0.012492637 #> 126 0.137604837 -0.75958926 -0.4225802873 -0.569556778 -0.058421450 #> 127 -0.713011024 -0.56613375 0.1117571868 -0.554433125 -0.232950941 #> 128 0.075580763 -0.51086074 -0.5233986787 -0.168779969 3.955756829 #> 129 -0.500357059 -0.56613375 -0.4427439656 -0.463691206 -0.471780770 #> 130 -0.642126369 -0.05946454 -0.5939715526 -0.456129379 -0.333994330 #> 131 2.972991038 -0.66746759 -0.5233986787 0.050513002 1.493972438 #> 132 -0.730732188 0.35508299 -0.4024166090 -0.040228917 0.823411764 #> 133 2.078072268 -0.70431626 0.0109387955 -0.463691206 -0.040049925 #> 134 -0.473775313 -0.54770941 -0.1402887916 0.315176932 -0.517709583 #> 135 2.645149508 -0.53849724 -0.5838897135 -0.561994951 1.319442948 #> 136 0.350258802 -0.45558774 1.1804321350 1.313338040 -0.049235688 #> 137 -0.269981930 -0.20685922 3.0254086966 1.857789554 -0.591195684 #> 138 0.093301927 -0.54770941 -0.4528258047 2.583724905 -0.683053311 #> 139 0.607215676 -0.66746759 -0.2209435046 7.158629984 -0.517709583 #> 140 -0.730732188 0.83411569 2.2087797267 -0.577118604 3.312753443 #> 141 -0.110491456 1.50660391 0.2125755781 0.368109718 -0.600381447 #> 142 -0.305424257 -0.75037709 -0.1705343090 -0.569556778 -0.710610599 #> 143 -0.278842512 -0.06867671 -0.3217618960 0.179064053 -0.683053311 #> 144 -0.571241714 0.50247767 -0.0293885611 2.349308281 -0.582009922 #> 145 1.271759317 -0.29898089 -0.4427439656 -0.365387460 -0.710610599 #> 146 -0.110491456 0.47484117 0.0008569563 0.549593556 0.051807701 #> 147 -0.730732188 -0.76880143 -0.5838897135 -0.577118604 -0.673867548 #> 148 -0.367448331 0.19847615 1.9164063918 0.632773648 -0.710610599 #> 149 -0.642126369 -0.74116493 -0.4326621264 -0.569556778 -0.701424836 #> 150 -0.730732188 4.27025412 -0.5939715526 -0.577118604 -0.701424836 #> 151 -0.402890658 -0.38189040 -0.4629076438 -0.577118604 0.805040239 #> 152 0.740124404 -0.36346606 -0.2511890220 0.050513002 -0.609567210 #> 153 -0.580102296 -0.65825542 0.0109387955 1.162101508 1.025498543 #> 154 -0.704150442 -0.74116493 -0.2209435046 2.825703355 -0.655496023 #> 155 0.004696108 0.90781303 -0.5133168395 -0.448567553 0.005878888 #> 156 0.846451387 -0.07788888 -0.2612708612 -0.561994951 -0.664681786 #> 157 -0.713011024 -0.76880143 -0.5838897135 -0.561994951 -0.710610599 #> 158 -0.367448331 -0.76880143 -0.0797977567 0.156378574 -0.637124498 #> 159 -0.163654947 -0.40031473 2.0676339788 -0.569556778 -0.646310260 #> 160 0.004696108 -0.48322424 -0.5738078743 -0.539309471 -0.370737381 #> 161 1.094547680 -0.48322424 -0.3923347699 -0.433443899 -0.591195684 #> 162 -0.730732188 0.41956816 -0.5939715526 -0.577118604 1.319442948 #> 163 0.181907746 -0.61219458 -0.5637260352 -0.569556778 -0.444223482 #> 164 -0.721871606 -0.25292005 -0.4830713221 -0.501500339 0.465167021 #> 165 -0.030746219 0.01423280 -0.5838897135 -0.554433125 -0.223765178 #> 166 -0.713011024 -0.76880143 0.6662583392 -0.577118604 -0.710610599 #> 167 -0.713011024 4.09522294 1.1602684568 -0.577118604 2.302319551 #> 168 2.388192634 -0.70431626 -0.5939715526 -0.577118604 1.007127017 #> 169 0.270513565 -0.76880143 -0.5738078743 -0.539309471 0.593767698 #> 170 -0.730732188 -0.76880143 0.1016753477 -0.569556778 -0.710610599 #> 171 -0.571241714 -0.61219458 -0.1100432742 0.534469902 -0.600381447 #> 172 -0.287703094 -0.48322424 -0.4225802873 -0.524185818 -0.407480431 #> 173 1.422389209 -0.61219458 -0.5738078743 -0.577118604 2.752421921 #> 174 0.456585784 0.14320314 -0.1705343090 -0.546871298 1.806288368 #> 175 -0.296563675 -0.39110257 -0.0697159176 -0.493938512 -0.627938735 #> 176 0.562912767 1.38684574 -0.5939715526 0.587402689 -0.012492637 #> 177 0.952778369 -0.48322424 -0.1604524698 -0.244398235 -0.683053311 #> 178 -0.721871606 -0.75037709 -0.5838897135 -0.214150929 1.705244979 #> 179 0.217350073 -0.52928508 -0.5435623569 -0.577118604 5.278506651 #> 180 -0.261121348 0.88017653 -0.1604524698 0.557155382 -0.673867548 #> 181 -0.039606801 -0.54770941 -0.1604524698 0.111007614 -0.627938735 #> 182 -0.083909710 -0.64904325 -0.2612708612 -0.577118604 -0.306437042 #> 183 -0.199097275 1.20260239 -0.2108616655 -0.123409010 -0.554452634 #> 184 -0.668708114 -0.30819306 -0.3116800568 1.600687450 -0.572824159 #> 185 0.297095310 2.55679099 -0.5939715526 -0.554433125 -0.627938735 #> 186 -0.713011024 -0.62140675 -0.0293885611 -0.380511113 -0.701424836 #> 187 -0.721871606 -0.75958926 -0.4225802873 -0.085599877 -0.609567210 #> 188 2.990712202 -0.41873907 -0.5939715526 -0.554433125 1.392929049 #> 189 -0.730732188 -0.56613375 -0.4326621264 -0.380511113 -0.710610599 #> 190 0.102162509 -0.25292005 0.0815116694 -0.304892848 -0.609567210 #> 191 -0.668708114 -0.25292005 -0.5133168395 -0.554433125 -0.343180093 #> 192 -0.730732188 -0.32661739 0.6158491435 -0.577118604 -0.205393653 #> 193 0.057859600 -0.63061892 -0.3822529308 0.413480677 -0.278879754 #> 194 -0.509217641 0.14320314 -0.4528258047 -0.577118604 0.162036853 #> 195 -0.668708114 0.11556664 -0.3721710916 0.526908076 -0.692239074 #> 196 -0.730732188 -0.76880143 -0.5838897135 -0.577118604 0.906083628 #> 197 -0.154794365 -0.47401207 2.1079613354 -0.093161703 -0.572824159 #> 198 -0.721871606 -0.67667975 -0.5939715526 -0.577118604 -0.627938735 #> 199 -0.713011024 -0.74116493 -0.4225802873 -0.161218143 -0.232950941 #> 200 -0.730732188 -0.47401207 -0.3217618960 0.511784423 -0.278879754 #> Otu00034 Otu00035 Otu00036 Otu00037 Otu00038 #> 1 -0.1482914828 -0.28857253 -0.337797955 -0.28026882 -0.269009738 #> 2 -0.1507314908 1.32771762 -0.337797955 -0.40104181 -0.269009738 #> 3 -0.1360914431 -0.09645535 -0.309626997 5.43380328 -0.251964926 #> 4 -0.1507314908 -0.24263146 -0.337797955 -0.28781713 -0.254805728 #> 5 0.0469091527 -0.38463111 -0.332163763 -0.55200805 -0.269009738 #> 6 -0.1507314908 -0.31363129 -0.337797955 -0.02362622 -0.269009738 #> 7 -0.1507314908 -0.38880757 3.099058896 -0.19723739 -0.269009738 #> 8 -0.1507314908 -0.25098438 -0.337797955 -0.13685089 -0.266168936 #> 9 -0.0775312524 -0.38880757 -0.337797955 0.32359613 -0.084357613 #> 10 -0.0604511968 -0.30110191 0.811577123 -0.51426649 -0.254805728 #> 11 -0.1507314908 1.31518824 -0.337797955 0.52740055 -0.269009738 #> 12 0.6935112580 -0.25098438 -0.337797955 -0.54445974 -0.266168936 #> 13 -0.1458514749 5.21182571 -0.337797955 -0.55200805 -0.257646530 #> 14 -0.1507314908 -0.31780775 -0.337797955 -0.43878337 -0.269009738 #> 15 -0.1507314908 -0.20921978 0.158010902 -0.40859012 -0.269009738 #> 16 -0.0824112683 -0.36792527 -0.337797955 1.16145875 -0.269009738 #> 17 -0.1507314908 -0.38880757 0.963700295 -0.29536544 0.049160077 #> 18 -0.1507314908 -0.17580810 -0.337797955 0.01411534 -0.200830492 #> 19 -0.1458514749 0.28360254 -0.337797955 -0.43123506 -0.269009738 #> 20 -0.1482914828 -0.36792527 -0.337797955 1.87100007 -0.269009738 #> 21 0.3616701775 -0.38880757 -0.337797955 7.21520489 -0.251964926 #> 22 -0.1214513954 -0.38463111 -0.337797955 0.18772652 -0.232079313 #> 23 -0.1507314908 0.35460236 -0.337797955 -0.25007557 -0.269009738 #> 24 -0.1507314908 -0.38880757 -0.337797955 0.06695353 -0.260487332 #> 25 -0.1360914431 -0.23010208 1.746852922 -0.54445974 0.270742627 #> 26 0.9887522192 -0.38463111 -0.337797955 -0.51426649 -0.260487332 #> 27 13.8524741014 -0.38880757 -0.337797955 -0.55200805 -0.266168936 #> 28 -0.1507314908 -0.38880757 -0.337797955 -0.55200805 -0.101402425 #> 29 -0.1507314908 0.05807368 -0.337797955 -0.31801038 -0.266168936 #> 30 -0.1458514749 -0.38880757 -0.337797955 -0.46897662 -0.260487332 #> 31 -0.1141313716 1.80383409 -0.320895380 0.42927250 0.301991448 #> 32 -0.1482914828 -0.38045465 -0.332163763 -0.33310700 -0.269009738 #> 33 -0.1507314908 -0.30945483 0.929895146 1.22184525 -0.269009738 #> 34 0.3836302490 -0.38880757 -0.337797955 -0.55200805 -0.269009738 #> 35 -0.1434114669 -0.38880757 -0.337797955 0.05940521 -0.266168936 #> 36 0.0542291766 -0.38880757 -0.337797955 -0.55200805 -0.254805728 #> 37 -0.1068113478 -0.38880757 -0.337797955 -0.52936311 2.219532746 #> 38 0.0883892878 -0.38463111 -0.337797955 -0.55200805 0.196881777 #> 39 -0.1507314908 -0.31780775 -0.337797955 -0.20478570 -0.226397709 #> 40 -0.1507314908 -0.27604314 -0.337797955 -0.14439921 0.114498521 #> 41 -0.1385314510 -0.38463111 -0.332163763 0.98029927 -0.269009738 #> 42 -0.0848512763 -0.30945483 -0.072990952 -0.01607790 -0.146855255 #> 43 -0.0360511174 -0.38880757 -0.337797955 -0.55200805 -0.269009738 #> 44 -0.1434114669 -0.38880757 -0.337797955 -0.55200805 -0.269009738 #> 45 -0.1019313319 -0.38880757 -0.337797955 -0.46142831 -0.266168936 #> 46 -0.1409714590 -0.38880757 3.262450451 0.53494886 -0.266168936 #> 47 -0.0214110697 -0.38880757 -0.337797955 0.82933303 -0.269009738 #> 48 -0.1312114272 -0.35121943 -0.337797955 2.98060192 -0.266168936 #> 49 -0.1287714193 -0.38880757 2.969472490 -0.52936311 -0.192308086 #> 50 -0.0946113080 -0.38880757 -0.337797955 -0.49162155 -0.269009738 #> 51 -0.1458514749 -0.18833748 -0.337797955 -0.44633168 -0.135492048 #> 52 -0.1458514749 3.57047681 -0.337797955 -0.54445974 0.392897110 #> 53 0.0493491607 -0.38880757 -0.337797955 1.64455071 -0.229238511 #> 54 0.1249894069 -0.38880757 -0.337797955 -0.54445974 -0.149696057 #> 55 -0.1482914828 -0.19251394 -0.337797955 -0.41613843 -0.269009738 #> 56 -0.0311711015 -0.38880757 -0.337797955 -0.55200805 -0.266168936 #> 57 -0.1507314908 -0.07139659 -0.337797955 -0.43123506 -0.254805728 #> 58 -0.0287310935 -0.37210173 -0.326529572 -0.54445974 -0.269009738 #> 59 -0.1092513557 -0.38880757 -0.337797955 -0.48407324 0.017911256 #> 60 -0.1507314908 -0.11733765 -0.337797955 -0.41613843 -0.269009738 #> 61 -0.1409714590 -0.38880757 -0.337797955 -0.32555869 0.071886493 #> 62 -0.1287714193 -0.28439607 -0.005380653 0.23301639 1.310476131 #> 63 -0.0458111492 -0.38880757 -0.332163763 -0.04627115 -0.007655961 #> 64 -0.1507314908 0.63442520 -0.281456039 0.48965899 -0.226397709 #> 65 -0.1507314908 -0.38880757 -0.337797955 -0.55200805 -0.220716105 #> 66 -0.1409714590 1.92912790 -0.337797955 -0.55200805 -0.090039217 #> 67 -0.1482914828 -0.32198421 -0.337797955 -0.09910934 -0.269009738 #> 68 -0.1507314908 0.04972076 2.293369503 -0.53691142 -0.269009738 #> 69 -0.1507314908 -0.05469075 -0.337797955 -0.42368675 -0.266168936 #> 70 -0.0653312127 0.55507246 -0.337797955 -0.18968908 1.685461984 #> 71 -0.1068113478 -0.38880757 -0.332163763 0.24056470 -0.260487332 #> 72 -0.1482914828 0.44230803 -0.337797955 -0.40104181 -0.226397709 #> 73 -0.1482914828 -0.38880757 -0.337797955 -0.29536544 -0.217875303 #> 74 -0.1482914828 -0.38880757 -0.337797955 -0.25762388 -0.269009738 #> 75 -0.1458514749 -0.34704297 0.011521922 -0.48407324 -0.257646530 #> 76 -0.0897312922 -0.17998456 -0.337797955 -0.55200805 -0.232079313 #> 77 -0.1409714590 -0.25933730 -0.326529572 -0.46897662 0.032115266 #> 78 -0.1482914828 0.07895598 -0.337797955 -0.55200805 -0.246283323 #> 79 -0.1507314908 -0.29692545 -0.337797955 -0.50671818 -0.269009738 #> 80 0.1591495182 -0.38463111 -0.337797955 -0.55200805 -0.269009738 #> 81 -0.1507314908 -0.01292614 0.203084435 -0.53691142 -0.266168936 #> 82 -0.0287310935 -0.36374881 7.662754058 -0.55200805 -0.269009738 #> 83 -0.1190113875 -0.38045465 -0.337797955 2.54279983 -0.195148888 #> 84 -0.1434114669 0.12489705 -0.337797955 2.80699074 -0.266168936 #> 85 0.9009119332 1.03536539 -0.337797955 -0.52936311 -0.269009738 #> 86 -0.1507314908 -0.19669040 -0.337797955 -0.55200805 -0.269009738 #> 87 -0.1507314908 0.47989617 -0.337797955 0.46701406 -0.240601719 #> 88 -0.1141313716 0.53419016 2.304637886 -0.34820363 -0.192308086 #> 89 -0.1507314908 -0.38880757 -0.337797955 -0.29536544 0.398578714 #> 90 -0.0214110697 -0.38880757 -0.337797955 -0.07646440 -0.266168936 #> 91 -0.1434114669 -0.38880757 -0.332163763 -0.46897662 -0.246283323 #> 92 -0.1482914828 1.78712825 -0.337797955 -0.55200805 -0.169581671 #> 93 -0.1507314908 -0.38880757 -0.337797955 -0.39349350 -0.240601719 #> 94 -0.1482914828 -0.32616067 1.284849214 -0.29536544 -0.158218463 #> 95 -0.0824112683 -0.35121943 -0.337797955 -0.25007557 -0.269009738 #> 96 -0.0580111889 -0.38880757 -0.337797955 -0.55200805 -0.266168936 #> 97 0.3909502729 -0.38880757 -0.337797955 -0.52936311 -0.266168936 #> 98 -0.1482914828 1.37365868 -0.337797955 -0.03117453 -0.266168936 #> 99 0.0005490018 -0.35539589 -0.337797955 -0.55200805 -0.269009738 #> 100 0.1786695817 -0.38463111 -0.337797955 -0.55200805 8.500545795 #> 101 -0.0946113080 -0.37210173 -0.247650890 -0.01607790 -0.266168936 #> 102 -0.1434114669 -0.38880757 -0.332163763 -0.42368675 -0.263328134 #> 103 -0.1019313319 -0.38880757 -0.337797955 0.73875328 -0.237760917 #> 104 -0.1482914828 0.41724927 1.160897000 -0.55200805 -0.251964926 #> 105 -0.1263314113 -0.38880757 -0.337797955 -0.52936311 -0.118447236 #> 106 0.5324707336 -0.38463111 0.496062396 -0.55200805 -0.269009738 #> 107 -0.1507314908 1.03954186 -0.337797955 0.11224340 -0.172422473 #> 108 -0.1385314510 -0.38880757 -0.337797955 -0.34820363 -0.095720821 #> 109 -0.1214513954 -0.38045465 -0.337797955 0.74630160 -0.269009738 #> 110 -0.1458514749 -0.38463111 -0.337797955 -0.47652493 -0.266168936 #> 111 -0.1507314908 -0.38463111 -0.337797955 -0.03872284 -0.269009738 #> 112 -0.0165310538 -0.17163164 -0.337797955 0.17262989 -0.263328134 #> 113 0.0200690653 -0.38880757 -0.337797955 -0.45387999 -0.200830492 #> 114 -0.1507314908 -0.32198421 -0.337797955 -0.42368675 -0.075835207 #> 115 -0.1507314908 -0.09645535 -0.337797955 -0.38594519 0.120180125 #> 116 0.1323094308 -0.35539589 -0.332163763 0.55759380 -0.206512096 #> 117 -0.1507314908 -0.30945483 1.476411727 -0.49162155 -0.260487332 #> 118 -0.1434114669 -0.38880757 -0.337797955 -0.55200805 -0.269009738 #> 119 -0.1507314908 -0.38880757 -0.337797955 0.57269042 -0.269009738 #> 120 -0.1409714590 -0.38045465 -0.332163763 0.88971952 -0.269009738 #> 121 -0.1507314908 -0.38880757 -0.332163763 -0.48407324 -0.269009738 #> 122 -0.1507314908 3.68741770 -0.337797955 -0.55200805 -0.030382377 #> 123 -0.1458514749 -0.38880757 -0.337797955 -0.55200805 -0.269009738 #> 124 -0.1019313319 -0.10063181 -0.337797955 0.85952627 -0.215034501 #> 125 -0.1287714193 -0.29692545 -0.337797955 0.49720730 -0.217875303 #> 126 -0.1092513557 0.78477778 -0.337797955 -0.10665765 0.228130598 #> 127 -0.1434114669 -0.38880757 -0.337797955 0.17262989 0.151428946 #> 128 -0.1360914431 -0.38045465 -0.332163763 -0.37839688 0.012229652 #> 129 -0.1507314908 -0.38880757 -0.337797955 -0.53691142 0.179836966 #> 130 -0.1482914828 0.61354290 -0.337797955 -0.35575194 1.557625898 #> 131 -0.1409714590 -0.38880757 -0.337797955 1.72003383 -0.234920115 #> 132 -0.1190113875 -0.34286651 -0.332163763 0.27830626 -0.269009738 #> 133 -0.1385314510 0.68454273 6.113351379 0.40662756 -0.146855255 #> 134 -0.1507314908 -0.38880757 -0.337797955 -0.43878337 -0.269009738 #> 135 -0.1336514351 -0.37210173 -0.332163763 -0.53691142 -0.260487332 #> 136 -0.1507314908 0.21260271 -0.337797955 -0.35575194 -0.254805728 #> 137 -0.1360914431 -0.38880757 -0.281456039 -0.55200805 -0.269009738 #> 138 -0.1409714590 1.77042241 -0.332163763 0.11224340 -0.124128840 #> 139 -0.1507314908 0.57595476 0.056595454 -0.52181480 -0.254805728 #> 140 -0.0458111492 0.54254308 -0.337797955 -0.55200805 -0.237760917 #> 141 -0.1507314908 0.12489705 -0.337797955 -0.40104181 -0.192308086 #> 142 -0.1482914828 0.18336749 -0.315261189 -0.55200805 -0.183785680 #> 143 -0.1238914034 -0.36374881 -0.337797955 -0.45387999 -0.243442521 #> 144 -0.1482914828 -0.38880757 1.955318009 -0.24252726 0.441190742 #> 145 -0.1312114272 -0.35957235 -0.337797955 -0.55200805 -0.260487332 #> 146 -0.1507314908 -0.10898473 -0.270187656 -0.55200805 0.784927775 #> 147 -0.0580111889 -0.38880757 -0.332163763 -0.55200805 -0.269009738 #> 148 -0.1507314908 -0.36792527 1.521485259 -0.51426649 -0.001974357 #> 149 0.2201497168 -0.33869005 -0.337797955 0.32359613 -0.269009738 #> 150 -0.0677712207 -0.38880757 -0.337797955 0.21791976 0.509369989 #> 151 -0.1507314908 -0.23845500 -0.337797955 -0.49162155 0.023592860 #> 152 -0.1482914828 -0.38463111 -0.337797955 0.77649484 -0.263328134 #> 153 -0.1482914828 -0.38880757 -0.292724422 -0.06136778 0.162792154 #> 154 -0.1385314510 -0.36374881 -0.337797955 -0.55200805 4.418313433 #> 155 0.2665098677 -0.32198421 -0.337797955 1.95403150 0.091772106 #> 156 -0.1482914828 -0.16745518 -0.337797955 0.35378938 -0.254805728 #> 157 0.4812305668 -0.37210173 -0.332163763 -0.55200805 -0.223556907 #> 158 -0.0824112683 2.04606879 -0.337797955 -0.51426649 0.052000879 #> 159 -0.1263314113 -0.10063181 -0.337797955 -0.53691142 -0.263328134 #> 160 -0.1482914828 -0.38880757 0.203084435 4.20342844 -0.260487332 #> 161 -0.1507314908 -0.38880757 0.974968678 0.32359613 -0.269009738 #> 162 -0.0994913239 -0.38880757 -0.337797955 -0.55200805 -0.263328134 #> 163 -0.1507314908 -0.18416102 -0.337797955 0.35378938 -0.269009738 #> 164 0.1079093513 -0.37627819 -0.163138017 0.90481615 -0.266168936 #> 165 -0.1287714193 -0.37627819 -0.337797955 -0.50671818 -0.237760917 #> 166 0.0347091130 0.50495493 -0.337797955 -0.54445974 5.517703777 #> 167 -0.1507314908 0.04136784 -0.337797955 -0.55200805 -0.269009738 #> 168 -0.1482914828 -0.38463111 -0.337797955 -0.55200805 -0.266168936 #> 169 -0.1482914828 -0.38880757 2.535639740 -0.55200805 -0.240601719 #> 170 0.5861509084 -0.38463111 -0.337797955 -0.55200805 0.941171881 #> 171 -0.1507314908 -0.29274899 -0.337797955 -0.50671818 -0.260487332 #> 172 -0.0799712604 -0.22592562 0.005887730 -0.35575194 -0.144014453 #> 173 0.0127490415 -0.33869005 -0.264553465 -0.12175427 -0.257646530 #> 174 -0.1507314908 -0.38463111 -0.208211549 -0.15949583 -0.001974357 #> 175 -0.1458514749 0.56342538 -0.298358614 0.11224340 -0.260487332 #> 176 -0.1312114272 1.81218701 -0.337797955 0.33869275 -0.266168936 #> 177 -0.1507314908 -0.31363129 1.279215022 -0.28781713 -0.269009738 #> 178 -0.0775312524 -0.38463111 -0.337797955 -0.55200805 -0.215034501 #> 179 0.1298694228 -0.33451359 -0.337797955 2.56544476 -0.269009738 #> 180 0.3445901219 -0.33033713 0.890455805 -0.37084856 0.091772106 #> 181 -0.1507314908 2.17136260 0.777771974 -0.43878337 -0.269009738 #> 182 -0.1507314908 5.69629511 -0.337797955 -0.50671818 -0.115606434 #> 183 -0.0994913239 -0.38045465 -0.337797955 -0.53691142 -0.269009738 #> 184 0.0371491210 -0.20086686 -0.095527718 -0.25762388 -0.223556907 #> 185 -0.1507314908 -0.38880757 2.259564353 0.05940521 -0.234920115 #> 186 -0.1385314510 -0.35957235 -0.089893526 -0.54445974 0.375852298 #> 187 -0.1360914431 -0.38880757 -0.337797955 -0.55200805 -0.246283323 #> 188 -0.1092513557 -0.38880757 -0.337797955 1.79551695 -0.266168936 #> 189 -0.1165713795 -0.36792527 0.417183714 -0.52936311 -0.246283323 #> 190 -0.1507314908 -0.35957235 -0.337797955 -0.34065532 -0.269009738 #> 191 -0.0628912048 -0.29692545 -0.337797955 0.72365666 -0.266168936 #> 192 -0.0189710618 -0.38463111 2.693397103 0.36888600 7.210821722 #> 193 -0.1360914431 -0.38880757 -0.337797955 0.26320964 -0.186626482 #> 194 0.0298290971 -0.38880757 -0.337797955 2.06725618 0.515051592 #> 195 -0.1458514749 -0.38880757 -0.337797955 -0.44633168 -0.269009738 #> 196 -0.1312114272 -0.38880757 -0.337797955 2.57299307 -0.269009738 #> 197 -0.1190113875 -0.34704297 2.225759204 -0.52936311 -0.257646530 #> 198 0.4446304476 -0.38880757 -0.332163763 0.83688134 -0.269009738 #> 199 0.0200690653 -0.38880757 -0.337797955 -0.54445974 0.128702531 #> 200 -0.1092513557 7.49217304 -0.337797955 -0.15194752 -0.269009738 #> Otu00039 Otu00040 Otu00041 Otu00042 Otu00043 #> 1 -0.369691676 -0.20704023 0.122728281 0.690525991 0.719828577 #> 2 0.504524822 -0.32139200 -0.630775883 -0.301679743 -0.243967502 #> 3 -0.439414464 0.35201286 0.855588495 -0.293479696 -0.461086399 #> 4 0.064734927 -0.33409775 -0.620453908 0.641325706 -0.127464679 #> 5 0.252450126 -0.85503359 4.860514738 2.211634782 -0.461086399 #> 6 -0.214156225 0.05978056 0.277557904 -0.301679743 0.545074343 #> 7 -0.385781550 -0.81691633 -0.424336386 -0.301679743 0.126723298 #> 8 -0.278515722 0.30118985 -0.661741808 -0.301679743 -0.381652656 #> 9 -0.133706855 -0.33409775 3.467048133 -0.297579720 -0.455790816 #> 10 -0.412598007 -0.46115527 0.071118407 -0.301679743 -0.461086399 #> 11 0.102277967 0.50448189 -0.661741808 -0.301679743 -0.461086399 #> 12 -0.417961299 -0.63903580 0.081440382 -0.301679743 0.312068697 #> 13 0.080824801 0.37742437 0.205304080 -0.010578061 -0.461086399 #> 14 -0.396508133 -0.55009554 0.298201853 4.581448478 -0.095691182 #> 15 -0.289242305 -0.37221501 1.712312408 3.257140824 -0.026848605 #> 16 -0.439414464 0.75859693 -0.651419833 -0.301679743 0.539778760 #> 17 -0.289242305 -0.33409775 0.659470973 -0.301679743 0.269704035 #> 18 -0.251699265 0.17413233 -0.155965040 -0.277079601 -0.005666274 #> 19 -0.058620775 -0.60091855 0.628505049 -0.256579483 -0.164533759 #> 20 1.362651445 1.52094206 -0.372726512 -0.297579720 -0.461086399 #> 21 -0.439414464 4.04938672 -0.661741808 -0.301679743 -0.455790816 #> 22 -0.310695471 -0.85503359 -0.661741808 -0.256579483 -0.249263085 #> 23 -0.407234716 0.79671419 -0.021779367 -0.297579720 0.132018880 #> 24 -0.305332179 1.34306153 1.640058584 -0.236079364 -0.365765907 #> 25 -0.439414464 0.25036685 -0.651419833 -0.301679743 -0.461086399 #> 26 -0.434051173 -0.74068182 0.721402822 -0.289379672 0.010220475 #> 27 -0.439414464 -0.85503359 -0.641097858 -0.231979341 -0.424017319 #> 28 -0.230246100 -0.57550704 -0.558522059 -0.002378014 -0.418721736 #> 29 0.466981782 -0.72797607 -0.290150713 -0.301679743 -0.392243822 #> 30 8.093582148 -0.74068182 -0.455302311 -0.268879554 3.399393499 #> 31 -0.310695471 0.14872083 -0.661741808 -0.297579720 -0.455790816 #> 32 -0.439414464 -0.30868625 -0.661741808 -0.281179625 -0.424017319 #> 33 -0.192703060 1.16518100 -0.630775883 -0.301679743 1.180544285 #> 34 0.139821007 0.84753719 0.174338155 -0.289379672 -0.413426153 #> 35 -0.273152431 -0.10539421 -0.475946260 -0.301679743 -0.085100016 #> 36 -0.332148636 1.02541772 -0.661741808 -0.297579720 -0.413426153 #> 37 0.542067861 -0.63903580 -0.269506763 -0.301679743 -0.053326519 #> 38 -0.439414464 -0.85503359 -0.651419833 -0.301679743 -0.461086399 #> 39 -0.417961299 -0.14351147 1.412975137 -0.301679743 -0.249263085 #> 40 0.247086835 -0.29598050 -0.114677141 -0.297579720 0.184974709 #> 41 0.043281762 0.31389561 -0.434658361 -0.301679743 -0.238671919 #> 42 -0.412598007 0.14872083 -0.279828738 -0.260679507 -0.392243822 #> 43 -0.439414464 -0.85503359 -0.641097858 -0.301679743 -0.429312902 #> 44 -0.203429643 -0.85503359 0.287879879 -0.289379672 -0.344583576 #> 45 -0.428687881 -0.82962208 -0.475946260 -0.301679743 -0.339287993 #> 46 0.129094424 0.37742437 -0.506912185 -0.252479459 -0.461086399 #> 47 -0.428687881 -0.80421058 -0.032101342 -0.297579720 0.290886366 #> 48 0.123731133 -0.05457121 -0.166287015 -0.301679743 -0.461086399 #> 49 -0.230246100 -0.62633005 -0.424336386 -0.301679743 0.820444651 #> 50 -0.417961299 0.16142658 0.019508532 -0.297579720 0.449753851 #> 51 0.450891908 -0.43574377 -0.455302311 -0.297579720 -0.461086399 #> 52 0.214907086 -0.74068182 -0.465624286 4.749549449 -0.302218913 #> 53 -0.434051173 0.17413233 -0.620453908 0.973427626 -0.461086399 #> 54 -0.439414464 1.10165224 -0.661741808 -0.297579720 -0.450495233 #> 55 -0.037167609 -0.37221501 0.225948029 -0.301679743 0.412684771 #> 56 -0.439414464 -0.85503359 -0.661741808 1.563831038 -0.461086399 #> 57 -0.235609391 -0.51197828 -0.434658361 1.157928692 -0.386948239 #> 58 -0.369691676 -0.84232784 -0.641097858 -0.293479696 -0.445199650 #> 59 -0.026441027 1.69882259 2.032293628 -0.293479696 -0.445199650 #> 60 -0.305332179 0.13601508 -0.228218864 -0.277079601 -0.010961856 #> 61 -0.412598007 -0.48656678 2.352274849 -0.293479696 -0.445199650 #> 62 -0.026441027 0.19954384 -0.290150713 -0.289379672 -0.439904067 #> 63 0.096914676 2.25787568 -0.073389241 -0.293479696 -0.445199650 #> 64 1.389467902 -0.32139200 -0.651419833 -0.289379672 0.052585138 #> 65 -0.439414464 -0.85503359 -0.424336386 -0.301679743 5.326985656 #> 66 -0.010351152 1.20329825 0.143372231 -0.301679743 -0.461086399 #> 67 -0.407234716 -0.81691633 -0.506912185 3.232540682 2.599760488 #> 68 -0.396508133 -0.55009554 1.784566232 -0.301679743 -0.455790816 #> 69 -0.316058762 0.40283587 -0.661741808 -0.301679743 0.063176303 #> 70 -0.273152431 -0.20704023 -0.661741808 -0.297579720 -0.455790816 #> 71 1.603999558 0.40283587 -0.114677141 -0.301679743 -0.381652656 #> 72 -0.273152431 0.05978056 -0.661741808 -0.301679743 -0.450495233 #> 73 -0.417961299 0.08519207 1.113637867 -0.301679743 -0.286332165 #> 74 0.048645053 0.26307260 -0.197252939 -0.297579720 0.211452623 #> 75 -0.310695471 -0.24515749 1.268467489 -0.297579720 0.788671154 #> 76 -0.257062557 -0.85503359 -0.114677141 -0.293479696 -0.116873513 #> 77 -0.358965093 -0.56280129 1.361365263 -0.289379672 -0.418721736 #> 78 -0.439414464 -0.43574377 1.144603791 -0.297579720 -0.461086399 #> 79 -0.396508133 -0.39762651 -0.052745291 -0.301679743 0.089654218 #> 80 -0.439414464 -0.81691633 -0.661741808 -0.301679743 -0.461086399 #> 81 -0.423324590 -0.23245173 -0.661741808 -0.301679743 -0.233376336 #> 82 -0.439414464 1.07624073 0.102084331 0.292823692 0.910469559 #> 83 3.760042699 0.92377171 -0.238540839 -0.297579720 -0.365765907 #> 84 2.816103414 3.09645532 -0.661741808 2.219834829 -0.450495233 #> 85 -0.439414464 -0.82962208 0.463353451 -0.100778582 0.274999617 #> 86 -0.439414464 -0.74068182 0.525285300 -0.297579720 -0.074508851 #> 87 0.820959014 -0.72797607 -0.279828738 -0.285279649 -0.402834987 #> 88 -0.273152431 -0.85503359 -0.651419833 -0.289379672 -0.333992410 #> 89 0.359715954 0.94918321 0.504641350 -0.293479696 -0.376357073 #> 90 -0.434051173 1.01271197 -0.661741808 -0.301679743 -0.461086399 #> 91 -0.391144842 -0.47386102 0.287879879 -0.301679743 -0.455790816 #> 92 -0.283879014 -0.84232784 -0.651419833 -0.301679743 -0.392243822 #> 93 -0.181976477 -0.85503359 -0.661741808 -0.297579720 -0.307514496 #> 94 -0.364328385 -0.85503359 -0.661741808 -0.297579720 -0.455790816 #> 95 -0.251699265 -0.34680350 0.463353451 -0.297579720 0.666872748 #> 96 -0.439414464 -0.09268846 0.153694206 -0.301679743 -0.461086399 #> 97 0.912134968 1.03812348 -0.641097858 -0.301679743 -0.439904067 #> 98 0.096914676 -0.51197828 0.834944546 -0.301679743 -0.461086399 #> 99 0.075461510 0.49177614 -0.661741808 -0.301679743 6.846817934 #> 100 -0.439414464 -0.85503359 -0.620453908 -0.289379672 4.109001601 #> 101 -0.294605596 -0.68985881 -0.372726512 -0.293479696 1.127588456 #> 102 -0.160523311 -0.65174155 -0.517234160 -0.244279412 -0.376357073 #> 103 -0.214156225 1.57176506 -0.589487984 -0.174579009 -0.386948239 #> 104 2.767833791 1.35576728 -0.383048487 -0.297579720 -0.450495233 #> 105 -0.407234716 -0.49927253 0.019508532 0.219023266 0.417980354 #> 106 1.051580544 -0.71527031 0.060796432 -0.301679743 2.864539631 #> 107 -0.396508133 -0.05457121 -0.444980336 -0.301679743 0.476231766 #> 108 -0.439414464 2.90586903 -0.661741808 0.145222839 -0.439904067 #> 109 -0.348238510 0.98730047 -0.630775883 -0.297579720 1.350002936 #> 110 0.134457715 -0.58821279 0.029830507 0.719226157 -0.016257439 #> 111 -0.364328385 -0.65174155 -0.661741808 -0.244279412 -0.445199650 #> 112 -0.439414464 4.51949955 0.339489753 -0.301679743 4.956294857 #> 113 -0.198066351 -0.85503359 -0.661741808 1.752432128 -0.455790816 #> 114 -0.171249894 -0.60091855 2.589680270 -0.297579720 -0.286332165 #> 115 -0.348238510 -0.04186545 -0.661741808 -0.301679743 0.089654218 #> 116 -0.181976477 -0.52468403 -0.001135417 -0.108978630 -0.291627748 #> 117 -0.396508133 0.04707481 0.969130219 -0.301679743 -0.461086399 #> 118 -0.439414464 -0.23245173 2.259377075 -0.301679743 -0.461086399 #> 119 0.107641258 -0.85503359 2.042615603 -0.293479696 -0.461086399 #> 120 6.806392213 1.94023187 -0.651419833 -0.297579720 -0.455790816 #> 121 -0.401871424 -0.65174155 1.113637867 0.018122105 -0.206898422 #> 122 0.745872935 -0.71527031 -0.661741808 1.756532152 -0.455790816 #> 123 -0.439414464 -0.85503359 -0.465624286 -0.297579720 -0.455790816 #> 124 0.761962809 0.93647746 -0.661741808 -0.297579720 -0.461086399 #> 125 -0.428687881 0.94918321 -0.558522059 0.624925612 -0.429312902 #> 126 0.037918470 -0.42303802 0.422065552 0.895527176 -0.461086399 #> 127 -0.122980272 -0.84232784 1.825854131 -0.297579720 -0.445199650 #> 128 0.155910881 -0.56280129 -0.661741808 -0.301679743 -0.243967502 #> 129 0.649333689 -0.66444731 -0.537878109 -0.301679743 -0.281036582 #> 130 -0.385781550 0.36471861 -0.166287015 -0.301679743 -0.461086399 #> 131 -0.439414464 -0.85503359 -0.589487984 -0.256579483 -0.450495233 #> 132 0.155910881 -0.33409775 -0.599809959 0.268223550 1.662442324 #> 133 0.155910881 1.68611683 -0.661741808 -0.301679743 -0.455790816 #> 134 -0.326785345 0.12330932 0.463353451 -0.301679743 1.620077661 #> 135 -0.139070146 0.80941994 -0.651419833 -0.301679743 -0.434608484 #> 136 -0.149796729 -0.21974598 2.114869427 -0.281179625 0.073767469 #> 137 -0.353601802 0.46636463 -0.661741808 0.743826299 -0.058622102 #> 138 -0.101527106 -0.39762651 -0.661741808 2.387935801 -0.461086399 #> 139 -0.149796729 -0.21974598 0.277557904 -0.301679743 -0.217489588 #> 140 0.525977987 1.19059250 0.164016180 -0.301679743 -0.461086399 #> 141 -0.332148636 -0.74068182 0.618183074 1.990233502 0.184974709 #> 142 -0.434051173 -0.84232784 -0.641097858 -0.289379672 -0.333992410 #> 143 1.587909684 -0.66444731 -0.465624286 -0.297579720 -0.318105662 #> 144 -0.439414464 -0.21974598 -0.362404537 -0.301679743 0.492118514 #> 145 -0.321422053 -0.85503359 -0.444980336 -0.281179625 1.561826250 #> 146 -0.342875219 -0.76609332 -0.475946260 9.243175419 -0.450495233 #> 147 -0.439414464 -0.85503359 -0.455302311 -0.293479696 -0.461086399 #> 148 -0.434051173 0.40283587 2.909661491 -0.301679743 0.889287228 #> 149 -0.439414464 -0.52468403 -0.403692436 -0.301679743 -0.461086399 #> 150 -0.439414464 0.45365888 0.308523828 -0.297579720 -0.376357073 #> 151 0.032555179 -0.70256456 0.287879879 -0.301679743 -0.461086399 #> 152 -0.004987861 0.96188896 -0.300472688 -0.002378014 -0.461086399 #> 153 -0.358965093 -0.81691633 6.832011934 -0.293479696 -0.461086399 #> 154 -0.412598007 0.31389561 -0.269506763 -0.297579720 0.169087960 #> 155 0.102277967 0.59342215 -0.630775883 -0.100778582 0.121427715 #> 156 -0.439414464 4.15103274 -0.290150713 -0.301679743 -0.461086399 #> 157 -0.439414464 -0.85503359 -0.630775883 -0.301679743 -0.355174742 #> 158 0.107641258 -0.47386102 0.215626055 -0.301679743 -0.386948239 #> 159 -0.031804318 -0.13080572 0.153694206 -0.281179625 -0.318105662 #> 160 1.169572955 -0.77879908 -0.630775883 -0.301679743 -0.429312902 #> 161 -0.332148636 0.22495534 -0.630775883 -0.301679743 -0.461086399 #> 162 -0.417961299 -0.01645395 -0.661741808 -0.297579720 -0.450495233 #> 163 -0.042530901 0.21224959 -0.599809959 -0.301679743 -0.455790816 #> 164 -0.407234716 1.95293763 -0.114677141 -0.281179625 -0.445199650 #> 165 -0.364328385 2.10540665 -0.610131933 -0.301679743 0.592734588 #> 166 -0.439414464 -0.85503359 -0.661741808 -0.301679743 -0.455790816 #> 167 -0.439414464 -0.85503359 -0.434658361 -0.301679743 -0.461086399 #> 168 3.373885719 -0.06727696 -0.661741808 -0.223779293 -0.450495233 #> 169 0.359715954 -0.84232784 -0.589487984 0.124722721 -0.185716091 #> 170 -0.439414464 -0.85503359 -0.661741808 -0.297579720 -0.461086399 #> 171 -0.391144842 -0.28327474 0.525285300 -0.301679743 0.635099251 #> 172 -0.439414464 0.05978056 -0.465624286 -0.240179388 0.862809314 #> 173 -0.417961299 -0.76609332 -0.630775883 -0.301679743 3.341142087 #> 174 0.338262788 -0.15621722 0.680114923 -0.301679743 -0.085100016 #> 175 0.005738722 -0.04186545 1.010418118 -0.277079601 -0.455790816 #> 176 -0.439414464 -0.85503359 -0.661741808 0.501924901 -0.461086399 #> 177 -0.391144842 -0.43574377 -0.032101342 -0.293479696 -0.058622102 #> 178 -0.439414464 1.39388453 -0.145643065 -0.301679743 -0.461086399 #> 179 -0.439414464 0.61883366 -0.661741808 -0.301679743 -0.445199650 #> 180 -0.369691676 -0.49927253 0.164016180 -0.301679743 -0.069213268 #> 181 -0.267789139 -0.39762651 0.081440382 0.961127555 -0.153942593 #> 182 3.111084440 1.03812348 -0.661741808 -0.178679033 -0.439904067 #> 183 -0.198066351 -0.51197828 -0.290150713 -0.301679743 -0.196307256 #> 184 -0.353601802 -0.70256456 2.486460522 -0.293479696 -0.408130570 #> 185 -0.439414464 1.22870976 -0.496590210 -0.281179625 -0.381652656 #> 186 -0.407234716 -0.85503359 -0.661741808 -0.293479696 -0.413426153 #> 187 -0.439414464 -0.85503359 0.607861099 -0.301679743 -0.455790816 #> 188 -0.439414464 3.94774071 -0.661741808 -0.268879554 -0.445199650 #> 189 -0.423324590 -0.84232784 -0.527556135 -0.256579483 -0.333992410 #> 190 -0.321422053 -0.41033226 1.805210182 -0.285279649 -0.397539405 #> 191 0.134457715 -0.62633005 -0.661741808 0.014022081 -0.386948239 #> 192 -0.439414464 1.52094206 -0.661741808 -0.301679743 -0.450495233 #> 193 -0.412598007 -0.09268846 -0.094033191 -0.289379672 0.455049434 #> 194 -0.423324590 0.98730047 -0.527556135 -0.133578772 -0.392243822 #> 195 -0.375054967 -0.15621722 0.236270004 -0.297579720 1.090519376 #> 196 -0.144433437 -0.85503359 -0.661741808 0.104222602 -0.450495233 #> 197 -0.439414464 -0.82962208 -0.001135417 -0.293479696 -0.376357073 #> 198 0.692240021 -0.81691633 -0.661741808 -0.301679743 -0.445199650 #> 199 -0.423324590 -0.75338757 -0.290150713 -0.293479696 -0.191011673 #> 200 0.445528616 0.11060357 0.494319376 -0.301679743 -0.392243822 #> Otu00044 Otu00045 Otu00046 Otu00047 Otu00048 Otu00049 #> 1 -0.611704260 -0.23391339 0.693551357 -0.203512195 -0.253544727 0.60651290 #> 2 -0.622709104 -0.23391339 -0.569110688 -0.208661143 -0.253544727 -0.42970775 #> 3 0.026576699 -0.23391339 -0.584323484 0.342276360 0.007337307 -0.42161228 #> 4 0.092605763 -0.23391339 -0.523472301 -0.208661143 -0.253544727 -0.43780323 #> 5 -0.303568625 -0.14075174 -0.584323484 -0.208661143 -0.194846269 0.42841248 #> 6 -0.259549248 -0.23391339 0.784828131 -0.208661143 -0.253544727 -0.43780323 #> 7 0.829930318 -0.23391339 -0.584323484 -0.033596890 -0.247022676 0.01554331 #> 8 -0.204525028 -0.23391339 0.221954690 -0.208661143 -0.253544727 -0.33256207 #> 9 -0.534670351 -0.23391339 -0.584323484 -0.208661143 -0.070927303 -0.31637112 #> 10 -0.446631598 -0.23391339 -0.584323484 0.501893767 -0.207890371 -0.42970775 #> 11 0.235668737 -0.23391339 1.895362219 -0.203512195 -0.247022676 -0.43780323 #> 12 -0.622709104 -0.23391339 -0.188790795 -0.208661143 -0.116581659 -0.40542133 #> 13 -0.314573469 -0.23391339 -0.584323484 -0.208661143 -0.227456524 -0.42161228 #> 14 -0.578689727 -0.18733256 0.298018668 -0.208661143 -0.253544727 -0.43780323 #> 15 0.884954539 -0.23391339 1.180360820 -0.208661143 -0.253544727 -0.40542133 #> 16 -0.611704260 -0.10348707 -0.584323484 -0.193214297 -0.253544727 8.67770035 #> 17 0.004567010 -0.23391339 0.678338561 -0.208661143 -0.207890371 -0.41351681 #> 18 -0.215529872 -0.23391339 0.632700174 -0.203512195 -0.253544727 -0.43780323 #> 19 0.169639672 -0.23391339 -0.386557139 -0.208661143 -0.253544727 0.68746764 #> 20 -0.402612222 0.55174991 -0.584323484 -0.208661143 -0.247022676 -0.43780323 #> 21 -0.600699416 -0.23391339 -0.477833914 -0.208661143 1.142174157 -0.42161228 #> 22 0.488780151 -0.23391339 -0.234429182 -0.203512195 -0.227456524 -0.42970775 #> 23 -0.039452366 -0.23391339 -0.097514021 -0.208661143 -0.247022676 0.50936722 #> 24 6.431395968 -0.23391339 1.119509637 0.316531617 -0.253544727 -0.13017522 #> 25 -0.435626754 -0.23391339 -0.584323484 -0.208661143 -0.207890371 -0.43780323 #> 26 0.279688113 -0.23391339 -0.127939612 -0.203512195 0.626932139 -0.43780323 #> 27 -0.732757545 -0.23391339 -0.584323484 -0.208661143 5.707609757 1.02747754 #> 28 -0.380602533 -0.23391339 -0.584323484 -0.208661143 -0.253544727 -0.43780323 #> 29 -0.633713948 -0.23080800 -0.219216386 0.002445751 -0.253544727 0.03982973 #> 30 -0.545675195 -0.23080800 -0.295280365 -0.203512195 -0.253544727 -0.43780323 #> 31 -0.644718792 -0.23391339 -0.584323484 -0.208661143 -0.207890371 -0.26779828 #> 32 -0.226534716 5.84954278 -0.584323484 -0.208661143 -0.253544727 -0.43780323 #> 33 0.026576699 -0.23391339 1.073871250 -0.141724811 -0.253544727 0.52555816 #> 34 -0.655723636 -0.23391339 0.982594476 3.756029300 0.920424427 -0.02493406 #> 35 -0.347588001 -0.23080800 -0.264854773 -0.208661143 -0.240500625 0.26650300 #> 36 -0.721752701 -0.23391339 -0.584323484 -0.208661143 0.033425511 -0.28398922 #> 37 1.677303314 -0.23391339 0.510997808 -0.208661143 -0.097015507 -0.38113491 #> 38 0.829930318 -0.23391339 0.008975549 -0.208661143 -0.233978575 -0.12207975 #> 39 -0.006437834 7.04201198 0.754402540 -0.208661143 -0.253544727 0.12078447 #> 40 0.180644516 -0.23080800 1.256424799 -0.208661143 -0.253544727 -0.41351681 #> 41 -0.138495963 -0.23080800 0.008975549 -0.208661143 -0.247022676 0.48508079 #> 42 -0.292563781 -0.22459723 -0.493046709 -0.193214297 0.274741392 -0.41351681 #> 43 -0.523665507 -0.23391339 -0.584323484 -0.208661143 1.311747479 -0.34065754 #> 44 -0.094476587 -0.14385712 2.153979746 -0.208661143 -0.227456524 -0.36494396 #> 45 0.202654204 -0.23391339 -0.462621118 -0.208661143 1.279137225 0.19364374 #> 46 -0.380602533 -0.23391339 -0.569110688 -0.188065349 -0.194846269 -0.42161228 #> 47 3.206976645 -0.23391339 -0.386557139 0.661511175 0.079079867 -0.37303944 #> 48 -0.600699416 -0.23080800 -0.584323484 -0.208661143 -0.220934473 -0.43780323 #> 49 -0.380602533 -0.23391339 -0.386557139 -0.208661143 -0.207890371 -0.08969785 #> 50 -0.490650974 -0.23391339 0.100252324 -0.203512195 1.670460276 -0.31637112 #> 51 -0.215529872 -0.23391339 0.419721034 -0.208661143 -0.253544727 -0.43780323 #> 52 -0.688738168 -0.23391339 0.997807271 -0.208661143 -0.253544727 -0.43780323 #> 53 -0.721752701 -0.23391339 -0.584323484 -0.208661143 0.046469612 -0.43780323 #> 54 -0.534670351 -0.22770262 -0.188790795 -0.208661143 0.366050104 -0.42161228 #> 55 -0.248544404 -0.23391339 3.918664050 -0.208661143 -0.253544727 -0.43780323 #> 56 -0.732757545 -0.23391339 -0.584323484 -0.208661143 -0.220934473 -0.42970775 #> 57 -0.127491119 -0.02274697 -0.508259505 -0.208661143 -0.253544727 -0.17065259 #> 58 -0.721752701 -0.23391339 0.176316302 -0.198363246 -0.247022676 -0.34065754 #> 59 -0.325578313 -0.23391339 -0.371344344 -0.203512195 -0.240500625 -0.38923038 #> 60 0.323707489 0.39026971 -0.538685096 -0.208661143 -0.253544727 0.08840257 #> 61 1.226104706 -0.23391339 -0.584323484 -0.208661143 -0.253544727 -0.42161228 #> 62 -0.699743012 -0.23391339 -0.416982731 -0.203512195 0.079079867 0.25031205 #> 63 -0.501655819 2.33734833 -0.477833914 -0.203512195 -0.175280117 -0.42970775 #> 64 -0.567684883 -0.23391339 0.510997808 -0.203512195 -0.240500625 -0.41351681 #> 65 -0.468641286 -0.23080800 -0.219216386 -0.115980068 -0.253544727 2.04750725 #> 66 0.983998136 -0.23391339 -0.082301225 -0.203512195 -0.149191913 -0.07350690 #> 67 -0.446631598 -0.23391339 -0.508259505 -0.018150044 -0.253544727 0.74413596 #> 68 1.435196744 -0.23391339 3.812174480 -0.208661143 -0.253544727 -0.43780323 #> 69 0.873949695 -0.23391339 5.303028460 -0.208661143 -0.227456524 1.06795491 #> 70 -0.534670351 -0.23391339 -0.584323484 -0.208661143 -0.083971405 -0.42970775 #> 71 0.433755930 -0.18422718 -0.553897892 -0.208661143 -0.240500625 0.54174911 #> 72 1.138065953 -0.23391339 -0.584323484 -0.208661143 -0.253544727 -0.07350690 #> 73 -0.369597689 -0.23391339 2.473448456 6.943228501 -0.227456524 -0.38923038 #> 74 -0.094476587 1.04550669 -0.386557139 -0.208661143 -0.253544727 -0.20303448 #> 75 -0.347588001 -0.23391339 0.374082647 -0.208661143 -0.253544727 -0.29208470 #> 76 -0.710747857 -0.23391339 -0.158365203 -0.208661143 0.646498291 -0.43780323 #> 77 0.510789839 -0.23080800 -0.553897892 -0.208661143 -0.253544727 -0.43780323 #> 78 -0.732757545 -0.23391339 -0.584323484 -0.208661143 0.033425511 -0.43780323 #> 79 0.048586387 4.98624476 -0.204003591 -0.208661143 -0.253544727 -0.08160238 #> 80 0.323707489 -0.23391339 -0.584323484 -0.208661143 -0.136147812 -0.43780323 #> 81 0.499784995 -0.23391339 0.997807271 -0.208661143 -0.253544727 0.09649805 #> 82 -0.732757545 -0.23391339 -0.584323484 -0.203512195 -0.129625761 -0.42161228 #> 83 0.147629984 -0.23080800 -0.356131548 -0.208661143 -0.240500625 -0.42161228 #> 84 -0.523665507 -0.23391339 -0.584323484 -0.208661143 -0.227456524 -0.30018017 #> 85 5.352921246 -0.19975412 -0.569110688 -0.208661143 -0.175280117 0.06411615 #> 86 -0.457636442 -0.23391339 -0.401769935 -0.208661143 0.248653189 -0.29208470 #> 87 0.081600919 -0.23391339 -0.553897892 -0.208661143 -0.240500625 -0.42970775 #> 88 -0.116486275 -0.23391339 -0.584323484 -0.208661143 -0.253544727 -0.34875301 #> 89 0.774906098 -0.23391339 1.773659853 -0.208661143 -0.253544727 -0.43780323 #> 90 -0.534670351 -0.22149184 -0.584323484 -0.208661143 0.666064444 -0.43780323 #> 91 -0.380602533 -0.23391339 1.682383079 -0.198363246 -0.253544727 -0.32446659 #> 92 0.499784995 -0.23391339 3.583982544 -0.208661143 -0.253544727 -0.39732586 #> 93 -0.633713948 -0.23391339 -0.538685096 -0.208661143 -0.253544727 -0.35684849 #> 94 -0.457636442 -0.23391339 0.419721034 -0.208661143 -0.253544727 -0.33256207 #> 95 -0.391607378 -0.23391339 0.298018668 -0.208661143 -0.083971405 -0.39732586 #> 96 -0.732757545 -0.23391339 -0.584323484 1.160959192 0.144300375 -0.43780323 #> 97 -0.369597689 -0.23080800 -0.584323484 -0.193214297 0.242131138 0.06411615 #> 98 -0.259549248 -0.23391339 0.434933830 -0.208661143 -0.253544727 -0.38113491 #> 99 -0.677733324 -0.23391339 -0.584323484 -0.208661143 -0.038317049 -0.39732586 #> 100 3.273005710 -0.23391339 -0.477833914 -0.208661143 -0.253544727 4.16042593 #> 101 0.554809216 -0.23391339 -0.553897892 -0.167469554 -0.057883201 1.04366849 #> 102 -0.710747857 0.20084100 -0.508259505 -0.208661143 -0.207890371 -0.42161228 #> 103 -0.435626754 -0.23391339 -0.584323484 -0.208661143 -0.175280117 0.14507089 #> 104 -0.600699416 -0.23080800 -0.584323484 -0.208661143 0.633454190 -0.33256207 #> 105 -0.281558936 -0.23391339 0.008975549 -0.208661143 -0.240500625 -0.36494396 #> 106 -0.479646130 -0.22770262 -0.097514021 -0.208661143 0.509535223 1.65892451 #> 107 0.213659048 -0.23391339 -0.569110688 -0.208661143 -0.253544727 1.18129155 #> 108 0.213659048 -0.23391339 -0.584323484 -0.208661143 -0.253544727 -0.36494396 #> 109 1.699313003 -0.22459723 1.210786411 -0.208661143 -0.253544727 3.28611475 #> 110 2.260560052 -0.03206314 1.575893509 -0.208661143 -0.240500625 0.03173426 #> 111 1.908405041 -0.23391339 -0.462621118 -0.208661143 -0.253544727 -0.42161228 #> 112 0.686867345 -0.23391339 -0.584323484 0.120871569 -0.253544727 3.50469255 #> 113 0.466770463 -0.23391339 -0.584323484 -0.208661143 -0.233978575 -0.43780323 #> 114 0.653852813 6.28429718 1.560680713 -0.203512195 -0.253544727 -0.33256207 #> 115 -0.149500807 -0.23391339 1.530255122 -0.208661143 -0.247022676 2.12846199 #> 116 -0.314573469 -0.23391339 -0.493046709 -0.146873760 -0.207890371 -0.42970775 #> 117 -0.490650974 -0.23080800 -0.584323484 -0.208661143 -0.207890371 -0.42970775 #> 118 -0.710747857 -0.23080800 -0.584323484 2.715941677 -0.240500625 -0.43780323 #> 119 -0.380602533 -0.23391339 -0.584323484 3.169049157 -0.194846269 -0.41351681 #> 120 -0.600699416 -0.23080800 -0.584323484 -0.208661143 -0.253544727 -0.42970775 #> 121 -0.358592845 5.26883512 -0.584323484 -0.208661143 -0.253544727 0.08840257 #> 122 -0.501655819 -0.23080800 -0.432195526 -0.208661143 -0.253544727 -0.42970775 #> 123 -0.369597689 -0.22149184 -0.584323484 -0.038745838 -0.247022676 -0.43780323 #> 124 -0.402612222 -0.23391339 -0.569110688 -0.208661143 -0.247022676 -0.42970775 #> 125 0.664857657 -0.23391339 -0.508259505 -0.208661143 -0.227456524 -0.42161228 #> 126 -0.490650974 -0.23391339 1.438978347 -0.203512195 -0.201368320 -0.43780323 #> 127 -0.534670351 -0.23080800 -0.401769935 -0.203512195 -0.123103710 -0.34875301 #> 128 -0.644718792 -0.23391339 -0.523472301 -0.208661143 -0.253544727 0.36364869 #> 129 0.015571854 -0.23391339 -0.310493161 1.572875082 -0.253544727 0.71175406 #> 130 -0.094476587 -0.23391339 -0.584323484 -0.203512195 -0.253544727 -0.30827565 #> 131 -0.567684883 0.69770317 -0.584323484 -0.208661143 -0.025272947 -0.43780323 #> 132 -0.039452366 -0.23391339 0.860892110 -0.198363246 -0.253544727 1.01938207 #> 133 0.972993292 -0.23391339 -0.584323484 -0.208661143 -0.240500625 -0.37303944 #> 134 0.400741398 -0.23391339 1.895362219 -0.208661143 -0.253544727 -0.40542133 #> 135 -0.534670351 -0.22770262 -0.432195526 -0.208661143 -0.253544727 -0.10588880 #> 136 0.037581543 -0.23391339 -0.584323484 -0.208661143 -0.253544727 1.36748745 #> 137 -0.578689727 -0.23391339 -0.264854773 -0.208661143 -0.227456524 1.17319607 #> 138 0.928973915 -0.22770262 -0.584323484 -0.208661143 -0.201368320 -0.43780323 #> 139 -0.545675195 -0.11901402 -0.584323484 -0.208661143 -0.247022676 -0.21922543 #> 140 3.262000866 -0.23391339 -0.584323484 -0.203512195 -0.240500625 -0.43780323 #> 141 0.895959383 -0.22149184 -0.386557139 -0.208661143 -0.253544727 0.08840257 #> 142 -0.600699416 -0.23391339 -0.462621118 -0.208661143 -0.253544727 -0.42161228 #> 143 0.125620295 0.74428400 -0.584323484 -0.193214297 -0.240500625 0.82509070 #> 144 -0.468641286 -0.21217567 0.161103507 -0.136575862 -0.253544727 -0.34065754 #> 145 -0.160505651 -0.23391339 -0.584323484 -0.198363246 -0.240500625 -0.33256207 #> 146 -0.589694571 -0.22149184 4.146855986 -0.182916400 -0.253544727 -0.43780323 #> 147 -0.633713948 -0.23391339 -0.584323484 -0.208661143 0.137778324 -0.13017522 #> 148 -0.732757545 -0.23391339 -0.584323484 -0.208661143 -0.247022676 0.81699522 #> 149 -0.567684883 -0.23391339 0.298018668 -0.208661143 0.085601918 -0.42970775 #> 150 -0.732757545 -0.23391339 -0.553897892 -0.208661143 -0.162236015 -0.43780323 #> 151 -0.611704260 -0.23080800 -0.310493161 -0.208661143 -0.253544727 -0.43780323 #> 152 -0.281558936 -0.23391339 -0.584323484 0.980745990 -0.253544727 -0.43780323 #> 153 -0.424621910 -0.23391339 2.777704371 9.152127462 -0.253544727 -0.31637112 #> 154 -0.699743012 -0.23391339 1.515042326 -0.208661143 -0.233978575 0.20983468 #> 155 -0.534670351 -0.23391339 -0.584323484 -0.208661143 -0.207890371 4.74330005 #> 156 -0.490650974 -0.23391339 -0.584323484 -0.208661143 0.020381409 -0.43780323 #> 157 -0.699743012 -0.22770262 -0.584323484 -0.208661143 11.623109885 -0.29208470 #> 158 2.271564896 -0.19975412 3.188449855 -0.208661143 -0.253544727 -0.43780323 #> 159 -0.622709104 -0.23391339 -0.584323484 -0.208661143 -0.175280117 -0.31637112 #> 160 -0.556680039 -0.23080800 -0.401769935 -0.208661143 -0.247022676 -0.43780323 #> 161 -0.567684883 0.65422773 -0.584323484 -0.208661143 -0.253544727 -0.43780323 #> 162 -0.501655819 -0.23391339 0.465359421 -0.208661143 -0.201368320 0.76032691 #> 163 1.369167679 0.46169364 1.241212003 -0.208661143 -0.253544727 -0.30018017 #> 164 -0.446631598 -0.23391339 -0.493046709 -0.198363246 0.222564986 -0.42970775 #> 165 0.400741398 -0.23080800 -0.553897892 -0.208661143 -0.240500625 -0.10588880 #> 166 -0.732757545 -0.23391339 -0.584323484 -0.208661143 1.540019259 -0.26779828 #> 167 -0.545675195 -0.23080800 0.480572217 0.337127411 -0.247022676 -0.39732586 #> 168 0.191649360 -0.23080800 -0.432195526 -0.208661143 -0.253544727 -0.43780323 #> 169 -0.512660663 -0.23391339 -0.432195526 -0.208661143 -0.175280117 0.88985449 #> 170 -0.721752701 -0.23080800 -0.584323484 -0.208661143 0.653020342 -0.36494396 #> 171 0.257678425 -0.23391339 1.362914369 -0.203512195 -0.181802168 -0.40542133 #> 172 -0.501655819 -0.19043795 -0.493046709 -0.208661143 -0.247022676 2.04750725 #> 173 -0.512660663 -0.23391339 -0.553897892 -0.208661143 0.326917799 2.76800443 #> 174 -0.677733324 1.07345519 -0.584323484 -0.208661143 -0.247022676 -0.37303944 #> 175 0.015571854 -0.23391339 -0.112726816 -0.203512195 -0.253544727 -0.43780323 #> 176 -0.358592845 -0.23391339 -0.569110688 -0.208661143 0.366050104 0.11268900 #> 177 0.059591231 0.80639177 -0.280067569 -0.208661143 -0.253544727 -0.43780323 #> 178 1.006007824 -0.23080800 -0.584323484 -0.208661143 0.561711630 -0.43780323 #> 179 -0.732757545 -0.23080800 -0.584323484 -0.208661143 -0.077449354 0.23412110 #> 180 -0.402612222 0.02693925 0.632700174 -0.188065349 -0.253544727 0.32317132 #> 181 -0.270554092 -0.23391339 0.008975549 0.450404281 -0.253544727 0.39603058 #> 182 0.609833436 -0.23391339 0.465359421 -0.208661143 -0.227456524 -0.42161228 #> 183 0.631843124 0.11389013 -0.401769935 -0.208661143 -0.253544727 -0.30018017 #> 184 -0.589694571 -0.22459723 -0.371344344 -0.172618503 0.222564986 -0.35684849 #> 185 -0.457636442 0.65112234 -0.553897892 -0.208661143 -0.253544727 -0.37303944 #> 186 -0.655723636 -0.23391339 -0.477833914 -0.208661143 -0.247022676 -0.32446659 #> 187 0.895959383 -0.23391339 -0.584323484 -0.208661143 0.092123968 -0.30827565 #> 188 -0.248544404 -0.23391339 -0.493046709 -0.208661143 -0.129625761 -0.18684354 #> 189 -0.666728480 -0.23080800 -0.553897892 4.682840053 0.150822426 -0.41351681 #> 190 -0.171510495 1.64484668 1.073871250 -0.110831119 -0.247022676 -0.42970775 #> 191 -0.369597689 -0.23391339 -0.553897892 -0.208661143 2.146569989 -0.30018017 #> 192 3.735209162 -0.22459723 -0.569110688 -0.208661143 -0.240500625 -0.43780323 #> 193 -0.369597689 -0.23080800 0.328444260 -0.208661143 -0.253544727 -0.31637112 #> 194 0.224663892 -0.23391339 -0.356131548 -0.208661143 -0.253544727 -0.32446659 #> 195 -0.204525028 -0.23080800 0.313231464 -0.177767451 -0.247022676 0.43650795 #> 196 -0.490650974 -0.23391339 -0.386557139 -0.208661143 -0.188324219 -0.43780323 #> 197 -0.435626754 -0.23391339 -0.569110688 -0.208661143 -0.142669863 -0.42161228 #> 198 -0.666728480 -0.23391339 -0.553897892 -0.208661143 -0.103537557 -0.22732091 #> 199 -0.303568625 -0.23391339 -0.340918752 -0.208661143 1.983518717 -0.29208470 #> 200 2.876831322 -0.23391339 -0.584323484 -0.208661143 -0.253544727 -0.42970775 #> Otu00050 Otu00051 Otu00052 Otu00053 Otu00054 Otu00055 #> 1 -0.475385806 -0.20991733 0.19735560 -0.082761027 -0.18688626 -0.256009183 #> 2 -0.450642238 -0.20991733 -0.25745566 0.651532741 -0.45315341 -0.418554697 #> 3 0.304036595 -0.16859502 5.36271211 -0.189845534 1.12780781 -0.377918318 #> 4 1.380381816 -0.20991733 -0.25745566 -0.128654387 -0.08703608 -0.405009237 #> 5 -0.549616511 2.09035789 -0.25745566 -0.465205697 -0.53636190 -0.201827346 #> 6 -0.475385806 -0.20991733 -0.25745566 -0.342823403 -0.58628699 -0.283100102 #> 7 -0.524872942 -0.20991733 0.06740953 -0.082761027 -0.33666153 -0.432100156 #> 8 1.652561068 -0.20991733 -0.22496914 -0.388716763 -0.51972020 -0.418554697 #> 9 0.390639084 -0.20991733 -0.25745566 1.095168558 0.76169047 0.136809140 #> 10 -0.475385806 -0.20991733 -0.25745566 -0.373418976 0.26243956 0.096172762 #> 11 3.384610848 -0.20991733 -0.25745566 -0.465205697 -0.58628699 -0.296645562 #> 12 -0.549616511 -0.20991733 -0.25745566 -0.419312337 -0.38658662 -0.296645562 #> 13 -0.549616511 -0.20991733 -0.25745566 0.085514628 -0.30337814 -0.418554697 #> 14 -0.425898669 0.04490358 -0.25745566 -0.358121189 -0.50307850 -0.350827400 #> 15 0.192690538 -0.20991733 -0.25745566 0.024323481 -0.58628699 -0.296645562 #> 16 -0.203206555 0.84380156 -0.25745566 -0.465205697 -0.53636190 -0.432100156 #> 17 -0.549616511 -0.20991733 -0.25745566 -0.419312337 -0.03711098 -0.364372859 #> 18 -0.376411533 -0.20991733 -0.25745566 -0.312227829 0.16258938 0.245172816 #> 19 1.120574349 -0.20303028 -0.25745566 -0.281632255 -0.18688626 -0.405009237 #> 20 -0.524872942 0.91955912 -0.25745566 0.100812415 -0.58628699 -0.201827346 #> 21 -0.512501158 -0.20991733 -0.25745566 -0.465205697 -0.10367777 4.850629026 #> 22 -0.487757590 -0.20303028 -0.25745566 -0.449907910 2.24280151 -0.432100156 #> 23 -0.326924396 -0.20991733 -0.25745566 -0.388716763 -0.35330323 -0.432100156 #> 24 1.256663975 -0.20991733 7.27941672 -0.465205697 -0.51972020 -0.432100156 #> 25 -0.265065475 -0.20991733 -0.25745566 -0.006272093 2.12630963 -0.201827346 #> 26 -0.549616511 -0.20991733 0.58719383 -0.388716763 -0.43651171 0.475445626 #> 27 -0.512501158 -0.20991733 -0.25745566 -0.449907910 -0.58628699 2.547900921 #> 28 0.019485560 -0.20991733 -0.25745566 -0.434610124 -0.40322832 -0.405009237 #> 29 1.442240737 -0.18236913 -0.25745566 -0.449907910 -0.32001983 1.829991571 #> 30 -0.549616511 -0.20991733 -0.25745566 -0.465205697 -0.58628699 -0.405009237 #> 31 -0.549616511 -0.20303028 -0.24662682 -0.465205697 -0.07039438 2.209264435 #> 32 -0.005258008 -0.03774104 5.22193719 1.079870772 -0.10367777 -0.418554697 #> 33 -0.302180828 -0.20991733 -0.25745566 -0.327525616 -0.51972020 0.949536707 #> 34 3.533072258 -0.20991733 -0.25745566 -0.449907910 2.79197752 0.109718221 #> 35 -0.549616511 -0.20991733 -0.25745566 -0.312227829 -0.56964529 -0.323736481 #> 36 -0.537244727 -0.20991733 -0.24662682 -0.465205697 -0.41987002 -0.418554697 #> 37 -0.549616511 -0.20991733 -0.25745566 -0.419312337 -0.50307850 -0.147645508 #> 38 -0.524872942 -0.20991733 -0.25745566 -0.465205697 -0.58628699 -0.377918318 #> 39 -0.512501158 -0.08595040 -0.25745566 0.009025694 -0.58628699 -0.405009237 #> 40 1.937112103 -0.20991733 -0.25745566 -0.465205697 0.11266429 -0.418554697 #> 41 -0.116604066 -0.20991733 -0.25745566 -0.465205697 -0.00382759 3.035537461 #> 42 -0.487757590 -0.18236913 -0.09502307 -0.189845534 0.27908126 -0.120554589 #> 43 0.897882235 -0.20303028 -0.23579798 -0.465205697 2.30936830 2.507264543 #> 44 -0.401155101 -0.20991733 -0.04087887 -0.159249961 -0.12031947 -0.377918318 #> 45 -0.549616511 -0.20991733 -0.25745566 -0.449907910 0.86154066 -0.256009183 #> 46 0.056600912 -0.20991733 -0.25745566 -0.434610124 -0.33666153 -0.432100156 #> 47 -0.500129374 -0.20991733 -0.25745566 -0.388716763 -0.33666153 -0.377918318 #> 48 -0.549616511 -0.20991733 -0.25745566 -0.465205697 -0.58628699 -0.432100156 #> 49 3.124803381 -0.20991733 -0.25745566 -0.465205697 2.22615982 -0.350827400 #> 50 -0.549616511 -0.20991733 -0.25745566 -0.342823403 -0.46979511 -0.323736481 #> 51 -0.549616511 -0.20991733 -0.25745566 -0.251036682 -0.51972020 -0.432100156 #> 52 -0.549616511 -0.20991733 -0.25745566 0.085514628 -0.56964529 -0.418554697 #> 53 -0.524872942 -0.20991733 -0.25745566 -0.465205697 -0.51972020 0.163900059 #> 54 6.564159374 -0.20991733 -0.21414030 -0.465205697 -0.30337814 -0.418554697 #> 55 0.242177675 -0.20991733 -0.25745566 -0.358121189 -0.51972020 -0.337281940 #> 56 -0.537244727 -0.19614323 -0.24662682 -0.312227829 -0.51972020 -0.418554697 #> 57 -0.388783317 0.25840217 -0.25745566 -0.404014550 -0.46979511 -0.405009237 #> 58 -0.549616511 -0.20991733 -0.17082495 -0.449907910 -0.58628699 0.123263681 #> 59 0.254549459 -0.20991733 -0.25745566 -0.465205697 -0.12031947 -0.391463778 #> 60 -0.091860497 2.84104651 -0.25745566 -0.388716763 -0.56964529 0.055536384 #> 61 -0.302180828 -0.20991733 -0.25745566 -0.449907910 -0.46979511 -0.350827400 #> 62 -0.487757590 -0.20991733 -0.25745566 -0.006272093 3.92361292 4.539083459 #> 63 -0.512501158 0.39614321 4.64800869 -0.296930042 -0.58628699 -0.418554697 #> 64 0.613331199 -0.20991733 -0.11668075 0.819808396 0.12930599 -0.432100156 #> 65 0.304036595 -0.20991733 -0.25745566 -0.296930042 -0.56964529 -0.405009237 #> 66 1.454612521 -0.20991733 -0.25745566 -0.465205697 -0.20352796 -0.432100156 #> 67 4.287751091 -0.20991733 -0.25745566 -0.327525616 -0.58628699 -0.432100156 #> 68 0.786536177 -0.20991733 2.70964640 0.223194710 -0.58628699 -0.432100156 #> 69 0.118459833 -0.20991733 -0.25745566 -0.449907910 -0.58628699 -0.418554697 #> 70 -0.537244727 -0.20991733 -0.25745566 -0.465205697 3.05824467 0.367081951 #> 71 -0.549616511 -0.19614323 -0.25745566 -0.465205697 -0.53636190 -0.432100156 #> 72 0.130831617 -0.20991733 -0.25745566 0.391470365 -0.51972020 -0.377918318 #> 73 0.922625803 -0.20991733 -0.25745566 0.116110202 -0.20352796 -0.432100156 #> 74 0.192690538 -0.18925618 -0.25745566 -0.419312337 -0.30337814 -0.432100156 #> 75 -0.524872942 -0.20991733 -0.25745566 -0.052165453 -0.48643681 -0.283100102 #> 76 -0.537244727 -0.20991733 1.99494298 0.529150446 -0.33666153 -0.418554697 #> 77 -0.512501158 -0.20303028 -0.25745566 -0.174547748 -0.58628699 0.055536384 #> 78 -0.351667964 -0.20991733 -0.25745566 1.033977411 -0.56964529 -0.242463724 #> 79 -0.425898669 2.84793356 -0.13833843 -0.419312337 -0.58628699 -0.405009237 #> 80 -0.549616511 -0.20991733 -0.24662682 -0.465205697 -0.28673644 -0.432100156 #> 81 -0.326924396 -0.20991733 -0.25745566 -0.449907910 -0.48643681 -0.432100156 #> 82 -0.549616511 -0.20991733 -0.25745566 -0.465205697 -0.50307850 -0.174736427 #> 83 -0.549616511 -0.20991733 -0.25745566 -0.404014550 -0.51972020 -0.323736481 #> 84 0.551472278 -0.20991733 -0.25745566 -0.388716763 -0.40322832 -0.323736481 #> 85 1.528843226 -0.18925618 -0.25745566 -0.220441108 -0.43651171 -0.310191021 #> 86 1.256663975 -0.20991733 -0.25745566 -0.449907910 -0.45315341 -0.432100156 #> 87 -0.549616511 -0.20991733 -0.25745566 -0.251036682 0.36228975 0.908900329 #> 88 0.266921243 -0.20991733 -0.25745566 1.095168558 -0.56964529 -0.161190967 #> 89 -0.500129374 -0.20991733 -0.25745566 0.238492497 0.42885653 -0.432100156 #> 90 -0.475385806 -0.20991733 -0.25745566 4.078236988 1.95989266 -0.283100102 #> 91 -0.277437260 -0.18925618 -0.24662682 0.330279217 -0.58628699 -0.432100156 #> 92 -0.351667964 -0.20991733 -0.25745566 1.202253066 -0.56964529 -0.391463778 #> 93 -0.166091202 -0.20991733 -0.25745566 -0.465205697 -0.58628699 -0.377918318 #> 94 -0.166091202 -0.20991733 -0.25745566 -0.465205697 -0.40322832 -0.432100156 #> 95 -0.524872942 -0.20991733 0.77128410 -0.419312337 -0.03711098 -0.310191021 #> 96 -0.376411533 -0.20991733 -0.25745566 1.752973392 4.00682140 -0.350827400 #> 97 -0.537244727 -0.20991733 -0.24662682 -0.465205697 -0.46979511 1.071445842 #> 98 -0.104232281 -0.20991733 -0.25745566 -0.404014550 -0.56964529 0.082627303 #> 99 -0.401155101 -0.20991733 -0.25745566 -0.465205697 1.22765799 5.026719999 #> 100 -0.549616511 -0.20991733 -0.25745566 8.912337624 -0.58628699 -0.093463670 #> 101 -0.549616511 -0.20991733 -0.25745566 -0.128654387 -0.53636190 -0.418554697 #> 102 -0.549616511 0.14820935 -0.25745566 -0.358121189 -0.58628699 -0.418554697 #> 103 -0.227950123 -0.20991733 -0.25745566 -0.465205697 1.47728345 0.394172870 #> 104 -0.549616511 -0.20991733 -0.25745566 0.269088070 1.22765799 -0.350827400 #> 105 -0.364039749 -0.20991733 -0.25745566 -0.388716763 0.26243956 -0.174736427 #> 106 -0.524872942 -0.20991733 -0.25745566 -0.404014550 1.76019230 -0.418554697 #> 107 0.007113776 -0.20991733 -0.24662682 -0.067463240 -0.58628699 -0.418554697 #> 108 -0.190834770 -0.20991733 -0.24662682 -0.465205697 0.12930599 -0.432100156 #> 109 1.182433270 -0.20991733 -0.24662682 -0.465205697 -0.23681135 -0.405009237 #> 110 2.036086376 0.46501372 -0.24662682 -0.205143321 -0.12031947 -0.377918318 #> 111 -0.265065475 -0.20991733 -0.25745566 0.590341593 -0.55300359 -0.337281940 #> 112 -0.227950123 -0.20991733 -0.25745566 -0.465205697 -0.20352796 -0.432100156 #> 113 -0.450642238 -0.20991733 -0.24662682 -0.006272093 -0.28673644 -0.432100156 #> 114 -0.116604066 0.05179063 1.34521260 -0.082761027 -0.27009474 -0.418554697 #> 115 -0.339296180 -0.20991733 -0.25745566 0.162003562 -0.15360286 -0.052827292 #> 116 -0.537244727 0.38236910 -0.25745566 -0.174547748 -0.08703608 1.003718545 #> 117 -0.487757590 -0.20991733 -0.25745566 -0.205143321 -0.48643681 -0.174736427 #> 118 -0.549616511 -0.20991733 -0.25745566 1.951844620 -0.35330323 -0.391463778 #> 119 -0.289809044 -0.20991733 -0.25745566 1.538804376 0.06273920 -0.432100156 #> 120 -0.549616511 -0.20303028 -0.25745566 -0.465205697 -0.56964529 -0.432100156 #> 121 -0.463014022 11.54627967 -0.25745566 -0.205143321 -0.38658662 -0.432100156 #> 122 -0.326924396 -0.20991733 -0.25745566 2.915605190 -0.55300359 -0.432100156 #> 123 -0.463014022 -0.16170797 1.12863581 -0.342823403 -0.48643681 -0.432100156 #> 124 -0.549616511 -0.20991733 -0.25745566 -0.358121189 -0.43651171 0.597354761 #> 125 1.244292191 -0.20991733 -0.25745566 0.146705776 0.94474914 -0.418554697 #> 126 -0.537244727 -0.20991733 -0.25745566 4.185321496 -0.58628699 -0.432100156 #> 127 0.316408380 -0.20303028 -0.25745566 -0.281632255 0.42885653 -0.432100156 #> 128 -0.376411533 -0.20991733 -0.25745566 -0.281632255 -0.58628699 -0.418554697 #> 129 0.588587631 -0.20991733 -0.25745566 -0.388716763 -0.35330323 -0.432100156 #> 130 -0.425898669 -0.20991733 -0.25745566 0.116110202 -0.51972020 -0.432100156 #> 131 -0.463014022 0.92644617 -0.25745566 -0.449907910 -0.43651171 6.354175024 #> 132 -0.537244727 -0.19614323 -0.25745566 -0.465205697 -0.56964529 -0.432100156 #> 133 0.514356926 -0.20991733 -0.25745566 -0.404014550 -0.56964529 0.407718329 #> 134 -0.549616511 -0.20991733 -0.25745566 -0.143952174 -0.51972020 -0.201827346 #> 135 -0.425898669 -0.20991733 -0.25745566 -0.465205697 -0.45315341 -0.364372859 #> 136 0.192690538 -0.20991733 -0.24662682 3.879365760 -0.36994493 -0.432100156 #> 137 -0.388783317 -0.20991733 -0.25745566 0.100812415 1.19437460 -0.405009237 #> 138 1.145317917 -0.20991733 -0.25745566 -0.251036682 0.31236465 -0.134100048 #> 139 0.019485560 -0.08595040 -0.24662682 -0.113356600 -0.56964529 -0.432100156 #> 140 -0.401155101 -0.20991733 -0.17082495 2.686138388 -0.51972020 -0.432100156 #> 141 -0.487757590 -0.20991733 -0.25745566 -0.052165453 0.02945580 -0.405009237 #> 142 -0.500129374 -0.20991733 -0.25745566 0.452661512 0.71176538 -0.432100156 #> 143 -0.425898669 -0.18925618 -0.25745566 0.024323481 -0.08703608 -0.432100156 #> 144 0.167946970 -0.19614323 1.64842011 -0.235738895 1.92660927 -0.432100156 #> 145 -0.537244727 -0.20991733 -0.25745566 -0.220441108 0.34564805 -0.012190913 #> 146 -0.252693691 -0.19614323 -0.25745566 0.054919055 -0.27009474 -0.296645562 #> 147 4.225892170 -0.20303028 -0.25745566 -0.465205697 0.06273920 0.231627356 #> 148 -0.376411533 -0.20991733 3.34854794 0.177301349 -0.10367777 -0.432100156 #> 149 0.761792609 -0.19614323 -0.24662682 -0.327525616 4.95539814 0.488991086 #> 150 -0.549616511 -0.20991733 -0.25745566 -0.220441108 3.50757049 -0.418554697 #> 151 -0.549616511 -0.20991733 -0.25745566 0.100812415 -0.55300359 -0.432100156 #> 152 -0.549616511 -0.20991733 -0.25745566 -0.373418976 -0.22016965 2.317628111 #> 153 -0.537244727 -0.20991733 -0.25745566 0.636234954 0.02945580 0.150354600 #> 154 1.083458997 -0.20991733 -0.25745566 -0.082761027 3.90697122 -0.377918318 #> 155 1.491727874 -0.20991733 -0.25745566 -0.388716763 -0.45315341 1.355900490 #> 156 -0.153719418 -0.20991733 -0.25745566 -0.327525616 -0.03711098 -0.337281940 #> 157 -0.549616511 -0.20991733 -0.15999611 -0.434610124 -0.58628699 3.726355893 #> 158 -0.500129374 -0.20991733 -0.25745566 -0.205143321 -0.13696117 -0.405009237 #> 159 -0.413526885 -0.20991733 -0.25745566 -0.465205697 3.57413728 -0.405009237 #> 160 -0.537244727 -0.20991733 -0.25745566 -0.449907910 -0.58628699 0.651536599 #> 161 -0.549616511 1.87685929 -0.25745566 -0.327525616 -0.58628699 1.667446057 #> 162 -0.425898669 -0.20991733 -0.25745566 2.303693717 -0.20352796 -0.283100102 #> 163 0.205062322 3.83278193 -0.25745566 -0.358121189 -0.58628699 0.001354546 #> 164 -0.500129374 -0.20991733 -0.23579798 -0.266334469 -0.15360286 0.312900113 #> 165 -0.487757590 -0.18236913 -0.23579798 -0.449907910 -0.56964529 -0.391463778 #> 166 -0.537244727 -0.20991733 -0.25745566 1.018679624 -0.15360286 -0.377918318 #> 167 -0.413526885 -0.20991733 -0.25745566 0.452661512 1.84340078 -0.337281940 #> 168 2.852624130 -0.20991733 -0.25745566 -0.434610124 -0.56964529 -0.432100156 #> 169 -0.227950123 -0.20991733 -0.25745566 -0.358121189 -0.56964529 -0.432100156 #> 170 -0.549616511 -0.20991733 -0.25745566 1.079870772 0.24579787 0.326445573 #> 171 -0.524872942 -0.20991733 -0.25745566 -0.296930042 -0.46979511 -0.337281940 #> 172 -0.463014022 -0.20991733 -0.25745566 -0.358121189 -0.56964529 -0.432100156 #> 173 0.130831617 -0.20991733 -0.25745566 -0.465205697 -0.56964529 -0.188281886 #> 174 -0.524872942 -0.16859502 -0.25745566 -0.449907910 -0.50307850 -0.432100156 #> 175 -0.425898669 -0.20991733 0.19735560 0.620937167 -0.48643681 0.190990978 #> 176 -0.500129374 -0.20991733 -0.25745566 -0.434610124 -0.20352796 -0.161190967 #> 177 0.279293027 3.47465525 -0.01922119 -0.342823403 -0.56964529 -0.405009237 #> 178 -0.512501158 -0.19614323 -0.25745566 -0.342823403 0.29572296 0.231627356 #> 179 -0.401155101 -0.20991733 -0.25745566 -0.465205697 -0.30337814 -0.432100156 #> 180 -0.475385806 1.57382902 0.34895936 -0.128654387 -0.03711098 -0.405009237 #> 181 1.095830781 -0.20991733 -0.25745566 0.054919055 0.01281411 -0.256009183 #> 182 1.009228292 -0.20991733 -0.25745566 -0.404014550 -0.36994493 -0.391463778 #> 183 2.679419152 0.31349859 -0.25745566 -0.404014550 -0.55300359 -0.350827400 #> 184 -0.438270453 1.29834696 -0.24662682 1.538804376 -0.33666153 -0.215372805 #> 185 -0.549616511 0.24462807 -0.01922119 -0.358121189 -0.58628699 -0.405009237 #> 186 -0.524872942 -0.20991733 -0.24662682 -0.419312337 -0.33666153 -0.432100156 #> 187 -0.549616511 -0.20303028 3.85750340 1.095168558 -0.38658662 -0.269554643 #> 188 -0.537244727 -0.20991733 -0.23579798 -0.419312337 -0.55300359 0.475445626 #> 189 -0.425898669 -0.20991733 -0.23579798 -0.373418976 2.70876903 0.177445519 #> 190 0.254549459 -0.09972451 3.55429589 0.162003562 -0.33666153 -0.432100156 #> 191 -0.512501158 -0.20991733 -0.25745566 -0.465205697 0.21251447 -0.147645508 #> 192 -0.537244727 -0.20991733 -0.25745566 -0.465205697 0.54534841 -0.432100156 #> 193 -0.450642238 -0.20303028 -0.25745566 -0.358121189 -0.38658662 -0.310191021 #> 194 0.885510450 -0.20991733 -0.25745566 -0.388716763 -0.56964529 -0.432100156 #> 195 -0.104232281 -0.16170797 0.01326533 -0.388716763 -0.32001983 -0.269554643 #> 196 -0.549616511 -0.20991733 -0.25745566 -0.465205697 -0.22016965 -0.256009183 #> 197 -0.512501158 -0.05151515 0.31647284 1.768271179 0.91146575 -0.174736427 #> 198 0.167946970 -0.20991733 -0.25745566 -0.465205697 -0.58628699 -0.147645508 #> 199 -0.537244727 -0.20991733 -0.25745566 -0.434610124 -0.38658662 1.708082436 #> 200 -0.450642238 -0.20991733 -0.25745566 -0.342823403 -0.15360286 2.046718922 #> Otu00056 Otu00057 Otu00058 Otu00059 Otu00060 #> 1 -0.67302626 -0.063085238 0.244028438 -0.04265350 -0.41506494 #> 2 2.49956176 -0.378272648 0.956294184 -0.33573273 -0.41506494 #> 3 -0.80430576 2.658987854 -0.313396928 -0.40900254 -0.40518715 #> 4 0.18029052 -0.340068114 -0.065652321 -0.29386427 -0.41506494 #> 5 -0.80430576 -0.426028317 -0.561141535 -0.39853543 -0.40518715 #> 6 0.77104829 0.786965657 0.151124210 0.66911037 -0.41506494 #> 7 -0.82618568 -0.244556777 -0.545657497 -0.29386427 -0.41506494 #> 8 -0.62926642 -0.426028317 -0.406301156 7.84955171 -0.16812007 #> 9 0.24593027 -0.426028317 -0.483721345 -0.40900254 -0.41506494 #> 10 -0.23542791 -0.406926049 -0.576625573 -0.40900254 2.69644047 #> 11 -0.82618568 -0.406926049 2.798894699 -0.40900254 0.40479204 #> 12 -0.56362667 0.557738450 -0.205008662 0.09341901 0.04919142 #> 13 0.66164870 -0.426028317 1.730496081 -0.40900254 -0.41506494 #> 14 0.04901101 0.529085049 0.213060362 0.69004460 -0.41506494 #> 15 1.82128432 1.407789345 0.832421880 -0.05312061 -0.41506494 #> 16 1.66812490 -0.397374916 -0.158556549 -0.40900254 -0.41506494 #> 17 -0.41046725 0.519533915 -0.220492700 0.21902440 -0.41506494 #> 18 -0.30106766 1.073499667 -0.096620397 0.03061631 -0.38543156 #> 19 -0.69490618 0.147039703 0.569193235 -0.21012735 -0.41506494 #> 20 -0.78242585 -0.359170381 -0.545657497 -0.23106158 -0.41506494 #> 21 -0.82618568 -0.406926049 -0.576625573 -0.40900254 -0.41506494 #> 22 1.88692408 -0.426028317 -0.530173459 2.16590791 -0.41506494 #> 23 0.46472945 -0.426028317 -0.205008662 0.76331441 -0.41506494 #> 24 1.05548722 -0.426028317 -0.375333080 -0.40900254 -0.41506494 #> 25 0.31157002 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 26 -0.32294758 0.357164643 -0.081136359 -0.03218638 1.06660430 #> 27 -0.78242585 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 28 -0.60738651 -0.426028317 -0.607593649 -0.40900254 2.67668488 #> 29 -0.76054593 -0.426028317 -0.437269232 0.75284729 -0.41506494 #> 30 -0.69490618 -0.053534104 -0.189524624 -0.13685754 -0.39530935 #> 31 -0.82618568 -0.426028317 1.482751474 -0.39853543 1.00733753 #> 32 -0.60738651 2.085919835 -0.375333080 -0.40900254 2.59766252 #> 33 -0.62926642 0.252102173 -0.592109611 -0.40900254 -0.31628699 #> 34 -0.82618568 0.242551039 0.770485728 -0.40900254 -0.41506494 #> 35 -0.71678609 8.532935052 0.878873994 -0.19966023 -0.33604258 #> 36 -0.49798692 -0.426028317 -0.607593649 -0.40900254 0.20723614 #> 37 2.23700275 0.280755574 -0.235976738 -0.04265350 -0.41506494 #> 38 -0.76054593 -0.426028317 0.383384780 -0.40900254 0.54308117 #> 39 -0.80430576 -0.426028317 0.615645349 -0.40900254 2.37047324 #> 40 -0.43234717 0.605494118 -0.143072511 0.03061631 0.12821378 #> 41 -0.60738651 -0.292312446 -0.437269232 -0.40900254 -0.41506494 #> 42 0.61788887 -0.416477183 -0.344365004 -0.35666697 -0.40518715 #> 43 -0.80430576 -0.426028317 -0.375333080 -0.40900254 1.60000523 #> 44 1.99632366 -0.063085238 0.042735945 -0.40900254 -0.41506494 #> 45 0.31157002 -0.034431837 -0.514689421 -0.29386427 0.39491424 #> 46 0.02713110 -0.406926049 -0.468237308 -0.40900254 -0.40518715 #> 47 -0.71678609 -0.015329570 -0.313396928 -0.14732465 -0.41506494 #> 48 -0.82618568 -0.387823782 -0.545657497 -0.40900254 -0.41506494 #> 49 -0.65114634 0.137488569 -0.266944814 -0.16825888 -0.41506494 #> 50 -0.54174675 0.634147519 0.305964590 0.28182709 -0.41506494 #> 51 0.37720978 -0.426028317 -0.561141535 4.57334451 -0.40518715 #> 52 -0.47610700 -0.177698842 -0.468237308 -0.25199581 -0.41506494 #> 53 -0.80430576 -0.416477183 -0.592109611 -0.40900254 -0.41506494 #> 54 -0.80430576 -0.426028317 -0.561141535 -0.40900254 -0.40518715 #> 55 -0.38858733 0.739209989 0.058219983 0.08295189 -0.40518715 #> 56 -0.82618568 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 57 -0.76054593 -0.416477183 0.135640172 0.40743248 -0.41506494 #> 58 1.20864664 -0.416477183 -0.452753270 -0.40900254 -0.03970874 #> 59 -0.21354799 -0.426028317 0.166608248 0.83658422 -0.40518715 #> 60 -0.10414841 -0.129943173 -0.003716169 0.02014920 -0.41506494 #> 61 0.70540854 -0.426028317 1.157586677 -0.40900254 1.35306035 #> 62 -0.76054593 0.739209989 -0.514689421 -0.40900254 -0.39530935 #> 63 0.44284953 -0.235005644 -0.359849042 -0.39853543 -0.41506494 #> 64 -0.76054593 -0.426028317 -0.592109611 -0.40900254 -0.41506494 #> 65 -0.82618568 0.318960108 -0.468237308 -0.40900254 0.21711393 #> 66 0.48660936 -0.426028317 5.369244999 -0.40900254 -0.41506494 #> 67 1.29616631 -0.426028317 -0.561141535 0.54350498 0.82953722 #> 68 1.23052655 1.197664405 0.166608248 -0.19966023 2.07413939 #> 69 1.20864664 -0.426028317 1.064682449 -0.40900254 -0.41506494 #> 70 0.13653068 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 71 -0.45422709 -0.349619247 -0.530173459 -0.38806831 6.91425892 #> 72 0.13653068 2.534823116 2.195017219 -0.07405484 1.57037184 #> 73 0.50848928 0.242551039 -0.607593649 -0.40900254 -0.41506494 #> 74 4.62191375 0.013323831 0.182092286 0.63770902 3.72373115 #> 75 0.81480812 0.748761123 0.491773045 1.42274270 -0.41506494 #> 76 -0.82618568 -0.426028317 5.431181150 -0.40900254 0.02943583 #> 77 -0.69490618 -0.426028317 0.213060362 1.06686076 -0.40518715 #> 78 -0.56362667 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 79 1.58060523 -0.091738639 0.940810146 1.19246615 -0.41506494 #> 80 -0.82618568 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 81 0.59600895 -0.426028317 1.699528005 0.20855728 -0.41506494 #> 82 3.28723879 0.939783796 -0.607593649 -0.39853543 -0.41506494 #> 83 0.83668804 -0.034431837 -0.545657497 -0.25199581 -0.40518715 #> 84 -0.76054593 -0.426028317 -0.390817118 -0.40900254 -0.16812007 #> 85 -0.43234717 -0.426028317 2.427277789 -0.40900254 -0.41506494 #> 86 -0.82618568 -0.139494307 -0.251460776 -0.40900254 -0.40518715 #> 87 -0.06038857 0.051528366 -0.390817118 -0.36713408 -0.41506494 #> 88 1.01172738 -0.426028317 6.546031883 -0.40900254 -0.41506494 #> 89 1.79940441 -0.359170381 0.151124210 -0.31479850 -0.41506494 #> 90 0.13653068 6.603606053 -0.174040587 -0.28339716 -0.41506494 #> 91 -0.23542791 -0.378272648 -0.344365004 2.80440196 0.95794856 #> 92 -0.76054593 -0.426028317 2.009208764 -0.40900254 0.41466983 #> 93 -0.82618568 -0.426028317 -0.530173459 -0.40900254 -0.41506494 #> 94 -0.80430576 -0.426028317 0.228544400 2.50085561 -0.38543156 #> 95 1.03360730 1.054397400 0.274996514 0.55397210 -0.41506494 #> 96 -0.82618568 -0.426028317 -0.576625573 -0.40900254 -0.41506494 #> 97 -0.78242585 -0.426028317 -0.592109611 -0.40900254 -0.41506494 #> 98 -0.16978816 -0.426028317 -0.468237308 1.63208501 -0.41506494 #> 99 -0.78242585 -0.406926049 -0.592109611 -0.40900254 -0.41506494 #> 100 2.41204209 -0.397374916 -0.499205383 -0.39853543 -0.37555376 #> 101 1.79940441 -0.177698842 -0.576625573 -0.40900254 -0.41506494 #> 102 -0.80430576 -0.426028317 -0.607593649 -0.36713408 -0.41506494 #> 103 -0.19166808 -0.301863579 -0.421785194 -0.40900254 -0.41506494 #> 104 -0.82618568 1.025743999 0.011767869 -0.40900254 -0.39530935 #> 105 0.18029052 0.509982781 0.027251907 0.47023517 0.07882480 #> 106 0.04901101 0.309408975 -0.235976738 0.03061631 -0.39530935 #> 107 0.20217044 -0.426028317 -0.034684245 -0.40900254 0.33564747 #> 108 0.81480812 -0.426028317 1.838884347 -0.40900254 0.80978163 #> 109 -0.62926642 -0.129943173 -0.251460776 -0.38806831 -0.41506494 #> 110 2.08384333 -0.397374916 -0.205008662 -0.27293004 -0.40518715 #> 111 0.53036920 -0.426028317 -0.220492700 -0.40900254 -0.41506494 #> 112 0.50848928 -0.426028317 0.259512476 -0.40900254 0.13809157 #> 113 -0.21354799 -0.426028317 0.569193235 -0.38806831 -0.41506494 #> 114 0.35532986 -0.378272648 1.637591853 -0.15779177 1.13574887 #> 115 0.44284953 -0.426028317 1.467267436 -0.40900254 -0.06934212 #> 116 2.01820358 -0.215903376 -0.174040587 -0.40900254 -0.41506494 #> 117 -0.03850865 -0.426028317 -0.607593649 -0.40900254 2.64705149 #> 118 0.18029052 -0.426028317 -0.514689421 -0.40900254 -0.41506494 #> 119 -0.82618568 -0.426028317 -0.050168283 -0.40900254 -0.41506494 #> 120 -0.32294758 -0.387823782 -0.607593649 -0.38806831 -0.34592038 #> 121 -0.34482750 0.414471445 1.002746297 0.35509690 4.63248828 #> 122 0.24593027 -0.416477183 -0.576625573 -0.40900254 -0.41506494 #> 123 -0.82618568 -0.426028317 -0.545657497 -0.39853543 -0.41506494 #> 124 0.02713110 -0.426028317 -0.530173459 -0.40900254 -0.41506494 #> 125 -0.60738651 -0.426028317 0.089188059 3.14981678 2.73595165 #> 126 0.63976878 -0.426028317 1.064682449 -0.40900254 -0.41506494 #> 127 -0.27918775 -0.378272648 -0.545657497 -0.31479850 -0.39530935 #> 128 -0.78242585 -0.426028317 -0.576625573 -0.40900254 -0.06934212 #> 129 -0.80430576 -0.110840906 -0.483721345 0.26089286 -0.41506494 #> 130 -0.47610700 -0.426028317 -0.344365004 -0.40900254 -0.40518715 #> 131 -0.56362667 -0.426028317 -0.390817118 -0.40900254 -0.41506494 #> 132 1.47120565 -0.426028317 -0.421785194 -0.40900254 -0.20763125 #> 133 -0.67302626 -0.426028317 -0.530173459 -0.26246293 -0.41506494 #> 134 0.46472945 0.739209989 1.869852422 1.54834808 -0.40518715 #> 135 -0.82618568 -0.406926049 -0.437269232 -0.39853543 -0.41506494 #> 136 0.85856796 -0.426028317 0.011767869 -0.40900254 -0.41506494 #> 137 -0.16978816 2.085919835 -0.468237308 -0.40900254 1.15550446 #> 138 0.88044788 -0.426028317 -0.220492700 -0.40900254 -0.40518715 #> 139 -0.71678609 -0.416477183 -0.468237308 0.11435324 -0.41506494 #> 140 -0.82618568 -0.426028317 -0.220492700 -0.40900254 -0.41506494 #> 141 -0.65114634 -0.426028317 -0.174040587 1.51694674 -0.03970874 #> 142 -0.56362667 1.617914285 0.693065539 -0.40900254 -0.41506494 #> 143 -0.73866601 -0.005778436 -0.607593649 -0.06358773 -0.41506494 #> 144 -0.58550659 1.149908736 -0.468237308 0.88891980 -0.41506494 #> 145 0.61788887 -0.196801109 -0.607593649 -0.40900254 -0.41506494 #> 146 0.81480812 -0.426028317 -0.592109611 -0.06358773 -0.40518715 #> 147 -0.82618568 -0.426028317 -0.592109611 -0.39853543 -0.41506494 #> 148 -0.73866601 -0.426028317 -0.359849042 -0.40900254 -0.41506494 #> 149 -0.71678609 0.185244237 -0.452753270 -0.40900254 -0.41506494 #> 150 -0.82618568 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 151 1.66812490 0.834721326 0.878873994 -0.40900254 -0.41506494 #> 152 1.05548722 -0.168147708 -0.576625573 -0.40900254 -0.41506494 #> 153 -0.67302626 -0.426028317 0.058219983 0.45976806 -0.41506494 #> 154 -0.82618568 -0.426028317 -0.607593649 1.78909174 -0.41506494 #> 155 -0.69490618 -0.426028317 -0.545657497 5.65145742 -0.41506494 #> 156 -0.19166808 0.643698653 -0.483721345 -0.40900254 0.16772496 #> 157 -0.82618568 -0.416477183 -0.607593649 -0.40900254 -0.23726464 #> 158 1.53684540 -0.426028317 2.597602206 -0.40900254 -0.37555376 #> 159 -0.78242585 0.041977232 -0.437269232 -0.40900254 -0.41506494 #> 160 -0.80430576 -0.426028317 -0.592109611 -0.40900254 -0.41506494 #> 161 -0.65114634 -0.426028317 0.352416704 -0.40900254 -0.41506494 #> 162 -0.32294758 -0.426028317 -0.468237308 -0.40900254 0.28625850 #> 163 0.66164870 -0.378272648 0.816937842 3.22308659 -0.41506494 #> 164 -0.80430576 -0.416477183 -0.576625573 -0.40900254 2.05438380 #> 165 -0.71678609 -0.406926049 -0.576625573 -0.40900254 2.11365057 #> 166 -0.82618568 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 167 0.48660936 3.585447818 -0.328880966 -0.40900254 -0.27677581 #> 168 -0.82618568 -0.426028317 -0.406301156 -0.40900254 -0.41506494 #> 169 -0.80430576 -0.426028317 -0.530173459 -0.38806831 1.61976082 #> 170 -0.82618568 -0.426028317 -0.607593649 -0.40900254 1.05672651 #> 171 -0.47610700 0.701005455 0.646613425 0.81564999 -0.41506494 #> 172 -0.76054593 -0.426028317 -0.437269232 -0.40900254 -0.01995315 #> 173 -0.82618568 -0.426028317 -0.592109611 -0.40900254 -0.39530935 #> 174 -0.78242585 -0.416477183 -0.421785194 -0.31479850 4.01018720 #> 175 2.43392201 -0.215903376 -0.034684245 -0.40900254 -0.40518715 #> 176 1.07736713 -0.426028317 -0.127588473 -0.39853543 -0.41506494 #> 177 0.20217044 -0.034431837 0.538225159 0.05155054 -0.41506494 #> 178 -0.82618568 -0.426028317 0.182092286 -0.40900254 -0.41506494 #> 179 -0.80430576 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 180 -0.25730783 0.844272459 -0.065652321 -0.10545619 -0.41506494 #> 181 -0.67302626 -0.416477183 -0.576625573 0.78424864 -0.41506494 #> 182 0.26781019 -0.426028317 -0.452753270 0.86798557 -0.41506494 #> 183 -0.41046725 -0.263659045 0.027251907 0.54350498 -0.41506494 #> 184 -0.36670742 -0.273210178 -0.174040587 -0.36713408 -0.30640920 #> 185 2.43392201 -0.378272648 -0.561141535 -0.40900254 -0.41506494 #> 186 -0.78242585 -0.416477183 -0.545657497 -0.37760120 -0.41506494 #> 187 0.31157002 0.548187316 -0.607593649 -0.40900254 -0.15824228 #> 188 -0.82618568 -0.426028317 -0.592109611 -0.40900254 -0.35579817 #> 189 -0.71678609 -0.340068114 -0.514689421 -0.40900254 -0.26689802 #> 190 0.81480812 0.739209989 -0.297912890 -0.25199581 -0.40518715 #> 191 0.00525118 -0.426028317 -0.499205383 -0.40900254 1.41232712 #> 192 1.12112697 -0.426028317 -0.561141535 -0.40900254 -0.41506494 #> 193 1.47120565 1.130806469 0.383384780 0.66911037 -0.05946433 #> 194 -0.56362667 -0.387823782 -0.576625573 0.02014920 0.52332558 #> 195 -0.21354799 0.901579261 0.491773045 0.50163652 -0.39530935 #> 196 -0.82618568 -0.426028317 -0.592109611 -0.40900254 -0.41506494 #> 197 -0.80430576 1.608363152 -0.514689421 -0.38806831 -0.37555376 #> 198 -0.80430576 -0.426028317 -0.530173459 -0.40900254 -0.25702023 #> 199 1.71188474 0.204346505 -0.421785194 -0.19966023 0.06894701 #> 200 3.72483714 -0.426028317 1.869852422 -0.40900254 -0.32616479 #> #> $removed #> character(0) #>"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":null,"dir":"Reference","previous_headings":"","what":"Get feature importance using the permutation method — get_feature_importance","title":"Get feature importance using the permutation method — get_feature_importance","text":"Calculates feature importance using trained model test data. Requires future.apply package.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get feature importance using the permutation method — get_feature_importance","text":"","code":"get_feature_importance( trained_model, test_data, outcome_colname, perf_metric_function, perf_metric_name, class_probs, method, seed = NA, corr_thresh = 1, groups = NULL, nperms = 100, corr_method = \"spearman\" )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get feature importance using the permutation method — get_feature_importance","text":"trained_model Trained model caret::train(). test_data Held test data: dataframe outcome features. outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). perf_metric_function Function calculate performance metric used cross-validation test performance. functions provided caret (see caret::defaultSummary()). Defaults: binary classification = twoClassSummary, multi-class classification = multiClassSummary, regression = defaultSummary. perf_metric_name column name output function provided perf_metric_function used performance metric. Defaults: binary classification = \"ROC\", multi-class classification = \"logLoss\", regression = \"RMSE\". class_probs Whether use class probabilities (TRUE categorical outcomes, FALSE numeric outcomes). method ML method. Options: c(\"glmnet\", \"rf\", \"rpart2\", \"svmRadial\", \"xgbTree\"). glmnet: linear, logistic, multiclass regression rf: random forest rpart2: decision tree svmRadial: support vector machine xgbTree: xgboost seed Random seed (default: NA). results reproducible set seed. corr_thresh feature importance, group correlations equal corr_thresh (range 0 1; default: 1). groups Vector feature names group together permutation. element string feature names separated pipe character (|). NULL (default), correlated features grouped together based corr_thresh. nperms number permutations perform (default: 100). corr_method correlation method. options supported stats::cor: spearman, pearson, kendall. (default: spearman)","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get feature importance using the permutation method — get_feature_importance","text":"Data frame performance metrics feature (group correlated features; names) permuted (perf_metric), differences actual test performance metric permuted performance metric (perf_metric_diff; test minus permuted performance), p-value (pvalue: probability obtaining actual performance value null hypothesis). Features larger perf_metric_diff important. performance metric name (perf_metric_name) seed (seed) also returned.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Get feature importance using the permutation method — get_feature_importance","text":"permutation tests, p-value number permutation statistics greater test statistic, divided number permutations. case, permutation statistic model performance (e.g. AUROC) randomizing order observations one feature, test statistic actual performance test data. default perform 100 permutations per feature; increasing increase precision estimating null distribution, also increases runtime. p-value represents probability obtaining actual performance event null hypothesis true, null hypothesis feature important model performance. strongly recommend providing multiple cores speed computation time. See vignette parallel processing details.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get feature importance using the permutation method — get_feature_importance","text":"Begüm Topçuoğlu, topcuoglu.begum@gmail.com Zena Lapp, zenalapp@umich.edu Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get feature importance using the permutation method — get_feature_importance","text":"","code":"if (FALSE) { # If you called `run_ml()` with `feature_importance = FALSE` (the default), # you can use `get_feature_importance()` later as long as you have the # trained model and test data. results <- run_ml(otu_small, \"glmnet\", kfold = 2, cv_times = 2) names(results$trained_model$trainingData)[1] <- \"dx\" feat_imp <- get_feature_importance(results$trained_model, results$trained_model$trainingData, results$test_data, \"dx\", multiClassSummary, \"AUC\", class_probs = TRUE, method = \"glmnet\" ) # We strongly recommend providing multiple cores to speed up computation time. # Do this before calling `get_feature_importance()`. doFuture::registerDoFuture() future::plan(future::multicore, workers = 2) # Optionally, you can group features together with a custom grouping feat_imp <- get_feature_importance(results$trained_model, results$trained_model$trainingData, results$test_data, \"dx\", multiClassSummary, \"AUC\", class_probs = TRUE, method = \"glmnet\", groups = c( \"Otu00007\", \"Otu00008\", \"Otu00009\", \"Otu00011\", \"Otu00012\", \"Otu00015\", \"Otu00016\", \"Otu00018\", \"Otu00019\", \"Otu00020\", \"Otu00022\", \"Otu00023\", \"Otu00025\", \"Otu00028\", \"Otu00029\", \"Otu00030\", \"Otu00035\", \"Otu00036\", \"Otu00037\", \"Otu00038\", \"Otu00039\", \"Otu00040\", \"Otu00047\", \"Otu00050\", \"Otu00052\", \"Otu00054\", \"Otu00055\", \"Otu00056\", \"Otu00060\", \"Otu00003|Otu00002|Otu00005|Otu00024|Otu00032|Otu00041|Otu00053\", \"Otu00014|Otu00021|Otu00017|Otu00031|Otu00057\", \"Otu00013|Otu00006\", \"Otu00026|Otu00001|Otu00034|Otu00048\", \"Otu00033|Otu00010\", \"Otu00042|Otu00004\", \"Otu00043|Otu00027|Otu00049\", \"Otu00051|Otu00045\", \"Otu00058|Otu00044\", \"Otu00059|Otu00046\" ) ) # the function can show a progress bar if you have the `progressr` package installed. ## optionally, specify the progress bar format: progressr::handlers(progressr::handler_progress( format = \":message :bar :percent | elapsed: :elapsed | eta: :eta\", clear = FALSE, show_after = 0 )) ## tell progressr to always report progress progressr::handlers(global = TRUE) ## run the function and watch the live progress udpates feat_imp <- get_feature_importance(results$trained_model, results$trained_model$trainingData, results$test_data, \"dx\", multiClassSummary, \"AUC\", class_probs = TRUE, method = \"glmnet\" ) # You can specify any correlation method supported by `stats::cor`: feat_imp <- get_feature_importance(results$trained_model, results$trained_model$trainingData, results$test_data, \"dx\", multiClassSummary, \"AUC\", class_probs = TRUE, method = \"glmnet\", corr_method = \"pearson\" ) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hp_performance.html","id":null,"dir":"Reference","previous_headings":"","what":"Get hyperparameter performance metrics — get_hp_performance","title":"Get hyperparameter performance metrics — get_hp_performance","text":"Get hyperparameter performance metrics","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hp_performance.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get hyperparameter performance metrics — get_hp_performance","text":"","code":"get_hp_performance(trained_model)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hp_performance.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get hyperparameter performance metrics — get_hp_performance","text":"trained_model trained model (e.g. run_ml())","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hp_performance.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get hyperparameter performance metrics — get_hp_performance","text":"Named list: dat: Dataframe performance metric group hyperparameters. params: Hyperparameters tuned. metric: Performance metric used.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hp_performance.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get hyperparameter performance metrics — get_hp_performance","text":"Zena Lapp, zenalapp@umich.edu Kelly Sovacool sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hp_performance.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get hyperparameter performance metrics — get_hp_performance","text":"","code":"get_hp_performance(otu_mini_bin_results_glmnet$trained_model) #> $dat #> alpha lambda AUC #> 1 0 1e-04 0.6082552 #> 2 0 1e-03 0.6082552 #> 3 0 1e-02 0.6086458 #> 4 0 1e-01 0.6166789 #> 5 0 1e+00 0.6221737 #> 6 0 1e+01 0.6187408 #> #> $params #> [1] \"lambda\" #> #> $metric #> [1] \"AUC\" #>"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hyperparams_list.html","id":null,"dir":"Reference","previous_headings":"","what":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","title":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","text":"details see vignette hyperparameter tuning.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hyperparams_list.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","text":"","code":"get_hyperparams_list(dataset, method)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hyperparams_list.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","text":"dataset Dataframe outcome variable columns features. method ML method. Options: c(\"glmnet\", \"rf\", \"rpart2\", \"svmRadial\", \"xgbTree\"). glmnet: linear, logistic, multiclass regression rf: random forest rpart2: decision tree svmRadial: support vector machine xgbTree: xgboost","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hyperparams_list.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","text":"Named list hyperparameters.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hyperparams_list.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","text":"Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hyperparams_list.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","text":"","code":"get_hyperparams_list(otu_mini_bin, \"rf\") #> $mtry #> [1] 2 3 6 #> get_hyperparams_list(otu_small, \"rf\") #> $mtry #> [1] 4 8 16 #> get_hyperparams_list(otu_mini_bin, \"rpart2\") #> $maxdepth #> [1] 1 2 4 8 16 30 #> get_hyperparams_list(otu_small, \"rpart2\") #> $maxdepth #> [1] 1 2 4 8 16 30 #>"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_outcome_type.html","id":null,"dir":"Reference","previous_headings":"","what":"Get outcome type. — get_outcome_type","title":"Get outcome type. — get_outcome_type","text":"outcome numeric, type continuous. Otherwise, outcome type binary two outcomes multiclass two outcomes.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_outcome_type.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get outcome type. — get_outcome_type","text":"","code":"get_outcome_type(outcomes_vec)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_outcome_type.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get outcome type. — get_outcome_type","text":"outcomes_vec Vector outcomes.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_outcome_type.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get outcome type. — get_outcome_type","text":"Outcome type (continuous, binary, multiclass).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_outcome_type.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get outcome type. — get_outcome_type","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_outcome_type.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get outcome type. — get_outcome_type","text":"","code":"get_outcome_type(c(1, 2, 1)) #> [1] \"continuous\" get_outcome_type(c(\"a\", \"b\", \"b\")) #> [1] \"binary\" get_outcome_type(c(\"a\", \"b\", \"c\")) #> [1] \"multiclass\""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":null,"dir":"Reference","previous_headings":"","what":"Select indices to partition the data into training & testing sets. — get_partition_indices","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"Use function get row indices training set.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"","code":"get_partition_indices( outcomes, training_frac = 0.8, groups = NULL, group_partitions = NULL )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"outcomes vector outcomes training_frac Fraction data training set (default: 0.8). Rows dataset randomly selected training set, remaining rows used testing set. Alternatively, provide vector integers, used row indices training set. remaining rows used testing set. groups Vector groups keep together splitting data train test sets. number groups training set larger kfold, groups also kept together cross-validation. Length matches number rows dataset (default: NULL). group_partitions Specify assign groups training testing partitions (default: NULL). groups specifies samples belong group \"\" belong group \"B\", setting group_partitions = list(train = c(\"\", \"B\"), test = c(\"B\")) result samples group \"\" placed training set, samples \"B\" also training set, remaining samples \"B\" testing set. partition sizes close training_frac possible. number groups training set larger kfold, groups also kept together cross-validation.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"Vector row indices training set.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"groups NULL, uses createDataPartition. Otherwise, uses create_grouped_data_partition(). Set seed prior calling function like data partitions reproducible (recommended).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"","code":"training_inds <- get_partition_indices(otu_mini_bin$dx) train_data <- otu_mini_bin[training_inds, ] test_data <- otu_mini_bin[-training_inds, ]"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_fn.html","id":null,"dir":"Reference","previous_headings":"","what":"Get default performance metric function — get_perf_metric_fn","title":"Get default performance metric function — get_perf_metric_fn","text":"Get default performance metric function","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_fn.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get default performance metric function — get_perf_metric_fn","text":"","code":"get_perf_metric_fn(outcome_type)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_fn.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get default performance metric function — get_perf_metric_fn","text":"outcome_type Type outcome (one : \"continuous\",\"binary\",\"multiclass\").","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_fn.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get default performance metric function — get_perf_metric_fn","text":"Performance metric function.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_fn.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get default performance metric function — get_perf_metric_fn","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_fn.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get default performance metric function — get_perf_metric_fn","text":"","code":"get_perf_metric_fn(\"continuous\") #> function (data, lev = NULL, model = NULL) #> { #> if (is.character(data$obs)) #> data$obs <- factor(data$obs, levels = lev) #> postResample(data[, \"pred\"], data[, \"obs\"]) #> } #> #> get_perf_metric_fn(\"binary\") #> function (data, lev = NULL, model = NULL) #> { #> if (!all(levels(data[, \"pred\"]) == levels(data[, \"obs\"]))) #> stop(\"levels of observed and predicted data do not match\") #> has_class_probs <- all(lev %in% colnames(data)) #> if (has_class_probs) { #> lloss <- mnLogLoss(data = data, lev = lev, model = model) #> requireNamespaceQuietStop(\"pROC\") #> requireNamespaceQuietStop(\"MLmetrics\") #> prob_stats <- lapply(levels(data[, \"pred\"]), function(x) { #> obs <- ifelse(data[, \"obs\"] == x, 1, 0) #> prob <- data[, x] #> roc_auc <- try(pROC::roc(obs, data[, x], direction = \"<\", #> quiet = TRUE), silent = TRUE) #> roc_auc <- if (inherits(roc_auc, \"try-error\")) #> NA #> else roc_auc$auc #> pr_auc <- try(MLmetrics::PRAUC(y_pred = data[, x], #> y_true = obs), silent = TRUE) #> if (inherits(pr_auc, \"try-error\")) #> pr_auc <- NA #> res <- c(ROC = roc_auc, AUC = pr_auc) #> return(res) #> }) #> prob_stats <- do.call(\"rbind\", prob_stats) #> prob_stats <- colMeans(prob_stats, na.rm = TRUE) #> } #> CM <- confusionMatrix(data[, \"pred\"], data[, \"obs\"], mode = \"everything\") #> if (length(levels(data[, \"pred\"])) == 2) { #> class_stats <- CM$byClass #> } #> else { #> class_stats <- colMeans(CM$byClass) #> names(class_stats) <- paste(\"Mean\", names(class_stats)) #> } #> overall_stats <- if (has_class_probs) #> c(CM$overall, logLoss = as.numeric(lloss), AUC = unname(prob_stats[\"ROC\"]), #> prAUC = unname(prob_stats[\"AUC\"])) #> else CM$overall #> stats <- c(overall_stats, class_stats) #> stats <- stats[!names(stats) %in% c(\"AccuracyNull\", \"AccuracyLower\", #> \"AccuracyUpper\", \"AccuracyPValue\", \"McnemarPValue\", \"Mean Prevalence\", #> \"Mean Detection Prevalence\")] #> names(stats) <- gsub(\"[[:blank:]]+\", \"_\", names(stats)) #> stat_list <- c(\"Accuracy\", \"Kappa\", \"Mean_F1\", \"Mean_Sensitivity\", #> \"Mean_Specificity\", \"Mean_Pos_Pred_Value\", \"Mean_Neg_Pred_Value\", #> \"Mean_Precision\", \"Mean_Recall\", \"Mean_Detection_Rate\", #> \"Mean_Balanced_Accuracy\") #> if (has_class_probs) #> stat_list <- c(\"logLoss\", \"AUC\", \"prAUC\", stat_list) #> if (length(levels(data[, \"pred\"])) == 2) #> stat_list <- gsub(\"^Mean_\", \"\", stat_list) #> stats <- stats[c(stat_list)] #> return(stats) #> } #> #> get_perf_metric_fn(\"multiclass\") #> function (data, lev = NULL, model = NULL) #> { #> if (!all(levels(data[, \"pred\"]) == levels(data[, \"obs\"]))) #> stop(\"levels of observed and predicted data do not match\") #> has_class_probs <- all(lev %in% colnames(data)) #> if (has_class_probs) { #> lloss <- mnLogLoss(data = data, lev = lev, model = model) #> requireNamespaceQuietStop(\"pROC\") #> requireNamespaceQuietStop(\"MLmetrics\") #> prob_stats <- lapply(levels(data[, \"pred\"]), function(x) { #> obs <- ifelse(data[, \"obs\"] == x, 1, 0) #> prob <- data[, x] #> roc_auc <- try(pROC::roc(obs, data[, x], direction = \"<\", #> quiet = TRUE), silent = TRUE) #> roc_auc <- if (inherits(roc_auc, \"try-error\")) #> NA #> else roc_auc$auc #> pr_auc <- try(MLmetrics::PRAUC(y_pred = data[, x], #> y_true = obs), silent = TRUE) #> if (inherits(pr_auc, \"try-error\")) #> pr_auc <- NA #> res <- c(ROC = roc_auc, AUC = pr_auc) #> return(res) #> }) #> prob_stats <- do.call(\"rbind\", prob_stats) #> prob_stats <- colMeans(prob_stats, na.rm = TRUE) #> } #> CM <- confusionMatrix(data[, \"pred\"], data[, \"obs\"], mode = \"everything\") #> if (length(levels(data[, \"pred\"])) == 2) { #> class_stats <- CM$byClass #> } #> else { #> class_stats <- colMeans(CM$byClass) #> names(class_stats) <- paste(\"Mean\", names(class_stats)) #> } #> overall_stats <- if (has_class_probs) #> c(CM$overall, logLoss = as.numeric(lloss), AUC = unname(prob_stats[\"ROC\"]), #> prAUC = unname(prob_stats[\"AUC\"])) #> else CM$overall #> stats <- c(overall_stats, class_stats) #> stats <- stats[!names(stats) %in% c(\"AccuracyNull\", \"AccuracyLower\", #> \"AccuracyUpper\", \"AccuracyPValue\", \"McnemarPValue\", \"Mean Prevalence\", #> \"Mean Detection Prevalence\")] #> names(stats) <- gsub(\"[[:blank:]]+\", \"_\", names(stats)) #> stat_list <- c(\"Accuracy\", \"Kappa\", \"Mean_F1\", \"Mean_Sensitivity\", #> \"Mean_Specificity\", \"Mean_Pos_Pred_Value\", \"Mean_Neg_Pred_Value\", #> \"Mean_Precision\", \"Mean_Recall\", \"Mean_Detection_Rate\", #> \"Mean_Balanced_Accuracy\") #> if (has_class_probs) #> stat_list <- c(\"logLoss\", \"AUC\", \"prAUC\", stat_list) #> if (length(levels(data[, \"pred\"])) == 2) #> stat_list <- gsub(\"^Mean_\", \"\", stat_list) #> stats <- stats[c(stat_list)] #> return(stats) #> } #> #> "},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_name.html","id":null,"dir":"Reference","previous_headings":"","what":"Get default performance metric name — get_perf_metric_name","title":"Get default performance metric name — get_perf_metric_name","text":"Get default performance metric name cross-validation.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_name.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get default performance metric name — get_perf_metric_name","text":"","code":"get_perf_metric_name(outcome_type)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_name.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get default performance metric name — get_perf_metric_name","text":"outcome_type Type outcome (one : \"continuous\",\"binary\",\"multiclass\").","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_name.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get default performance metric name — get_perf_metric_name","text":"Performance metric name.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_name.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get default performance metric name — get_perf_metric_name","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_name.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get default performance metric name — get_perf_metric_name","text":"","code":"get_perf_metric_name(\"continuous\") #> [1] \"RMSE\" get_perf_metric_name(\"binary\") #> [1] \"AUC\" get_perf_metric_name(\"multiclass\") #> [1] \"logLoss\""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_performance_tbl.html","id":null,"dir":"Reference","previous_headings":"","what":"Get model performance metrics as a one-row tibble — get_performance_tbl","title":"Get model performance metrics as a one-row tibble — get_performance_tbl","text":"Get model performance metrics one-row tibble","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_performance_tbl.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get model performance metrics as a one-row tibble — get_performance_tbl","text":"","code":"get_performance_tbl( trained_model, test_data, outcome_colname, perf_metric_function, perf_metric_name, class_probs, method, seed = NA )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_performance_tbl.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get model performance metrics as a one-row tibble — get_performance_tbl","text":"trained_model Trained model caret::train(). test_data Held test data: dataframe outcome features. outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). perf_metric_function Function calculate performance metric used cross-validation test performance. functions provided caret (see caret::defaultSummary()). Defaults: binary classification = twoClassSummary, multi-class classification = multiClassSummary, regression = defaultSummary. perf_metric_name column name output function provided perf_metric_function used performance metric. Defaults: binary classification = \"ROC\", multi-class classification = \"logLoss\", regression = \"RMSE\". class_probs Whether use class probabilities (TRUE categorical outcomes, FALSE numeric outcomes). method ML method. Options: c(\"glmnet\", \"rf\", \"rpart2\", \"svmRadial\", \"xgbTree\"). glmnet: linear, logistic, multiclass regression rf: random forest rpart2: decision tree svmRadial: support vector machine xgbTree: xgboost seed Random seed (default: NA). results reproducible set seed.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_performance_tbl.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get model performance metrics as a one-row tibble — get_performance_tbl","text":"one-row tibble columns cv_auroc, column performance metrics test data method, seed.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_performance_tbl.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get model performance metrics as a one-row tibble — get_performance_tbl","text":"Kelly Sovacool, sovacool@umich.edu Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_performance_tbl.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get model performance metrics as a one-row tibble — get_performance_tbl","text":"","code":"if (FALSE) { results <- run_ml(otu_small, \"glmnet\", kfold = 2, cv_times = 2) names(results$trained_model$trainingData)[1] <- \"dx\" get_performance_tbl(results$trained_model, results$test_data, \"dx\", multiClassSummary, \"AUC\", class_probs = TRUE, method = \"glmnet\" ) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_tuning_grid.html","id":null,"dir":"Reference","previous_headings":"","what":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","title":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","text":"Generate tuning grid tuning hyperparameters","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_tuning_grid.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","text":"","code":"get_tuning_grid(hyperparams_list, method)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_tuning_grid.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","text":"hyperparams_list Named list lists hyperparameters. method ML method. Options: c(\"glmnet\", \"rf\", \"rpart2\", \"svmRadial\", \"xgbTree\"). glmnet: linear, logistic, multiclass regression rf: random forest rpart2: decision tree svmRadial: support vector machine xgbTree: xgboost","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_tuning_grid.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","text":"tuning grid.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_tuning_grid.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","text":"Begüm Topçuoğlu, topcuoglu.begum@gmail.com Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_tuning_grid.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","text":"","code":"ml_method <- \"glmnet\" hparams_list <- get_hyperparams_list(otu_small, ml_method) get_tuning_grid(hparams_list, ml_method) #> lambda alpha #> 1 1e-04 0 #> 2 1e-03 0 #> 3 1e-02 0 #> 4 1e-01 0 #> 5 1e+00 0 #> 6 1e+01 0"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/group_correlated_features.html","id":null,"dir":"Reference","previous_headings":"","what":"Group correlated features — group_correlated_features","title":"Group correlated features — group_correlated_features","text":"Group correlated features","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/group_correlated_features.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Group correlated features — group_correlated_features","text":"","code":"group_correlated_features( features, corr_thresh = 1, group_neg_corr = TRUE, corr_method = \"spearman\" )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/group_correlated_features.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Group correlated features — group_correlated_features","text":"features dataframe column feature ML corr_thresh feature importance, group correlations equal corr_thresh (range 0 1; default: 1). group_neg_corr Whether group negatively correlated features together (e.g. c(0,1) c(1,0)). corr_method correlation method. options supported stats::cor: spearman, pearson, kendall. (default: spearman)","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/group_correlated_features.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Group correlated features — group_correlated_features","text":"vector element group correlated features separated pipes (|)","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/group_correlated_features.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Group correlated features — group_correlated_features","text":"Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/group_correlated_features.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Group correlated features — group_correlated_features","text":"","code":"features <- data.frame( a = 1:3, b = 2:4, c = c(1, 0, 1), d = (5:7), e = c(5, 1, 4), f = c(-1, 0, -1) ) group_correlated_features(features) #> [1] \"a|b|d\" \"c|f\" \"e\""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/mikropml.html","id":null,"dir":"Reference","previous_headings":"","what":"mikropml: User-Friendly R Package for Robust Machine Learning Pipelines — mikropml","title":"mikropml: User-Friendly R Package for Robust Machine Learning Pipelines — mikropml","text":"mikropml implements supervised machine learning pipelines using regression, support vector machines, decision trees, random forest, gradient-boosted trees. main functions preprocess_data() process data prior running machine learning, run_ml() run machine learning.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/mikropml.html","id":"authors","dir":"Reference","previous_headings":"","what":"Authors","title":"mikropml: User-Friendly R Package for Robust Machine Learning Pipelines — mikropml","text":"Begüm D. Topçuoğlu (ORCID) Zena Lapp (ORCID) Kelly L. Sovacool (ORCID) Evan Snitkin (ORCID) Jenna Wiens (ORCID) Patrick D. Schloss (ORCID)","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/mikropml.html","id":"see-vignettes","dir":"Reference","previous_headings":"","what":"See vignettes","title":"mikropml: User-Friendly R Package for Robust Machine Learning Pipelines — mikropml","text":"Introduction Preprocessing data Hyperparameter tuning Parallel processing mikropml paper","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin.html","id":null,"dir":"Reference","previous_headings":"","what":"Mini OTU abundance dataset — otu_mini_bin","title":"Mini OTU abundance dataset — otu_mini_bin","text":"dataset containing relatives abundances OTUs human stool samples binary outcome, dx. subset otu_small.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Mini OTU abundance dataset — otu_mini_bin","text":"","code":"otu_mini_bin"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Mini OTU abundance dataset — otu_mini_bin","text":"data frame dx column diagnosis: healthy cancerous (colorectal). columns OTU relative abundances.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_glmnet.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with L2 logistic regression on otu_mini_bin with feature importance and grouping — otu_mini_bin_results_glmnet","title":"Results from running the pipeline with L2 logistic regression on otu_mini_bin with feature importance and grouping — otu_mini_bin_results_glmnet","text":"Results running pipeline L2 logistic regression otu_mini_bin feature importance grouping","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_glmnet.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with L2 logistic regression on otu_mini_bin with feature importance and grouping — otu_mini_bin_results_glmnet","text":"","code":"otu_mini_bin_results_glmnet"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_glmnet.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with L2 logistic regression on otu_mini_bin with feature importance and grouping — otu_mini_bin_results_glmnet","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_rf.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with random forest on otu_mini_bin — otu_mini_bin_results_rf","title":"Results from running the pipeline with random forest on otu_mini_bin — otu_mini_bin_results_rf","text":"Results running pipeline random forest otu_mini_bin","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_rf.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with random forest on otu_mini_bin — otu_mini_bin_results_rf","text":"","code":"otu_mini_bin_results_rf"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_rf.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with random forest on otu_mini_bin — otu_mini_bin_results_rf","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_rpart2.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with rpart2 on otu_mini_bin — otu_mini_bin_results_rpart2","title":"Results from running the pipeline with rpart2 on otu_mini_bin — otu_mini_bin_results_rpart2","text":"Results running pipeline rpart2 otu_mini_bin","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_rpart2.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with rpart2 on otu_mini_bin — otu_mini_bin_results_rpart2","text":"","code":"otu_mini_bin_results_rpart2"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_rpart2.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with rpart2 on otu_mini_bin — otu_mini_bin_results_rpart2","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_svmRadial.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with svmRadial on otu_mini_bin — otu_mini_bin_results_svmRadial","title":"Results from running the pipeline with svmRadial on otu_mini_bin — otu_mini_bin_results_svmRadial","text":"Results running pipeline svmRadial otu_mini_bin","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_svmRadial.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with svmRadial on otu_mini_bin — otu_mini_bin_results_svmRadial","text":"","code":"otu_mini_bin_results_svmRadial"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_svmRadial.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with svmRadial on otu_mini_bin — otu_mini_bin_results_svmRadial","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_xgbTree.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with xbgTree on otu_mini_bin — otu_mini_bin_results_xgbTree","title":"Results from running the pipeline with xbgTree on otu_mini_bin — otu_mini_bin_results_xgbTree","text":"Results running pipeline xbgTree otu_mini_bin","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_xgbTree.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with xbgTree on otu_mini_bin — otu_mini_bin_results_xgbTree","text":"","code":"otu_mini_bin_results_xgbTree"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_xgbTree.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with xbgTree on otu_mini_bin — otu_mini_bin_results_xgbTree","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cont_results_glmnet.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome — otu_mini_cont_results_glmnet","title":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome — otu_mini_cont_results_glmnet","text":"Results running pipeline glmnet otu_mini_bin Otu00001 outcome","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cont_results_glmnet.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome — otu_mini_cont_results_glmnet","text":"","code":"otu_mini_cont_results_glmnet"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cont_results_glmnet.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome — otu_mini_cont_results_glmnet","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cont_results_nocv.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome column,\nusing a custom train control scheme that does not perform cross-validation — otu_mini_cont_results_nocv","title":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome column,\nusing a custom train control scheme that does not perform cross-validation — otu_mini_cont_results_nocv","text":"Results running pipeline glmnet otu_mini_bin Otu00001 outcome column, using custom train control scheme perform cross-validation","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cont_results_nocv.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome column,\nusing a custom train control scheme that does not perform cross-validation — otu_mini_cont_results_nocv","text":"","code":"otu_mini_cont_results_nocv"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cont_results_nocv.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome column,\nusing a custom train control scheme that does not perform cross-validation — otu_mini_cont_results_nocv","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cv.html","id":null,"dir":"Reference","previous_headings":"","what":"Cross validation on train_data_mini with grouped features. — otu_mini_cv","title":"Cross validation on train_data_mini with grouped features. — otu_mini_cv","text":"Cross validation train_data_mini grouped features.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cv.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Cross validation on train_data_mini with grouped features. — otu_mini_cv","text":"","code":"otu_mini_cv"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cv.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Cross validation on train_data_mini with grouped features. — otu_mini_cv","text":"object class list length 27.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi.html","id":null,"dir":"Reference","previous_headings":"","what":"Mini OTU abundance dataset with 3 categorical variables — otu_mini_multi","title":"Mini OTU abundance dataset with 3 categorical variables — otu_mini_multi","text":"dataset containing relatives abundances OTUs human stool samples","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Mini OTU abundance dataset with 3 categorical variables — otu_mini_multi","text":"","code":"otu_mini_multi"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Mini OTU abundance dataset with 3 categorical variables — otu_mini_multi","text":"data frame dx column colorectal cancer diagnosis: adenoma, carcinoma, normal. columns OTU relative abundances.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi_group.html","id":null,"dir":"Reference","previous_headings":"","what":"Groups for otu_mini_multi — otu_mini_multi_group","title":"Groups for otu_mini_multi — otu_mini_multi_group","text":"Groups otu_mini_multi","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi_group.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Groups for otu_mini_multi — otu_mini_multi_group","text":"","code":"otu_mini_multi_group"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi_group.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Groups for otu_mini_multi — otu_mini_multi_group","text":"object class character length 490.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi_results_glmnet.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with glmnet on otu_mini_multi for\nmulticlass outcomes — otu_mini_multi_results_glmnet","title":"Results from running the pipeline with glmnet on otu_mini_multi for\nmulticlass outcomes — otu_mini_multi_results_glmnet","text":"Results running pipeline glmnet otu_mini_multi multiclass outcomes","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi_results_glmnet.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with glmnet on otu_mini_multi for\nmulticlass outcomes — otu_mini_multi_results_glmnet","text":"","code":"otu_mini_multi_results_glmnet"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi_results_glmnet.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with glmnet on otu_mini_multi for\nmulticlass outcomes — otu_mini_multi_results_glmnet","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_small.html","id":null,"dir":"Reference","previous_headings":"","what":"Small OTU abundance dataset — otu_small","title":"Small OTU abundance dataset — otu_small","text":"dataset containing relatives abundances 60 OTUs 60 human stool samples. subset data provided extdata/otu_large.csv, used Topçuoğlu et al. 2020.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_small.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Small OTU abundance dataset — otu_small","text":"","code":"otu_small"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_small.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Small OTU abundance dataset — otu_small","text":"data frame 60 rows 61 variables. dx column diagnosis: healthy cancerous (colorectal). columns OTU relative abundances.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/permute_p_value.html","id":null,"dir":"Reference","previous_headings":"","what":"Calculated a permuted p-value comparing two models — permute_p_value","title":"Calculated a permuted p-value comparing two models — permute_p_value","text":"Calculated permuted p-value comparing two models","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/permute_p_value.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Calculated a permuted p-value comparing two models — permute_p_value","text":"","code":"permute_p_value( merged_data, metric, group_name, group_1, group_2, nperm = 10000 )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/permute_p_value.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Calculated a permuted p-value comparing two models — permute_p_value","text":"merged_data concatenated performance data run_ml metric metric compare, must numeric group_name column group variables compare group_1 name one group compare group_2 name group compare nperm number permutations, default=10000","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/permute_p_value.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Calculated a permuted p-value comparing two models — permute_p_value","text":"numeric p-value comparing two models","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/permute_p_value.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Calculated a permuted p-value comparing two models — permute_p_value","text":"Begüm Topçuoğlu, topcuoglu.begum@gmail.com Courtney R Armour, armourc@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/permute_p_value.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Calculated a permuted p-value comparing two models — permute_p_value","text":"","code":"df <- dplyr::tibble( model = c(\"rf\", \"rf\", \"glmnet\", \"glmnet\", \"svmRadial\", \"svmRadial\"), AUC = c(.2, 0.3, 0.8, 0.9, 0.85, 0.95) ) set.seed(123) permute_p_value(df, \"AUC\", \"model\", \"rf\", \"glmnet\", nperm = 100) #> [1] 0.3663366"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_hp_performance.html","id":null,"dir":"Reference","previous_headings":"","what":"Plot hyperparameter performance metrics — plot_hp_performance","title":"Plot hyperparameter performance metrics — plot_hp_performance","text":"Plot hyperparameter performance metrics","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_hp_performance.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Plot hyperparameter performance metrics — plot_hp_performance","text":"","code":"plot_hp_performance(dat, param_col, metric_col)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_hp_performance.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Plot hyperparameter performance metrics — plot_hp_performance","text":"dat dataframe hyperparameters performance metric (e.g. get_hp_performance() combine_hp_performance()) param_col hyperparameter plotted. must column dat. metric_col performance metric. must column dat.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_hp_performance.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Plot hyperparameter performance metrics — plot_hp_performance","text":"ggplot hyperparameter performance.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_hp_performance.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Plot hyperparameter performance metrics — plot_hp_performance","text":"Zena Lapp, zenalapp@umich.edu Kelly Sovacool sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_hp_performance.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Plot hyperparameter performance metrics — plot_hp_performance","text":"","code":"# plot for a single `run_ml()` call hp_metrics <- get_hp_performance(otu_mini_bin_results_glmnet$trained_model) hp_metrics #> $dat #> alpha lambda AUC #> 1 0 1e-04 0.6082552 #> 2 0 1e-03 0.6082552 #> 3 0 1e-02 0.6086458 #> 4 0 1e-01 0.6166789 #> 5 0 1e+00 0.6221737 #> 6 0 1e+01 0.6187408 #> #> $params #> [1] \"lambda\" #> #> $metric #> [1] \"AUC\" #> plot_hp_performance(hp_metrics$dat, lambda, AUC) if (FALSE) { # plot for multiple `run_ml()` calls results <- lapply(seq(100, 102), function(seed) { run_ml(otu_small, \"glmnet\", seed = seed) }) models <- lapply(results, function(x) x$trained_model) hp_metrics <- combine_hp_performance(models) plot_hp_performance(hp_metrics$dat, lambda, AUC) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_model_performance.html","id":null,"dir":"Reference","previous_headings":"","what":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","title":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","text":"ggplot2 required use function.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_model_performance.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","text":"","code":"plot_model_performance(performance_df)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_model_performance.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","text":"performance_df dataframe performance results multiple calls run_ml()","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_model_performance.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","text":"ggplot2 plot performance.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_model_performance.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","text":"Begüm Topçuoglu, topcuoglu.begum@gmail.com Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_model_performance.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","text":"","code":"if (FALSE) { # call `run_ml()` multiple times with different seeds results_lst <- lapply(seq(100, 104), function(seed) { run_ml(otu_small, \"glmnet\", seed = seed) }) # extract and combine the performance results perf_df <- lapply(results_lst, function(result) { result[[\"performance\"]] }) %>% dplyr::bind_rows() # plot the performance results p <- plot_model_performance(perf_df) # call `run_ml()` with different ML methods param_grid <- expand.grid( seeds = seq(100, 104), methods = c(\"glmnet\", \"rf\") ) results_mtx <- mapply( function(seed, method) { run_ml(otu_mini_bin, method, seed = seed, kfold = 2) }, param_grid$seeds, param_grid$methods ) # extract and combine the performance results perf_df2 <- dplyr::bind_rows(results_mtx[\"performance\", ]) # plot the performance results p <- plot_model_performance(perf_df2) # you can continue adding layers to customize the plot p + theme_classic() + scale_color_brewer(palette = \"Dark2\") + coord_flip() }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":null,"dir":"Reference","previous_headings":"","what":"Preprocess data prior to running machine learning — preprocess_data","title":"Preprocess data prior to running machine learning — preprocess_data","text":"Function preprocess data input run_ml().","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Preprocess data prior to running machine learning — preprocess_data","text":"","code":"preprocess_data( dataset, outcome_colname, method = c(\"center\", \"scale\"), remove_var = \"nzv\", collapse_corr_feats = TRUE, to_numeric = TRUE, group_neg_corr = TRUE, prefilter_threshold = 1 )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Preprocess data prior to running machine learning — preprocess_data","text":"dataset Dataframe outcome variable columns features. outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). method Methods preprocess data, described caret::preProcess() (default: c(\"center\",\"scale\"), use NULL normalization). remove_var Whether remove variables near-zero variance ('nzv'; default), zero variance ('zv'), none (NULL). collapse_corr_feats Whether keep one perfectly correlated features. to_numeric Whether change features numeric possible. group_neg_corr Whether group negatively correlated features together (e.g. c(0,1) c(1,0)). prefilter_threshold Remove features non-zero & non-NA values N rows fewer (default: 1). Set -1 keep columns step. step also skipped to_numeric set FALSE.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Preprocess data prior to running machine learning — preprocess_data","text":"Named list including: dat_transformed: Preprocessed data. grp_feats: features grouped together, named list features corresponding group. removed_feats: features removed preprocessing (e.g. zero variance near-zero variance features). progressr package installed, progress bar time elapsed estimated time completion can displayed.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":"more-details","dir":"Reference","previous_headings":"","what":"More details","title":"Preprocess data prior to running machine learning — preprocess_data","text":"See preprocessing vignette details. Note values outcome_colname contain spaces, converted underscores compatibility caret.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Preprocess data prior to running machine learning — preprocess_data","text":"Zena Lapp, zenalapp@umich.edu Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Preprocess data prior to running machine learning — preprocess_data","text":"","code":"preprocess_data(mikropml::otu_small, \"dx\") #> Using 'dx' as the outcome column. #> $dat_transformed #> # A tibble: 200 × 61 #> dx Otu00001 Otu00002 Otu00003 Otu00004 Otu00005 Otu00006 Otu00…¹ Otu00008 #> #> 1 normal -0.420 -0.219 -0.174 -0.591 -0.0488 -0.167 -0.569 -0.0624 #> 2 normal -0.105 1.75 -0.718 0.0381 1.54 -0.573 -0.643 -0.132 #> 3 normal -0.708 0.696 1.43 0.604 -0.265 -0.0364 -0.612 -0.207 #> 4 normal -0.494 -0.665 2.02 -0.593 -0.676 -0.586 -0.552 -0.470 #> 5 normal 1.11 -0.395 -0.754 -0.586 -0.754 2.73 0.191 -0.676 #> 6 normal -0.685 0.614 -0.174 -0.584 0.376 0.804 -0.337 -0.00608 #> 7 cancer -0.770 -0.496 -0.318 0.159 -0.658 2.20 -0.717 0.0636 #> 8 normal -0.424 -0.478 -0.397 -0.556 -0.391 -0.0620 0.376 -0.0222 #> 9 normal -0.556 1.14 1.62 -0.352 -0.275 -0.465 -0.804 0.294 #> 10 cancer 1.46 -0.451 -0.694 -0.0567 -0.706 0.689 -0.370 1.59 #> # … with 190 more rows, 52 more variables: Otu00009 , Otu00010 , #> # Otu00011 , Otu00012 , Otu00013 , Otu00014 , #> # Otu00015 , Otu00016 , Otu00017 , Otu00018 , #> # Otu00019 , Otu00020 , Otu00021 , Otu00022 , #> # Otu00023 , Otu00024 , Otu00025 , Otu00026 , #> # Otu00027 , Otu00028 , Otu00029 , Otu00030 , #> # Otu00031 , Otu00032 , Otu00033 , Otu00034 , … #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0) #> # the function can show a progress bar if you have the progressr package installed ## optionally, specify the progress bar format progressr::handlers(progressr::handler_progress( format = \":message :bar :percent | elapsed: :elapsed | eta: :eta\", clear = FALSE, show_after = 0 )) ## tell progressor to always report progress if (FALSE) { progressr::handlers(global = TRUE) ## run the function and watch the live progress udpates dat_preproc <- preprocess_data(mikropml::otu_small, \"dx\") }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/randomize_feature_order.html","id":null,"dir":"Reference","previous_headings":"","what":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","title":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","text":"Randomize feature order eliminate position-dependent effects","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/randomize_feature_order.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","text":"","code":"randomize_feature_order(dataset, outcome_colname)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/randomize_feature_order.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","text":"dataset Dataframe outcome variable columns features. outcome_colname Column name string outcome variable (default NULL; first column chosen automatically).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/randomize_feature_order.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","text":"Dataset feature order randomized.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/randomize_feature_order.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","text":"Nick Lesniak, nlesniak@umich.edu Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/randomize_feature_order.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","text":"","code":"dat <- data.frame( outcome = c(\"1\", \"2\", \"3\"), a = 4:6, b = 7:9, c = 10:12, d = 13:15 ) randomize_feature_order(dat, \"outcome\") #> outcome c b a d #> 1 1 10 7 4 13 #> 2 2 11 8 5 14 #> 3 3 12 9 6 15"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/reexports.html","id":null,"dir":"Reference","previous_headings":"","what":"dplyr pipe — reexports","title":"dplyr pipe — reexports","text":"objects imported packages. Follow links see documentation. caret contr.ltfr dplyr %>% rlang :=, !!, .data","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/remove_singleton_columns.html","id":null,"dir":"Reference","previous_headings":"","what":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","title":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","text":"Removes columns non-zero & non-NA values threshold row(s) fewer.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/remove_singleton_columns.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","text":"","code":"remove_singleton_columns(dat, threshold = 1)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/remove_singleton_columns.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","text":"dat dataframe threshold Number rows. column non-zero & non-NA values threshold row(s) fewer, removed.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/remove_singleton_columns.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","text":"dataframe without singleton columns","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/remove_singleton_columns.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","text":"Kelly Sovacool, sovacool@umich.edu Courtney Armour","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/remove_singleton_columns.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","text":"","code":"remove_singleton_columns(data.frame(a = 1:3, b = c(0, 1, 0), c = 4:6)) #> $dat #> a c #> 1 1 4 #> 2 2 5 #> 3 3 6 #> #> $removed_feats #> [1] \"b\" #> remove_singleton_columns(data.frame(a = 1:3, b = c(0, 1, 0), c = 4:6), threshold = 0) #> $dat #> a b c #> 1 1 0 4 #> 2 2 1 5 #> 3 3 0 6 #> #> $removed_feats #> character(0) #> remove_singleton_columns(data.frame(a = 1:3, b = c(0, 1, NA), c = 4:6)) #> $dat #> a c #> 1 1 4 #> 2 2 5 #> 3 3 6 #> #> $removed_feats #> [1] \"b\" #> remove_singleton_columns(data.frame(a = 1:3, b = c(1, 1, 1), c = 4:6)) #> $dat #> a b c #> 1 1 1 4 #> 2 2 1 5 #> 3 3 1 6 #> #> $removed_feats #> character(0) #>"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/replace_spaces.html","id":null,"dir":"Reference","previous_headings":"","what":"Replace spaces in all elements of a character vector with underscores — replace_spaces","title":"Replace spaces in all elements of a character vector with underscores — replace_spaces","text":"Replace spaces elements character vector underscores","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/replace_spaces.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Replace spaces in all elements of a character vector with underscores — replace_spaces","text":"","code":"replace_spaces(x, new_char = \"_\")"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/replace_spaces.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Replace spaces in all elements of a character vector with underscores — replace_spaces","text":"x character vector new_char character replace spaces (default: _)","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/replace_spaces.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Replace spaces in all elements of a character vector with underscores — replace_spaces","text":"character vector spaces replaced new_char","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/replace_spaces.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Replace spaces in all elements of a character vector with underscores — replace_spaces","text":"Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/replace_spaces.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Replace spaces in all elements of a character vector with underscores — replace_spaces","text":"","code":"dat <- data.frame( dx = c(\"outcome 1\", \"outcome 2\", \"outcome 1\"), a = 1:3, b = c(5, 7, 1) ) dat$dx <- replace_spaces(dat$dx) dat #> dx a b #> 1 outcome_1 1 5 #> 2 outcome_2 2 7 #> 3 outcome_1 3 1"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":null,"dir":"Reference","previous_headings":"","what":"Run the machine learning pipeline — run_ml","title":"Run the machine learning pipeline — run_ml","text":"function runs machine learning (ML), evaluates best model, optionally calculates feature importance using framework outlined Topçuoğlu et al. 2020 (doi:10.1128/mBio.00434-20 ). Required inputs dataframe outcome variable columns features, well ML method. See vignette('introduction') details.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Run the machine learning pipeline — run_ml","text":"","code":"run_ml( dataset, method, outcome_colname = NULL, hyperparameters = NULL, find_feature_importance = FALSE, calculate_performance = TRUE, kfold = 5, cv_times = 100, cross_val = NULL, training_frac = 0.8, perf_metric_function = NULL, perf_metric_name = NULL, groups = NULL, group_partitions = NULL, corr_thresh = 1, seed = NA, ... )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Run the machine learning pipeline — run_ml","text":"dataset Dataframe outcome variable columns features. method ML method. Options: c(\"glmnet\", \"rf\", \"rpart2\", \"svmRadial\", \"xgbTree\"). glmnet: linear, logistic, multiclass regression rf: random forest rpart2: decision tree svmRadial: support vector machine xgbTree: xgboost outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). hyperparameters Dataframe hyperparameters (default NULL; sensible defaults chosen automatically). find_feature_importance Run permutation importance (default: FALSE). TRUE recommended like identify features important predicting outcome, resource-intensive. calculate_performance Whether calculate performance metrics (default: TRUE). might choose skip perform cross-validation model training. kfold Fold number k-fold cross-validation (default: 5). cv_times Number cross-validation partitions create (default: 100). cross_val custom cross-validation scheme caret::trainControl() (default: NULL, uses kfold cross validation repeated cv_times). kfold cv_times ignored user provides custom cross-validation scheme. See caret::trainControl() docs information use . training_frac Fraction data training set (default: 0.8). Rows dataset randomly selected training set, remaining rows used testing set. Alternatively, provide vector integers, used row indices training set. remaining rows used testing set. perf_metric_function Function calculate performance metric used cross-validation test performance. functions provided caret (see caret::defaultSummary()). Defaults: binary classification = twoClassSummary, multi-class classification = multiClassSummary, regression = defaultSummary. perf_metric_name column name output function provided perf_metric_function used performance metric. Defaults: binary classification = \"ROC\", multi-class classification = \"logLoss\", regression = \"RMSE\". groups Vector groups keep together splitting data train test sets. number groups training set larger kfold, groups also kept together cross-validation. Length matches number rows dataset (default: NULL). group_partitions Specify assign groups training testing partitions (default: NULL). groups specifies samples belong group \"\" belong group \"B\", setting group_partitions = list(train = c(\"\", \"B\"), test = c(\"B\")) result samples group \"\" placed training set, samples \"B\" also training set, remaining samples \"B\" testing set. partition sizes close training_frac possible. number groups training set larger kfold, groups also kept together cross-validation. corr_thresh feature importance, group correlations equal corr_thresh (range 0 1; default: 1). seed Random seed (default: NA). results reproducible set seed. ... additional arguments passed caret::train(), case weights via weights argument ntree rf models. See caret::train() docs details.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Run the machine learning pipeline — run_ml","text":"Named list results: trained_model: Output caret::train(), including best model. test_data: Part data used testing. performance: Dataframe performance metrics. first column cross-validation performance metric, last two columns ML method used seed (one set), respectively. columns performance metrics calculated test data. contains one row, can easily combine performance dataframes multiple calls run_ml() (see vignette(\"parallel\")). feature_importance: feature importances calculated, dataframe row feature correlated group. columns performance metric permuted data, difference true performance metric performance metric permuted data (true - permuted), feature name, ML method, performance metric name, seed (provided). AUC RMSE, higher perf_metric_diff , important feature predicting outcome. log loss, lower perf_metric_diff , important feature predicting outcome.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":"more-details","dir":"Reference","previous_headings":"","what":"More details","title":"Run the machine learning pipeline — run_ml","text":"details, please see vignettes.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Run the machine learning pipeline — run_ml","text":"Begüm Topçuoğlu, topcuoglu.begum@gmail.com Zena Lapp, zenalapp@umich.edu Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Run the machine learning pipeline — run_ml","text":"","code":"if (FALSE) { # regression run_ml(otu_small, \"glmnet\", seed = 2019 ) # random forest w/ feature importance run_ml(otu_small, \"rf\", outcome_colname = \"dx\", find_feature_importance = TRUE ) # custom cross validation & hyperparameters run_ml(otu_mini_bin[, 2:11], \"glmnet\", outcome_colname = \"Otu00001\", seed = 2019, hyperparameters = list(lambda = c(1e-04), alpha = 0), cross_val = caret::trainControl(method = \"none\"), calculate_performance = FALSE ) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/tidy_perf_data.html","id":null,"dir":"Reference","previous_headings":"","what":"Tidy the performance dataframe — tidy_perf_data","title":"Tidy the performance dataframe — tidy_perf_data","text":"Used plot_model_performance().","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/tidy_perf_data.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Tidy the performance dataframe — tidy_perf_data","text":"","code":"tidy_perf_data(performance_df)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/tidy_perf_data.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Tidy the performance dataframe — tidy_perf_data","text":"performance_df dataframe performance results multiple calls run_ml()","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/tidy_perf_data.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Tidy the performance dataframe — tidy_perf_data","text":"Tidy dataframe model performance metrics.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/tidy_perf_data.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Tidy the performance dataframe — tidy_perf_data","text":"Begüm Topçuoglu, topcuoglu.begum@gmail.com Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/tidy_perf_data.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Tidy the performance dataframe — tidy_perf_data","text":"","code":"if (FALSE) { # call `run_ml()` multiple times with different seeds results_lst <- lapply(seq(100, 104), function(seed) { run_ml(otu_small, \"glmnet\", seed = seed) }) # extract and combine the performance results perf_df <- lapply(results_lst, function(result) { result[[\"performance\"]] }) %>% dplyr::bind_rows() # make it pretty! tidy_perf_data(perf_df) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/train_model.html","id":null,"dir":"Reference","previous_headings":"","what":"Train model using caret::train(). — train_model","title":"Train model using caret::train(). — train_model","text":"Train model using caret::train().","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/train_model.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Train model using caret::train(). — train_model","text":"","code":"train_model( train_data, outcome_colname, method, cv, perf_metric_name, tune_grid, ... )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/train_model.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Train model using caret::train(). — train_model","text":"train_data Training data. Expected subset full dataset. outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). method ML method. Options: c(\"glmnet\", \"rf\", \"rpart2\", \"svmRadial\", \"xgbTree\"). glmnet: linear, logistic, multiclass regression rf: random forest rpart2: decision tree svmRadial: support vector machine xgbTree: xgboost cv Cross-validation caret scheme define_cv(). perf_metric_name column name output function provided perf_metric_function used performance metric. Defaults: binary classification = \"ROC\", multi-class classification = \"logLoss\", regression = \"RMSE\". tune_grid Tuning grid get_tuning_grid().#' ... additional arguments passed caret::train(), case weights via weights argument ntree rf models. See caret::train() docs details.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/train_model.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Train model using caret::train(). — train_model","text":"Trained model caret::train().","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/train_model.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Train model using caret::train(). — train_model","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/train_model.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Train model using caret::train(). — train_model","text":"","code":"if (FALSE) { training_data <- otu_mini_bin_results_glmnet$trained_model$trainingData %>% dplyr::rename(dx = .outcome) method <- \"rf\" hyperparameters <- get_hyperparams_list(otu_mini_bin, method) cross_val <- define_cv(training_data, \"dx\", hyperparameters, perf_metric_function = caret::multiClassSummary, class_probs = TRUE, cv_times = 2 ) tune_grid <- get_tuning_grid(hyperparameters, method) rf_model <- train_model( training_data, \"dx\", method, cross_val, \"AUC\", tune_grid, ntree = 1000 ) rf_model$results %>% dplyr::select(mtry, AUC, prAUC) }"},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-140","dir":"Changelog","previous_headings":"","what":"mikropml 1.4.0","title":"mikropml 1.4.0","text":"CRAN release: 2022-10-16 Users can now pass model-specific arguments (e.g. weights) caret::train(), allowing greater flexibility. Improved tests (#298, #300, #303 #kelly-sovacool) Minor documentation improvements.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-130","dir":"Changelog","previous_headings":"","what":"mikropml 1.3.0","title":"mikropml 1.3.0","text":"CRAN release: 2022-05-20 mikropml now requires R version 4.1.0 greater due update randomForest package (#292). New function compare_models() compares performance two models permutation test (#295, @courtneyarmour). Fixed bug cv_times affect reported repeats cross-validation (#291, @kelly-sovacool). Made minor documentation improvements (#293, @kelly-sovacool)","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-122","dir":"Changelog","previous_headings":"","what":"mikropml 1.2.2","title":"mikropml 1.2.2","text":"CRAN release: 2022-02-03 minor patch fixes test failure platforms long doubles. actual package code remains unchanged.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-121","dir":"Changelog","previous_headings":"","what":"mikropml 1.2.1","title":"mikropml 1.2.1","text":"CRAN release: 2022-01-30 using groups parameter, groups kept together cross-validation partitions kfold <= number groups training set. Previously, error thrown condition met. Now, enough groups training set groups kept together CV, groups allowed split across CV partitions. Report p-values permutation feature importance (#288, @kelly-sovacool).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-120","dir":"Changelog","previous_headings":"","what":"mikropml 1.2.0","title":"mikropml 1.2.0","text":"CRAN release: 2021-11-10 Also added new parameter calculate_performance, controls whether performance metrics calculated (default: TRUE). Users may wish skip performance calculations training models cross-validation. New parameter group_partitions added run_ml() allows users control groups go partition train/test split (#281, @kelly-sovacool). default, training_frac fraction 0 1 specifies much dataset used training fraction train/test split. Users can instead give training_frac vector indices correspond rows dataset go training fraction train/test split. gives users direct control exactly observations training fraction desired.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-111","dir":"Changelog","previous_headings":"","what":"mikropml 1.1.1","title":"mikropml 1.1.1","text":"CRAN release: 2021-09-14 Also, group_correlated_features() now user-facing function.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-110","dir":"Changelog","previous_headings":"","what":"mikropml 1.1.0","title":"mikropml 1.1.0","text":"CRAN release: 2021-08-10 default still “spearman”, now can use methods supported stats::cor corr_method parameter: get_feature_importance(corr_method = \"pearson\") now video tutorials covering mikropml skills related machine learning, created @pschloss (#270). Fixed bug preprocess_data() converted outcome column character vector (#273, @kelly-sovacool, @ecmaggioncalda).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-100","dir":"Changelog","previous_headings":"","what":"mikropml 1.0.0","title":"mikropml 1.0.0","text":"CRAN release: 2021-05-13 mikropml now logo created @NLesniak! Made documentation improvements (#238, #231 @kelly-sovacool; #256 @BTopcuoglu). Remove features appear N=prefilter_threshold fewer rows data. Created function remove_singleton_columns() called preprocess_data() carry . Provide custom groups features permute together permutation importance. groups NULL default; case, correlated features corr_thresh grouped together. preprocess_data() now replaces spaces outcome column underscores (#247, @kelly-sovacool, @JonnyTran). Clarify intro vignette support multi-label outcomes. (#254, @zenalapp) Optional progress bar preprocess_data() get_feature_importance() using progressr package (#257, @kelly-sovacool, @JonnyTran, @FedericoComoglio). mikropml paper soon published JOSS!","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-002","dir":"Changelog","previous_headings":"","what":"mikropml 0.0.2","title":"mikropml 0.0.2","text":"CRAN release: 2020-12-03 Fixed test failure Solaris. Fixed multiple test failures R 3.6.2 due stringsAsFactors behavior. Made minor documentation improvements. Moved rpart Suggests Imports consistency packages used model training.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-001","dir":"Changelog","previous_headings":"","what":"mikropml 0.0.1","title":"mikropml 0.0.1","text":"CRAN release: 2020-11-23 first release version mikropml! 🎉 Added NEWS.md file track changes package. run_ml() preprocess_data() plot_model_performance() plot_hp_performance() glmnet: logistic linear regression rf: random forest rpart2: decision trees svmRadial: support vector machines xgbTree: gradient-boosted trees Introduction Preprocess data Hyperparameter tuning Parallel processing mikropml paper","code":""}] +[{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":null,"dir":"","previous_headings":"","what":"Contributor Covenant Code of Conduct","title":"Contributor Covenant Code of Conduct","text":"document adapted Tidyverse Code Conduct.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"our-pledge","dir":"","previous_headings":"","what":"Our Pledge","title":"Contributor Covenant Code of Conduct","text":"members, contributors, leaders pledge make participation community harassment-free experience everyone, regardless age, body size, visible invisible disability, ethnicity, sex characteristics, gender identity expression, level experience, education, socio-economic status, nationality, personal appearance, race, religion, sexual identity orientation. pledge act interact ways contribute open, welcoming, diverse, inclusive, healthy community.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"our-standards","dir":"","previous_headings":"","what":"Our Standards","title":"Contributor Covenant Code of Conduct","text":"Examples behavior contributes positive environment community include: Demonstrating empathy kindness toward people respectful differing opinions, viewpoints, experiences Giving gracefully accepting constructive feedback Accepting responsibility apologizing affected mistakes, learning experience Focusing best just us individuals, overall community Examples unacceptable behavior include: use sexualized language imagery, sexual attention advances kind Trolling, insulting derogatory comments, personal political attacks Public private harassment Publishing others’ private information, physical email address, without explicit permission conduct reasonably considered inappropriate professional setting","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"enforcement-responsibilities","dir":"","previous_headings":"","what":"Enforcement Responsibilities","title":"Contributor Covenant Code of Conduct","text":"Community leaders responsible clarifying enforcing standards acceptable behavior take appropriate fair corrective action response behavior deem inappropriate, threatening, offensive, harmful. Community leaders right responsibility remove, edit, reject comments, commits, code, wiki edits, issues, contributions aligned Code Conduct, communicate reasons moderation decisions appropriate.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"scope","dir":"","previous_headings":"","what":"Scope","title":"Contributor Covenant Code of Conduct","text":"Code Conduct applies within community spaces, also applies individual officially representing community public spaces. Examples representing community include using official e-mail address, posting via official social media account, acting appointed representative online offline event.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"enforcement","dir":"","previous_headings":"","what":"Enforcement","title":"Contributor Covenant Code of Conduct","text":"Instances abusive, harassing, otherwise unacceptable behavior may reported community leaders responsible enforcement [INSERT CONTACT METHOD]. complaints reviewed investigated promptly fairly. community leaders obligated respect privacy security reporter incident.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"enforcement-guidelines","dir":"","previous_headings":"","what":"Enforcement Guidelines","title":"Contributor Covenant Code of Conduct","text":"Community leaders follow Community Impact Guidelines determining consequences action deem violation Code Conduct:","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"id_1-correction","dir":"","previous_headings":"Enforcement Guidelines","what":"1. Correction","title":"Contributor Covenant Code of Conduct","text":"Community Impact: Use inappropriate language behavior deemed unprofessional unwelcome community. Consequence: private, written warning community leaders, providing clarity around nature violation explanation behavior inappropriate. public apology may requested.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"id_2-warning","dir":"","previous_headings":"Enforcement Guidelines","what":"2. Warning","title":"Contributor Covenant Code of Conduct","text":"Community Impact: violation single incident series actions. Consequence: warning consequences continued behavior. interaction people involved, including unsolicited interaction enforcing Code Conduct, specified period time. includes avoiding interactions community spaces well external channels like social media. Violating terms may lead temporary permanent ban.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"id_3-temporary-ban","dir":"","previous_headings":"Enforcement Guidelines","what":"3. Temporary Ban","title":"Contributor Covenant Code of Conduct","text":"Community Impact: serious violation community standards, including sustained inappropriate behavior. Consequence: temporary ban sort interaction public communication community specified period time. public private interaction people involved, including unsolicited interaction enforcing Code Conduct, allowed period. Violating terms may lead permanent ban.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"id_4-permanent-ban","dir":"","previous_headings":"Enforcement Guidelines","what":"4. Permanent Ban","title":"Contributor Covenant Code of Conduct","text":"Community Impact: Demonstrating pattern violation community standards, including sustained inappropriate behavior, harassment individual, aggression toward disparagement classes individuals. Consequence: permanent ban sort public interaction within community.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"attribution","dir":"","previous_headings":"","what":"Attribution","title":"Contributor Covenant Code of Conduct","text":"Code Conduct adapted Contributor Covenant, version 2.0, available https://www.contributor-covenant.org/version/2/0/ code_of_conduct.html. Community Impact Guidelines inspired Mozilla’s code conduct enforcement ladder. answers common questions code conduct, see FAQ https://www.contributor-covenant.org/faq. Translations available https:// www.contributor-covenant.org/translations.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CONTRIBUTING.html","id":null,"dir":"","previous_headings":"","what":"Contributing to mikropml","title":"Contributing to mikropml","text":"document adapted Tidyverse Contributing guide.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CONTRIBUTING.html","id":"fixing-typos","dir":"","previous_headings":"","what":"Fixing typos","title":"Contributing to mikropml","text":"can fix typos, spelling mistakes, grammatical errors documentation directly using GitHub web interface, long changes made source file. generally means ’ll need edit roxygen2 comments .R, .Rd file. can find .R file generates .Rd reading comment first line.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CONTRIBUTING.html","id":"bigger-changes","dir":"","previous_headings":"","what":"Bigger changes","title":"Contributing to mikropml","text":"want make bigger change, ’s good idea first file issue make sure someone team agrees ’s needed. ’ve found bug, please file issue illustrates bug minimal reprex (also help write unit test, needed).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CONTRIBUTING.html","id":"pull-request-process","dir":"","previous_headings":"Bigger changes","what":"Pull request process","title":"Contributing to mikropml","text":"Fork package clone onto computer. haven’t done , recommend using usethis::create_from_github(\"SchlossLab/mikropml\", fork = TRUE). Install development dependences devtools::install_dev_deps(), make sure package passes R CMD check running devtools::check(). R CMD check doesn’t pass cleanly, ’s good idea ask help continuing. Create Git branch pull request (PR). recommend using usethis::pr_init(\"brief-description--change\"). Make changes, commit git, create PR running usethis::pr_push(), following prompts browser. title PR briefly describe change. body PR contain Fixes #issue-number. user-facing changes, add bullet top NEWS.md (.e. just first header). Follow style described https://style.tidyverse.org/news.html.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CONTRIBUTING.html","id":"code-style","dir":"","previous_headings":"Bigger changes","what":"Code style","title":"Contributing to mikropml","text":"New code follow tidyverse style guide. can use styler package apply styles, please don’t restyle code nothing PR. use roxygen2, Markdown syntax, documentation. use testthat unit tests. Contributions test cases included easier accept.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CONTRIBUTING.html","id":"code-of-conduct","dir":"","previous_headings":"","what":"Code of Conduct","title":"Contributing to mikropml","text":"Please note mikropml project released Contributor Code Conduct. contributing project agree abide terms.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/LICENSE.html","id":null,"dir":"","previous_headings":"","what":"MIT License","title":"MIT License","text":"Copyright (c) 2019-2021 Begüm D. Topçuoğlu, Zena Lapp, Kelly L. Sovacool, Evan Snitkin, Jenna Wiens, Patrick D. Schloss Permission hereby granted, free charge, person obtaining copy software associated documentation files (“Software”), deal Software without restriction, including without limitation rights use, copy, modify, merge, publish, distribute, sublicense, /sell copies Software, permit persons Software furnished , subject following conditions: copyright notice permission notice shall included copies substantial portions Software. SOFTWARE PROVIDED “”, WITHOUT WARRANTY KIND, EXPRESS IMPLIED, INCLUDING LIMITED WARRANTIES MERCHANTABILITY, FITNESS PARTICULAR PURPOSE NONINFRINGEMENT. EVENT SHALL AUTHORS COPYRIGHT HOLDERS LIABLE CLAIM, DAMAGES LIABILITY, WHETHER ACTION CONTRACT, TORT OTHERWISE, ARISING , CONNECTION SOFTWARE USE DEALINGS SOFTWARE.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/SUPPORT.html","id":null,"dir":"","previous_headings":"","what":"Getting help with mikropml","title":"Getting help with mikropml","text":"Thanks using mikropml! filing issue, places explore pieces put together make process smooth possible.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/SUPPORT.html","id":"make-a-reprex","dir":"","previous_headings":"","what":"Make a reprex","title":"Getting help with mikropml","text":"Start making minimal reproducible example using reprex package. haven’t heard used reprex , ’re treat! Seriously, reprex make R-question-asking endeavors easier (pretty insane ROI five ten minutes ’ll take learn ’s ). additional reprex pointers, check Get help! section tidyverse site.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/SUPPORT.html","id":"where-to-ask","dir":"","previous_headings":"","what":"Where to ask?","title":"Getting help with mikropml","text":"Armed reprex, next step figure ask. ’s question: start community.rstudio.com, /StackOverflow. people answer questions. ’s bug: ’re right place, file issue. ’re sure: let community help figure ! problem bug feature request, can easily return report . opening new issue, sure search issues pull requests make sure bug hasn’t reported /already fixed development version. default, search pre-populated :issue :open. can edit qualifiers (e.g. :pr, :closed) needed. example, ’d simply remove :open search issues repo, open closed.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/SUPPORT.html","id":"what-happens-next","dir":"","previous_headings":"","what":"What happens next?","title":"Getting help with mikropml","text":"efficient possible, development tidyverse packages tends bursty, shouldn’t worry don’t get immediate response. Typically don’t look repo sufficient quantity issues accumulates, ’s burst intense activity focus efforts. makes development efficient avoids expensive context switching problems, cost taking longer get back . process makes good reprex particularly important might multiple months initial report start working . can’t reproduce bug, can’t fix !","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"its-running-so-slow","dir":"Articles","previous_headings":"","what":"It’s running so slow!","title":"Introduction to mikropml","text":"Since assume lot won’t read entire vignette, ’m going say beginning. run_ml() function running super slow, consider parallelizing. See vignette(\"parallel\") examples.","code":""},{"path":[]},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"the-input-data","dir":"Articles","previous_headings":"Understanding the inputs","what":"The input data","title":"Introduction to mikropml","text":"input data run_ml() dataframe row sample observation. One column (assumed first) outcome interest, columns features. package otu_mini_bin small example dataset mikropml. , dx outcome column (normal cancer), 10 features (Otu00001 Otu00010). 2 outcomes, performing binary classification majority examples . bottom, also briefly provide examples multi-class continuous outcomes. ’ll see, run way binary classification! feature columns amount Operational Taxonomic Unit (OTU) microbiome samples patients cancer without cancer. goal predict dx, stands diagnosis. diagnosis can cancer based individual’s microbiome. need understand exactly means, ’re interested can read original paper (Topçuoğlu et al. 2020). real machine learning applications ’ll need use features, purposes vignette ’ll stick example dataset everything runs faster.","code":"# install.packages(\"devtools\") # devtools::install_github(\"SchlossLab/mikropml\") library(mikropml) head(otu_mini_bin) #> dx Otu00001 Otu00002 Otu00003 Otu00004 Otu00005 Otu00006 Otu00007 #> 1 normal 350 268 213 1 208 230 70 #> 2 normal 568 1320 13 293 671 103 48 #> 3 normal 151 756 802 556 145 271 57 #> 4 normal 299 30 1018 0 25 99 75 #> 5 normal 1409 174 0 3 2 1136 296 #> 6 normal 167 712 213 4 332 534 139 #> Otu00008 Otu00009 Otu00010 #> 1 230 235 64 #> 2 204 119 115 #> 3 176 37 710 #> 4 78 255 197 #> 5 1 537 533 #> 6 251 155 122"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"the-methods-we-support","dir":"Articles","previous_headings":"Understanding the inputs","what":"The methods we support","title":"Introduction to mikropml","text":"methods use supported great ML wrapper package caret, use train machine learning models. methods tested (backend packages) : Logistic/multiclass/linear regression (\"glmnet\") Random forest (\"rf\") Decision tree (\"rpart2\") Support vector machine radial basis kernel (\"svmRadial\") xgboost (\"xgbTree\") documentation methods, well many others, can look available models (see list tag). vetted models used caret, function general enough others might work. can’t promise can help models, feel free [start new discussion GitHub]https://github.com/SchlossLab/mikropml/discussions) questions models might able help. first focus glmnet, default implementation L2-regularized logistic regression. cover examples towards end.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"before-running-ml","dir":"Articles","previous_headings":"","what":"Before running ML","title":"Introduction to mikropml","text":"execute run_ml(), consider preprocessing data, either preprocess_data() function. can learn preprocessing vignette: vignette(\"preprocess\").","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"the-simplest-way-to-run_ml","dir":"Articles","previous_headings":"","what":"The simplest way to run_ml()","title":"Introduction to mikropml","text":"mentioned , minimal input dataset (dataset) machine learning model want use (method). may also want provide: outcome column name. default run_ml() pick first column, ’s best practice specify column name explicitly. seed results reproducible, get results see (.e train/test split). Say want use logistic regression, method use glmnet. , run ML pipeline : ’ll notice things: takes little run. parameters use. message stating ‘dx’ used outcome column. want, ’s nice sanity check! warning. Don’t worry warning right now - just means hyperparameters aren’t good fit - ’re interested learning , see vignette(\"tuning\"). Now, let’s dig output bit. results list 4 things: trained_model trained model caret. bunch info won’t get , can learn caret::train() documentation. test_data partition dataset used testing. machine learning, ’s always important held-test dataset used training stage. pipeline using run_ml() split data training testing sets. training data used build model (e.g. tune hyperparameters, learn data) test data used evaluate well model performs. performance dataframe (mainly) performance metrics (1 column cross-validation performance metric, several test performance metrics, 2 columns end ML method seed): using logistic regression binary classification, area receiver-operator characteristic curve (AUC) useful metric evaluate model performance. , ’s default use mikropml. However, crucial evaluate model performance using multiple metrics. can find information performance metrics use package. cv_metric_AUC AUC cross-validation folds training data. gives us sense well model performs training data. columns performance metrics test data — data wasn’t used build model. , can see AUC test data much 0.5, suggesting model predict much better chance, model overfit cross-validation AUC (cv_metric_AUC, measured training) much higher testing AUC. isn’t surprising since ’re using features example dataset, don’t discouraged. default option also provides number performance metrics might interested , including area precision-recall curve (prAUC). last columns results$performance method seed (set one) help combining results multiple runs (see vignette(\"parallel\")). feature_importance information feature importance values find_feature_importance = TRUE (default FALSE). Since used defaults, ’s nothing :","code":"results <- run_ml(otu_mini_bin, \"glmnet\", outcome_colname = \"dx\", seed = 2019 ) names(results) #> [1] \"trained_model\" \"test_data\" \"performance\" #> [4] \"feature_importance\" names(results$trained_model) #> [1] \"method\" \"modelInfo\" \"modelType\" \"results\" \"pred\" #> [6] \"bestTune\" \"call\" \"dots\" \"metric\" \"control\" #> [11] \"finalModel\" \"preProcess\" \"trainingData\" \"ptype\" \"resample\" #> [16] \"resampledCM\" \"perfNames\" \"maximize\" \"yLimits\" \"times\" #> [21] \"levels\" head(results$test_data) #> dx Otu00009 Otu00005 Otu00010 Otu00001 Otu00008 Otu00004 Otu00003 #> 9 normal 119 142 248 256 363 112 871 #> 14 normal 60 209 70 86 96 1 123 #> 16 cancer 205 5 180 1668 95 22 3 #> 17 normal 188 356 107 381 1035 915 315 #> 27 normal 4 21 161 7 1 27 8 #> 30 normal 13 166 5 31 33 5 58 #> Otu00002 Otu00007 Otu00006 #> 9 995 0 137 #> 14 426 54 40 #> 16 20 590 570 #> 17 357 253 341 #> 27 25 322 5 #> 30 179 6 30 results$performance #> # A tibble: 1 × 17 #> cv_metric_AUC logLoss AUC prAUC Accuracy Kappa F1 Sensi…¹ Speci…² Pos_P…³ #> #> 1 0.622 0.684 0.647 0.606 0.590 0.179 0.6 0.6 0.579 0.6 #> # … with 7 more variables: Neg_Pred_Value , Precision , Recall , #> # Detection_Rate , Balanced_Accuracy , method , seed , #> # and abbreviated variable names ¹​Sensitivity, ²​Specificity, ³​Pos_Pred_Value results$feature_importance #> [1] \"Skipped feature importance\""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"customizing-parameters","dir":"Articles","previous_headings":"","what":"Customizing parameters","title":"Introduction to mikropml","text":"arguments allow change execute run_ml(). ’ve chosen reasonable defaults , encourage change think something else better data.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"changing-kfold-cv_times-and-training_frac","dir":"Articles","previous_headings":"Customizing parameters","what":"Changing kfold, cv_times, and training_frac","title":"Introduction to mikropml","text":"kfold: number folds run cross-validation (default: 5). cv_times: number times run repeated cross-validation (default: 100). training_frac: fraction data training set (default: 0.8). rest data used testing. ’s example change default parameters: might noticed one ran faster — ’s reduced kfold cv_times. okay testing things may even necessary smaller datasets. general may better larger numbers parameters; think defaults good starting point (Topçuoğlu et al. 2020).","code":"results_custom <- run_ml(otu_mini_bin, \"glmnet\", kfold = 2, cv_times = 5, training_frac = 0.5, seed = 2019 ) #> Using 'dx' as the outcome column. #> Training the model... #> Loading required package: ggplot2 #> Loading required package: lattice #> #> Attaching package: 'caret' #> The following object is masked from 'package:mikropml': #> #> compare_models #> Warning in (function (w) : `caret::train()` issued the following warning: #> #> simpleWarning in nominalTrainWorkflow(x = x, y = y, wts = weights, info = trainInfo, : There were missing values in resampled performance measures. #> #> This warning usually means that the model didn't converge in some cross-validation folds because it is predicting something close to a constant. As a result, certain performance metrics can't be calculated. This suggests that some of the hyperparameters chosen are doing very poorly. #> Training complete."},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"custom-training-indices","dir":"Articles","previous_headings":"Customizing parameters > Changing kfold, cv_times, and training_frac","what":"Custom training indices","title":"Introduction to mikropml","text":"training_frac fraction 0 1, random sample observations dataset chosen training set satisfy training_frac. However, cases might wish control exactly observations training set. can instead assign training_frac vector indices correspond rows dataset go training set (remaining sequences go testing set). ’s example ~80% data training set:","code":"n_obs <- otu_mini_bin %>% nrow() training_size <- 0.8 * n_obs training_rows <- sample(n_obs, training_size) results_custom_train <- run_ml(otu_mini_bin, \"glmnet\", kfold = 2, cv_times = 5, training_frac = training_rows, seed = 2019 ) #> Using 'dx' as the outcome column. #> Using the custom training set indices provided by `training_frac`. #> The fraction of data in the training set will be 0.8 #> Training the model... #> Training complete."},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"changing-the-performance-metric","dir":"Articles","previous_headings":"Customizing parameters","what":"Changing the performance metric","title":"Introduction to mikropml","text":"two arguments allow change performance metric use model evaluation, performance metrics calculate using test data. perf_metric_function function used calculate performance metrics. default classification caret::multiClassSummary() default regression caret::defaultSummary(). ’d suggest changing unless really know ’re . perf_metric_name column name output perf_metric_function. chose reasonable defaults (AUC binary, logLoss multiclass, RMSE continuous), default functions calculate bunch different performance metrics, can choose different one ’d like. default performance metrics available classification : default performance metrics available regression : ’s example using prAUC instead AUC: ’ll see cross-validation metric prAUC, instead default AUC:","code":"#> [1] \"logLoss\" \"AUC\" \"prAUC\" #> [4] \"Accuracy\" \"Kappa\" \"Mean_F1\" #> [7] \"Mean_Sensitivity\" \"Mean_Specificity\" \"Mean_Pos_Pred_Value\" #> [10] \"Mean_Neg_Pred_Value\" \"Mean_Precision\" \"Mean_Recall\" #> [13] \"Mean_Detection_Rate\" \"Mean_Balanced_Accuracy\" #> [1] \"RMSE\" \"Rsquared\" \"MAE\" results_pr <- run_ml(otu_mini_bin, \"glmnet\", cv_times = 5, perf_metric_name = \"prAUC\", seed = 2019 ) #> Using 'dx' as the outcome column. #> Training the model... #> Warning in (function (w) : `caret::train()` issued the following warning: #> #> simpleWarning in nominalTrainWorkflow(x = x, y = y, wts = weights, info = trainInfo, : There were missing values in resampled performance measures. #> #> This warning usually means that the model didn't converge in some cross-validation folds because it is predicting something close to a constant. As a result, certain performance metrics can't be calculated. This suggests that some of the hyperparameters chosen are doing very poorly. #> Training complete. results_pr$performance #> # A tibble: 1 × 17 #> cv_metric_p…¹ logLoss AUC prAUC Accur…² Kappa F1 Sensi…³ Speci…⁴ Pos_P…⁵ #> #> 1 0.577 0.691 0.663 0.605 0.538 0.0539 0.690 1 0.0526 0.526 #> # … with 7 more variables: Neg_Pred_Value , Precision , Recall , #> # Detection_Rate , Balanced_Accuracy , method , seed , #> # and abbreviated variable names ¹​cv_metric_prAUC, ²​Accuracy, ³​Sensitivity, #> # ⁴​Specificity, ⁵​Pos_Pred_Value"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"using-groups","dir":"Articles","previous_headings":"Customizing parameters","what":"Using groups","title":"Introduction to mikropml","text":"optional groups vector groups keep together splitting data train test sets cross-validation. Sometimes ’s important split data based grouping instead just randomly. allows control similarities within groups don’t want skew predictions (.e. batch effects). example, biological data may samples collected multiple hospitals, might like keep observations hospital partition. ’s example split data train/test sets based groups: one difference run_ml() report much data training set run code chunk. can little finicky depending many samples groups . won’t exactly specify training_frac, since include one group either training set test set.","code":"# make random groups set.seed(2019) grps <- sample(LETTERS[1:8], nrow(otu_mini_bin), replace = TRUE) results_grp <- run_ml(otu_mini_bin, \"glmnet\", cv_times = 2, training_frac = 0.8, groups = grps, seed = 2019 ) #> Using 'dx' as the outcome column. #> Fraction of data in the training set: 0.795 #> Groups in the training set: A B D F G H #> Groups in the testing set: C E #> Groups will be kept together in CV partitions #> Training the model... #> Training complete."},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"controlling-how-groups-are-assigned-to-partitions","dir":"Articles","previous_headings":"Customizing parameters > Using groups","what":"Controlling how groups are assigned to partitions","title":"Introduction to mikropml","text":"use groups parameter , default run_ml() assume want observations group placed partition train/test split. makes sense want use groups control batch effects. However, cases might prefer control exactly groups end partition, might even okay observations group assigned different partitions. example, say want groups B used training, C D testing, don’t preference happens groups. can give group_partitions parameter named list specify groups go training set go testing set. case, observations & B used training, C & D used testing, remaining groups randomly assigned one satisfy training_frac closely possible. another scenario, maybe want groups F used training, also want allow observations selected training F used testing: need even control , take look setting custom training indices. might also prefer provide train control scheme cross_val parameter run_ml().","code":"results_grp_part <- run_ml(otu_mini_bin, \"glmnet\", cv_times = 2, training_frac = 0.8, groups = grps, group_partitions = list( train = c(\"A\", \"B\"), test = c(\"C\", \"D\") ), seed = 2019 ) #> Using 'dx' as the outcome column. #> Fraction of data in the training set: 0.785 #> Groups in the training set: A B E F G H #> Groups in the testing set: C D #> Groups will not be kept together in CV partitions because the number of groups in the training set is not larger than `kfold` #> Training the model... #> Training complete. results_grp_trainA <- run_ml(otu_mini_bin, \"glmnet\", cv_times = 2, kfold = 2, training_frac = 0.5, groups = grps, group_partitions = list( train = c(\"A\", \"B\", \"C\", \"D\", \"E\", \"F\"), test = c(\"A\", \"B\", \"C\", \"D\", \"E\", \"F\", \"G\", \"H\") ), seed = 2019 ) #> Using 'dx' as the outcome column. #> Fraction of data in the training set: 0.5 #> Groups in the training set: A B C D E F #> Groups in the testing set: A B C D E F G H #> Groups will be kept together in CV partitions #> Training the model... #> Training complete."},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"more-arguments","dir":"Articles","previous_headings":"Customizing parameters","what":"More arguments","title":"Introduction to mikropml","text":"ML methods take optional arguments, ntree randomForest-based models case weights. additional arguments give run_ml() forwarded along caret::train() can leverage options.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"case-weights","dir":"Articles","previous_headings":"Customizing parameters > More arguments","what":"Case weights","title":"Introduction to mikropml","text":"want use case weights, also need use custom indices training data (.e. perform partition run_ml() ). ’s one way weights calculated proportion class data set, ~70% data training set: See caret docs list models accept case weights.","code":"library(dplyr) case_weights_dat <- otu_mini_bin %>% count(dx) %>% mutate(p = n / sum(n)) %>% select(dx, p) %>% right_join(otu_mini_bin, by = \"dx\") %>% select(-starts_with(\"Otu\")) %>% mutate( in_train = sample(c(TRUE, FALSE), size = nrow(otu_mini_bin), replace = TRUE, prob = c(0.70, 0.30) ), row_num = row_number() ) %>% filter(in_train) head(case_weights_dat) #> dx p in_train row_num #> 1 cancer 0.49 TRUE 3 #> 2 cancer 0.49 TRUE 4 #> 3 cancer 0.49 TRUE 5 #> 4 cancer 0.49 TRUE 6 #> 5 cancer 0.49 TRUE 8 #> 6 cancer 0.49 TRUE 9 nrow(case_weights_dat) / nrow(otu_mini_bin) #> [1] 0.75 results_weighted <- run_ml(otu_mini_bin, \"glmnet\", outcome_colname = \"dx\", seed = 2019, training_frac = case_weights_dat %>% pull(row_num), weights = case_weights_dat %>% pull(p) )"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"finding-feature-importance","dir":"Articles","previous_headings":"","what":"Finding feature importance","title":"Introduction to mikropml","text":"find features contributing predictive power, can use find_feature_importance = TRUE. use permutation importance determine feature importance described (Topçuoğlu et al. 2020). Briefly, permutes features individually (correlated ones together) evaluates much performance metric decreases. performance decreases feature randomly shuffled, important feature . default FALSE takes run useful want know features important predicting outcome. Let’s look feature importance results: Now, can check feature importances: several columns: perf_metric: performance value permuted feature. perf_metric_diff: difference performance actual permuted data (.e. test performance minus permuted performance). Features larger perf_metric_diff important. pvalue: probability obtaining actual performance value null hypothesis. names: feature permuted. method: ML method used. perf_metric_name: performance metric used. seed: seed (set). can see , differences negligible (close zero), makes sense since model isn’t great. ’re interested feature importance, ’s especially useful run multiple different train/test splits, shown example snakemake workflow. can also choose permute correlated features together using corr_thresh (default: 1). features correlation threshold permuted together; .e. perfectly correlated features permuted together using default value. can see features permuted together names column. 3 features permuted together (doesn’t really make sense, ’s just example). previously executed run_ml() without feature importance now wish find feature importance fact, see example code get_feature_importance() documentation. get_feature_importance() can show live progress bar, see vignette(\"parallel\") examples.","code":"results_imp <- run_ml(otu_mini_bin, \"rf\", outcome_colname = \"dx\", find_feature_importance = TRUE, seed = 2019 ) results_imp$feature_importance #> perf_metric perf_metric_diff pvalue names method perf_metric_name #> 1 0.5459125 0.0003375 0.51485149 Otu00001 rf AUC #> 2 0.5682625 -0.0220125 0.73267327 Otu00002 rf AUC #> 3 0.5482875 -0.0020375 0.56435644 Otu00003 rf AUC #> 4 0.6314375 -0.0851875 1.00000000 Otu00004 rf AUC #> 5 0.4991750 0.0470750 0.08910891 Otu00005 rf AUC #> 6 0.5364875 0.0097625 0.28712871 Otu00006 rf AUC #> 7 0.5382875 0.0079625 0.39603960 Otu00007 rf AUC #> 8 0.5160500 0.0302000 0.09900990 Otu00008 rf AUC #> 9 0.5293375 0.0169125 0.17821782 Otu00009 rf AUC #> 10 0.4976500 0.0486000 0.12871287 Otu00010 rf AUC #> seed #> 1 2019 #> 2 2019 #> 3 2019 #> 4 2019 #> 5 2019 #> 6 2019 #> 7 2019 #> 8 2019 #> 9 2019 #> 10 2019 results_imp_corr <- run_ml(otu_mini_bin, \"glmnet\", cv_times = 5, find_feature_importance = TRUE, corr_thresh = 0.2, seed = 2019 ) #> Using 'dx' as the outcome column. #> Training the model... #> Warning in (function (w) : `caret::train()` issued the following warning: #> #> simpleWarning in nominalTrainWorkflow(x = x, y = y, wts = weights, info = trainInfo, : There were missing values in resampled performance measures. #> #> This warning usually means that the model didn't converge in some cross-validation folds because it is predicting something close to a constant. As a result, certain performance metrics can't be calculated. This suggests that some of the hyperparameters chosen are doing very poorly. #> Training complete. #> Finding feature importance... #> Feature importance complete. results_imp_corr$feature_importance #> perf_metric perf_metric_diff pvalue #> 1 0.4941842 0.1531842 0.05940594 #> names #> 1 Otu00001|Otu00002|Otu00003|Otu00004|Otu00005|Otu00006|Otu00007|Otu00008|Otu00009|Otu00010 #> method perf_metric_name seed #> 1 glmnet AUC 2019"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"tuning-hyperparameters-using-the-hyperparameter-argument","dir":"Articles","previous_headings":"","what":"Tuning hyperparameters (using the hyperparameter argument)","title":"Introduction to mikropml","text":"important, whole vignette . bottom line provide default hyperparameters can start , ’s important tune hyperparameters. information default hyperparameters , tune hyperparameters, see vignette(\"tuning\").","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"other-models","dir":"Articles","previous_headings":"","what":"Other models","title":"Introduction to mikropml","text":"examples train evaluate models. output similar, won’t go details.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"random-forest","dir":"Articles","previous_headings":"Other models","what":"Random forest","title":"Introduction to mikropml","text":"rf engine takes optional argument ntree: number trees use random forest. can’t tuned using rf package implementation random forest. Please refer caret documentation interested packages random forest implementations.","code":"results_rf <- run_ml(otu_mini_bin, \"rf\", cv_times = 5, seed = 2019 ) results_rf_nt <- run_ml(otu_mini_bin, \"rf\", cv_times = 5, ntree = 1000, seed = 2019 )"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"decision-tree","dir":"Articles","previous_headings":"Other models","what":"Decision tree","title":"Introduction to mikropml","text":"","code":"results_dt <- run_ml(otu_mini_bin, \"rpart2\", cv_times = 5, seed = 2019 )"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"svm","dir":"Articles","previous_headings":"Other models","what":"SVM","title":"Introduction to mikropml","text":"get message “maximum number iterations reached”, see issue caret.","code":"results_svm <- run_ml(otu_mini_bin, \"svmRadial\", cv_times = 5, seed = 2019 )"},{"path":[]},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"multiclass-data","dir":"Articles","previous_headings":"Other data","what":"Multiclass data","title":"Introduction to mikropml","text":"provide otu_mini_multi multiclass outcome (three outcomes): ’s example running multiclass data: performance metrics slightly different, format everything else :","code":"otu_mini_multi %>% dplyr::pull(\"dx\") %>% unique() #> [1] \"adenoma\" \"carcinoma\" \"normal\" results_multi <- run_ml(otu_mini_multi, outcome_colname = \"dx\", seed = 2019 ) results_multi$performance #> # A tibble: 1 × 17 #> cv_metric…¹ logLoss AUC prAUC Accur…² Kappa Mean_F1 Mean_…³ Mean_…⁴ Mean_…⁵ #> #> 1 1.07 1.11 0.506 0.353 0.382 0.0449 NA 0.360 0.682 NaN #> # … with 7 more variables: Mean_Neg_Pred_Value , Mean_Precision , #> # Mean_Recall , Mean_Detection_Rate , Mean_Balanced_Accuracy , #> # method , seed , and abbreviated variable names #> # ¹​cv_metric_logLoss, ²​Accuracy, ³​Mean_Sensitivity, ⁴​Mean_Specificity, #> # ⁵​Mean_Pos_Pred_Value"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"continuous-data","dir":"Articles","previous_headings":"Other data","what":"Continuous data","title":"Introduction to mikropml","text":"’s example running continuous data, outcome column numerical: , performance metrics slightly different, format rest :","code":"results_cont <- run_ml(otu_mini_bin[, 2:11], \"glmnet\", outcome_colname = \"Otu00001\", seed = 2019 ) results_cont$performance #> # A tibble: 1 × 6 #> cv_metric_RMSE RMSE Rsquared MAE method seed #> #> 1 622. 731. 0.0893 472. glmnet 2019"},{"path":[]},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"summary","dir":"Articles","previous_headings":"","what":"Summary","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"Machine learning (ML) classification prediction based set features used make decisions healthcare, economics, criminal justice . However, implementing ML pipeline including preprocessing, model selection, evaluation can time-consuming, confusing, difficult. , present mikropml (pronounced “meek-ROPE em el”), easy--use R package implements ML pipelines using regression, support vector machines, decision trees, random forest, gradient-boosted trees. package available GitHub, CRAN, conda.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"statement-of-need","dir":"Articles","previous_headings":"","what":"Statement of need","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"applications machine learning (ML) require reproducible steps data pre-processing, cross-validation, testing, model evaluation, often interpretation model makes particular predictions. Performing steps important, failure implement can result incorrect misleading results (Teschendorff 2019; Wiens et al. 2019). Supervised ML widely used recognize patterns large datasets make predictions outcomes interest. Several packages including caret (Kuhn 2008) tidymodels (Kuhn, Wickham, RStudio 2020) R, scikitlearn (Pedregosa et al. 2011) Python, H2O autoML platform (H2O.ai 2020) allow scientists train ML models variety algorithms. packages provide tools necessary ML step, implement complete ML pipeline according good practices literature. makes difficult practitioners new ML easily begin perform ML analyses. enable broader range researchers apply ML problem domains, created mikropml, easy--use R package (R Core Team 2020) implements ML pipeline created Topçuoğlu et al. (Topçuoğlu et al. 2020) single function returns trained model, model performance metrics feature importance. mikropml leverages caret package support several ML algorithms: linear regression, logistic regression, support vector machines radial basis kernel, decision trees, random forest, gradient boosted trees. incorporates good practices ML training, testing, model evaluation (Topçuoğlu et al. 2020; Teschendorff 2019). Furthermore, provides data preprocessing steps based FIDDLE (FlexIble Data-Driven pipeLinE) framework outlined Tang et al. (Tang et al. 2020) post-training permutation importance steps estimate importance feature models trained (Breiman 2001; Fisher, Rudin, Dominici 2018). mikropml can used starting point application ML datasets many different fields. already applied microbiome data categorize patients colorectal cancer (Topçuoğlu et al. 2020), identify differences genomic clinical features associated bacterial infections (Lapp et al. 2020), predict gender-based biases academic publishing (Hagan et al. 2020).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"mikropml-package","dir":"Articles","previous_headings":"","what":"mikropml package","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"mikropml package includes functionality preprocess data, train ML models, evaluate model performance, quantify feature importance (Figure 1). also provide vignettes example Snakemake workflow (Köster Rahmann 2012) showcase run ideal ML pipeline multiple different train/test data splits. results can visualized using helper functions use ggplot2 (Wickham 2016). mikropml allows users get started quickly facilitates reproducibility, replacement understanding ML workflow still necessary interpreting results (Pollard et al. 2019). facilitate understanding enable one tailor code application, heavily commented code provided supporting documentation can read online.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"preprocessing-data","dir":"Articles","previous_headings":"mikropml package","what":"Preprocessing data","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"provide function preprocess_data() preprocess features using several different functions caret package. preprocess_data() takes continuous categorical data, re-factors categorical data binary features, provides options normalize continuous data, remove features near-zero variance, keep one instance perfectly correlated features. set default options based implemented FIDDLE (Tang et al. 2020). details use preprocess_data() can found accompanying vignette.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"running-ml","dir":"Articles","previous_headings":"mikropml package","what":"Running ML","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"main function mikropml, run_ml(), minimally takes model choice data frame outcome column feature columns. model choice, mikropml currently supports logistic linear regression (glmnet: Friedman, Hastie, Tibshirani 2010), support vector machines radial basis kernel (kernlab: Karatzoglou et al. 2004), decision trees (rpart: Therneau et al. 2019), random forest (randomForest: Liaw Wiener 2002), gradient-boosted trees (xgboost: Chen et al. 2020). run_ml() randomly splits data train test sets maintaining distribution outcomes found full dataset. also provides option split data train test sets based categorical variables (e.g. batch, geographic location, etc.). mikropml uses caret package (Kuhn 2008) train evaluate models, optionally quantifies feature importance. output includes best model built based tuning hyperparameters internal repeated cross-validation step, model evaluation metrics, optional feature importances. Feature importances calculated using permutation test, breaks relationship feature true outcome test data, measures change model performance. provides intuitive metric individual features influence model performance comparable across model types, particularly useful model interpretation (Topçuoğlu et al. 2020). introductory vignette contains comprehensive tutorial use run_ml(). mikropml pipeline","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"ideal-workflow-for-running-mikropml-with-many-different-traintest-splits","dir":"Articles","previous_headings":"mikropml package","what":"Ideal workflow for running mikropml with many different train/test splits","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"investigate variation model performance depending train test set used (Topçuoğlu et al. 2020; Lapp et al. 2020), provide examples run_ml() many times different train/test splits get summary information model performance local computer high-performance computing cluster using Snakemake workflow.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"tuning-visualization","dir":"Articles","previous_headings":"mikropml package","what":"Tuning & visualization","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"One particularly important aspect ML hyperparameter tuning. provide reasonable range default hyperparameters model type. However practitioners explore whether range appropriate data, customize hyperparameter range. Therefore, provide function plot_hp_performance() plot cross-validation performance metric single model models built using different train/test splits. helps evaluate hyperparameter range searched exhaustively allows user pick ideal set. also provide summary plots test performance metrics many train/test splits different models using plot_model_performance(). Examples described accompanying vignette hyperparameter tuning.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"dependencies","dir":"Articles","previous_headings":"mikropml package","what":"Dependencies","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"mikropml written R (R Core Team 2020) depends several packages: dplyr (Wickham et al. 2020), rlang (Henry, Wickham, RStudio 2020) caret (Kuhn 2008). ML algorithms supported mikropml require: glmnet (Friedman, Hastie, Tibshirani 2010), e1071 (Meyer et al. 2020), MLmetrics (Yan 2016) logistic regression, rpart2 (Therneau et al. 2019) decision trees, randomForest (Liaw Wiener 2002) random forest, xgboost (Chen et al. 2020) xgboost, kernlab (Karatzoglou et al. 2004) support vector machines. also allow parallelization cross-validation steps using foreach, doFuture, future.apply, future packages (Bengtsson Team 2020). Finally, use ggplot2 plotting (Wickham 2016).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"acknowledgments","dir":"Articles","previous_headings":"","what":"Acknowledgments","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"thank members Schloss Lab participated code clubs related initial development pipeline, made documentation improvements, provided general feedback. also thank Nick Lesniak designing mikropml logo. thank US Research Software Sustainability Institute (NSF #1743188) providing training KLS Winter School Research Software Engineering.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"funding","dir":"Articles","previous_headings":"","what":"Funding","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"Salary support PDS came NIH grant 1R01CA215574. KLS received support NIH Training Program Bioinformatics (T32 GM070449). ZL received support National Science Foundation Graduate Research Fellowship Program Grant . DGE 1256260. opinions, findings, conclusions recommendations expressed material authors necessarily reflect views National Science Foundation.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"author-contributions","dir":"Articles","previous_headings":"","what":"Author contributions","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"BDT, ZL, KLS contributed equally. Author order among co-first authors determined time since joining project. BDT, ZL, KLS conceptualized study wrote code. KLS structured code R package form. BDT, ZL, JW, PDS developed methodology. PDS, ES, JW supervised project. BDT, ZL, KLS wrote original draft. authors reviewed edited manuscript.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"conflicts-of-interest","dir":"Articles","previous_headings":"","what":"Conflicts of interest","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"None.","code":""},{"path":[]},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"speed-up-single-runs","dir":"Articles","previous_headings":"","what":"Speed up single runs","title":"Parallel processing","text":"default, preprocess_data(), run_ml(), compare_models() use one process series. ’d like parallelize various steps pipeline make run faster, install foreach, future, future.apply, doFuture. , register future plan prior calling functions: , used multicore plan split work across 2 cores. See future documentation picking best plan use case. Notably, multicore work inside RStudio Windows; need use multisession instead cases. registering future plan, can call preprocess_data() run_ml() usual, run certain tasks parallel. ’s also parallel version rf engine called parRF trains trees forest parallel. See caret docs information.","code":"doFuture::registerDoFuture() future::plan(future::multicore, workers = 2) otu_data_preproc <- preprocess_data(otu_mini_bin, \"dx\")$dat_transformed #> Using 'dx' as the outcome column. result1 <- run_ml(otu_data_preproc, \"glmnet\") #> Using 'dx' as the outcome column. #> Training the model... #> Loading required package: ggplot2 #> Loading required package: lattice #> #> Attaching package: 'caret' #> The following object is masked from 'package:mikropml': #> #> compare_models #> Training complete."},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"call-run_ml-multiple-times-in-parallel-in-r","dir":"Articles","previous_headings":"","what":"Call run_ml() multiple times in parallel in R","title":"Parallel processing","text":"can use functions future.apply package call run_ml() multiple times parallel different parameters. first need run future::plan() haven’t already. , call run_ml() multiple seeds using future_lapply(): call run_ml() different seed uses different random split data training testing sets. Since using seeds, must set future.seed TRUE (see future.apply documentation blog post details parallel-safe random seeds). example uses seeds speed simplicity, real data recommend using many seeds get better estimate model performance. examples, used functions future.apply package run_ml() parallel, can accomplish thing parallel versions purrr::map() functions using furrr package (e.g. furrr::future_map_dfr()). Extract performance results combine one dataframe seeds:","code":"# NOTE: use more seeds for real-world data results_multi <- future.apply::future_lapply(seq(100, 102), function(seed) { run_ml(otu_data_preproc, \"glmnet\", seed = seed) }, future.seed = TRUE) #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. perf_df <- future.apply::future_lapply(results_multi, function(result) { result[[\"performance\"]] %>% select(cv_metric_AUC, AUC, method) }, future.seed = TRUE ) %>% dplyr::bind_rows() perf_df #> # A tibble: 3 × 3 #> cv_metric_AUC AUC method #> #> 1 0.630 0.634 glmnet #> 2 0.591 0.608 glmnet #> 3 0.671 0.471 glmnet"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"multiple-ml-methods","dir":"Articles","previous_headings":"Call run_ml() multiple times in parallel in R","what":"Multiple ML methods","title":"Parallel processing","text":"may also wish compare performance different ML methods. mapply() can iterate multiple lists vectors, future_mapply() works way: Extract combine results seeds methods:","code":"# NOTE: use more seeds for real-world data param_grid <- expand.grid( seeds = seq(100, 102), methods = c(\"glmnet\", \"rf\") ) results_mtx <- future.apply::future_mapply( function(seed, method) { run_ml(otu_data_preproc, method, seed = seed, find_feature_importance = TRUE) }, param_grid$seeds, param_grid$methods %>% as.character(), future.seed = TRUE ) #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Finding feature importance... #> Feature importance complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Finding feature importance... #> Feature importance complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Finding feature importance... #> Feature importance complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Finding feature importance... #> Feature importance complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Finding feature importance... #> Feature importance complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Finding feature importance... #> Feature importance complete. perf_df <- lapply( results_mtx[\"performance\", ], function(x) { x %>% select(cv_metric_AUC, AUC, method) } ) %>% dplyr::bind_rows() feat_df <- results_mtx[\"feature_importance\", ] %>% dplyr::bind_rows()"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"visualize-the-results","dir":"Articles","previous_headings":"Call run_ml() multiple times in parallel in R","what":"Visualize the results","title":"Parallel processing","text":"ggplot2 required use plotting functions . can also create plots however like using results.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"performance","dir":"Articles","previous_headings":"Call run_ml() multiple times in parallel in R > Visualize the results","what":"Performance","title":"Parallel processing","text":"plot_model_performance() returns ggplot2 object. can add layers customize plot:","code":"perf_boxplot <- plot_model_performance(perf_df) perf_boxplot perf_boxplot + theme_classic() + scale_color_brewer(palette = \"Dark2\") + coord_flip()"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"feature-importance","dir":"Articles","previous_headings":"Call run_ml() multiple times in parallel in R > Visualize the results","what":"Feature importance","title":"Parallel processing","text":"perf_metric_diff feature importance data frame contains differences performance actual test data performance permuted test data (.e. test minus permuted). feature important model performance, expect perf_metric_diff positive. words, features resulted largest decrease performance permuted important features. can select top n important features models plot like : See docs get_feature_importance() details values computed.","code":"top_n <- 5 top_feats <- feat_df %>% group_by(method, names) %>% summarize(median_diff = median(perf_metric_diff)) %>% filter(median_diff > 0) %>% slice_max(order_by = median_diff, n = top_n) #> `summarise()` has grouped output by 'method'. You can override using the #> `.groups` argument. feat_df %>% right_join(top_feats, by = c(\"method\", \"names\")) %>% mutate(features = factor(names, levels = rev(unique(top_feats$names)))) %>% ggplot(aes(x = perf_metric_diff, y = features, color = method)) + geom_boxplot() + theme_bw()"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"live-progress-updates","dir":"Articles","previous_headings":"","what":"Live progress updates","title":"Parallel processing","text":"preprocess_data() get_feature_importance() support reporting live progress updates using progressr package. format , recommend using progress bar like : Note future backends support “near-live” progress updates, meaning progress may reported immediately parallel processing futures. Read progressr vignette. progressr customize format progress updates, see progressr docs.","code":"# optionally, specify the progress bar format with the `progress` package. progressr::handlers(progressr::handler_progress( format = \":message :bar :percent | elapsed: :elapsed | eta: :eta\", clear = FALSE, show_after = 0 )) # tell progressr to always report progress in any functions that use it. # set this to FALSE to turn it back off again. progressr::handlers(global = TRUE) # run your code and watch the live progress updates. dat <- preprocess_data(otu_mini_bin, \"dx\")$dat_transformed #> Using 'dx' as the outcome column. #> preprocessing ========================>------- 78% | elapsed: 1s | eta: 0s results <- run_ml(dat, \"glmnet\", kfold = 2, cv_times = 2, find_feature_importance = TRUE ) #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Feature importance =========================== 100% | elapsed: 37s | eta: 0s"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"parallelizing-with-snakemake","dir":"Articles","previous_headings":"","what":"Parallelizing with Snakemake","title":"Parallel processing","text":"parallelizing multiple calls run_ml() R examples , results objects held memory. isn’t big deal small dataset run seeds. However, large datasets run parallel , say, 100 seeds (recommended), may run problems trying store objects memory . Using workflow manager Snakemake Nextflow highly recommend maximize scalability reproducibility computational analyses. created template Snakemake workflow can use starting point ML project.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"its-running-so-slow","dir":"Articles","previous_headings":"","what":"It’s running so slow!","title":"Preprocessing data","text":"Since assume lot won’t read entire vignette, ’m going say beginning. preprocess_data() function running super slow, consider parallelizing goes faster! preprocess_data() also can report live progress updates. See vignette(\"parallel\") details.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"examples","dir":"Articles","previous_headings":"","what":"Examples","title":"Preprocessing data","text":"’re going start simple get complicated, want whole shebang , just scroll bottom. First, load mikropml:","code":"library(mikropml)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"binary-data","dir":"Articles","previous_headings":"Examples","what":"Binary data","title":"Preprocessing data","text":"Let’s start binary variables: addition dataframe , provide name outcome column preprocess_data(). ’s preprocessed data looks like: output list: dat_transformed transformed data, grp_feats list grouped features, removed_feats list features removed. , grp_feats NULL perfectly correlated features (e.g. c(0,1,0) c(0,1,0), c(0,1,0) c(1,0,1) - see details). first column (var1) dat_transformed character changed var1_yes zeros () ones (yes). values second column (var2) stay ’s already binary, name changes var2_1. third column (var3) factor also changed binary b 1 0, denoted new column name var3_b.","code":"# raw binary dataset bin_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"no\", \"yes\", \"no\"), var2 = c(0, 1, 1), var3 = factor(c(\"a\", \"a\", \"b\")) ) bin_df #> outcome var1 var2 var3 #> 1 normal no 0 a #> 2 normal yes 1 a #> 3 cancer no 1 b # preprocess raw binary data preprocess_data(dataset = bin_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 4 #> outcome var1_yes var2_1 var3_b #> #> 1 normal 0 0 0 #> 2 normal 1 1 0 #> 3 cancer 0 1 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"categorical-data","dir":"Articles","previous_headings":"Examples","what":"Categorical data","title":"Preprocessing data","text":"non-binary categorical data: can see, variable split 3 different columns - one type (, b, c). , grp_feats NULL.","code":"# raw categorical dataset cat_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"a\", \"b\", \"c\") ) cat_df #> outcome var1 #> 1 normal a #> 2 normal b #> 3 cancer c # preprocess raw categorical data preprocess_data(dataset = cat_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 4 #> outcome var1_a var1_b var1_c #> #> 1 normal 1 0 0 #> 2 normal 0 1 0 #> 3 cancer 0 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"continuous-data","dir":"Articles","previous_headings":"Examples","what":"Continuous data","title":"Preprocessing data","text":"Now, looking continuous variables: Wow! numbers change? default normalize data using \"center\" \"scale\". often best practice, may want normalize data, may want normalize data different way. don’t want normalize data, can use method=NULL: can also normalize data different ways. can choose method supported method argument caret::preProcess() (see caret::preProcess() docs details). Note methods applied continuous variables. Another feature preprocess_data() provide continuous variables characters, converted numeric: don’t want happen, want character data remain character data even can converted numeric, can use to_numeric=FALSE kept categorical: can see output, case features treated groups rather numbers (e.g. normalized).","code":"# raw continuous dataset cont_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(1, 2, 3) ) cont_df #> outcome var1 #> 1 normal 1 #> 2 normal 2 #> 3 cancer 3 # preprocess raw continuous data preprocess_data(dataset = cont_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 2 #> outcome var1 #> #> 1 normal -1 #> 2 normal 0 #> 3 cancer 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0) # preprocess raw continuous data, no normalization preprocess_data(dataset = cont_df, outcome_colname = \"outcome\", method = NULL) # raw continuous dataset as characters cont_char_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"1\", \"2\", \"3\") ) cont_char_df #> outcome var1 #> 1 normal 1 #> 2 normal 2 #> 3 cancer 3 # preprocess raw continuous character data as numeric preprocess_data(dataset = cont_char_df, outcome_colname = \"outcome\") # preprocess raw continuous character data as characters preprocess_data(dataset = cont_char_df, outcome_colname = \"outcome\", to_numeric = FALSE) #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 4 #> outcome var1_1 var1_2 var1_3 #> #> 1 normal 1 0 0 #> 2 normal 0 1 0 #> 3 cancer 0 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"collapse-perfectly-correlated-features","dir":"Articles","previous_headings":"Examples","what":"Collapse perfectly correlated features","title":"Preprocessing data","text":"default, preprocess_data() collapses features perfectly positively negatively correlated. multiple copies features add information machine learning, makes run_ml faster. can see, end one variable, 3 grouped together. Also, second element list longer NULL. Instead, tells grp1 contains var1, var2, var3. want group positively correlated features, negatively correlated features (e.g. interpretability, another downstream application), can using group_neg_corr=FALSE: , var3 kept ’s ’s negatively correlated var1 var2. can also choose keep features separate, even perfectly correlated, using collapse_corr_feats=FALSE: case, grp_feats always NULL.","code":"# raw correlated dataset corr_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"no\", \"yes\", \"no\"), var2 = c(0, 1, 0), var3 = c(1, 0, 1) ) corr_df #> outcome var1 var2 var3 #> 1 normal no 0 1 #> 2 normal yes 1 0 #> 3 cancer no 0 1 # preprocess raw correlated dataset preprocess_data(dataset = corr_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 2 #> outcome grp1 #> #> 1 normal 0 #> 2 normal 1 #> 3 cancer 0 #> #> $grp_feats #> $grp_feats$grp1 #> [1] \"var1_yes\" \"var3_1\" #> #> #> $removed_feats #> [1] \"var2\" # preprocess raw correlated dataset; don't group negatively correlated features preprocess_data(dataset = corr_df, outcome_colname = \"outcome\", group_neg_corr = FALSE) #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_yes var3_1 #> #> 1 normal 0 1 #> 2 normal 1 0 #> 3 cancer 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var2\" # preprocess raw correlated dataset; don't group negatively correlated features preprocess_data(dataset = corr_df, outcome_colname = \"outcome\", collapse_corr_feats = FALSE) #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_yes var3_1 #> #> 1 normal 0 1 #> 2 normal 1 0 #> 3 cancer 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var2\""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"data-with-near-zero-variance","dir":"Articles","previous_headings":"Examples","what":"Data with near-zero variance","title":"Preprocessing data","text":"variables zero, “”? ones won’t contribute information, remove : , var3, var4, var5 variability, variables removed preprocessing: can read caret::preProcess() documentation information. default, remove features “near-zero variance” (remove_var='nzv'). uses default arguments caret::nearZeroVar(). However, particularly smaller datasets, might want remove features near-zero variance. want remove features zero variance, can use remove_var='zv': want include features, can use argument remove_zv=NULL. work, collapse correlated features (otherwise errors underlying caret function use). want nuanced remove near-zero variance features (e.g. change default 10% cutoff percentage distinct values total number samples), can use caret::preProcess() function running preprocess_data remove_var=NULL (see caret::nearZeroVar() function information).","code":"# raw dataset with non-variable features nonvar_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"no\", \"yes\", \"no\"), var2 = c(0, 1, 1), var3 = c(\"no\", \"no\", \"no\"), var4 = c(0, 0, 0), var5 = c(12, 12, 12) ) nonvar_df #> outcome var1 var2 var3 var4 var5 #> 1 normal no 0 no 0 12 #> 2 normal yes 1 no 0 12 #> 3 cancer no 1 no 0 12 # remove features with near-zero variance preprocess_data(dataset = nonvar_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_yes var2_1 #> #> 1 normal 0 0 #> 2 normal 1 1 #> 3 cancer 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var4\" \"var3\" \"var5\" # remove features with zero variance preprocess_data(dataset = nonvar_df, outcome_colname = \"outcome\", remove_var = \"zv\") #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_yes var2_1 #> #> 1 normal 0 0 #> 2 normal 1 1 #> 3 cancer 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var4\" \"var3\" \"var5\" # don't remove features with near-zero or zero variance preprocess_data(dataset = nonvar_df, outcome_colname = \"outcome\", remove_var = NULL, collapse_corr_feats = FALSE) #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 5 #> outcome var1_yes var2_1 var3 var5 #> #> 1 normal 0 0 0 12 #> 2 normal 1 1 0 12 #> 3 cancer 0 1 0 12 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var4\""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"missing-data","dir":"Articles","previous_headings":"Examples","what":"Missing data","title":"Preprocessing data","text":"preprocess_data() also deals missing data. : Removes missing outcome variables. Maintains zero variability feature already variability (.e. feature removed removing features near-zero variance). Replaces missing binary categorical variables zero (splitting multiple columns). Replaces missing continuous data median value feature. ’d like deal missing data different way, please prior inputting data preprocess_data().","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"remove-missing-outcome-variables","dir":"Articles","previous_headings":"Examples > Missing data","what":"Remove missing outcome variables","title":"Preprocessing data","text":"","code":"# raw dataset with missing outcome value miss_oc_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\", NA), var1 = c(\"no\", \"yes\", \"no\", \"no\"), var2 = c(0, 1, 1, 1) ) miss_oc_df #> outcome var1 var2 #> 1 normal no 0 #> 2 normal yes 1 #> 3 cancer no 1 #> 4 no 1 # preprocess raw dataset with missing outcome value preprocess_data(dataset = miss_oc_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> Removed 1/4 (25%) of samples because of missing outcome value (NA). #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_yes var2_1 #> #> 1 normal 0 0 #> 2 normal 1 1 #> 3 cancer 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"maintain-zero-variability-in-a-feature-if-it-already-has-no-variability","dir":"Articles","previous_headings":"Examples > Missing data","what":"Maintain zero variability in a feature if it already has no variability","title":"Preprocessing data","text":", non-variable feature missing data removed removed features near-zero variance. maintained feature, ’d ones:","code":"# raw dataset with missing value in non-variable feature miss_nonvar_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"no\", \"yes\", \"no\"), var2 = c(NA, 1, 1) ) miss_nonvar_df #> outcome var1 var2 #> 1 normal no NA #> 2 normal yes 1 #> 3 cancer no 1 # preprocess raw dataset with missing value in non-variable feature preprocess_data(dataset = miss_nonvar_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> There are 1 missing value(s) in features with no variation. Missing values were replaced with the non-varying value. #> $dat_transformed #> # A tibble: 3 × 2 #> outcome var1_yes #> #> 1 normal 0 #> 2 normal 1 #> 3 cancer 0 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var2\" # preprocess raw dataset with missing value in non-variable feature preprocess_data(dataset = miss_nonvar_df, outcome_colname = \"outcome\", remove_var = NULL, collapse_corr_feats = FALSE) #> Using 'outcome' as the outcome column. #> There are 1 missing value(s) in features with no variation. Missing values were replaced with the non-varying value. #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_yes var2 #> #> 1 normal 0 1 #> 2 normal 1 1 #> 3 cancer 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"replace-missing-binary-and-categorical-variables-with-zero","dir":"Articles","previous_headings":"Examples > Missing data","what":"Replace missing binary and categorical variables with zero","title":"Preprocessing data","text":"binary variable split two, missing value considered zero .","code":"# raw dataset with missing value in categorical feature miss_cat_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"no\", \"yes\", NA), var2 = c(NA, 1, 0) ) miss_cat_df #> outcome var1 var2 #> 1 normal no NA #> 2 normal yes 1 #> 3 cancer 0 # preprocess raw dataset with missing value in non-variable feature preprocess_data(dataset = miss_cat_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> 2 categorical missing value(s) (NA) were replaced with 0. Note that the matrix is not full rank so missing values may be duplicated in separate columns. #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_no var1_yes #> #> 1 normal 1 0 #> 2 normal 0 1 #> 3 cancer 0 0 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var2\""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"replace-missing-continuous-data-with-the-median-value-of-that-feature","dir":"Articles","previous_headings":"Examples > Missing data","what":"Replace missing continuous data with the median value of that feature","title":"Preprocessing data","text":"’re normalizing continuous features ’s easier see ’s going (.e. median value used):","code":"# raw dataset with missing value in continuous feature miss_cont_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\", \"normal\"), var1 = c(1, 2, 2, NA), var2 = c(1, 2, 3, NA) ) miss_cont_df #> outcome var1 var2 #> 1 normal 1 1 #> 2 normal 2 2 #> 3 cancer 2 3 #> 4 normal NA NA # preprocess raw dataset with missing value in continuous feature preprocess_data(dataset = miss_cont_df, outcome_colname = \"outcome\", method = NULL) #> Using 'outcome' as the outcome column. #> 2 missing continuous value(s) were imputed using the median value of the feature. #> $dat_transformed #> # A tibble: 4 × 3 #> outcome var1 var2 #> #> 1 normal 1 1 #> 2 normal 2 2 #> 3 cancer 2 3 #> 4 normal 2 2 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"putting-it-all-together","dir":"Articles","previous_headings":"Examples","what":"Putting it all together","title":"Preprocessing data","text":"’s complicated example raw data puts everything discussed together: Let’s throw preprocessing function default values: can see, got several messages: One samples (row 4) removed outcome value missing. One variables feature variation missing value replaced non-varying value (var11). Four categorical missing values replaced zero (var9). 4 missing rather just 1 (like raw data) split categorical variable 4 different columns first. One missing continuous value imputed using median value feature (var8). Additionally, can see continuous variables normalized, categorical variables changed binary, several features grouped together. variables group can found grp_feats.","code":"test_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\", NA), var1 = 1:4, var2 = c(\"a\", \"b\", \"c\", \"d\"), var3 = c(\"no\", \"yes\", \"no\", \"no\"), var4 = c(0, 1, 0, 0), var5 = c(0, 0, 0, 0), var6 = c(\"no\", \"no\", \"no\", \"no\"), var7 = c(1, 1, 0, 0), var8 = c(5, 6, NA, 7), var9 = c(NA, \"x\", \"y\", \"z\"), var10 = c(1, 0, NA, NA), var11 = c(1, 1, NA, NA), var12 = c(\"1\", \"2\", \"3\", \"4\") ) test_df #> outcome var1 var2 var3 var4 var5 var6 var7 var8 var9 var10 var11 var12 #> 1 normal 1 a no 0 0 no 1 5 1 1 1 #> 2 normal 2 b yes 1 0 no 1 6 x 0 1 2 #> 3 cancer 3 c no 0 0 no 0 NA y NA NA 3 #> 4 4 d no 0 0 no 0 7 z NA NA 4 preprocess_data(dataset = test_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> Removed 1/4 (25%) of samples because of missing outcome value (NA). #> There are 1 missing value(s) in features with no variation. Missing values were replaced with the non-varying value. #> 2 categorical missing value(s) (NA) were replaced with 0. Note that the matrix is not full rank so missing values may be duplicated in separate columns. #> 1 missing continuous value(s) were imputed using the median value of the feature. #> $dat_transformed #> # A tibble: 3 × 6 #> outcome grp1 var2_a grp2 grp3 var8 #> #> 1 normal -1 1 0 0 -0.707 #> 2 normal 0 0 1 0 0.707 #> 3 cancer 1 0 0 1 0 #> #> $grp_feats #> $grp_feats$grp1 #> [1] \"var1\" \"var12\" #> #> $grp_feats$var2_a #> [1] \"var2_a\" #> #> $grp_feats$grp2 #> [1] \"var2_b\" \"var3_yes\" \"var9_x\" #> #> $grp_feats$grp3 #> [1] \"var2_c\" \"var7_1\" \"var9_y\" #> #> $grp_feats$var8 #> [1] \"var8\" #> #> #> $removed_feats #> [1] \"var4\" \"var5\" \"var10\" \"var6\" \"var11\""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"next-step-train-and-evaluate-your-model","dir":"Articles","previous_headings":"Examples","what":"Next step: train and evaluate your model!","title":"Preprocessing data","text":"preprocess data (either using preprocess_data() preprocessing data ), ’re ready train evaluate machine learning models! Please see run_ml() information training models.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"the-simplest-way-to-run_ml","dir":"Articles","previous_headings":"","what":"The simplest way to run_ml()","title":"Hyperparameter tuning","text":"mentioned , minimal input dataset (dataset) machine learning model want use (method). run_ml(), default 100 times repeated, 5-fold cross-validation, evaluate hyperparameters 500 total iterations. Say want run L2 regularized logistic regression. : ’ll probably get warning run dataset small. want learn , check introductory vignette training evaluating ML model: vignette(\"introduction\"). default, run_ml() selects hyperparameters depending dataset method used. can see, alpha hyperparameter set 0, specifies L2 regularization. glmnet gives us option run L1 L2 regularization. change alpha 1, run L1-regularized logistic regression. can also tune alpha specifying variety values 0 1. use value 0 1, running elastic net. default hyperparameter lambda adjusts L2 regularization penalty range values 10^-4 10. look 100 repeated cross-validation performance metrics AUC, Accuracy, prAUC tested lambda value, see appropriate dataset better others.","code":"results <- run_ml(dat, \"glmnet\", outcome_colname = \"dx\", cv_times = 100, seed = 2019 ) #> Using 'dx' as the outcome column. #> Training the model... #> Loading required package: ggplot2 #> Loading required package: lattice #> #> Attaching package: 'caret' #> The following object is masked from 'package:mikropml': #> #> compare_models #> Training complete. results$trained_model #> glmnet #> #> 161 samples #> 10 predictor #> 2 classes: 'cancer', 'normal' #> #> No pre-processing #> Resampling: Cross-Validated (5 fold, repeated 100 times) #> Summary of sample sizes: 128, 129, 129, 129, 129, 130, ... #> Resampling results across tuning parameters: #> #> lambda logLoss AUC prAUC Accuracy Kappa F1 #> 1e-04 0.7113272 0.6123301 0.5725828 0.5853927 0.17080523 0.5730989 #> 1e-03 0.7113272 0.6123301 0.5725828 0.5853927 0.17080523 0.5730989 #> 1e-02 0.7112738 0.6123883 0.5726478 0.5854514 0.17092470 0.5731635 #> 1e-01 0.6819806 0.6210744 0.5793961 0.5918756 0.18369829 0.5779616 #> 1e+00 0.6803749 0.6278273 0.5827655 0.5896356 0.17756961 0.5408139 #> 1e+01 0.6909820 0.6271894 0.5814202 0.5218000 0.02920942 0.1875293 #> Sensitivity Specificity Pos_Pred_Value Neg_Pred_Value Precision #> 0.5789667 0.5920074 0.5796685 0.5977166 0.5796685 #> 0.5789667 0.5920074 0.5796685 0.5977166 0.5796685 #> 0.5789667 0.5921250 0.5797769 0.5977182 0.5797769 #> 0.5805917 0.6032353 0.5880165 0.6026963 0.5880165 #> 0.5057833 0.6715588 0.6005149 0.5887829 0.6005149 #> 0.0607250 0.9678676 0.7265246 0.5171323 0.7265246 #> Recall Detection_Rate Balanced_Accuracy #> 0.5789667 0.2839655 0.5854870 #> 0.5789667 0.2839655 0.5854870 #> 0.5789667 0.2839636 0.5855458 #> 0.5805917 0.2847195 0.5919135 #> 0.5057833 0.2478291 0.5886711 #> 0.0607250 0.0292613 0.5142963 #> #> Tuning parameter 'alpha' was held constant at a value of 0 #> AUC was used to select the optimal model using the largest value. #> The final values used for the model were alpha = 0 and lambda = 1. results$trained_model$results #> alpha lambda logLoss AUC prAUC Accuracy Kappa F1 #> 1 0 1e-04 0.7113272 0.6123301 0.5725828 0.5853927 0.17080523 0.5730989 #> 2 0 1e-03 0.7113272 0.6123301 0.5725828 0.5853927 0.17080523 0.5730989 #> 3 0 1e-02 0.7112738 0.6123883 0.5726478 0.5854514 0.17092470 0.5731635 #> 4 0 1e-01 0.6819806 0.6210744 0.5793961 0.5918756 0.18369829 0.5779616 #> 5 0 1e+00 0.6803749 0.6278273 0.5827655 0.5896356 0.17756961 0.5408139 #> 6 0 1e+01 0.6909820 0.6271894 0.5814202 0.5218000 0.02920942 0.1875293 #> Sensitivity Specificity Pos_Pred_Value Neg_Pred_Value Precision Recall #> 1 0.5789667 0.5920074 0.5796685 0.5977166 0.5796685 0.5789667 #> 2 0.5789667 0.5920074 0.5796685 0.5977166 0.5796685 0.5789667 #> 3 0.5789667 0.5921250 0.5797769 0.5977182 0.5797769 0.5789667 #> 4 0.5805917 0.6032353 0.5880165 0.6026963 0.5880165 0.5805917 #> 5 0.5057833 0.6715588 0.6005149 0.5887829 0.6005149 0.5057833 #> 6 0.0607250 0.9678676 0.7265246 0.5171323 0.7265246 0.0607250 #> Detection_Rate Balanced_Accuracy logLossSD AUCSD prAUCSD AccuracySD #> 1 0.2839655 0.5854870 0.085315967 0.09115229 0.07296554 0.07628572 #> 2 0.2839655 0.5854870 0.085315967 0.09115229 0.07296554 0.07628572 #> 3 0.2839636 0.5855458 0.085276565 0.09122242 0.07301412 0.07637123 #> 4 0.2847195 0.5919135 0.048120032 0.09025695 0.07329214 0.07747312 #> 5 0.2478291 0.5886711 0.012189172 0.09111917 0.07505095 0.07771171 #> 6 0.0292613 0.5142963 0.001610008 0.09266875 0.07640896 0.03421597 #> KappaSD F1SD SensitivitySD SpecificitySD Pos_Pred_ValueSD #> 1 0.15265728 0.09353786 0.13091452 0.11988406 0.08316345 #> 2 0.15265728 0.09353786 0.13091452 0.11988406 0.08316345 #> 3 0.15281903 0.09350099 0.13073501 0.12002481 0.08329024 #> 4 0.15485134 0.09308733 0.12870031 0.12037225 0.08554483 #> 5 0.15563046 0.10525917 0.13381009 0.11639614 0.09957685 #> 6 0.06527242 0.09664720 0.08010494 0.06371495 0.31899811 #> Neg_Pred_ValueSD PrecisionSD RecallSD Detection_RateSD Balanced_AccuracySD #> 1 0.08384956 0.08316345 0.13091452 0.06394409 0.07640308 #> 2 0.08384956 0.08316345 0.13091452 0.06394409 0.07640308 #> 3 0.08385838 0.08329024 0.13073501 0.06384692 0.07648207 #> 4 0.08427362 0.08554483 0.12870031 0.06272897 0.07748791 #> 5 0.07597766 0.09957685 0.13381009 0.06453637 0.07773039 #> 6 0.02292294 0.31899811 0.08010494 0.03803159 0.03184136"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"customizing-hyperparameters","dir":"Articles","previous_headings":"","what":"Customizing hyperparameters","title":"Hyperparameter tuning","text":"example, want change lambda values provide better range test cross-validation step. don’t want use defaults provide named list new values. example: Now let’s run L2 logistic regression new lambda values: time, cover larger different range lambda settings cross-validation. know lambda value best one? answer , need run ML pipeline multiple data splits look mean cross-validation performance lambda across modeling experiments. describe run pipeline multiple data splits vignette(\"parallel\"). train model new lambda range defined . run 3 times different seed, result different splits data training testing sets. can use plot_hp_performance see lambda gives us largest mean AUC value across modeling experiments. can see, get mean maxima 0.03 best lambda value dataset run 3 data splits. fact seeing maxima middle range edges, shows providing large enough range exhaust lambda search build model. recommend user use plot make sure best hyperparameter edges provided list. better understanding global maxima, better run data splits using seeds. picked 3 seeds keep runtime vignette, real-world data recommend using many seeds.","code":"new_hp <- list( alpha = 1, lambda = c(0.00001, 0.0001, 0.001, 0.01, 0.015, 0.02, 0.025, 0.03, 0.04, 0.05, 0.06, 0.1) ) new_hp #> $alpha #> [1] 1 #> #> $lambda #> [1] 0.00001 0.00010 0.00100 0.01000 0.01500 0.02000 0.02500 0.03000 0.04000 #> [10] 0.05000 0.06000 0.10000 results <- run_ml(dat, \"glmnet\", outcome_colname = \"dx\", cv_times = 100, hyperparameters = new_hp, seed = 2019 ) #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. results$trained_model #> glmnet #> #> 161 samples #> 10 predictor #> 2 classes: 'cancer', 'normal' #> #> No pre-processing #> Resampling: Cross-Validated (5 fold, repeated 100 times) #> Summary of sample sizes: 128, 129, 129, 129, 129, 130, ... #> Resampling results across tuning parameters: #> #> lambda logLoss AUC prAUC Accuracy Kappa F1 #> 0.00001 0.7215038 0.6112253 0.5720005 0.5842184 0.1684871 0.5726974 #> 0.00010 0.7215038 0.6112253 0.5720005 0.5842184 0.1684871 0.5726974 #> 0.00100 0.7209099 0.6112771 0.5719601 0.5845329 0.1691285 0.5730414 #> 0.01000 0.6984432 0.6156112 0.5758977 0.5830960 0.1665062 0.5759265 #> 0.01500 0.6913332 0.6169396 0.5770496 0.5839720 0.1683912 0.5786347 #> 0.02000 0.6870103 0.6177313 0.5779563 0.5833645 0.1673234 0.5796891 #> 0.02500 0.6846387 0.6169757 0.5769305 0.5831907 0.1669901 0.5792840 #> 0.03000 0.6834369 0.6154763 0.5754118 0.5821394 0.1649081 0.5786336 #> 0.04000 0.6833322 0.6124776 0.5724802 0.5786224 0.1578750 0.5735757 #> 0.05000 0.6850454 0.6069059 0.5668928 0.5732197 0.1468699 0.5624480 #> 0.06000 0.6880861 0.5974311 0.5596714 0.5620224 0.1240112 0.5375824 #> 0.10000 0.6944846 0.5123565 0.3034983 0.5120114 0.0110144 0.3852423 #> Sensitivity Specificity Pos_Pred_Value Neg_Pred_Value Precision #> 0.5798500 0.5888162 0.5780748 0.5971698 0.5780748 #> 0.5798500 0.5888162 0.5780748 0.5971698 0.5780748 #> 0.5801167 0.5891912 0.5784544 0.5974307 0.5784544 #> 0.5883667 0.5783456 0.5755460 0.5977390 0.5755460 #> 0.5929750 0.5756471 0.5763123 0.5987220 0.5763123 #> 0.5967167 0.5708824 0.5748385 0.5990649 0.5748385 #> 0.5970250 0.5702721 0.5743474 0.5997928 0.5743474 #> 0.5964500 0.5687721 0.5734044 0.5982451 0.5734044 #> 0.5904500 0.5677353 0.5699817 0.5943308 0.5699817 #> 0.5734833 0.5736176 0.5668523 0.5864448 0.5668523 #> 0.5360333 0.5881250 0.5595918 0.5722851 0.5595918 #> 0.1145917 0.8963456 0.5255752 0.5132665 0.5255752 #> Recall Detection_Rate Balanced_Accuracy #> 0.5798500 0.28441068 0.5843331 #> 0.5798500 0.28441068 0.5843331 #> 0.5801167 0.28453770 0.5846539 #> 0.5883667 0.28860521 0.5833561 #> 0.5929750 0.29084305 0.5843110 #> 0.5967167 0.29264681 0.5837995 #> 0.5970250 0.29278708 0.5836485 #> 0.5964500 0.29248583 0.5826110 #> 0.5904500 0.28951992 0.5790926 #> 0.5734833 0.28119862 0.5735505 #> 0.5360333 0.26270204 0.5620792 #> 0.1145917 0.05585777 0.5054686 #> #> Tuning parameter 'alpha' was held constant at a value of 1 #> AUC was used to select the optimal model using the largest value. #> The final values used for the model were alpha = 1 and lambda = 0.02. results <- lapply(seq(100, 102), function(seed) { run_ml(dat, \"glmnet\", seed = seed, hyperparameters = new_hp) }) #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. models <- lapply(results, function(x) x$trained_model) hp_metrics <- combine_hp_performance(models) plot_hp_performance(hp_metrics$dat, lambda, AUC)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"hyperparameter-options","dir":"Articles","previous_headings":"","what":"Hyperparameter options","title":"Hyperparameter tuning","text":"can see default hyperparameters used dataset get_hyperparams_list(). examples built-datasets provide: hyperparameters tuned modeling methods. output similar, won’t go details.","code":"get_hyperparams_list(otu_mini_bin, \"glmnet\") #> $lambda #> [1] 1e-04 1e-03 1e-02 1e-01 1e+00 1e+01 #> #> $alpha #> [1] 0 get_hyperparams_list(otu_mini_bin, \"rf\") #> $mtry #> [1] 2 3 6 get_hyperparams_list(otu_small, \"rf\") #> $mtry #> [1] 4 8 16"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"regression","dir":"Articles","previous_headings":"Hyperparameter options","what":"Regression","title":"Hyperparameter tuning","text":"mentioned , glmnet uses alpha parameter lambda hyperparameter. alpha 0 L2 regularization (ridge). alpha 1 L1 regularization (lasso). alpha elastic net. can also tune alpha like hyperparameter. Please refer original glmnet documentation information: https://web.stanford.edu/~hastie/glmnet/glmnet_alpha.html default hyperparameters chosen run_ml() fixed glmnet.","code":"#> $lambda #> [1] 1e-04 1e-03 1e-02 1e-01 1e+00 1e+01 #> #> $alpha #> [1] 0"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"random-forest","dir":"Articles","previous_headings":"Hyperparameter options","what":"Random forest","title":"Hyperparameter tuning","text":"run rf parRF, using randomForest package implementation. tuning mtry hyperparameter. number features randomly collected sampled tree node. number needs less number features dataset. Please refer original documentation information: https://cran.r-project.org/web/packages/randomForest/randomForest.pdf default, take square root number features dataset provide range [sqrt_features / 2, sqrt_features, sqrt_features * 2]. example number features 1000: Similar glmnet method, can provide mtry range.","code":"#> $mtry #> [1] 16 32 64"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"decision-tree","dir":"Articles","previous_headings":"Hyperparameter options","what":"Decision tree","title":"Hyperparameter tuning","text":"run rpart2, running rpart package implementation decision tree. tuning maxdepth hyperparameter. maximum depth node final tree. Please refer original documentation information maxdepth: https://cran.r-project.org/web/packages/rpart/rpart.pdf default, provide range less number features dataset. example 1000 features: 10 features:","code":"#> $maxdepth #> [1] 1 2 4 8 16 30 #> $maxdepth #> [1] 1 2 4 8"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"svm-with-radial-basis-kernel","dir":"Articles","previous_headings":"Hyperparameter options","what":"SVM with radial basis kernel","title":"Hyperparameter tuning","text":"run svmRadial method, tuning C sigma hyperparameters. sigma defines far influence single training example reaches C behaves regularization parameter. Please refer great sklearn resource information hyperparameters: https://scikit-learn.org/stable/auto_examples/svm/plot_rbf_parameters.html default, provide 2 separate range values two hyperparameters.","code":"#> $C #> [1] 1e-03 1e-02 1e-01 1e+00 1e+01 1e+02 #> #> $sigma #> [1] 1e-06 1e-05 1e-04 1e-03 1e-02 1e-01"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"xgboost","dir":"Articles","previous_headings":"Hyperparameter options","what":"XGBoost","title":"Hyperparameter tuning","text":"run xgbTree method, tuning nrounds, gamma, eta max_depth, colsample_bytree, min_child_weight subsample hyperparameters. can read hyperparameters : https://xgboost.readthedocs.io/en/latest/parameter.html default, set nrounds, gamma, colsample_bytree min_child_weight fixed values provide range values eta, max_depth subsample. can changed optimized user supplying custom named list hyperparameters run_ml().","code":"#> $nrounds #> [1] 100 #> #> $gamma #> [1] 0 #> #> $eta #> [1] 0.001 0.010 0.100 1.000 #> #> $max_depth #> [1] 1 2 4 8 16 30 #> #> $colsample_bytree #> [1] 0.8 #> #> $min_child_weight #> [1] 1 #> #> $subsample #> [1] 0.4 0.5 0.6 0.7"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"other-ml-methods","dir":"Articles","previous_headings":"","what":"Other ML methods","title":"Hyperparameter tuning","text":"ML methods tested set default hyperparameters , theory may able use methods supported caret run_ml(). Take look available models caret (see list tag). need give run_ml() custom hyperparameters just like examples :","code":"run_ml(otu_mini_bin, \"regLogistic\", hyperparameters = list( cost = 10^seq(-4, 1, 1), epsilon = c(0.01), loss = c(\"L2_primal\") ) )"},{"path":"http://www.schlosslab.org/mikropml/dev/authors.html","id":null,"dir":"","previous_headings":"","what":"Authors","title":"Authors and Citation","text":"Begüm Topçuoğlu. Author. Zena Lapp. Author. Kelly Sovacool. Author, maintainer. Evan Snitkin. Author. Jenna Wiens. Author. Patrick Schloss. Author. Nick Lesniak. Contributor. Courtney Armour. Contributor. Sarah Lucas. Contributor.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/authors.html","id":"citation","dir":"","previous_headings":"","what":"Citation","title":"Authors and Citation","text":"Topçuoğlu et al., (2021). mikropml: User-Friendly R Package Supervised Machine Learning Pipelines. Journal Open Source Software, 6(61), 3073, https://doi.org/10.21105/joss.03073","code":"@Article{, title = {{mikropml}: User-Friendly R Package for Supervised Machine Learning Pipelines}, author = {Begüm D. Topçuoğlu and Zena Lapp and Kelly L. Sovacool and Evan Snitkin and Jenna Wiens and Patrick D. Schloss}, journal = {Journal of Open Source Software}, year = {2021}, month = {May}, volume = {6}, number = {61}, pages = {3073}, doi = {10.21105/joss.03073}, url = {https://joss.theoj.org/papers/10.21105/joss.03073}, }"},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"mikropml-","dir":"","previous_headings":"","what":"User-Friendly R Package for Supervised Machine Learning Pipelines","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"meek-ROPE em el User-Friendly R Package Supervised Machine Learning Pipelines interface build machine learning models classification regression problems. mikropml implements ML pipeline described Topçuoğlu et al. (2020) reasonable default options data preprocessing, hyperparameter tuning, cross-validation, testing, model evaluation, interpretation steps. See website information, documentation, examples.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"installation","dir":"","previous_headings":"","what":"Installation","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"can install latest release CRAN: development version GitHub: install terminal using conda mamba:","code":"install.packages('mikropml') # install.packages(\"devtools\") devtools::install_github(\"SchlossLab/mikropml\") mamba install -c conda-forge r-mikropml"},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"dependencies","dir":"","previous_headings":"Installation","what":"Dependencies","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"Imports: caret, dplyr, e1071, glmnet, kernlab, MLmetrics, randomForest, rlang, rpart, stats, utils, xgboost Suggests: doFuture, foreach, future, future.apply, ggplot2, knitr, progress, progressr, purrr, rmarkdown, testthat, tidyr","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"usage","dir":"","previous_headings":"","what":"Usage","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"Check introductory vignette quick start tutorial. -depth discussion, read vignettes /take look reference documentation. can watch Riffomonas Project series video tutorials covering mikropml skills related machine learning. also provide example Snakemake workflow running mikropml HPC.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"help--contributing","dir":"","previous_headings":"","what":"Help & Contributing","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"come across bug, open issue include minimal reproducible example. questions, create new post Discussions. ’d like contribute, see guidelines .","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"code-of-conduct","dir":"","previous_headings":"","what":"Code of Conduct","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"Please note mikropml project released Contributor Code Conduct. contributing project, agree abide terms.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"license","dir":"","previous_headings":"","what":"License","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"mikropml package licensed MIT license. Text images included repository, including mikropml logo, licensed CC 4.0 license.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"citation","dir":"","previous_headings":"","what":"Citation","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"cite mikropml publications, use: Topçuoğlu BD, Lapp Z, Sovacool KL, Snitkin E, Wiens J, Schloss PD (2021). “mikropml: User-Friendly R Package Supervised Machine Learning Pipelines.” Journal Open Source Software, 6(61), 3073. doi:10.21105/joss.03073, https://joss.theoj.org/papers/10.21105/joss.03073. BibTeX entry LaTeX users :","code":"@Article{, title = {{mikropml}: User-Friendly R Package for Supervised Machine Learning Pipelines}, author = {Begüm D. Topçuoğlu and Zena Lapp and Kelly L. Sovacool and Evan Snitkin and Jenna Wiens and Patrick D. Schloss}, journal = {Journal of Open Source Software}, year = {2021}, month = {May}, volume = {6}, number = {61}, pages = {3073}, doi = {10.21105/joss.03073}, url = {https://joss.theoj.org/papers/10.21105/joss.03073}, }"},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"why-the-name","dir":"","previous_headings":"","what":"Why the name?","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"word “mikrop” (pronounced “meek-ROPE”) Turkish “microbe”. package originally implemented machine learning pipeline microbiome-based classification problems (see Topçuoğlu et al. 2020). realized methods applicable many fields , stuck name like !","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/pull_request_template.html","id":"issues","dir":"","previous_headings":"","what":"Issues","title":"NA","text":"Resolves # .","code":""},{"path":[]},{"path":"http://www.schlosslab.org/mikropml/dev/pull_request_template.html","id":"checklist","dir":"","previous_headings":"","what":"Checklist","title":"NA","text":"(Strikethrough points applicable.) Write unit tests new functionality bug fixes. roxygen comments vignettes Update NEWS.md includes user-facing changes. check workflow succeeds recent commit. always required PR can merged.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/calc_perf_metrics.html","id":null,"dir":"Reference","previous_headings":"","what":"Get performance metrics for test data — calc_perf_metrics","title":"Get performance metrics for test data — calc_perf_metrics","text":"Get performance metrics test data","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/calc_perf_metrics.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get performance metrics for test data — calc_perf_metrics","text":"","code":"calc_perf_metrics( test_data, trained_model, outcome_colname, perf_metric_function, class_probs )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/calc_perf_metrics.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get performance metrics for test data — calc_perf_metrics","text":"test_data Held test data: dataframe outcome features. trained_model Trained model caret::train(). outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). perf_metric_function Function calculate performance metric used cross-validation test performance. functions provided caret (see caret::defaultSummary()). Defaults: binary classification = twoClassSummary, multi-class classification = multiClassSummary, regression = defaultSummary. class_probs Whether use class probabilities (TRUE categorical outcomes, FALSE numeric outcomes).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/calc_perf_metrics.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get performance metrics for test data — calc_perf_metrics","text":"Dataframe performance metrics.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/calc_perf_metrics.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get performance metrics for test data — calc_perf_metrics","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/calc_perf_metrics.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get performance metrics for test data — calc_perf_metrics","text":"","code":"if (FALSE) { results <- run_ml(otu_small, \"glmnet\", kfold = 2, cv_times = 2) calc_perf_metrics(results$test_data, results$trained_model, \"dx\", multiClassSummary, class_probs = TRUE ) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/combine_hp_performance.html","id":null,"dir":"Reference","previous_headings":"","what":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","title":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","text":"Combine hyperparameter performance metrics multiple train/test splits generated , instance, looping R using snakemake workflow high-performance computer.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/combine_hp_performance.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","text":"","code":"combine_hp_performance(trained_model_lst)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/combine_hp_performance.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","text":"trained_model_lst List trained models.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/combine_hp_performance.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","text":"Named list: dat: Dataframe performance metric group hyperparameters params: Hyperparameters tuned. Metric: Performance metric used.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/combine_hp_performance.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/combine_hp_performance.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","text":"","code":"if (FALSE) { results <- lapply(seq(100, 102), function(seed) { run_ml(otu_small, \"glmnet\", seed = seed, cv_times = 2, kfold = 2) }) models <- lapply(results, function(x) x$trained_model) combine_hp_performance(models) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/compare_models.html","id":null,"dir":"Reference","previous_headings":"","what":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","title":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","text":"wrapper permute_p_value().","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/compare_models.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","text":"","code":"compare_models(merged_data, metric, group_name, nperm = 10000)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/compare_models.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","text":"merged_data concatenated performance data run_ml metric metric compare, must numeric group_name column group variables compare nperm number permutations, default=10000","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/compare_models.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","text":"table p-values pairs group variable","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/compare_models.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","text":"Courtney R Armour, armourc@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/compare_models.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","text":"","code":"df <- dplyr::tibble( model = c(\"rf\", \"rf\", \"glmnet\", \"glmnet\", \"svmRadial\", \"svmRadial\"), AUC = c(.2, 0.3, 0.8, 0.9, 0.85, 0.95) ) set.seed(123) compare_models(df, \"AUC\", \"model\", nperm = 10) #> group1 group2 p_value #> 1 glmnet svmRadial 0.7272727 #> 2 rf glmnet 0.2727273 #> 3 rf svmRadial 0.5454545"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/define_cv.html","id":null,"dir":"Reference","previous_headings":"","what":"Define cross-validation scheme and training parameters — define_cv","title":"Define cross-validation scheme and training parameters — define_cv","text":"Define cross-validation scheme training parameters","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/define_cv.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Define cross-validation scheme and training parameters — define_cv","text":"","code":"define_cv( train_data, outcome_colname, hyperparams_list, perf_metric_function, class_probs, kfold = 5, cv_times = 100, groups = NULL, group_partitions = NULL )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/define_cv.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Define cross-validation scheme and training parameters — define_cv","text":"train_data Dataframe training model. outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). hyperparams_list Named list lists hyperparameters. perf_metric_function Function calculate performance metric used cross-validation test performance. functions provided caret (see caret::defaultSummary()). Defaults: binary classification = twoClassSummary, multi-class classification = multiClassSummary, regression = defaultSummary. class_probs Whether use class probabilities (TRUE categorical outcomes, FALSE numeric outcomes). kfold Fold number k-fold cross-validation (default: 5). cv_times Number cross-validation partitions create (default: 100). groups Vector groups keep together splitting data train test sets. number groups training set larger kfold, groups also kept together cross-validation. Length matches number rows dataset (default: NULL). group_partitions Specify assign groups training testing partitions (default: NULL). groups specifies samples belong group \"\" belong group \"B\", setting group_partitions = list(train = c(\"\", \"B\"), test = c(\"B\")) result samples group \"\" placed training set, samples \"B\" also training set, remaining samples \"B\" testing set. partition sizes close training_frac possible. number groups training set larger kfold, groups also kept together cross-validation.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/define_cv.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Define cross-validation scheme and training parameters — define_cv","text":"Caret object trainControl controls cross-validation","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/define_cv.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Define cross-validation scheme and training parameters — define_cv","text":"Begüm Topçuoğlu, topcuoglu.begum@gmail.com Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/define_cv.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Define cross-validation scheme and training parameters — define_cv","text":"","code":"training_inds <- get_partition_indices(otu_small %>% dplyr::pull(\"dx\"), training_frac = 0.8, groups = NULL ) train_data <- otu_small[training_inds, ] test_data <- otu_small[-training_inds, ] cv <- define_cv(train_data, outcome_colname = \"dx\", hyperparams_list = get_hyperparams_list(otu_small, \"glmnet\"), perf_metric_function = caret::multiClassSummary, class_probs = TRUE, kfold = 5 )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_caret_processed_df.html","id":null,"dir":"Reference","previous_headings":"","what":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","title":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","text":"Get preprocessed dataframe continuous variables","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_caret_processed_df.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","text":"","code":"get_caret_processed_df(features, method)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_caret_processed_df.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","text":"features Dataframe features machine learning method Methods preprocess data, described caret::preProcess() (default: c(\"center\",\"scale\"), use NULL normalization).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_caret_processed_df.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","text":"Named list: processed: Dataframe processed features. removed: Names features removed preprocessing.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_caret_processed_df.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_caret_processed_df.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","text":"","code":"get_caret_processed_df(mikropml::otu_small[, 2:ncol(otu_small)], c(\"center\", \"scale\")) #> $processed #> Otu00001 Otu00002 Otu00003 Otu00004 Otu00005 #> 1 -0.4198476322 -0.218855527 -0.174296240 -0.59073845 -0.048774220 #> 2 -0.1045750483 1.754032339 -0.718419364 0.03805034 1.537072974 #> 3 -0.7076423302 0.696324396 1.428146361 0.60439092 -0.264559044 #> 4 -0.4936040623 -0.665193276 2.015799335 -0.59289184 -0.675577755 #> 5 1.1116829471 -0.395140184 -0.753787367 -0.58643168 -0.754356341 #> 6 -0.6845030580 0.613808173 -0.174296240 -0.58427829 0.375945115 #> 7 -0.7698291243 -0.496410093 -0.318488868 0.15863997 -0.658451975 #> 8 -0.4241862457 -0.477656406 -0.397386721 -0.55628427 -0.391289813 #> 9 -0.5557908564 1.144537514 1.615868839 -0.35171258 -0.274834512 #> 10 1.4573258257 -0.451401245 -0.693933823 -0.05669866 -0.706404158 #> 11 0.2931311927 -0.721454336 -0.753787367 3.03341063 -0.449517464 #> 12 1.1044519245 0.002437979 -0.473563958 -0.41846755 0.413621830 #> 13 -0.5933921737 -0.297621012 -0.340253793 -0.59289184 -0.655026820 #> 14 -0.8016456236 0.077452727 -0.419151646 -0.59073845 -0.045349064 #> 15 -0.7915221920 0.291244758 -0.269517787 -0.59289184 -0.220032017 #> 16 1.4862499159 -0.683946963 -0.745625520 -0.54551734 -0.744080874 #> 17 -0.3750152923 -0.051947713 0.103206554 1.37745659 0.458148857 #> 18 0.2135899445 0.325001395 0.478651509 -0.34309903 0.560903535 #> 19 -0.5181895390 -0.100707299 -0.073633462 -0.40770062 -0.237157796 #> 20 0.8745054069 -0.676445488 -0.560623658 -0.58212491 -0.154954054 #> 21 2.0184531767 -0.682071594 -0.740184289 -0.58643168 -0.720104782 #> 22 0.5867107089 -0.646439589 -0.560623658 0.09188499 -0.593374013 #> 23 -0.4603413585 -0.397015552 0.386150578 -0.42062094 -0.463218088 #> 24 -0.7553670792 1.401463025 0.829610924 -0.58858507 -0.295385447 #> 25 1.9316809059 0.334378238 -0.228708552 -0.42923448 -0.535146362 #> 26 1.2201482855 -0.108208774 -0.302165174 -0.58858507 0.358819335 #> 27 -0.9158957801 -0.674570119 -0.732022442 -0.53475041 -0.689278379 #> 28 -0.7597056927 -0.595804634 -0.375621796 -0.57566475 -0.730380250 #> 29 -0.5109585165 -0.558297260 -0.432754724 3.84093048 -0.672152599 #> 30 -0.8811868718 -0.385763340 -0.595991661 -0.58212491 -0.192630769 #> 31 0.3437483507 0.902614952 1.376454664 -0.59289184 1.396641581 #> 32 -0.5109585165 0.535042688 -0.484446421 -0.59289184 0.550628067 #> 33 1.2302717171 -0.582677053 0.007985007 -0.40554723 -0.672152599 #> 34 -0.0770971626 0.244360541 -0.313047636 -0.28711099 2.273481498 #> 35 -0.2275024319 2.211622300 1.515206061 -0.57781814 1.269910812 #> 36 0.0284757669 -0.663317907 -0.634080280 -0.57781814 -0.730380250 #> 37 -0.3157209072 -0.290119537 -0.231429168 -0.58643168 -0.233732640 #> 38 -0.1653156379 1.476477772 1.836238704 1.65309003 4.393653017 #> 39 -0.6859492625 -0.134463935 -0.258635324 0.68191283 0.399921206 #> 40 -0.3967083600 -0.126962461 -0.269517787 -0.57135798 0.304016840 #> 41 0.0009978811 -0.485157881 -0.291282711 -0.58858507 -0.463218088 #> 42 -0.4111704051 -0.029443288 -0.400107336 -0.19236199 0.050555301 #> 43 1.4399713715 -0.693323806 -0.734743058 3.42532693 -0.757781497 #> 44 -0.4805882217 -0.593929265 -0.397386721 1.44851834 -0.648176508 #> 45 0.3827958725 -0.421395345 -0.609594739 2.34648041 -0.333062162 #> 46 1.6438862078 -0.592053897 -0.579667967 -0.58427829 -0.726955094 #> 47 -0.5471136293 -0.280742693 0.269164106 1.53680717 -0.202906237 #> 48 1.6164083221 -0.653941063 -0.615035970 -0.58643168 -0.696128691 #> 49 -0.8609400086 -0.078202875 -0.397386721 0.99630722 -0.086450936 #> 50 0.6026189585 0.146841369 -0.345695024 -0.58212491 1.002748649 #> 51 -0.8363545319 -0.635187377 -0.421872261 -0.56705120 -0.593374013 #> 52 -0.1783314785 0.328752132 -0.666727667 -0.58643168 -0.531721206 #> 53 1.6728102981 -0.548920417 -0.639521511 -0.58858507 -0.562547610 #> 54 1.2620882164 -0.458902719 -0.751066751 1.21595262 -0.579673389 #> 55 -0.7828449649 -0.065075294 -0.130766390 1.23963987 0.375945115 #> 56 2.0705165392 -0.605181477 -0.606874124 -0.58858507 -0.733805406 #> 57 -0.6469017407 -0.327626911 -0.666727667 -0.57566475 -0.600224325 #> 58 -0.3663380652 8.732279248 -0.740184289 -0.54551734 -0.730380250 #> 59 -0.4415406999 1.363955651 -0.748346136 -0.10191977 1.746007486 #> 60 -0.4111704051 -0.479531775 -0.242311630 -0.59289184 -0.422116216 #> 61 -0.2694423628 2.967395884 -0.740184289 0.50964193 0.721885863 #> 62 0.6112961856 0.047446828 -0.579667967 -0.52613687 0.564328691 #> 63 -0.2347334545 -0.425146083 -0.479005189 -0.05454527 -0.665302287 #> 64 -0.8334621229 -0.344505229 -0.356577486 -0.45507512 -0.350187942 #> 65 -0.8884178944 -0.550795785 -0.400107336 0.57424351 -0.476918711 #> 66 -0.6049618098 -0.721454336 1.305718658 1.43129125 -0.487194179 #> 67 -0.8970951214 -0.642688851 -0.623197817 -0.58858507 -0.682428067 #> 68 -0.7293353979 0.801345043 -0.130766390 0.47303436 -0.257708732 #> 69 -0.7221043754 -0.721454336 2.736762475 1.05660204 -0.052199376 #> 70 -0.1002364348 -0.717703599 0.679977065 -0.57135798 0.512951352 #> 71 -0.2708885673 -0.584552422 0.783360459 -0.59289184 0.389645738 #> 72 -0.7221043754 0.150592106 1.036377712 1.00276738 0.030004366 #> 73 -0.4979426759 0.486283102 0.116809632 0.01436309 0.314292308 #> 74 -0.5557908564 -0.412018502 -0.527976271 -0.32587194 -0.315936383 #> 75 -0.7061961257 0.306247708 -0.323930099 -0.17944168 0.526651975 #> 76 0.4203971899 -0.558297260 0.269164106 0.31583716 0.074531393 #> 77 -0.6425631272 0.304372339 0.699021375 -0.52829025 0.427322453 #> 78 0.6488975029 -0.053823082 2.739483091 -0.59073845 -0.291960291 #> 79 -0.7380126250 -0.190724996 -0.424592877 -0.16867475 -0.199481081 #> 80 2.5159475292 -0.717703599 -0.710257517 -0.59289184 -0.757781497 #> 81 -0.8710634402 -0.331377648 -0.470843343 -0.55628427 -0.555697298 #> 82 -0.4039393825 1.645260955 -0.138928237 -0.59289184 0.410196674 #> 83 1.0032176086 -0.425146083 -0.405548568 -0.59073845 0.095082329 #> 84 -0.8305697138 -0.548920417 -0.748346136 -0.59289184 -0.319361539 #> 85 -0.8088766461 -0.368885022 2.105579651 -0.57135798 -0.579673389 #> 86 -0.6859492625 -0.550795785 0.410636119 0.67545267 -0.490619335 #> 87 -0.0062331415 0.167470424 0.367106269 -0.59289184 0.869167568 #> 88 -0.6497941497 1.360204913 -0.751066751 -0.59289184 -0.291960291 #> 89 -0.4458793134 0.788217462 0.731668762 -0.30864485 1.211683161 #> 90 -0.1421763657 -0.717703599 0.767036765 -0.59289184 2.119349482 #> 91 -0.7915221920 1.123908458 0.652770909 2.20651030 -0.045349064 #> 92 -0.4502179269 0.116835470 -0.054589152 -0.26342374 3.900430564 #> 93 1.3633225323 -0.593929265 -0.753787367 0.12203240 -0.206331393 #> 94 -0.6743796264 -0.442024401 -0.538858733 2.10314776 -0.679002911 #> 95 -0.2072555688 0.193725586 -0.364739333 -0.49383607 0.554053223 #> 96 -0.3460912020 2.147859765 2.856469562 1.86412189 1.304162371 #> 97 0.2121437400 -0.700825281 -0.742904905 0.42135309 -0.747506030 #> 98 -0.5948383782 0.169345793 0.024308701 -0.36463290 -0.401565281 #> 99 1.5281898468 -0.704576018 -0.699375055 -0.58858507 -0.692703535 #> 100 -0.5688066970 -0.680196225 -0.729301827 -0.58212491 -0.740655718 #> 101 -0.2361796590 -0.179472784 -0.498049499 -0.58427829 -0.631050728 #> 102 2.6518907534 -0.683946963 -0.721139980 -0.53905718 -0.740655718 #> 103 0.8325654761 -0.590178528 -0.473563958 2.55966565 -0.672152599 #> 104 0.3061470334 -0.162594466 0.524901975 0.39981923 1.636402496 #> 105 -0.5644680835 0.915742533 -0.449078418 -0.57997152 0.841766321 #> 106 -0.2390720680 0.345630450 -0.348415640 -0.56274443 0.132759044 #> 107 -0.7394588295 -0.278867325 -0.748346136 0.10049854 0.290316216 #> 108 -0.3648918607 1.302068484 0.802404768 -0.59289184 0.506101040 #> 109 1.5079429836 -0.682071594 -0.650403974 -0.50029623 -0.713254470 #> 110 -0.6035156053 -0.451401245 -0.582388583 -0.59289184 -0.583098545 #> 111 -0.2810119989 1.495231459 0.660932756 -0.59289184 -0.130977963 #> 112 -0.4502179269 -0.719578968 1.033657096 -0.58427829 -0.754356341 #> 113 -0.4458793134 0.508787527 2.353155672 -0.59289184 4.314874431 #> 114 -0.7813987604 0.090580308 -0.304885790 -0.17944168 -0.329637006 #> 115 -0.7365664205 3.171811071 0.228354872 -0.59073845 0.649957589 #> 116 -0.0264800046 -0.078202875 -0.413710414 0.21462801 0.321142620 #> 117 -0.6324396956 -0.205727946 -0.753787367 0.15863997 -0.702979003 #> 118 -0.5557908564 -0.213229420 0.821449078 0.09188499 2.633122871 #> 119 0.2309443987 1.073273503 2.619776003 -0.42062094 0.817790229 #> 120 0.3900268951 -0.674570119 -0.718419364 -0.58858507 -0.737230562 #> 121 -0.9057723485 -0.344505229 -0.661286436 -0.59073845 -0.668727443 #> 122 -0.0264800046 0.013690191 -0.751066751 -0.59289184 -0.761206653 #> 123 -0.7076423302 -0.637062745 -0.468122727 -0.50890978 -0.675577755 #> 124 0.0545074481 -0.650190326 0.443283506 -0.57566475 0.348543867 #> 125 0.3249476920 0.144966000 -0.585109199 -0.58427829 -0.196055925 #> 126 -0.0496192768 2.852998394 0.233796103 -0.58858507 0.944520999 #> 127 -0.3388601794 -0.057573819 0.954759243 -0.42923448 -0.004247193 #> 128 -0.0366034362 -0.286368800 -0.511652577 1.86196850 -0.757781497 #> 129 -0.8522627815 -0.355757441 -0.386504258 -0.32371856 0.194411850 #> 130 -0.2766733854 0.094331045 1.893371632 2.95158195 2.937961748 #> 131 0.5433245735 -0.537668204 -0.712978133 -0.58427829 -0.747506030 #> 132 -0.3764614968 -0.121336354 0.062397319 -0.56274443 -0.360463410 #> 133 0.1340486963 -0.316374699 0.312693956 -0.45722851 -0.052199376 #> 134 -0.5196357436 0.308123076 -0.280400249 -0.56705120 0.598580250 #> 135 0.1499569459 -0.706451387 -0.712978133 -0.57781814 -0.744080874 #> 136 -0.3808001103 1.189546362 0.475930894 -0.59289184 0.214962786 #> 137 -0.6859492625 0.872609053 5.601570724 -0.56705120 -0.672152599 #> 138 0.1210328557 -0.301371749 -0.443637186 -0.59289184 -0.562547610 #> 139 -0.8450317590 -0.192600365 -0.636800895 1.93303025 -0.709829314 #> 140 0.1803272407 0.475030890 0.435121659 1.12120363 1.276761123 #> 141 -0.8464779635 0.321250657 -0.220546705 -0.58643168 -0.538571518 #> 142 -0.8826330763 -0.472030300 0.764316150 3.24444248 0.026579210 #> 143 -0.8537089861 -0.522665255 -0.549741196 1.32792871 -0.357038254 #> 144 -0.7582594882 -0.344505229 -0.664007052 -0.45722851 -0.726955094 #> 145 -0.5832687421 -0.171971309 1.553294680 -0.58212491 -0.374164034 #> 146 -0.6469017407 -0.470154931 -0.644962742 -0.59073845 -0.730380250 #> 147 -0.3460912020 -0.023817182 2.127344576 1.81459400 0.307441996 #> 148 -0.5644680835 -0.408267765 2.886396334 0.15863997 -0.346762786 #> 149 1.1478380599 -0.593929265 -0.601432892 -0.58427829 -0.730380250 #> 150 0.1427259234 3.299336143 2.657864622 -0.56705120 3.835352601 #> 151 -0.5659142880 3.123051485 1.289394964 -0.57781814 0.899993971 #> 152 0.9699549048 0.081203464 -0.285841480 -0.58643168 0.423897298 #> 153 -0.8378007364 0.203102430 0.070559166 -0.58427829 -0.442667152 #> 154 -0.6830568535 -0.653941063 -0.560623658 -0.56274443 -0.668727443 #> 155 -0.3735690877 -0.466404194 -0.590550430 -0.49383607 -0.689278379 #> 156 -0.3460912020 -0.235733845 -0.294003327 2.08161390 -0.682428067 #> 157 4.1935447642 -0.668944013 -0.683051361 2.59842660 -0.716679626 #> 158 0.1687576046 -0.477656406 -0.000176840 -0.57135798 -0.713254470 #> 159 0.9280149739 -0.592053897 2.867352025 -0.58212491 0.584879626 #> 160 1.1001133110 -0.674570119 -0.715698748 -0.48737591 -0.494044491 #> 161 -0.6526865587 -0.693323806 -0.718419364 -0.57566475 -0.750931186 #> 162 -0.0192489821 0.495659946 -0.751066751 -0.59289184 5.606158216 #> 163 -0.2491954996 -0.653941063 -0.536138117 -0.16006120 -0.668727443 #> 164 0.4478750756 -0.250736794 -0.179737471 -0.44215480 -0.500894803 #> 165 0.4088275538 -0.663317907 -0.595991661 -0.56705120 0.310867152 #> 166 -0.9130033711 0.317499920 0.761595534 -0.59289184 -0.449517464 #> 167 -0.8999875305 0.506912158 0.595637981 0.91447854 -0.720104782 #> 168 2.9367930424 -0.721454336 -0.748346136 -0.58643168 -0.754356341 #> 169 3.0119956771 -0.689573069 -0.680330745 -0.59073845 -0.726955094 #> 170 3.9332279519 -0.706451387 -0.740184289 0.67760606 -0.432391684 #> 171 -0.5962845827 0.291244758 -0.196061165 -0.57351136 0.701334928 #> 172 -0.7683829198 -0.554546523 -0.658565820 -0.12776041 -0.552272142 #> 173 -0.1754390695 -0.712077493 -0.734743058 2.77931105 -0.730380250 #> 174 -0.3186133163 -0.166345203 -0.397386721 1.53034702 -0.028223285 #> 175 -0.5890535602 0.373760981 -0.043706690 -0.30649147 -0.518020582 #> 176 -0.3446449975 -0.160719097 2.959852956 -0.57781814 -0.161804366 #> 177 -0.5283129706 -0.288244168 -0.606874124 -0.57566475 0.067681081 #> 178 1.8608168848 -0.036944763 -0.160693162 -0.39262692 -0.329637006 #> 179 0.0501688346 -0.698949912 -0.726581211 0.53978933 -0.750931186 #> 180 -0.8363545319 0.364384137 -0.492608267 -0.33233210 -0.446092308 #> 181 -0.8378007364 0.131838419 -0.296723943 -0.56489782 -0.634475884 #> 182 -0.8233386913 -0.593929265 -0.095398387 -0.58858507 1.266485656 #> 183 -0.7177657618 -0.571424841 -0.737463673 3.51146238 -0.644751352 #> 184 -0.7625981017 0.683196815 -0.405548568 -0.39478030 -0.175504990 #> 185 -0.3301829524 -0.672694750 -0.742904905 -0.59289184 -0.569397921 #> 186 -0.9202343936 -0.383887972 -0.117163312 1.05660204 -0.048774220 #> 187 0.0762005158 -0.006938864 -0.593271045 2.91066761 0.036854678 #> 188 0.5028308471 -0.708326756 -0.742904905 -0.23542972 -0.062474844 #> 189 -0.8652786222 -0.389514078 0.032470547 -0.47230221 -0.055624532 #> 190 -0.7842911694 -0.059449188 -0.356577486 0.82403632 -0.668727443 #> 191 0.7212077286 -0.685822331 -0.753787367 -0.55197750 -0.631050728 #> 192 0.9844169499 -0.713952862 -0.751066751 -0.58427829 -0.754356341 #> 193 0.3061470334 0.120586207 -0.261355940 -0.58427829 0.817790229 #> 194 -0.2810119989 -0.577050947 -0.443637186 0.18017383 -0.661877131 #> 195 -0.5413288112 0.195600955 -0.356577486 -0.58212491 0.677358836 #> 196 1.6858261387 -0.702700649 -0.734743058 -0.59073845 -0.723529938 #> 197 1.1478380599 -0.078202875 3.286326831 -0.57135798 0.852041788 #> 198 5.1046536074 -0.691448437 -0.753787367 -0.48737591 -0.716679626 #> 199 0.9309073830 -0.350131335 -0.590550430 -0.58212491 1.232234096 #> 200 -0.6252086730 0.400016142 -0.424592877 -0.58427829 -0.048774220 #> Otu00006 Otu00007 Otu00008 Otu00009 Otu00010 Otu00011 #> 1 -0.16741683 -0.568745802 -0.0623643594 0.05469266 -0.637876227 0.72598299 #> 2 -0.57321773 -0.642744310 -0.1320452079 -0.40907548 -0.383314722 0.01116897 #> 3 -0.03641025 -0.612472193 -0.2070861217 -0.73691158 2.586569508 -0.77512645 #> 4 -0.58599886 -0.551927960 -0.4697293198 0.13465268 0.025980248 -0.02010414 #> 5 2.72750923 0.191420685 -0.6760918326 1.26208901 1.703091342 1.58822740 #> 6 0.80394909 -0.336659574 -0.0060836741 -0.26514744 -0.348374907 0.55621466 #> 7 2.20028760 -0.716742817 0.0635971744 -0.84485761 -0.682798846 -0.48920084 #> 8 -0.06197251 0.376416954 -0.0221638699 0.61841082 2.381922023 -0.75278851 #> 9 -0.46457812 -0.804195599 0.2940799810 -0.40907548 0.280541753 0.18540489 #> 10 0.68891891 -0.370295259 1.5885357433 -0.72091957 0.495172042 -0.82873750 #> 11 -0.24090833 0.066968649 -0.1963659911 0.09067467 0.500163444 0.38644633 #> 12 -0.35593851 -0.777287051 0.3423205684 0.48247878 0.634931300 -0.48920084 #> 13 -0.49972623 -0.141572599 -0.2178062522 -0.14520740 -0.248546866 1.11019552 #> 14 -0.77452053 -0.622562899 -0.4214887324 -0.64495755 -0.607927815 -0.04690967 #> 15 -0.68824790 -0.582200076 0.8997673563 -0.79688159 -0.892437732 -0.62769606 #> 16 0.91897926 1.180309832 -0.4241687650 -0.06524738 -0.058873587 -0.06478002 #> 17 0.18725954 0.046787238 2.0950619112 -0.13321340 -0.423245938 -0.65003400 #> 18 1.43341977 -0.316478163 0.7791658878 0.44249877 -0.353366309 -0.02903932 #> 19 -0.73298186 -0.693197838 -0.2124461869 0.28657672 0.045945856 -0.55174707 #> 20 -0.11948759 -0.481293021 -0.2365664806 0.54644680 2.327016600 1.20848245 #> 21 5.17190045 0.712773807 -0.6787718653 0.43450277 1.468495445 -0.87788097 #> 22 -0.60517055 1.372033238 0.5004424938 1.02620694 0.040954454 0.29262699 #> 23 -0.26008003 -0.568745802 -0.6787718653 -0.80087960 -0.677807444 0.29709458 #> 24 -0.25368946 0.524413969 -0.6787718653 0.62240882 0.619957094 -0.09605313 #> 25 0.79116796 -0.797468462 0.2779997852 0.79432287 -0.363349113 0.69024229 #> 26 -0.25368946 0.904497213 -0.5126098420 0.24659671 3.609806932 -0.80193198 #> 27 -0.88635543 0.278873467 -0.6760918326 -0.86884562 -0.153710227 -0.87788097 #> 28 -0.88955071 0.077059355 0.1895587082 0.22260871 -0.842523712 -0.87788097 #> 29 2.07567158 -0.518292274 -0.6760918326 0.31056473 0.445258021 -0.81086715 #> 30 -0.80647336 -0.784014188 -0.5903307884 -0.83286360 -0.932368949 -0.80639956 #> 31 1.79768199 -0.797468462 -0.6787718653 -0.82886560 -0.378323320 2.83915193 #> 32 -0.49333566 0.100604335 -0.6787718653 -0.25715144 -0.712747258 -0.54727948 #> 33 -0.79049695 0.917951487 0.3878811232 1.19812099 -0.647859031 0.67683952 #> 34 -0.37511020 1.028949248 -0.6787718653 0.02670665 -0.558013794 -0.87788097 #> 35 -0.15463570 -0.239116087 -0.5822906904 -0.64895555 0.585017279 0.01116897 #> 36 0.92536983 0.743045923 0.9480079437 2.52545738 0.470215032 -0.46239532 #> 37 2.33129419 0.325963426 -0.5849707231 -0.84485761 -0.897429135 1.27549626 #> 38 1.15862546 -0.787377757 -0.6787718653 -0.11722139 0.679853918 1.45419977 #> 39 -0.53806962 -0.373658828 0.0582371091 -0.35710146 -0.737704268 -0.31496493 #> 40 -0.56363188 -0.535110117 -0.5045697441 -0.02926537 0.555068867 -0.24348353 #> 41 -0.11309703 1.207218380 -0.0864846531 0.96623692 1.363676002 0.34177045 #> 42 2.76585263 -0.387113102 0.7014449414 -0.70492757 -0.892437732 0.98063548 #> 43 -0.62753753 -0.797468462 -0.6707317674 6.20761646 1.054209073 0.15859936 #> 44 -0.36552436 0.547958949 -0.6653717021 0.57043481 0.510146248 0.65896917 #> 45 0.01151899 -0.794104894 -0.6466114737 1.17413298 1.608254703 -0.85554303 #> 46 0.17128313 -0.555291528 0.0207166523 1.17813099 -0.233572660 2.34771729 #> 47 -0.75215356 0.036696533 0.9185275847 0.13865068 -0.298460887 0.34623804 #> 48 1.35034242 3.773621166 0.6022837339 -0.57299353 0.150765299 3.36185968 #> 49 -0.81286393 -0.784014188 -0.2526466764 -0.83686161 2.242162765 -0.80193198 #> 50 -0.60836584 0.574867497 -0.4214887324 -0.80087960 -0.148718825 -0.70364505 #> 51 0.16489256 2.014474827 -0.6787718653 0.28257872 2.297068188 -0.56514983 #> 52 0.63140383 0.161148568 0.2860398831 -0.21717142 0.400335403 -0.23454835 #> 53 0.38856234 -0.800832031 -0.6680517347 3.36503763 0.055928660 0.69917746 #> 54 -0.76173940 3.867801085 -0.6787718653 -0.23716143 -0.617910619 -0.87341338 #> 55 -0.54765547 -0.689834269 1.0686094123 -0.25315343 -0.792609691 -0.73045058 #> 56 2.00537536 1.789115735 -0.6787718653 -0.62496754 1.618237507 -0.87341338 #> 57 -0.78730166 -0.296296752 -0.1856458606 0.29857073 0.794656166 -0.46686291 #> 58 0.91258870 -0.800832031 -0.6734118000 -0.88483762 -0.867480722 -0.84660785 #> 59 -0.10990175 -0.565382234 -0.6760918326 0.78232887 0.150765299 -0.77065886 #> 60 -0.80966864 -0.403930944 0.2833598504 -0.08123938 -0.043899381 -0.18540489 #> 61 -0.27925173 -0.659562152 -0.5045697441 -0.58098953 -0.692781650 0.40431668 #> 62 0.74004343 -0.091119071 -0.6573316042 0.29857073 -0.423245938 0.12285866 #> 63 -0.33037625 -0.333296005 -0.4884895482 0.41451276 -0.742695670 -0.39538150 #> 64 -0.46777340 -0.054119818 1.5965758412 -0.61297354 -0.538048186 -0.44005738 #> 65 -0.80008280 -0.777287051 -0.5769306252 -0.73691158 -0.353366309 -0.82873750 #> 66 -0.72659129 0.450415461 1.5670954822 -0.02126937 -0.508099773 0.69470987 #> 67 -0.66588092 -0.696561406 1.1248900976 -0.75290358 -0.498116969 -0.62322847 #> 68 0.06583880 0.362962680 -0.6787718653 0.10666667 -0.218598454 0.82426992 #> 69 0.25755576 -0.575472939 0.0448369460 -0.42506748 -0.687790248 -0.66790435 #> 70 1.16821131 0.188057116 -0.1320452079 -0.08923539 -0.288478082 -0.12732625 #> 71 0.02430012 0.140967157 -0.6707317674 0.50646679 0.709802331 2.57556426 #> 72 0.12335389 -0.461111609 0.6451642560 -0.36109947 -0.273503876 -0.38197874 #> 73 -0.16741683 -0.175208285 -0.1722456974 -0.62896554 -0.558013794 0.16753454 #> 74 0.12335389 -0.040665543 -0.2392465133 0.19862070 0.020988846 -0.68130711 #> 75 -0.48055453 -0.683107132 -0.3116073944 -0.53701152 -0.188650041 -0.02457173 #> 76 -0.48694510 -0.804195599 -0.0302039678 -0.04525737 -0.518082577 0.55621466 #> 77 -0.84162147 -0.558655097 -0.6117710494 -0.80087960 0.205670722 -0.42218703 #> 78 0.86785474 0.053514375 0.1654384145 -0.88083962 2.322025198 -0.86001062 #> 79 -0.62114697 -0.498110863 -0.3089273618 -0.54500752 -0.712747258 -0.15413177 #> 80 -0.90233184 -0.797468462 -0.6760918326 -0.88483762 0.649905506 2.71405948 #> 81 -0.50611679 -0.716742817 -0.6707317674 -0.75290358 -0.852506516 1.21741763 #> 82 -0.08433949 -0.366931691 -0.6787718653 0.05869066 -0.328409299 -0.87788097 #> 83 0.65377080 -0.155026873 -0.6600116368 0.49847079 1.488461053 0.78406163 #> 84 -0.81925449 0.231783507 -0.6787718653 0.63040483 -0.308443691 -0.84660785 #> 85 -0.71381016 -0.753742071 -0.1427653384 -0.71692157 -0.882454928 -0.86894579 #> 86 -0.88316014 0.322599857 -0.6734118000 1.87378319 -0.533056784 1.00744101 #> 87 0.13293973 -0.477929452 -0.6707317674 -0.03326337 1.223916744 2.28070348 #> 88 -0.35274323 -0.400567376 -0.4482890587 -0.66494756 -0.418254536 -0.22114559 #> 89 -0.12587816 -0.141572599 0.5138426570 -0.60097954 -0.458185753 0.30602975 #> 90 1.82643953 -0.575472939 -0.3866483081 -0.34910546 -0.088822000 1.92776406 #> 91 -0.90233184 -0.804195599 -0.5983708863 -0.71692157 -0.707755856 -0.41325185 #> 92 -0.36871964 -0.494747295 -0.0516442289 -0.32111945 -0.702764454 -0.80193198 #> 93 -0.14824514 -0.800832031 0.0930775334 4.84030006 3.445090663 -0.35964080 #> 94 -0.86718373 -0.091119071 1.1329301955 0.40651676 0.305498763 -0.60089053 #> 95 -0.21215079 -0.380385965 0.2377992956 -0.53701152 -0.707755856 0.12732625 #> 96 -0.55724132 -0.464475178 -0.6787718653 -0.87284362 -0.538048186 -0.87788097 #> 97 3.31863652 3.736621913 -0.6734118000 0.94624692 2.671423343 -0.50707119 #> 98 -0.75215356 -0.535110117 -0.6787718653 -0.50102951 -0.013950969 1.64630604 #> 99 0.78477739 -0.804195599 0.2699596873 0.65039483 0.510146248 -0.67237194 #> 100 1.54844994 -0.800832031 -0.6787718653 -0.88483762 -0.667824639 -0.73938575 #> 101 2.03732818 -0.030574838 0.2511994588 -0.60897554 -0.098804804 1.15040381 #> 102 -0.88316014 2.815004136 -0.3330476555 1.66588713 -0.937360351 -0.31049734 #> 103 -0.41664888 2.848639821 -0.6787718653 1.03820094 -0.443211546 -0.15859936 #> 104 -0.39747718 0.262055624 0.0501970112 2.35754133 -0.268512474 -0.13179383 #> 105 -0.44860171 0.191420685 0.2404793283 0.11466267 -0.533056784 2.22262484 #> 106 -0.77771582 -0.730197092 0.8381266057 -0.82486760 -0.942351753 0.84214027 #> 107 -0.39428190 -0.020484132 1.2026110440 -0.52501751 -0.712747258 0.19434006 #> 108 0.83590191 -0.538473686 -0.3384077207 -0.50502751 -0.363349113 -0.66343676 #> 109 -0.28883757 0.066968649 1.6716167550 -0.57699153 -0.657841835 -0.76172369 #> 110 -0.73298186 -0.340023142 0.0475169786 -0.20517742 -0.707755856 -0.41325185 #> 111 -0.47096868 -0.518292274 -0.2794470028 -0.42906549 -0.043899381 -0.25688629 #> 112 -0.89913656 0.194784253 -0.6760918326 -0.46104950 -0.957325959 -0.87788097 #> 113 0.46524913 0.369689817 -0.6787718653 -0.41707148 0.530111856 -0.33283528 #> 114 -0.87357430 -0.582200076 -0.0007236088 -0.64095955 -0.702764454 -0.39091392 #> 115 -0.36232907 -0.363568122 -0.2499666438 -0.07324338 0.270558949 -0.10498831 #> 116 0.08501049 0.921315055 -0.3276875902 -0.61297354 0.125808289 2.32091177 #> 117 1.88395462 0.009787984 2.6230283401 -0.88083962 1.203951135 -0.81980233 #> 118 -0.89594127 1.405668923 1.9905406385 -0.46104950 -0.867480722 0.22561317 #> 119 -0.58599886 0.151057863 -0.6734118000 -0.60897554 1.628220311 0.02010414 #> 120 2.31851306 3.225359496 -0.3357276881 3.15314357 0.365395588 0.14519660 #> 121 -0.19617438 -0.713379249 0.2377992956 -0.79288359 -0.927377547 0.19434006 #> 122 0.04027654 -0.454384472 0.5084825917 0.21861070 0.020988846 -0.05584485 #> 123 -0.83203562 0.053514375 -0.6787718653 0.16663669 -0.882454928 -0.80193198 #> 124 1.19696885 0.201511390 0.0421569133 0.49447279 -0.632884825 0.15413177 #> 125 -0.02682440 -0.111300483 -0.6707317674 -0.08923539 -0.108787608 0.07371520 #> 126 -0.41984416 -0.521655843 -0.1508054363 -0.20917542 -0.113779010 0.12732625 #> 127 -0.31439983 -0.259297498 -0.6626916695 0.65039483 0.844570187 -0.73045058 #> 128 -0.50292151 2.169198979 0.0582371091 -0.09323339 1.193968331 -0.84214027 #> 129 -0.84162147 -0.171844716 -0.2338864480 -0.83686161 -0.822558104 -0.81980233 #> 130 -0.64670923 -0.370295259 -0.6787718653 -0.45705149 -0.682798846 -0.87788097 #> 131 -0.04599609 1.752116482 -0.6760918326 0.94624692 1.533383672 0.28815940 #> 132 0.82631607 -0.138209031 -0.6760918326 -0.44505749 0.160748104 -0.86894579 #> 133 0.43329630 0.235147076 3.7084415570 -0.53701152 -0.563005196 -0.46686291 #> 134 -0.27286116 -0.575472939 -0.0543242615 -0.45705149 -0.672816042 -0.47133049 #> 135 2.29934136 3.295994435 0.5835235055 1.39802105 -0.538048186 -0.72598299 #> 136 -0.49014038 -0.414021650 0.3369605031 -0.55300352 0.440266619 -0.20327524 #> 137 -0.07475364 -0.498110863 -0.1963659911 -0.53701152 -0.752678475 0.19434006 #> 138 -0.53167905 -0.420748787 -0.6787718653 -0.36909547 -0.882454928 0.56068225 #> 139 -0.87037901 -0.696561406 -0.3893283408 -0.35710146 0.919441218 -0.82873750 #> 140 -0.88955071 1.153401284 -0.1052448815 -0.87684162 1.832867796 -0.87788097 #> 141 -0.74256771 -0.646107878 -0.6787718653 0.21461270 -0.832540908 -0.68130711 #> 142 -0.81286393 -0.740287797 -0.1963659911 -0.83286360 -0.947343155 -0.77959404 #> 143 -0.52209321 -0.740287797 -0.4080885692 -0.70492757 -0.762661279 -0.87341338 #> 144 -0.78410638 -0.528382980 -0.0328840004 -0.74490758 -0.777635485 -0.53387672 #> 145 -0.81925449 -0.666289290 -0.6707317674 -0.88483762 -0.907411939 0.97170031 #> 146 -0.89594127 -0.625926467 -0.4992096788 -0.10122939 -0.243555464 -0.35070563 #> 147 1.67945653 -0.800832031 3.4377582610 -0.88483762 -0.927377547 -0.87788097 #> 148 -0.87357430 -0.350113848 -0.6760918326 -0.19318342 -0.847515114 0.23901594 #> 149 -0.14504986 3.423810040 -0.6573316042 -0.82886560 -0.937360351 -0.86894579 #> 150 -0.54765547 -0.686470701 -0.4911695809 -0.88083962 -0.957325959 -0.87788097 #> 151 0.95732265 -0.740287797 -0.6707317674 -0.52501751 -0.677807444 0.73045058 #> 152 0.12974445 0.023242259 0.2538794914 0.43050477 -0.852506516 -0.36410839 #> 153 -0.88316014 -0.760469208 -0.0570042941 -0.69693156 -0.083830598 -0.60089053 #> 154 -0.87676958 0.181329979 -0.6787718653 -0.58098953 -0.907411939 -0.74385334 #> 155 -0.22493192 0.299054878 -0.6760918326 0.60641682 1.154037115 2.12880550 #> 156 1.17460187 -0.185298990 3.1563548344 0.87028490 0.120816887 -0.01116897 #> 157 -0.85440260 -0.800832031 -0.6707317674 -0.87684162 -0.947343155 -0.87788097 #> 158 -0.26966588 -0.235752518 -0.6653717021 -0.02526737 -0.123761814 -0.39091392 #> 159 -0.77452053 -0.740287797 2.1754628902 -0.10122939 -0.822558104 -0.10945590 #> 160 -0.57002245 2.630007867 2.0468213238 2.70936544 -0.797601093 1.57035705 #> 161 -0.85440260 -0.269388203 1.2990922188 -0.24115943 0.001023237 0.19434006 #> 162 -0.75854412 0.164512137 -0.6787718653 -0.76889559 0.979338042 -0.51600637 #> 163 0.34063311 -0.269388203 1.1945709461 -0.40107948 0.020988846 0.41325185 #> 164 1.27046035 2.199471096 0.4656020696 -0.85685161 -0.303452289 -0.74832093 #> 165 -0.10031590 1.190400537 0.2243991325 0.93825092 -0.353366309 -0.50260361 #> 166 -0.89913656 -0.656198584 -0.6653717021 3.16913557 -0.957325959 -0.52047396 #> 167 -0.61156112 -0.639380741 -0.2044060890 -0.73691158 0.110834083 -0.74832093 #> 168 -0.89913656 -0.141572599 1.6823368855 0.02270865 2.371939219 -0.78406163 #> 169 0.02430012 0.282237035 1.0257288901 2.52145938 0.714793733 0.36857598 #> 170 -0.84162147 -0.804195599 -0.6760918326 -0.88483762 -0.952334557 6.31046751 #> 171 -0.58919414 -0.356840985 0.3021200789 -0.34110946 -0.338392103 0.06924761 #> 172 -0.45179699 -0.511565137 -0.0650443920 -0.63296355 -0.617910619 -0.69470987 #> 173 -0.61795169 -0.356840985 -0.4992096788 -0.30112945 -0.717738660 -0.81086715 #> 174 0.44927271 0.803590157 -0.6760918326 0.21861070 0.450249423 0.52940913 #> 175 0.46205385 -0.158390442 -0.6760918326 -0.47304350 -0.637876227 0.72598299 #> 176 0.81673022 0.019878690 -0.6734118000 -0.09723139 0.370386990 0.38644633 #> 177 -0.41345360 -0.659562152 -0.0757645226 -0.40107948 -0.862489320 0.01563656 #> 178 -0.90233184 -0.797468462 -0.6787718653 1.03420294 0.170730908 2.49514768 #> 179 -0.82884034 -0.252570361 4.7054136970 0.72635685 3.634763942 -0.78852921 #> 180 -0.47096868 -0.706652112 -0.6626916695 -0.68493756 -0.927377547 -0.44899256 #> 181 -0.02362912 -0.760469208 -0.3678880797 -0.82886560 -0.872472124 -0.81533474 #> 182 -0.74256771 -0.625926467 0.9506879764 -0.01727137 -0.727721464 0.10945590 #> 183 -0.22812720 1.583938055 -0.1910059258 0.38252875 -0.652850433 -0.01116897 #> 184 -0.72020073 -0.706652112 -0.6358913431 -0.68493756 -0.518082577 -0.54727948 #> 185 -0.25049418 0.938132898 -0.6787718653 0.13065468 -0.782626887 -0.16753454 #> 186 -0.80966864 -0.733560660 -0.2285263827 -0.86884562 -0.912403341 -0.81980233 #> 187 -0.89913656 -0.797468462 -0.6787718653 1.25409301 2.312042394 -0.86894579 #> 188 2.32490362 0.023242259 -0.6171311147 -0.87684162 -0.008959567 -0.83320509 #> 189 -0.89274599 -0.784014188 -0.3678880797 -0.88083962 -0.882454928 -0.84214027 #> 190 -0.31120455 -0.548564391 -0.3518078839 -0.39308348 -0.777635485 0.04690967 #> 191 2.05330460 0.921315055 0.9453279111 -0.26914544 -0.208615649 2.93297127 #> 192 0.66335665 0.070332218 -0.6787718653 0.32655674 -0.368340516 -0.35070563 #> 193 1.15862546 -0.131481894 -0.3062473291 -0.44905549 -0.563005196 0.07371520 #> 194 2.01815649 -0.121391188 -0.3625280144 -0.44505749 0.899475609 -0.36857598 #> 195 -0.59877999 -0.498110863 0.9346077806 -0.28113944 -0.767652681 -0.58302018 #> 196 -0.89913656 -0.457748041 6.7583186948 -0.02926537 0.035963052 -0.68577470 #> 197 -0.21215079 -0.757105640 -0.4429289935 0.58242881 -0.737704268 0.90468649 #> 198 0.46844441 3.948526730 -0.6600116368 1.81381317 0.609974290 -0.86894579 #> 199 2.88088280 -0.797468462 -0.6064109842 -0.01727137 3.969187880 0.98957066 #> 200 -0.20576023 0.147694294 -0.5126098420 -0.88083962 0.210662124 0.80193198 #> Otu00012 Otu00013 Otu00014 Otu00015 Otu00016 #> 1 -0.025998508 1.524303976 1.671766383 1.2574976512 -0.49503122 #> 2 0.413356123 -0.551251533 0.971673864 0.3058058591 -0.50331257 #> 3 -0.678984290 -0.980085316 0.007910136 -0.6658514951 -0.47570808 #> 4 -0.654799631 -0.842858506 -0.542162557 -0.4795062491 -0.50055212 #> 5 -0.674953513 0.517974032 -0.792195600 -0.9653349262 0.34414511 #> 6 -0.267845094 1.215543652 2.385497069 0.4056336694 -0.10856844 #> 7 -0.638676525 0.323569384 -0.737642936 -0.4928166238 -0.39841553 #> 8 -0.058244719 0.294980465 -0.437603285 -0.6059548089 3.78366388 #> 9 -0.038090837 -0.219620075 0.953489643 -0.9453693641 -0.32940431 #> 10 0.183601866 -0.699913911 -0.751281102 -0.8921278653 0.16471596 #> 11 0.425448452 0.392182789 -0.792195600 0.6585307890 -0.50331257 #> 12 -0.211414224 -0.911471911 2.280937796 0.0861846763 1.72712978 #> 13 -0.199321895 -0.534098182 0.485245945 -0.7457137434 -0.49503122 #> 14 -0.610461090 -0.671324993 0.958035698 0.1327709878 -0.12237068 #> 15 -0.501630127 -0.717067263 0.262489234 -0.0136431341 -0.47294763 #> 16 -0.509691680 -0.339693534 -0.769465323 -0.5260925606 -0.19690279 #> 17 -0.404891492 -0.008062075 1.294443792 -0.4795062491 -0.50331257 #> 18 -0.582245655 -0.162442237 1.358088566 1.4970843961 0.02669354 #> 19 -0.457291586 -0.791398452 0.812561928 1.1044283420 -0.37633194 #> 20 -0.594337985 0.643765275 -0.010274085 0.0928398636 1.70504620 #> 21 -0.707199725 0.020526844 -0.792195600 -0.9520245515 -0.47846853 #> 22 -0.030029284 0.312133816 -0.342136123 2.6883629331 0.29997793 #> 23 0.683418144 -0.585558236 0.262489234 0.8981175339 -0.50331257 #> 24 0.687448920 0.981114517 -0.783103489 -0.2066435675 0.39935408 #> 25 0.598771839 -0.231055642 1.771779600 -0.4329199376 -0.36252970 #> 26 -0.654799631 -0.379718020 0.635265770 -0.7390585561 0.90727659 #> 27 -0.767661371 -1.008674235 -0.792195600 -0.9520245515 -0.48674988 #> 28 0.107017114 -0.705631695 -0.701274494 -0.7257481813 -0.49779167 #> 29 0.175540314 -0.448331426 -0.692182383 -0.6725066825 -0.48122898 #> 30 -0.699138172 -0.934343046 0.080647021 -0.8521967411 -0.50331257 #> 31 -0.328306740 0.060551330 1.680858494 -0.9586797389 -0.24383041 #> 32 -0.650768855 0.357876086 -0.792195600 1.2308769018 -0.04507812 #> 33 2.170774647 1.209825868 -0.387596677 -0.2865058157 -0.50331257 #> 34 0.119109444 -1.014392019 -0.792195600 -0.0668846329 -0.30456028 #> 35 -0.142891024 -0.168160021 2.430957622 -0.1800228180 0.33586376 #> 36 -0.529845562 2.467738298 -0.778557434 -0.5726788721 2.28474037 #> 37 -0.618522643 1.204108084 0.894390924 -0.0202983214 -0.20242369 #> 38 -0.723322831 0.500820681 -0.792195600 -0.9120934274 2.25437544 #> 39 -0.308152858 0.923936680 -0.164839966 -0.0469190709 -0.01471319 #> 40 0.687448920 -0.785680668 1.117147634 0.1327709878 -0.50055212 #> 41 1.594373617 1.095470193 -0.687636328 2.3422931905 -0.05888037 #> 42 -0.437137704 -0.762809533 -0.196662353 -0.5527133100 -0.36529014 #> 43 -0.046152390 1.850217651 -0.787649545 0.3723577327 -0.13893337 #> 44 0.542340969 -0.088111048 0.062462800 -0.2931610031 -0.45914539 #> 45 -0.416983822 0.849605491 -0.096649136 -0.8721623032 0.21164358 #> 46 2.731052571 0.180624789 -0.487609894 1.3173943374 -0.50331257 #> 47 1.062311128 0.489385113 0.594351273 0.1727021119 -0.13617292 #> 48 -0.066306272 3.559834998 -0.628537609 -0.9520245515 -0.18586099 #> 49 -0.646738078 -0.871447425 0.553436775 -0.8056104296 -0.18862144 #> 50 -0.130798695 -0.705631695 1.571753166 -0.4595406870 -0.50331257 #> 51 0.006247703 -0.145288886 -0.778557434 -0.3730232514 0.66435713 #> 52 0.489940875 -0.414024723 0.489792000 3.0677086125 0.06533981 #> 53 -0.622553420 0.695225329 -0.687636328 -0.9520245515 -0.50055212 #> 54 3.315515153 -0.997238668 -0.778557434 -0.9387141768 0.70024296 #> 55 -0.570153326 -0.734220614 1.735411158 -0.4528854997 -0.35424835 #> 56 -0.098552484 2.307640352 -0.783103489 0.8714967845 1.32410431 #> 57 -0.683015066 -0.677042776 0.026094358 -0.0202983214 -0.11961023 #> 58 -0.739445936 -1.014392019 -0.783103489 -0.9586797389 -0.48122898 #> 59 0.240032736 1.221261436 0.048824634 -0.3530576893 0.45732350 #> 60 -0.545968668 0.878194409 0.471607779 0.5986341028 -0.23830952 #> 61 -0.678984290 -0.053804345 -0.792195600 -0.2332643169 -0.01747364 #> 62 -0.683015066 -0.791398452 -0.037550417 -0.4196095629 -0.32112297 #> 63 -0.009875402 -0.757091749 -0.733096881 -0.2399195042 -0.13617292 #> 64 -0.578214879 -0.596993804 -0.787649545 -0.9054382400 -0.50331257 #> 65 -0.755569042 -1.014392019 -0.483063838 -0.9387141768 -0.27143490 #> 66 2.138528435 1.753015327 -0.623991553 -0.8721623032 -0.28799759 #> 67 -0.497599350 -0.368282453 -0.483063838 -0.7656793055 0.66435713 #> 68 -0.352491399 -0.528380398 1.198976630 -0.0003327594 0.05429802 #> 69 -0.102583260 -0.396871372 -0.792195600 2.5352936239 -0.49779167 #> 70 -0.421014598 -0.385435804 1.417187285 2.0228441973 -0.48951032 #> 71 -0.489537798 1.158365814 0.357956396 -0.1800228180 -0.50331257 #> 72 -0.574184103 0.003373492 -0.792195600 2.9346048653 -0.12513113 #> 73 -0.545968668 -0.133853318 0.862568536 1.2042561523 -0.28799759 #> 74 -0.203352671 0.071986898 0.467061724 1.5370155202 0.84102583 #> 75 -0.493568574 -0.351129101 2.640076167 0.0262879901 -0.48674988 #> 76 2.344098033 -1.014392019 0.953489643 -0.6192651836 -0.27143490 #> 77 -0.654799631 -0.494073696 -0.778557434 -0.1999883801 -0.48951032 #> 78 2.194959305 -1.014392019 -0.787649545 3.3339161068 -0.50055212 #> 79 -0.489537798 -0.768527317 0.621627604 0.4854959177 0.23648762 #> 80 -0.731384383 2.416278244 -0.787649545 -0.9387141768 -0.50331257 #> 81 -0.719292054 -0.762809533 -0.437603285 0.6452204143 -0.31836252 #> 82 -0.558060997 0.346440519 -0.792195600 -0.9653349262 -0.50331257 #> 83 -0.574184103 0.986832301 -0.210300519 1.8431541387 -0.01747364 #> 84 0.514125534 -0.842858506 -0.587623111 -0.9520245515 -0.50055212 #> 85 -0.650768855 -0.814269587 -0.469425672 -0.1667124433 -0.50331257 #> 86 0.514125534 -1.014392019 -0.792195600 -0.9387141768 -0.06992216 #> 87 -0.392799163 0.552280735 1.153516077 1.2841184006 -0.43706180 #> 88 -0.441168480 -0.471202561 -0.792195600 0.6052892902 -0.41221777 #> 89 -0.231568106 0.134882519 1.435371507 -0.1334365065 -0.50331257 #> 90 0.280340501 3.136718999 0.989858085 -0.1134709444 -0.50331257 #> 91 -0.674953513 0.026244628 -0.792195600 0.8648415971 -0.47294763 #> 92 -0.320245187 0.043397979 0.639811826 2.3223276284 -0.44534315 #> 93 1.646773711 -0.133853318 -0.792195600 0.5387374166 -0.23002817 #> 94 -0.610461090 -0.842858506 0.357956396 -0.6858170572 1.29926027 #> 95 -0.340399070 -0.516944831 2.621891945 -0.4728510618 -0.47294763 #> 96 -0.767661371 -1.014392019 -0.792195600 -0.7190929940 -0.44534315 #> 97 -0.263814317 2.730756352 -0.792195600 -0.6458859330 1.69400440 #> 98 -0.287998976 -0.196748940 1.176246353 -0.2066435675 -0.16929830 #> 99 5.508257532 1.512868408 -0.769465323 -0.8721623032 -0.43154091 #> 100 -0.751538266 -1.014392019 -0.783103489 -0.0003327594 -0.50055212 #> 101 -0.582245655 0.112011384 -0.764919268 0.2459091729 3.51866083 #> 102 -0.203352671 -0.728502830 -0.755827157 -0.9520245515 -0.48122898 #> 103 2.106282224 -0.196748940 -0.701274494 0.5786685407 -0.50331257 #> 104 -0.421014598 0.134882519 -0.792195600 -0.9453693641 1.54494019 #> 105 -0.263814317 0.300698249 2.976484260 0.1660469246 -0.50331257 #> 106 -0.683015066 -0.202466723 0.903483034 -0.6725066825 0.38279139 #> 107 -0.267845094 -0.202466723 -0.251215017 1.0578420305 -0.18034010 #> 108 0.312586712 -0.276797912 -0.787649545 1.0179109063 -0.44810360 #> 109 0.115078667 -0.522662615 -0.751281102 -0.6325755583 -0.50055212 #> 110 -0.646738078 -0.133853318 -0.651267885 -0.6658514951 -0.07268261 #> 111 -0.570153326 -0.516944831 2.126371915 0.3989784821 -0.01195274 #> 112 0.288402054 -0.322540183 -0.792195600 1.1510146535 -0.40393642 #> 113 -0.412953045 -0.665607209 0.685272379 2.0960512583 -0.41773866 #> 114 -0.662861184 -0.762809533 -0.664906051 0.6252548522 -0.26867445 #> 115 -0.433106927 -0.333975750 1.989990256 1.0844627799 -0.28523714 #> 116 -0.392799163 -0.030933210 -0.646721830 0.4056336694 -0.20794458 #> 117 -0.425045375 -0.591276020 -0.792195600 -0.7656793055 0.21716448 #> 118 -0.521784009 -0.282515696 0.271581345 -0.1933331927 0.04049578 #> 119 0.151355655 -0.625582722 2.549155060 1.6434985179 -0.50055212 #> 120 -0.231568106 0.603740788 -0.792195600 -0.8588519285 0.26409210 #> 121 -0.703168948 -0.848576290 -0.133017579 -0.3197817525 -0.50055212 #> 122 0.941387835 1.284157057 0.062462800 1.2109113397 -0.27971624 #> 123 -0.594337985 -1.014392019 -0.410326953 -0.7324033687 -0.49227077 #> 124 -0.493568574 1.186954733 0.307949787 2.1958790686 2.14947840 #> 125 0.933326283 0.409336140 -0.573984945 0.8781519718 -0.46466629 #> 126 -0.421014598 0.746685383 1.939983647 0.2392539855 -0.48122898 #> 127 -0.296060529 -0.728502830 -0.092103081 -0.5460581227 -0.47294763 #> 128 -0.723322831 -0.882882992 -0.724004770 -0.9187486147 -0.45914539 #> 129 0.006247703 -0.968649749 -0.323951902 -0.7856448676 -0.36529014 #> 130 -0.404891492 -0.568404885 2.108187694 -0.8388863664 -0.50331257 #> 131 0.058647797 -0.242491210 -0.351228234 0.7982897235 0.86034897 #> 132 -0.445199257 1.524303976 -0.787649545 0.4389096062 -0.13065203 #> 133 0.264217395 0.129164735 -0.605807332 -0.7923000549 -0.20242369 #> 134 -0.199321895 -0.151006669 3.244701524 0.1527365499 -0.50331257 #> 135 0.393202241 4.720545104 -0.783103489 -0.7324033687 -0.32388342 #> 136 0.123140220 -0.002344291 -0.273945294 0.4189440442 -0.36805059 #> 137 -0.038090837 0.792427653 1.785417766 -0.9453693641 -0.50331257 #> 138 3.795177548 -0.145288886 1.271713515 0.5919789155 -0.50331257 #> 139 -0.723322831 -0.934343046 -0.623991553 -0.8322311791 1.30478117 #> 140 0.824495319 -1.008674235 1.008042307 1.8564645134 -0.49503122 #> 141 0.868833860 -0.213902291 -0.442149340 -0.7324033687 -0.50331257 #> 142 -0.735415160 -0.962931965 -0.037550417 -0.8521967411 -0.45362449 #> 143 -0.723322831 -0.922907479 0.671634213 -0.7590241181 -0.30732073 #> 144 -0.598368761 -0.562687101 -0.696728438 0.1527365499 -0.35424835 #> 145 -0.658830408 -1.002956451 -0.783103489 3.7132617861 -0.41497822 #> 146 -0.638676525 -0.837140722 -0.783103489 -0.7457137434 -0.50331257 #> 147 -0.634645749 -1.008674235 -0.787649545 -0.9653349262 -0.50055212 #> 148 -0.715261278 -0.837140722 0.507976221 -0.8189208043 -0.11408933 #> 149 0.921233953 -0.940060830 -0.423965119 -0.8921278653 -0.50331257 #> 150 -0.106614037 -1.014392019 -0.792195600 -0.9653349262 -0.50331257 #> 151 -0.416983822 -0.408306939 -0.223938685 -0.3131265652 -0.42049911 #> 152 3.017237697 0.180624789 -0.546708613 0.4122888568 -0.41773866 #> 153 -0.566122550 -0.922907479 2.344582571 0.1993228614 -0.50331257 #> 154 -0.344429846 -1.014392019 -0.664906051 -0.9586797389 1.93140297 #> 155 1.134865104 -0.614147155 -0.783103489 1.1310490914 -0.45638494 #> 156 1.219511409 -0.419742507 -0.319405847 -0.9586797389 -0.44534315 #> 157 -0.767661371 1.890242137 -0.783103489 -0.9653349262 -0.49779167 #> 158 4.012839476 2.439149379 -0.351228234 0.1727021119 -0.49779167 #> 159 0.514125534 -0.968649749 -0.787649545 -0.8255759917 0.72232655 #> 160 0.485910099 0.929654463 -0.583077055 -0.4994718112 -0.16377741 #> 161 -0.715261278 1.106905760 -0.792195600 0.6984619132 -0.50331257 #> 162 -0.731384383 0.603740788 -0.792195600 1.6368433306 0.95144377 #> 163 -0.594337985 0.780992085 -0.687636328 0.0129776153 -0.48674988 #> 164 -0.545968668 0.060551330 -0.528524391 1.2907735880 -0.49227077 #> 165 -0.477445468 2.216155812 -0.787649545 -0.6791618698 2.69604719 #> 166 -0.646738078 -1.008674235 -0.792195600 -0.9653349262 -0.49503122 #> 167 -0.529845562 -0.431178074 0.017002247 0.9912901569 -0.45914539 #> 168 0.961541718 -1.002956451 -0.792195600 -0.8987830526 -0.49503122 #> 169 0.308555936 -0.682760560 -0.746735047 -0.8189208043 0.49596977 #> 170 -0.634645749 -1.008674235 -0.419419064 -0.9387141768 -0.02299454 #> 171 -0.469383915 -0.499791479 2.426411566 0.0861846763 -0.38185283 #> 172 0.183601866 -0.871447425 -0.755827157 -0.6991274319 8.63929272 #> 173 -0.191260342 -0.854294073 -0.792195600 -0.9520245515 1.62499319 #> 174 1.155018986 -0.299669047 -0.787649545 0.0395983648 -0.38737373 #> 175 0.227940407 0.981114517 0.021548302 0.7117722879 -0.32112297 #> 176 -0.384737610 0.186342573 -0.774011379 -0.9254038021 -0.50331257 #> 177 -0.541937891 -0.791398452 0.785285596 0.2126332361 -0.50331257 #> 178 1.183234421 0.352158303 -0.701274494 0.5254270419 1.07566395 #> 179 -0.235598882 -0.213902291 -0.792195600 -0.9320589894 -0.50055212 #> 180 -0.751538266 -0.677042776 -0.787649545 0.8714967845 -0.23830952 #> 181 -0.122737142 -0.728502830 -0.628537609 0.0994950510 -0.50055212 #> 182 -0.150952577 -0.048086562 -0.714912660 -0.6791618698 -0.44534315 #> 183 -0.469383915 0.094858033 -0.533070447 0.3257714212 0.23372717 #> 184 -0.654799631 -0.877165208 -0.619445498 -0.2399195042 -0.40669687 #> 185 -0.271875870 0.060551330 -0.787649545 -0.2665402537 -0.50331257 #> 186 -0.715261278 -0.962931965 0.648903936 2.6218110595 -0.48951032 #> 187 1.803973992 0.918218896 -0.655813940 -0.9653349262 3.58767204 #> 188 -0.545968668 0.415053924 -0.792195600 -0.8721623032 2.44484638 #> 189 -0.038090837 -0.940060830 -0.660359996 -0.8455415538 -0.50331257 #> 190 -0.638676525 -0.333975750 0.007910136 0.3856681074 0.21992493 #> 191 0.078801679 2.730756352 -0.678544217 -0.7324033687 -0.48674988 #> 192 -0.416983822 0.094858033 -0.792195600 -0.9586797389 -0.48398943 #> 193 -0.400860716 1.152648031 2.117279805 -0.1667124433 0.36070780 #> 194 4.726286904 -0.191031156 -0.683090272 -0.7190929940 0.57602278 #> 195 -0.154983354 -0.516944831 2.149102192 -0.2598850663 -0.41221777 #> 196 0.631018050 0.317851600 -0.792195600 -0.9653349262 -0.50331257 #> 197 1.195326751 0.826734356 0.821654039 -0.7390585561 -0.50331257 #> 198 -0.719292054 2.136106839 -0.792195600 -0.6458859330 -0.13341247 #> 199 -0.497599350 1.381359381 0.280673455 -0.8056104296 0.18403910 #> 200 -0.283968200 1.124059112 0.703456600 1.6501537053 -0.44258270 #> Otu00017 Otu00018 Otu00019 Otu00020 Otu00021 Otu00022 #> 1 0.47611468 0.399615523 0.55293856 0.554816232 -0.35537010 1.647612103 #> 2 -0.32110972 -0.679309939 0.61541514 -0.360008658 0.15159833 -0.375705829 #> 3 0.49083266 -0.679309939 -0.13846893 -0.529188603 -0.63100342 -0.081618920 #> 4 -0.26714376 0.030253653 0.08644676 -0.266019799 0.74224116 -0.187490207 #> 5 -0.52961456 -0.674449915 -0.64244668 -0.685836701 -0.63100342 -0.367863511 #> 6 3.30687454 -0.008626544 -0.08432256 0.172594874 0.78161735 -0.356100035 #> 7 -0.50263159 -0.518929127 -0.52165862 -0.403870125 -0.63100342 -0.026722697 #> 8 -0.53452056 0.419055622 0.69871725 -0.027914691 -0.60639331 -0.207096001 #> 9 1.29296306 -0.679309939 0.29053693 -0.673304853 -0.63100342 0.141887131 #> 10 -0.52225557 -0.436308709 -0.03017619 0.918239819 -0.52271890 -0.281598018 #> 11 -0.53452056 -0.679309939 -0.35921951 1.005962753 -0.63100342 -0.383548146 #> 12 2.65928302 -0.664729865 -0.21344082 -0.641975234 0.46660784 -0.273755700 #> 13 -0.44375963 -0.650149792 -0.64244668 -0.522922680 -0.62608140 -0.371784670 #> 14 0.94709032 -0.120407110 -0.34255909 -0.479061212 3.60193686 -0.277676859 #> 15 0.68216652 -0.280787922 -0.30923825 -0.585581919 -0.11911297 -0.360021194 #> 16 -0.53452056 2.304745168 -0.35921951 1.087419764 -0.62608140 -0.301203812 #> 17 2.23246135 -0.674449915 -0.23426635 -0.535454527 0.23035070 -0.340415400 #> 18 1.37881799 0.146894244 0.02813529 -0.165765017 0.69302092 -0.163963254 #> 19 0.70914950 0.137174194 0.40299477 -0.159499093 -0.16341118 0.185019877 #> 20 -0.50508458 2.960848490 -0.39670546 -0.234690180 -0.61623735 0.628110819 #> 21 -0.53452056 -0.664729865 -0.63828157 -0.679570777 -0.62115938 -0.379626987 #> 22 -0.53206756 0.224654637 0.28637182 0.673868786 -0.47842069 -0.367863511 #> 23 -0.53452056 0.278114908 0.60291983 2.033574274 -0.63100342 -0.003195744 #> 24 -0.52716157 -0.674449915 -0.64244668 -0.485327136 -0.62115938 -0.379626987 #> 25 -0.35299870 1.157779362 0.69455215 0.254051885 0.41738760 0.185019877 #> 26 2.12943543 0.900198058 -0.44668673 -0.604379690 -0.23231951 -0.352178876 #> 27 -0.53452056 -0.669589890 -0.64244668 -0.685836701 -0.63100342 -0.379626987 #> 28 -0.53452056 -0.679309939 5.46359780 2.321806774 -0.63100342 -0.336494241 #> 29 -0.51489658 -0.674449915 -0.38004504 0.442029602 -0.63100342 -0.293361494 #> 30 1.07709922 -0.679309939 4.20990108 -0.660773005 -0.29630582 -0.367863511 #> 31 -0.53452056 -0.023206617 -0.55081436 -0.585581919 -0.62115938 1.173151890 #> 32 0.40252473 -0.314808094 -0.56330968 -0.441465669 -0.63100342 0.604583867 #> 33 -0.53452056 -0.679309939 0.01980508 -0.071776158 -0.56701712 -0.379626987 #> 34 -0.53452056 -0.679309939 -0.64244668 -0.679570777 1.28366375 0.216389147 #> 35 0.31176380 -0.188447454 -0.18428509 -0.585581919 -0.26677368 -0.383548146 #> 36 -0.51980257 4.146694494 -0.57997010 -0.554252299 -0.63100342 -0.371784670 #> 37 1.22673211 0.389895474 -0.24676167 -0.660773005 -0.02559452 -0.152199778 #> 38 -0.53452056 -0.674449915 -0.63411647 -0.259753876 -0.61131533 -0.375705829 #> 39 -0.53452056 0.176054391 -0.49250288 -0.447731593 -0.53748498 -0.352178876 #> 40 2.04358049 -0.674449915 0.93612826 -0.197094636 0.03346976 -0.261992224 #> 41 0.24553285 0.559996335 -0.24676167 2.240349763 -0.62608140 -0.379626987 #> 42 -0.46093062 -0.329388168 -0.23843146 -0.410136049 1.79063218 -0.332573082 #> 43 -0.46093062 0.219794613 -0.64244668 -0.685836701 -0.62115938 -0.375705829 #> 44 1.26843308 0.195494490 1.00693505 -0.510390832 -0.60639331 0.024252367 #> 45 0.51536265 -0.679309939 -0.57997010 -0.240956104 -0.38982427 -0.379626987 #> 46 -0.50753758 -0.402288537 -0.17178977 -0.190828713 -0.62115938 -0.332573082 #> 47 0.75820946 -0.679309939 -0.54664925 0.078606015 0.89974591 -0.348257717 #> 48 -0.53452056 -0.105827036 0.02813529 3.430875305 -0.58670521 -0.328651923 #> 49 1.34692902 -0.343968241 -0.55081436 -0.610645614 0.80622746 0.024252367 #> 50 3.17195964 2.469986005 -0.22177104 -0.547986375 1.48054469 -0.367863511 #> 51 -0.53206756 -0.679309939 -0.41336588 0.968367210 -0.62608140 -0.265913383 #> 52 0.13514793 -0.207887552 -0.11347830 -0.529188603 0.72747509 -0.363942352 #> 53 -0.36526369 -0.679309939 -0.64244668 -0.598113766 -0.40951236 -0.360021194 #> 54 -0.53452056 -0.664729865 -0.36754972 -0.353742734 -0.55225105 0.094833225 #> 55 2.23491435 -0.368268364 0.18224419 -0.522922680 0.82099353 -0.254149906 #> 56 -0.51244358 0.885617984 -0.64244668 2.722825904 -0.49810879 -0.375705829 #> 57 -0.48055460 -0.431448684 -0.32173356 -0.366274582 0.53059414 -0.312967288 #> 58 -0.51734957 -0.679309939 -0.62995136 -0.679570777 -0.63100342 -0.363942352 #> 59 -0.51980257 -0.363408340 0.80700999 0.003414929 0.45184176 1.631927468 #> 60 0.14005393 1.138339263 -0.05100172 0.028478624 -0.38490224 -0.332573082 #> 61 -0.53452056 -0.679309939 -0.03434129 -0.472795288 -0.62608140 -0.383548146 #> 62 -0.03901494 -0.679309939 -0.55914457 -0.598113766 1.13108102 -0.301203812 #> 63 -0.52225557 0.788417492 -0.36754972 -0.303615343 -0.62608140 -0.363942352 #> 64 -0.53452056 -0.159287306 -0.09681787 1.156344927 -0.24216356 -0.132593984 #> 65 -0.47810160 -0.679309939 1.00276994 -0.616911538 -0.63100342 -0.171805572 #> 66 -0.53452056 -0.674449915 1.28183200 0.636273243 0.37308939 -0.332573082 #> 67 -0.48546060 -0.562669349 -0.35505441 -0.347476810 -0.62608140 -0.246307589 #> 68 -0.53206756 -0.008626544 -0.49250288 -0.052978387 -0.63100342 -0.293361494 #> 69 -0.53452056 -0.669589890 1.39845495 -0.491593060 -0.01575048 -0.258071065 #> 70 3.36819949 1.269559928 -0.62995136 -0.623177462 1.17045721 0.008567732 #> 71 0.32402879 -0.679309939 -0.20511061 -0.479061212 -0.55717307 0.012488891 #> 72 -0.53452056 0.321855129 1.36513411 0.141265254 -0.63100342 0.290891164 #> 73 1.25862108 0.083713924 -0.64244668 -0.134435397 2.44033929 0.118360178 #> 74 0.65273054 -0.679309939 1.11939289 -0.410136049 -0.25692963 -0.297282653 #> 75 2.94383081 -0.679309939 0.50295730 -0.372540506 1.28366375 -0.367863511 #> 76 1.98716153 1.775002486 -0.03017619 -0.397604201 -0.62608140 -0.379626987 #> 77 -0.29903274 -0.679309939 -0.50499820 -0.648241158 2.05149943 0.761430218 #> 78 -0.53452056 0.195494490 -0.64244668 -0.685836701 0.71763104 0.204625671 #> 79 0.99615028 -0.275927897 -0.24676167 -0.554252299 0.07776797 -0.371784670 #> 80 -0.53206756 -0.679309939 6.88389873 -0.679570777 -0.62608140 -0.383548146 #> 81 0.06646398 0.005953530 -0.36754972 -0.629443386 -0.63100342 -0.277676859 #> 82 -0.28186175 -0.674449915 -0.64244668 0.128733407 4.36977254 -0.046328491 #> 83 0.49573866 0.200354514 -0.55914457 -0.491593060 0.13683226 -0.344336558 #> 84 -0.53452056 -0.674449915 -0.64244668 -0.178296865 -0.62608140 7.537192593 #> 85 -0.53206756 -0.664729865 -0.64244668 -0.685836701 -0.63100342 -0.316888447 #> 86 -0.53452056 2.192964602 1.78164465 -0.679570777 -0.63100342 -0.234544113 #> 87 0.40743073 -0.475188906 -0.28008251 -0.422667897 0.31894713 0.377156657 #> 88 -0.53452056 -0.193307479 -0.05100172 -0.090573930 2.66183035 0.702612836 #> 89 -0.24016078 -0.679309939 0.47380156 0.254051885 -0.46857665 1.141782620 #> 90 -0.53452056 -0.679309939 -0.47167736 0.924505743 -0.63100342 0.561451120 #> 91 -0.29412674 -0.679309939 -0.64244668 -0.497858984 -0.62608140 -0.379626987 #> 92 -0.53452056 -0.679309939 0.44048072 -0.504124908 -0.62608140 -0.371784670 #> 93 -0.53452056 -0.679309939 0.27387650 1.782937318 -0.63100342 -0.383548146 #> 94 -0.53452056 2.601206669 1.18603458 -0.259753876 -0.08958083 -0.250228748 #> 95 3.55708035 -0.664729865 1.49008727 -0.598113766 1.48546672 -0.211017160 #> 96 -0.46828961 -0.655009816 -0.64244668 -0.679570777 4.06952910 0.020331208 #> 97 -0.53452056 -0.679309939 -0.45501694 -0.667038929 -0.62608140 -0.383548146 #> 98 0.78519244 -0.455748807 -0.05516682 -0.103105778 -0.63100342 -0.281598018 #> 99 -0.53452056 -0.669589890 3.29774300 0.354306667 -0.62608140 -0.383548146 #> 100 -0.53206756 -0.679309939 -0.52582373 0.147531178 -0.60639331 -0.383548146 #> 101 -0.40451166 1.002258574 -0.63411647 -0.065510234 1.30335184 -0.371784670 #> 102 -0.52225557 -0.679309939 -0.45918204 -0.604379690 -0.63100342 -0.379626987 #> 103 -0.43885363 2.800467678 -0.10514809 0.166328950 -0.62115938 -0.383548146 #> 104 -0.53452056 0.161474318 -0.52165862 -0.178296865 -0.61131533 0.549687644 #> 105 2.59305208 -0.674449915 0.31552756 -0.529188603 0.41246558 0.345787387 #> 106 1.42787796 -0.679309939 1.39012474 -0.673304853 0.20574059 -0.301203812 #> 107 -0.53452056 -0.188447454 0.50712240 -0.272285723 0.61919057 2.274997508 #> 108 -0.25978477 0.681496950 0.22389524 0.222722265 -0.62608140 1.337840559 #> 109 -0.52470857 -0.217607602 2.99785542 2.096233513 -0.60639331 -0.352178876 #> 110 -0.50263159 -0.382848438 -0.41336588 -0.203360560 -0.61623735 -0.269834542 #> 111 -0.53206756 -0.421728635 -0.62578626 -0.416401973 -0.62608140 -0.199253683 #> 112 -0.21072481 -0.669589890 -0.64244668 0.454561450 -0.62608140 -0.383548146 #> 113 -0.53452056 -0.032926667 -0.41336588 0.053542320 2.00227919 -0.316888447 #> 114 -0.40941766 -0.412008586 -0.06349703 -0.491593060 -0.54240700 0.286970005 #> 115 -0.53206756 0.054553776 -0.08848766 -0.052978387 -0.43412248 -0.128672825 #> 116 -0.45111862 1.211239632 0.01147487 0.015946776 0.82591556 -0.336494241 #> 117 -0.53452056 -0.013486568 0.57792920 -0.685836701 -0.39966831 -0.371784670 #> 118 -0.16902384 -0.465468857 0.42798540 0.028478624 0.34847927 0.094833225 #> 119 -0.53452056 -0.679309939 0.72370788 1.739075850 -0.63100342 -0.383548146 #> 120 -0.53452056 0.244094736 -0.21344082 -0.159499093 -0.63100342 -0.383548146 #> 121 -0.52716157 -0.679309939 -0.44252162 -0.679570777 -0.23724154 -0.383548146 #> 122 -0.53452056 -0.679309939 0.23639056 -0.522922680 0.03346976 -0.383548146 #> 123 -0.53452056 4.550076536 -0.48417267 1.544832209 -0.56701712 -0.340415400 #> 124 -0.53206756 -0.421728635 -0.48833778 0.009680852 -0.15356714 -0.352178876 #> 125 -0.48055460 -0.139847208 -0.13846893 -0.215892408 -0.63100342 -0.375705829 #> 126 -0.53452056 -0.309948069 -0.03017619 0.141265254 0.65364473 -0.348257717 #> 127 -0.47319561 -0.596689521 -0.45085183 -0.516656756 1.18522328 -0.156120937 #> 128 -0.49772559 1.687522044 -0.63828157 -0.140701321 -0.63100342 -0.332573082 #> 129 0.10571196 0.919638156 -0.57580499 2.716559980 0.73239711 -0.238465271 #> 130 1.58486984 -0.023206617 0.17391397 -0.660773005 -0.63100342 -0.383548146 #> 131 -0.51489658 0.419055622 -0.64244668 0.084871939 -0.25200761 -0.301203812 #> 132 -0.52470857 -0.669589890 1.18186948 -0.604379690 -0.54732902 -0.379626987 #> 133 -0.53452056 0.030253653 0.86115636 -0.234690180 -0.52764093 -0.285519177 #> 134 3.26762657 -0.650149792 0.57376409 -0.485327136 1.72172385 -0.328651923 #> 135 -0.53452056 0.880757959 1.11106268 2.478454871 -0.59654926 -0.324730765 #> 136 0.11552395 -0.679309939 -0.13430382 -0.547986375 0.70778699 0.118360178 #> 137 -0.53452056 -0.679309939 -0.64244668 -0.667038929 -0.61623735 -0.379626987 #> 138 -0.53206756 -0.460608832 0.26138119 -0.685836701 4.39438266 0.032094685 #> 139 0.17439590 0.380175425 -0.54248415 -0.109371702 -0.62115938 -0.324730765 #> 140 -0.52716157 -0.674449915 -0.63411647 -0.259753876 0.83083758 -0.265913383 #> 141 -0.53452056 0.428775671 0.59042451 -0.009116919 0.05807988 0.141887131 #> 142 -0.37262268 -0.523789152 -0.56330968 -0.673304853 0.61919057 2.714167291 #> 143 -0.53452056 -0.538369226 -0.35921951 -0.109371702 -0.61623735 -0.277676859 #> 144 -0.49527259 0.973098427 -0.53831904 0.786655417 -0.63100342 -0.277676859 #> 145 -0.08807490 -0.528649176 -0.63411647 -0.566784147 3.53302853 -0.352178876 #> 146 -0.51244358 -0.222467626 -0.60079562 -0.435199745 -0.62115938 -0.363942352 #> 147 -0.53452056 -0.679309939 -0.64244668 -0.466529364 -0.62608140 3.682693510 #> 148 0.14741292 -0.081526913 -0.50499820 -0.366274582 -0.62608140 2.231864761 #> 149 -0.53452056 -0.655009816 0.59042451 5.498630194 -0.49810879 -0.383548146 #> 150 -0.53452056 -0.679309939 -0.64244668 -0.554252299 -0.20770940 0.443816357 #> 151 -0.43394764 -0.679309939 -0.39254036 -0.360008658 -0.60147128 -0.261992224 #> 152 -0.48546060 -0.314808094 -0.62162115 0.091137863 1.57898517 -0.352178876 #> 153 -0.53452056 -0.596689521 -0.58413520 -0.591847843 0.34847927 0.130123654 #> 154 -0.52961456 -0.679309939 -0.63828157 4.320636500 0.09745607 -0.191411366 #> 155 -0.53452056 0.214934588 0.20306971 1.024760525 -0.57193914 -0.379626987 #> 156 -0.52470857 0.030253653 -0.63828157 -0.353742734 -0.63100342 -0.328651923 #> 157 -0.53206756 -0.679309939 -0.64244668 -0.685836701 -0.63100342 -0.383548146 #> 158 -0.53452056 -0.091246962 4.23489171 -0.673304853 -0.62608140 -0.211017160 #> 159 -0.53452056 2.523446276 -0.63828157 -0.328679038 0.54043819 1.333919400 #> 160 -0.53452056 1.002258574 0.05312592 1.569895905 -0.63100342 -0.371784670 #> 161 -0.52225557 0.428775671 -0.57997010 0.066074168 -0.63100342 -0.344336558 #> 162 -0.53452056 1.998563618 -0.64244668 0.066074168 -0.63100342 7.666590833 #> 163 -0.53206756 -0.266207848 -0.25925698 2.459657100 -0.63100342 -0.383548146 #> 164 -0.51244358 -0.674449915 -0.62578626 -0.228424256 -0.61623735 -0.371784670 #> 165 -0.51489658 0.351015277 0.32385777 -0.103105778 -0.63100342 -0.375705829 #> 166 -0.53452056 -0.674449915 -0.64244668 -0.648241158 0.11222214 -0.383548146 #> 167 -0.49036659 -0.514069103 -0.63828157 0.279115580 1.49038874 -0.258071065 #> 168 -0.53452056 -0.412008586 0.18224419 -0.159499093 -0.62608140 -0.360021194 #> 169 -0.53206756 -0.679309939 -0.63828157 -0.504124908 -0.63100342 -0.383548146 #> 170 -0.04882693 -0.679309939 -0.63828157 -0.685836701 -0.63100342 -0.261992224 #> 171 3.46877241 -0.407148561 1.34847369 -0.009116919 1.17045721 -0.132593984 #> 172 -0.50753758 1.109179116 -0.31340335 -0.616911538 -0.52764093 -0.167884413 #> 173 -0.53452056 -0.562669349 -0.60912584 2.171424600 -0.62115938 -0.309046129 #> 174 -0.45602462 0.423915646 -0.36754972 0.698932482 -0.63100342 -0.175726731 #> 175 0.17439590 0.039973702 -0.54248415 -0.554252299 0.23527273 -0.258071065 #> 176 0.70914950 -0.679309939 -0.64244668 -0.121903550 2.44526132 -0.375705829 #> 177 0.95444931 -0.271067872 -0.38004504 -0.585581919 -0.06989273 -0.344336558 #> 178 -0.11996387 1.279279977 -0.64244668 -0.685836701 3.24755116 -0.136515143 #> 179 -0.53452056 -0.679309939 -0.19261530 0.435763678 -0.61131533 -0.360021194 #> 180 -0.48546060 -0.518929127 -0.26342209 -0.479061212 -0.63100342 -0.320809606 #> 181 -0.49772559 -0.635569718 -0.56747478 -0.673304853 -0.60639331 2.278918667 #> 182 -0.53206756 1.964543446 -0.63411647 0.391902211 -0.06004869 -0.375705829 #> 183 -0.52716157 -0.169007356 -0.42169609 3.180238349 -0.62608140 -0.383548146 #> 184 -0.32601572 -0.314808094 -0.50499820 -0.610645614 -0.13387904 -0.062013126 #> 185 -0.51489658 3.373950582 -0.27591741 -0.510390832 -0.61131533 -0.383548146 #> 186 -0.51980257 -0.679309939 -0.63411647 -0.641975234 -0.29630582 0.651637772 #> 187 0.38535374 0.783557467 -0.64244668 -0.504124908 1.10154888 -0.371784670 #> 188 -0.53452056 1.993703594 0.05729102 0.084871939 -0.63100342 -0.383548146 #> 189 -0.49281959 -0.353688291 -0.55081436 4.583805304 -0.60639331 3.910120720 #> 190 -0.37262268 -0.339108217 -0.08015745 -0.347476810 -0.62608140 -0.062013126 #> 191 -0.53452056 1.532001256 1.58588470 -0.428933821 -0.57193914 -0.081618920 #> 192 -0.53452056 -0.669589890 -0.27175230 -0.266019799 -0.63100342 -0.379626987 #> 193 3.84898713 -0.518929127 -0.16345956 -0.510390832 0.37308939 -0.348257717 #> 194 -0.52716157 0.715517123 0.39466456 -0.497858984 -0.21755344 -0.379626987 #> 195 3.26026757 0.268394859 -0.03017619 0.153797102 0.67825485 -0.211017160 #> 196 -0.48546060 4.652137053 0.77785425 -0.416401973 -0.63100342 -0.383548146 #> 197 -0.51244358 0.351015277 -0.14679914 -0.685836701 0.41738760 -0.367863511 #> 198 -0.53452056 -0.679309939 -0.63828157 -0.623177462 -0.63100342 -0.383548146 #> 199 1.06483423 -0.674449915 -0.53831904 -0.667038929 -0.18309928 -0.375705829 #> 200 -0.53452056 -0.552949299 0.14059313 -0.002850995 0.27957094 0.196783353 #> Otu00023 Otu00024 Otu00025 Otu00026 Otu00027 Otu00028 #> 1 -0.0069254588 -0.177204415 -0.24303824 -0.22202016 -0.24641906 -0.292554022 #> 2 -0.6642571429 -0.678440995 -0.43616774 -0.29146475 -0.38539990 -0.307394436 #> 3 -0.3747181868 0.177117995 0.04157367 -0.47086329 -0.41259180 -0.168883908 #> 4 -0.3199405465 0.954898895 -0.28369708 0.43770350 -0.36425064 -0.314814643 #> 5 -0.9068438359 -0.695725015 -0.39550890 -0.61553953 -0.06816104 -0.314814643 #> 6 -0.3434166781 0.851194775 0.03649131 -0.45350214 -0.38842122 -0.319761448 #> 7 0.4078195324 -0.669798985 -0.42600303 0.87751927 -0.23131245 -0.295027425 #> 8 -0.0851792307 -0.592020895 -0.35485005 -0.57503018 0.01945732 -0.322234850 #> 9 -0.8990184587 -0.393254665 -0.45141481 -0.62132658 -0.31288816 -0.319761448 #> 10 -0.4060196956 -0.341402605 1.42397434 -0.62132658 -0.40957048 0.214493446 #> 11 0.1965343482 3.962318375 -0.07023815 0.46085170 -0.20412055 -0.322234850 #> 12 1.2451348919 0.324032165 -0.14647348 -0.58660428 0.02852128 -0.319761448 #> 13 0.0713283131 0.488230355 -0.30402650 -0.37248345 -0.39748519 -0.314814643 #> 14 -0.5625272394 -0.280908535 -0.26845001 1.35205733 -0.37935725 -0.322234850 #> 15 -0.6955586517 0.107981915 -0.37009712 -0.26252951 -0.31288816 -0.312341241 #> 16 1.6911813918 -0.713009035 -0.43616774 -0.01368637 -0.32497345 -0.307394436 #> 17 -0.1399568711 0.099339905 0.21437375 -0.25095541 -0.38237857 -0.314814643 #> 18 -0.4138450728 -0.030290245 0.21437375 -0.22780721 -0.39144254 -0.183724322 #> 19 -0.7581616692 -0.021648235 -0.37517948 0.53608334 -0.12556616 -0.307394436 #> 20 0.8538660323 -0.592020895 -0.45141481 -0.54030789 -0.30986683 -0.312341241 #> 21 -0.8911930815 -0.704367025 5.62708227 -0.62132658 -0.41259180 -0.297500827 #> 22 0.7756122604 -0.704367025 0.61587983 -0.32618705 -0.31288816 -0.205984942 #> 23 0.3686926464 -0.721651045 -0.45649716 0.48978694 0.23699254 -0.299974229 #> 24 -0.1243061167 0.203044025 -0.40059125 -0.62132658 0.44848511 -0.314814643 #> 25 1.1434049884 -0.013006225 -0.29386179 -0.62132658 -0.41863444 -0.235665770 #> 26 -0.8285900640 0.168475985 -0.03974402 -0.58660428 0.33367486 -0.089735035 #> 27 -0.8677169499 -0.721651045 -0.14139113 -0.62132658 -0.41561312 1.485822222 #> 28 0.2200104798 -0.678440995 -0.44125010 2.96085712 -0.42467709 4.458851770 #> 29 -0.4216704500 -0.522884815 -0.43616774 -0.10049212 -0.32195212 -0.319761448 #> 30 -0.7816378008 -0.142636375 -0.37517948 -0.58660428 -0.40654915 -0.314814643 #> 31 -0.4920988447 1.680827735 -0.42600303 -0.60396543 -0.40352783 -0.317288045 #> 32 -0.6642571429 1.853667935 -0.31419121 -0.41299279 -0.40957048 -0.210931747 #> 33 1.3546901726 -0.721651045 -0.34976770 -0.59239133 0.49682627 -0.228245563 #> 34 -0.8990184587 -0.410538685 3.72119899 -0.49979854 -0.05909707 -0.260399793 #> 35 -0.2729882833 4.938865505 -0.18204997 -0.52873379 -0.33101609 -0.309867838 #> 36 2.7789088215 -0.661156975 1.47988025 -0.61553953 -0.15275807 -0.314814643 #> 37 -0.5234003535 2.026508135 0.45324446 -0.58081723 0.09801170 -0.314814643 #> 38 -0.9068438359 -0.721651045 0.34143264 -0.59817838 -0.36122932 -0.307394436 #> 39 -0.0069254588 -0.661156975 -0.26845001 -0.43614099 0.49984759 -0.287607218 #> 40 -0.6407810114 0.038845835 -0.25320295 -0.21623311 -0.37935725 -0.314814643 #> 41 1.1825318744 -0.609304915 -0.42092068 -0.61553953 0.26418444 -0.317288045 #> 42 -0.4529719588 0.073413875 -0.42092068 -0.37248345 -0.37935725 5.443265880 #> 43 3.1388761724 -0.721651045 -0.37517948 -0.62132658 -0.34914403 -0.297500827 #> 44 0.4391210411 0.090697895 -0.34976770 -0.59817838 -0.31288816 -0.295027425 #> 45 0.5252001902 -0.410538685 1.46971554 -0.61553953 -0.09535294 -0.317288045 #> 46 1.3077379094 -0.436464715 -0.24303824 0.16571217 -0.37633593 -0.210931747 #> 47 0.5173748130 0.393168245 0.04665602 -0.60396543 0.54818875 -0.317288045 #> 48 1.4877215849 -0.661156975 -0.33960299 -0.62132658 -0.41561312 -0.314814643 #> 49 -0.8442408184 0.151191965 -0.24812059 -0.60396543 -0.41863444 -0.290080620 #> 50 -0.6720825201 0.747490655 -0.18204997 -0.58660428 -0.38842122 -0.267820000 #> 51 -0.3590674325 -0.574736875 -0.44125010 1.11478830 -0.42467709 1.305263855 #> 52 -0.6407810114 0.427736285 -0.21762646 -0.60975248 -0.35518667 -0.302447632 #> 53 1.7459590322 -0.704367025 6.00825892 -0.60975248 0.58746594 -0.223298758 #> 54 1.4877215849 -0.522884815 1.16985657 -0.41877984 -0.36425064 -0.262873195 #> 55 -0.7425109149 0.254896085 -0.17188526 0.50714809 -0.10441691 -0.314814643 #> 56 0.8225645235 -0.713009035 0.03649131 -0.61553953 -0.36727196 -0.314814643 #> 57 -0.3590674325 -0.557452855 -0.45141481 1.07427895 0.25209915 -0.109522253 #> 58 -0.8911930815 -0.669798985 1.25117426 -0.62132658 -0.42467709 0.738854731 #> 59 -0.1008299851 0.445020305 -0.45141481 -0.38984460 0.56027404 -0.312341241 #> 60 0.0165506728 -0.254982505 0.61587983 0.62867613 0.19167270 -0.277713609 #> 61 -0.4294958272 -0.488316775 -0.45649716 -0.28567770 -0.37331461 -0.317288045 #> 62 -0.2338613974 -0.427822705 0.39733855 -0.40720575 -0.17390732 2.002763299 #> 63 1.9259427076 -0.592020895 -0.44633245 0.99904731 -0.42165577 -0.230718965 #> 64 -0.3981943184 -0.713009035 0.88524467 0.14256397 0.11613964 -0.317288045 #> 65 -0.6564317657 -0.531526825 -0.47174423 -0.55188199 8.52145880 0.006727654 #> 66 -0.6955586517 -0.177204415 -0.47174423 -0.62132658 -0.23433377 -0.322234850 #> 67 -0.5625272394 -0.687083005 -0.47174423 2.85669023 0.33367486 -0.322234850 #> 68 -0.3121151693 0.393168245 -0.45649716 0.17728626 -0.39748519 -0.319761448 #> 69 1.1590557428 -0.721651045 0.02124425 1.73400261 0.03758525 -0.309867838 #> 70 0.1808835938 1.940088035 -0.43616774 -0.54030789 -0.38539990 -0.319761448 #> 71 1.0181989533 -0.358686625 1.11395066 -0.61553953 -0.31893080 -0.304921034 #> 72 -0.3355913009 -0.721651045 -0.30910886 1.01640846 -0.16182203 -0.275240206 #> 73 -0.5860033710 -0.038932255 -0.42092068 -0.23359426 -0.26756832 -0.314814643 #> 74 -0.5781779938 -0.177204415 -0.36501477 0.14256397 0.83521439 0.006727654 #> 75 -0.4686227131 0.894404825 0.01107953 -0.30882590 -0.35216535 -0.304921034 #> 76 -0.6486063886 0.531440405 -0.44125010 -0.52294674 -0.36727196 -0.307394436 #> 77 -0.4842734675 0.721564625 -0.47174423 2.76409744 -0.37029328 -0.309867838 #> 78 -0.9068438359 1.015392965 0.94115058 -0.23938131 -0.39446386 -0.292554022 #> 79 -0.4451465816 -0.237698485 -0.26336766 -0.08313097 -0.28569625 -0.314814643 #> 80 0.0791536903 -0.721651045 0.36176206 -0.61553953 -0.42467709 -0.248032781 #> 81 -0.7190347833 -0.687083005 -0.29894415 0.60552794 -0.30986683 -0.322234850 #> 82 0.0087252956 1.145023115 -0.39042654 -0.23938131 -0.11045955 -0.270293402 #> 83 1.9885457251 -0.315476575 -0.33452063 -0.60396543 -0.40654915 -0.257926390 #> 84 0.2747881201 -0.721651045 -0.32943828 2.66571759 2.25221464 -0.314814643 #> 85 -0.8833677043 -0.229056475 -0.46157952 1.49673357 0.05269186 0.911992891 #> 86 -0.9068438359 -0.626588935 -0.45141481 1.59511342 1.12224003 -0.322234850 #> 87 -0.2495121518 5.517880175 -0.38534419 -0.61553953 -0.40352783 -0.309867838 #> 88 -0.2886390377 0.721564625 -0.08040286 -0.22780721 -0.21922716 -0.275240206 #> 89 -0.5234003535 0.133907945 -0.30910886 -0.19308491 -0.41561312 -0.173830713 #> 90 0.0008999184 0.082055885 -0.41075596 0.40876825 -0.42165577 -0.302447632 #> 91 -0.7659870464 -0.393254665 -0.44633245 0.45506465 -0.33705874 -0.302447632 #> 92 -0.7738124236 0.954898895 0.85983289 -0.30882590 -0.41561312 1.837045346 #> 93 0.1417567078 -0.721651045 6.81127108 -0.62132658 -0.14369410 -0.302447632 #> 94 -0.6016541254 -0.341402605 -0.46157952 1.02798256 -0.10743823 -0.149096690 #> 95 0.7286599972 0.254896085 -0.07532051 -0.53452084 -0.30080287 -0.319761448 #> 96 -0.9068438359 0.194402015 -0.46157952 -0.34354820 -0.42467709 -0.322234850 #> 97 1.9181173304 -0.704367025 -0.27353237 -0.62132658 0.98325919 -0.248032781 #> 98 -0.4529719588 0.142549955 0.31093850 0.24094381 -0.35820799 -0.277713609 #> 99 0.7286599972 -0.713009035 -0.07023815 -0.59239133 0.11311831 -0.280187011 #> 100 -0.5234003535 -0.704367025 -0.46666187 -0.60396543 0.06175583 3.006964628 #> 101 0.0243760500 0.514156385 -0.28369708 -0.61553953 3.79913175 -0.322234850 #> 102 5.4160609352 -0.609304915 -0.43108539 -0.61553953 5.83248179 -0.275240206 #> 103 1.1512303656 -0.609304915 -0.44125010 -0.54609494 0.83823571 -0.205984942 #> 104 -0.9068438359 -0.574736875 -0.28369708 0.40298120 -0.42467709 -0.319761448 #> 105 0.1495820850 0.254896085 -0.11597935 -0.59817838 -0.22526980 -0.282660413 #> 106 -0.7972885552 -0.056216275 -0.21254410 -0.59239133 0.43942114 -0.312341241 #> 107 -0.2260360202 -0.229056475 -0.34468534 0.61710203 -0.30080287 0.169972205 #> 108 -0.5468764851 1.335147335 -0.45141481 1.46779833 -0.12254484 -0.309867838 #> 109 1.1121034796 -0.678440995 -0.39550890 -0.59817838 -0.32195212 -0.312341241 #> 110 0.7599615060 -0.479674765 -0.45141481 0.94696386 -0.05305442 -0.309867838 #> 111 -0.6407810114 -0.289550545 1.47479789 0.06154527 -0.40957048 0.058669102 #> 112 -0.5468764851 -0.721651045 -0.25320295 -0.40141870 -0.07722500 -0.314814643 #> 113 -0.8990184587 -0.721651045 -0.24303824 -0.61553953 -0.42165577 -0.314814643 #> 114 -0.6486063886 -0.082142305 -0.30910886 -0.20465901 -0.22829113 -0.319761448 #> 115 -0.4842734675 0.073413875 -0.41583832 -0.62132658 0.20980063 -0.277713609 #> 116 0.1261059534 0.583292465 -0.43108539 -0.60396543 -0.40352783 -0.025426576 #> 117 0.0243760500 -0.514242805 -0.45141481 -0.62132658 -0.39748519 0.763588754 #> 118 -0.0304015904 -0.721651045 -0.27861472 -0.15257556 0.01945732 -0.319761448 #> 119 -0.7033840289 2.389472555 -0.45141481 -0.62132658 -0.38237857 -0.317288045 #> 120 1.8320381813 -0.652514965 -0.20237939 -0.61553953 0.10103302 -0.309867838 #> 121 -0.5547018623 -0.548810845 -0.47174423 -0.44771509 0.03154261 -0.272766804 #> 122 -0.1869091342 -0.254982505 3.03508101 -0.53452084 -0.31893080 -0.250506184 #> 123 -0.2260360202 -0.462390745 -0.46157952 2.06965148 -0.42467709 6.323797094 #> 124 0.1652328394 1.170949145 -0.44125010 -0.60975248 -0.42467709 3.514012096 #> 125 -0.9068438359 -0.531526825 -0.33960299 4.84743529 -0.38842122 -0.299974229 #> 126 -0.6329556342 3.564785915 -0.24812059 -0.52294674 -0.39748519 -0.245559379 #> 127 -0.9068438359 -0.367328635 -0.40059125 0.37983300 -0.36727196 -0.314814643 #> 128 1.6677052603 0.185760005 3.05032807 0.39140710 0.28533370 -0.314814643 #> 129 -0.0851792307 -0.522884815 -0.16680290 5.25252877 0.85032100 -0.280187011 #> 130 -0.6251302570 -0.695725015 0.10764429 -0.60975248 -0.27663229 -0.322234850 #> 131 -0.9068438359 -0.419180695 -0.42600303 -0.51715969 -0.02586252 -0.317288045 #> 132 1.4407693217 -0.592020895 -0.44125010 -0.55188199 1.61169427 -0.285133816 #> 133 0.4547717955 -0.488316775 0.03649131 -0.17572376 -0.21318451 -0.248032781 #> 134 -0.2808136605 0.427736285 0.24486788 -0.45928919 -0.29476022 -0.314814643 #> 135 -0.0695284764 -0.678440995 -0.33452063 -0.59239133 0.91679010 -0.317288045 #> 136 0.3217403832 -0.280908535 -0.39550890 -0.54030789 0.65997768 0.031461677 #> 137 0.4547717955 0.868478795 -0.44125010 0.07890642 -0.36727196 -0.136729678 #> 138 -0.5312257307 0.453662315 -0.47174423 -0.44192804 -0.40957048 1.082657649 #> 139 0.0400268043 -0.133994365 -0.41583832 1.91918820 0.06477715 -0.322234850 #> 140 -0.9068438359 2.795647025 -0.44125010 -0.55188199 -0.41561312 -0.317288045 #> 141 -0.4920988447 -0.583378885 -0.47174423 2.26062412 0.17656609 -0.116942460 #> 142 -0.7894631780 -0.237698485 -0.21762646 -0.42456689 -0.42467709 -0.099628644 #> 143 -0.5155749763 0.038845835 -0.24812059 0.23515676 -0.42467709 -0.015532966 #> 144 0.1417567078 0.142549955 0.09239722 1.66455801 -0.27663229 0.320849745 #> 145 -0.8833677043 -0.315476575 -0.15155584 -0.61553953 -0.40050651 5.809329418 #> 146 -0.3668928096 -0.609304915 -0.44633245 0.68075958 -0.42467709 -0.292554022 #> 147 -0.8990184587 -0.713009035 -0.44125010 -0.60975248 -0.31893080 -0.314814643 #> 148 -0.1869091342 -0.073500295 -0.41075596 1.02798256 0.45452776 -0.223298758 #> 149 -0.1008299851 -0.626588935 -0.39042654 -0.11785327 -0.39748519 -0.299974229 #> 150 0.0322014271 2.372188535 -0.39042654 0.42612940 -0.40352783 -0.322234850 #> 151 -0.2495121518 1.231443215 -0.46157952 -0.60396543 -0.42467709 -0.304921034 #> 152 0.3921687780 1.352431355 -0.20746175 -0.46507624 -0.41259180 -0.280187011 #> 153 -0.8442408184 0.548724425 -0.43108539 0.60552794 -0.34008006 -0.307394436 #> 154 1.2060080059 -0.617946925 -0.36501477 -0.62132658 0.43639982 -0.245559379 #> 155 0.9086436726 -0.531526825 -0.22779117 -0.56924313 0.30648295 0.706700501 #> 156 -0.4686227131 -0.522884815 -0.42092068 -0.61553953 -0.42165577 -0.314814643 #> 157 -0.8911930815 -0.687083005 0.98180942 -0.62132658 -0.33705874 -0.210931747 #> 158 0.9947228218 -0.220414465 0.74293871 0.07311937 -0.41561312 -0.295027425 #> 159 -0.6564317657 -0.125352355 -0.40567361 2.60784710 -0.41561312 -0.277713609 #> 160 -0.6877332745 -0.713009035 -0.34468534 -0.59239133 0.64184975 -0.139203081 #> 161 0.4078195324 -0.669798985 -0.47174423 3.04187582 -0.41561312 -0.314814643 #> 162 -0.8990184587 -0.721651045 -0.14647348 -0.62132658 -0.37633593 -0.285133816 #> 163 1.1121034796 -0.721651045 -0.35993241 0.74441713 -0.29173890 -0.290080620 #> 164 0.9712466902 -0.168562405 -0.32435592 -0.59817838 0.79895852 -0.272766804 #> 165 0.2356612341 -0.566094865 -0.33960299 -0.49979854 5.67839434 -0.297500827 #> 166 -0.3434166781 1.369715375 -0.46157952 -0.60975248 -0.41561312 4.716085608 #> 167 -0.5468764851 0.419094275 -0.46666187 3.73053472 -0.40654915 -0.307394436 #> 168 -0.5155749763 -0.721651045 -0.40567361 -0.59817838 -0.34008006 -0.287607218 #> 169 3.5849226723 -0.704367025 0.95639764 -0.53452084 0.37597337 -0.304921034 #> 170 -0.9068438359 -0.687083005 -0.39042654 -0.62132658 -0.41863444 -0.312341241 #> 171 -0.5390511079 0.617860505 -0.07532051 -0.37827050 -0.37633593 -0.314814643 #> 172 -0.4529719588 -0.626588935 -0.46157952 -0.26252951 2.99243865 -0.077368024 #> 173 -0.8207646868 -0.687083005 -0.40567361 -0.62132658 0.99836580 0.019094666 #> 174 0.4312956639 1.741321805 -0.39042654 -0.51137264 -0.15275807 -0.290080620 #> 175 -0.0695284764 0.107981915 -0.45649716 -0.50558559 -0.29778154 -0.295027425 #> 176 0.4547717955 4.307998775 1.64759798 -0.58660428 -0.37029328 -0.304921034 #> 177 -0.1321314939 -0.220414465 -0.24812059 0.70969483 -0.38842122 -0.319761448 #> 178 -0.9068438359 -0.410538685 -0.45649716 -0.62132658 -0.42165577 -0.299974229 #> 179 0.2982642517 -0.574736875 -0.16680290 -0.06576982 0.68414826 -0.319761448 #> 180 -0.5077495991 0.280822115 -0.44633245 -0.33776115 -0.37029328 0.244174274 #> 181 -0.6877332745 -0.522884815 0.01616189 0.77335237 -0.08931029 -0.302447632 #> 182 -0.5938287482 0.436378295 -0.46157952 1.04534371 -0.20109922 -0.196091333 #> 183 -0.4451465816 -0.367328635 -0.22779117 -0.19308491 -0.30684551 0.273855101 #> 184 -0.7738124236 0.151191965 0.03649131 -0.51137264 -0.36727196 1.483348819 #> 185 3.0997492864 -0.617946925 -0.42092068 -0.56924313 0.18260873 -0.314814643 #> 186 -0.8677169499 0.393168245 -0.47174423 0.21200856 -0.39144254 -0.069947817 #> 187 -0.9068438359 -0.609304915 -0.46157952 -0.61553953 -0.42165577 -0.309867838 #> 188 2.7710834443 -0.721651045 -0.34468534 -0.60396543 -0.08628897 0.773482363 #> 189 -0.8755423271 -0.047574265 -0.43108539 -0.43614099 -0.41863444 0.187286021 #> 190 -0.3355913009 -0.246340495 -0.40567361 1.58353932 -0.11650220 -0.302447632 #> 191 -0.6094795026 -0.479674765 -0.42092068 -0.45350214 -0.41259180 -0.245559379 #> 192 0.1104551991 -0.721651045 0.80900933 -0.59239133 -0.40957048 -0.307394436 #> 193 -0.5077495991 0.609218495 0.12289135 -0.56924313 -0.14671542 -0.297500827 #> 194 3.4518912600 -0.687083005 -0.40567361 1.55460407 0.06175583 -0.260399793 #> 195 -0.4842734675 0.315390155 2.58783373 -0.52873379 0.17958741 -0.282660413 #> 196 2.4658937338 -0.721651045 1.35282136 -0.16414966 -0.42467709 -0.322234850 #> 197 -0.0382269676 -0.669798985 -0.39550890 -0.58660428 -0.40352783 -0.161463701 #> 198 -0.9068438359 -0.721651045 0.15338549 -0.62132658 -0.41561312 -0.297500827 #> 199 -0.8598915727 0.107981915 0.40750326 -0.60396543 -0.27058964 -0.299974229 #> 200 -0.0304015904 0.004277795 -0.14647348 -0.55766903 -0.23131245 -0.317288045 #> Otu00029 Otu00030 Otu00031 Otu00032 Otu00033 #> 1 0.695821495 0.39193166 0.2730666130 1.850227727 -0.352365855 #> 2 -0.252260766 0.44720466 -0.1402887916 -0.493938512 0.152851091 #> 3 0.066720182 -0.59377025 -0.4629076438 -0.357825634 -0.288065517 #> 4 -0.473775313 -0.71352842 1.5937875395 -0.501500339 -0.435037719 #> 5 -0.571241714 0.33665866 -0.5637260352 -0.577118604 0.952012441 #> 6 -0.216818439 -0.52928508 -0.2411071829 0.337862411 0.079364989 #> 7 3.079318020 0.19847615 -0.3520074134 -0.395634767 -0.618752972 #> 8 0.031277854 -0.17001055 -0.3822529308 -0.357825634 -0.444223482 #> 9 -0.730732188 -0.11473754 0.3335576478 -0.070476224 -0.168650602 #> 10 0.137604837 -0.76880143 -0.4830713221 -0.516623992 0.740739900 #> 11 -0.305424257 0.16162748 -0.5939715526 -0.577118604 -0.600381447 #> 12 -0.730732188 -0.54770941 -0.5233986787 0.148816747 0.465167021 #> 13 -0.269981930 -0.62140675 -0.2209435046 0.103445788 -0.453409245 #> 14 -0.526938804 0.54853851 0.1420027042 0.572279035 -0.646310260 #> 15 -0.535799386 -0.33582956 -0.2411071829 0.436166157 -0.655496023 #> 16 -0.340866585 -0.38189040 -0.4729894830 -0.569556778 1.071427356 #> 17 -0.181376111 1.20260239 -0.4427439656 1.071359589 -0.582009922 #> 18 0.279374147 0.65908451 0.0109387955 -0.100723530 0.106922277 #> 19 0.270513565 0.72356969 -0.0797977567 0.466413463 -0.232950941 #> 20 1.431249791 0.85254003 0.4646215565 -0.546871298 0.446795495 #> 21 -0.730732188 -0.76880143 -0.5939715526 -0.569556778 1.787916843 #> 22 2.937548710 -0.28055656 -0.5536441961 -0.456129379 -0.159464840 #> 23 -0.004164473 0.04186930 -0.3217618960 0.141254920 -0.673867548 #> 24 0.146465418 1.07363205 -0.5838897135 0.504222596 0.116108040 #> 25 -0.730732188 0.79726702 -0.1806161481 -0.577118604 -0.021678400 #> 26 -0.730732188 -0.70431626 -0.5637260352 -0.138532663 4.424230724 #> 27 -0.686429278 -0.76880143 -0.5838897135 -0.531747645 1.705244979 #> 28 0.562912767 -0.76880143 -0.5939715526 -0.577118604 -0.490152295 #> 29 0.279374147 -0.52928508 -0.1402887916 -0.357825634 1.098984644 #> 30 -0.721871606 7.25499635 -0.5637260352 0.020265695 -0.692239074 #> 31 -0.128212620 1.34078490 1.6643604135 -0.569556778 -0.012492637 #> 32 1.378086300 -0.06867671 -0.5838897135 2.530792119 -0.627938735 #> 33 0.075580763 -0.43716340 -0.5939715526 -0.577118604 0.428423970 #> 34 -0.243400184 -0.76880143 -0.5838897135 -0.577118604 -0.223765178 #> 35 0.199628910 0.76041836 0.3033121304 -0.441005726 -0.407480431 #> 36 2.388192634 3.49643206 -0.5939715526 -0.509062165 -0.407480431 #> 37 -0.695289860 -0.67667975 -0.4830713221 0.821819312 -0.701424836 #> 38 -0.721871606 -0.03182804 -0.5939715526 -0.577118604 -0.012492637 #> 39 -0.234539602 2.08697046 0.5251125913 -0.350263807 -0.591195684 #> 40 -0.323145421 0.04186930 -0.1402887916 0.065636655 -0.609567210 #> 41 1.316062227 -0.34504173 -0.5233986787 -0.448567553 0.290637530 #> 42 -0.367448331 -0.06867671 -0.2713527003 -0.123409010 -0.692239074 #> 43 -0.721871606 -0.76880143 -0.5738078743 -0.577118604 -0.609567210 #> 44 0.748984986 0.39193166 1.3316597220 -0.478814859 -0.379923143 #> 45 1.989466449 -0.75037709 -0.4931531613 -0.289769194 2.936137175 #> 46 -0.057327965 -0.76880143 -0.4729894830 -0.569556778 2.467663279 #> 47 -0.730732188 -0.73195276 -0.3217618960 -0.297331021 -0.141093314 #> 48 3.495765369 -0.20685922 -0.5435623569 -0.524185818 -0.058421450 #> 49 -0.385169494 -0.72274059 -0.2108616655 -0.229274582 0.492724309 #> 50 -0.624405205 -0.63983108 -0.4124984482 0.489098943 0.042621939 #> 51 -0.588962878 2.18830430 -0.4830713221 -0.561994951 3.110666665 #> 52 -0.137073202 0.12477881 0.6662583392 1.056235936 -0.232950941 #> 53 -0.730732188 -0.76880143 -0.5939715526 -0.561994951 -0.692239074 #> 54 -0.305424257 -0.75037709 -0.5738078743 -0.577118604 -0.398294669 #> 55 -0.535799386 -0.63983108 -0.4225802873 0.050513002 -0.591195684 #> 56 -0.730732188 0.92623737 -0.5536441961 -0.478814859 0.446795495 #> 57 -0.367448331 2.16066779 -0.2511890220 5.563084576 -0.600381447 #> 58 -0.721871606 -0.75037709 -0.5838897135 -0.546871298 0.042621939 #> 59 -0.721871606 -0.23449572 2.7128716834 -0.577118604 1.622573115 #> 60 0.376840547 0.43799250 -0.4024166090 -0.115847183 -0.122721789 #> 61 0.111023091 0.09714230 4.3360477841 -0.055352571 -0.582009922 #> 62 -0.562381132 0.13399097 -0.2209435046 -0.577118604 -0.021678400 #> 63 1.750230739 0.22611265 -0.5133168395 -0.463691206 -0.554452634 #> 64 -0.314284839 0.36429516 2.6422988095 0.254682319 0.079364989 #> 65 -0.721871606 -0.75958926 -0.3923347699 -0.577118604 -0.085978738 #> 66 0.252792401 -0.54770941 -0.5939715526 -0.569556778 -0.333994330 #> 67 -0.358587749 -0.54770941 -0.4024166090 -0.554433125 -0.471780770 #> 68 -0.677568696 0.15241531 0.6965038566 0.012703869 -0.315622805 #> 69 0.642658004 -0.19764705 -0.0596340785 0.156378574 -0.517709583 #> 70 0.155326000 0.24453698 2.8741811096 -0.577118604 -0.499338058 #> 71 0.935057206 -0.48322424 -0.5939715526 0.942808538 -0.389108906 #> 72 -0.491496477 0.21690048 0.1117571868 -0.577118604 -0.343180093 #> 73 -0.730732188 -0.02261587 -0.4729894830 0.186625880 -0.673867548 #> 74 0.048999018 -0.46479990 -0.4225802873 -0.191465449 -0.425851957 #> 75 -0.145933784 1.34078490 -0.3217618960 0.436166157 -0.232950941 #> 76 -0.730732188 1.31314840 4.7393213494 0.141254920 -0.453409245 #> 77 -0.730732188 -0.05025237 4.3864569797 1.404079959 0.079364989 #> 78 -0.730732188 -0.76880143 -0.1302069524 -0.289769194 2.081861248 #> 79 -0.243400184 0.63144801 -0.3520074134 -0.168779969 -0.673867548 #> 80 6.614690190 0.31823432 -0.5939715526 -0.577118604 -0.389108906 #> 81 -0.394030076 -0.05025237 -0.5334805178 -0.342701980 -0.664681786 #> 82 1.759091320 -0.76880143 -0.5939715526 -0.577118604 0.162036853 #> 83 2.007187613 -0.28055656 -0.5334805178 -0.350263807 0.520281597 #> 84 -0.730732188 0.35508299 -0.5939715526 -0.478814859 -0.205393653 #> 85 -0.633265787 -0.08710104 -0.1201251133 -0.577118604 -0.710610599 #> 86 -0.101630874 0.08793014 -0.3419255742 -0.577118604 -0.269693992 #> 87 1.218595826 0.21690048 0.2125755781 1.094045069 -0.131907552 #> 88 -0.721871606 -0.40031473 -0.1906979872 -0.577118604 0.125293803 #> 89 -0.207957857 -0.45558774 -0.5939715526 -0.509062165 -0.425851957 #> 90 -0.730732188 -0.30819306 0.8376496045 -0.577118604 0.667253799 #> 91 -0.730732188 -0.76880143 1.7450151266 -0.093161703 -0.067607213 #> 92 -0.544659968 -0.17001055 -0.1503706307 -0.078038050 -0.582009922 #> 93 0.881893714 -0.76880143 -0.3520074134 -0.577118604 -0.398294669 #> 94 -0.137073202 -0.73195276 -0.1402887916 -0.577118604 -0.554452634 #> 95 -0.624405205 -0.29898089 -0.2612708612 0.383233371 -0.333994330 #> 96 -0.730732188 -0.76880143 -0.5939715526 2.349308281 -0.591195684 #> 97 0.243931819 -0.59377025 -0.5939715526 -0.577118604 2.807536497 #> 98 -0.482635895 0.42878033 1.4223962743 2.530792119 -0.159464840 #> 99 -0.730732188 -0.69510409 -0.5939715526 -0.561994951 -0.600381447 #> 100 -0.730732188 0.40114383 0.1420027042 -0.569556778 -0.600381447 #> 101 -0.704150442 0.91702520 -0.5637260352 -0.561994951 -0.389108906 #> 102 -0.491496477 2.38175981 -0.5939715526 -0.577118604 -0.683053311 #> 103 -0.243400184 -0.30819306 -0.4326621264 -0.569556778 -0.370737381 #> 104 1.316062227 -0.76880143 -0.5939715526 -0.009981611 -0.343180093 #> 105 0.040138436 0.56696284 -0.1201251133 0.156378574 -0.232950941 #> 106 -0.668708114 -0.23449572 -0.4528258047 0.020265695 -0.710610599 #> 107 0.261652983 1.19339022 0.4444578782 -0.138532663 -0.600381447 #> 108 -0.730732188 0.74199402 -0.5838897135 0.564717209 -0.582009922 #> 109 -0.704150442 -0.55692158 -0.4931531613 -0.561994951 -0.040049925 #> 110 -0.261121348 1.46975524 0.3133939695 -0.183903622 -0.288065517 #> 111 -0.367448331 -0.22528355 3.8823650230 -0.055352571 -0.572824159 #> 112 -0.721871606 -0.75958926 -0.5939715526 -0.531747645 -0.710610599 #> 113 -0.128212620 0.83411569 3.5496643316 0.678144607 -0.315622805 #> 114 -0.650986951 -0.10552538 -0.4830713221 -0.546871298 -0.664681786 #> 115 -0.500357059 0.99072254 3.0052450183 0.715953740 0.033436176 #> 116 -0.243400184 -0.56613375 -0.3419255742 -0.259521888 -0.361551618 #> 117 0.917336042 -0.76880143 -0.4427439656 -0.365387460 2.100232773 #> 118 0.616076258 0.43799250 0.7569948914 3.377716696 -0.563638396 #> 119 -0.225679020 -0.76880143 1.0090408698 2.939130754 0.703996850 #> 120 2.512240780 0.53932634 -0.5838897135 -0.546871298 -0.131907552 #> 121 -0.394030076 0.44720466 -0.4830713221 -0.531747645 -0.683053311 #> 122 0.111023091 -0.41873907 1.2409231698 0.950370364 -0.333994330 #> 123 -0.721871606 -0.75037709 -0.2915163786 -0.448567553 -0.683053311 #> 124 0.261652983 0.06029364 -0.3520074134 -0.161218143 -0.609567210 #> 125 -0.721871606 0.94466170 -0.3822529308 0.247120493 -0.012492637 #> 126 0.137604837 -0.75958926 -0.4225802873 -0.569556778 -0.058421450 #> 127 -0.713011024 -0.56613375 0.1117571868 -0.554433125 -0.232950941 #> 128 0.075580763 -0.51086074 -0.5233986787 -0.168779969 3.955756829 #> 129 -0.500357059 -0.56613375 -0.4427439656 -0.463691206 -0.471780770 #> 130 -0.642126369 -0.05946454 -0.5939715526 -0.456129379 -0.333994330 #> 131 2.972991038 -0.66746759 -0.5233986787 0.050513002 1.493972438 #> 132 -0.730732188 0.35508299 -0.4024166090 -0.040228917 0.823411764 #> 133 2.078072268 -0.70431626 0.0109387955 -0.463691206 -0.040049925 #> 134 -0.473775313 -0.54770941 -0.1402887916 0.315176932 -0.517709583 #> 135 2.645149508 -0.53849724 -0.5838897135 -0.561994951 1.319442948 #> 136 0.350258802 -0.45558774 1.1804321350 1.313338040 -0.049235688 #> 137 -0.269981930 -0.20685922 3.0254086966 1.857789554 -0.591195684 #> 138 0.093301927 -0.54770941 -0.4528258047 2.583724905 -0.683053311 #> 139 0.607215676 -0.66746759 -0.2209435046 7.158629984 -0.517709583 #> 140 -0.730732188 0.83411569 2.2087797267 -0.577118604 3.312753443 #> 141 -0.110491456 1.50660391 0.2125755781 0.368109718 -0.600381447 #> 142 -0.305424257 -0.75037709 -0.1705343090 -0.569556778 -0.710610599 #> 143 -0.278842512 -0.06867671 -0.3217618960 0.179064053 -0.683053311 #> 144 -0.571241714 0.50247767 -0.0293885611 2.349308281 -0.582009922 #> 145 1.271759317 -0.29898089 -0.4427439656 -0.365387460 -0.710610599 #> 146 -0.110491456 0.47484117 0.0008569563 0.549593556 0.051807701 #> 147 -0.730732188 -0.76880143 -0.5838897135 -0.577118604 -0.673867548 #> 148 -0.367448331 0.19847615 1.9164063918 0.632773648 -0.710610599 #> 149 -0.642126369 -0.74116493 -0.4326621264 -0.569556778 -0.701424836 #> 150 -0.730732188 4.27025412 -0.5939715526 -0.577118604 -0.701424836 #> 151 -0.402890658 -0.38189040 -0.4629076438 -0.577118604 0.805040239 #> 152 0.740124404 -0.36346606 -0.2511890220 0.050513002 -0.609567210 #> 153 -0.580102296 -0.65825542 0.0109387955 1.162101508 1.025498543 #> 154 -0.704150442 -0.74116493 -0.2209435046 2.825703355 -0.655496023 #> 155 0.004696108 0.90781303 -0.5133168395 -0.448567553 0.005878888 #> 156 0.846451387 -0.07788888 -0.2612708612 -0.561994951 -0.664681786 #> 157 -0.713011024 -0.76880143 -0.5838897135 -0.561994951 -0.710610599 #> 158 -0.367448331 -0.76880143 -0.0797977567 0.156378574 -0.637124498 #> 159 -0.163654947 -0.40031473 2.0676339788 -0.569556778 -0.646310260 #> 160 0.004696108 -0.48322424 -0.5738078743 -0.539309471 -0.370737381 #> 161 1.094547680 -0.48322424 -0.3923347699 -0.433443899 -0.591195684 #> 162 -0.730732188 0.41956816 -0.5939715526 -0.577118604 1.319442948 #> 163 0.181907746 -0.61219458 -0.5637260352 -0.569556778 -0.444223482 #> 164 -0.721871606 -0.25292005 -0.4830713221 -0.501500339 0.465167021 #> 165 -0.030746219 0.01423280 -0.5838897135 -0.554433125 -0.223765178 #> 166 -0.713011024 -0.76880143 0.6662583392 -0.577118604 -0.710610599 #> 167 -0.713011024 4.09522294 1.1602684568 -0.577118604 2.302319551 #> 168 2.388192634 -0.70431626 -0.5939715526 -0.577118604 1.007127017 #> 169 0.270513565 -0.76880143 -0.5738078743 -0.539309471 0.593767698 #> 170 -0.730732188 -0.76880143 0.1016753477 -0.569556778 -0.710610599 #> 171 -0.571241714 -0.61219458 -0.1100432742 0.534469902 -0.600381447 #> 172 -0.287703094 -0.48322424 -0.4225802873 -0.524185818 -0.407480431 #> 173 1.422389209 -0.61219458 -0.5738078743 -0.577118604 2.752421921 #> 174 0.456585784 0.14320314 -0.1705343090 -0.546871298 1.806288368 #> 175 -0.296563675 -0.39110257 -0.0697159176 -0.493938512 -0.627938735 #> 176 0.562912767 1.38684574 -0.5939715526 0.587402689 -0.012492637 #> 177 0.952778369 -0.48322424 -0.1604524698 -0.244398235 -0.683053311 #> 178 -0.721871606 -0.75037709 -0.5838897135 -0.214150929 1.705244979 #> 179 0.217350073 -0.52928508 -0.5435623569 -0.577118604 5.278506651 #> 180 -0.261121348 0.88017653 -0.1604524698 0.557155382 -0.673867548 #> 181 -0.039606801 -0.54770941 -0.1604524698 0.111007614 -0.627938735 #> 182 -0.083909710 -0.64904325 -0.2612708612 -0.577118604 -0.306437042 #> 183 -0.199097275 1.20260239 -0.2108616655 -0.123409010 -0.554452634 #> 184 -0.668708114 -0.30819306 -0.3116800568 1.600687450 -0.572824159 #> 185 0.297095310 2.55679099 -0.5939715526 -0.554433125 -0.627938735 #> 186 -0.713011024 -0.62140675 -0.0293885611 -0.380511113 -0.701424836 #> 187 -0.721871606 -0.75958926 -0.4225802873 -0.085599877 -0.609567210 #> 188 2.990712202 -0.41873907 -0.5939715526 -0.554433125 1.392929049 #> 189 -0.730732188 -0.56613375 -0.4326621264 -0.380511113 -0.710610599 #> 190 0.102162509 -0.25292005 0.0815116694 -0.304892848 -0.609567210 #> 191 -0.668708114 -0.25292005 -0.5133168395 -0.554433125 -0.343180093 #> 192 -0.730732188 -0.32661739 0.6158491435 -0.577118604 -0.205393653 #> 193 0.057859600 -0.63061892 -0.3822529308 0.413480677 -0.278879754 #> 194 -0.509217641 0.14320314 -0.4528258047 -0.577118604 0.162036853 #> 195 -0.668708114 0.11556664 -0.3721710916 0.526908076 -0.692239074 #> 196 -0.730732188 -0.76880143 -0.5838897135 -0.577118604 0.906083628 #> 197 -0.154794365 -0.47401207 2.1079613354 -0.093161703 -0.572824159 #> 198 -0.721871606 -0.67667975 -0.5939715526 -0.577118604 -0.627938735 #> 199 -0.713011024 -0.74116493 -0.4225802873 -0.161218143 -0.232950941 #> 200 -0.730732188 -0.47401207 -0.3217618960 0.511784423 -0.278879754 #> Otu00034 Otu00035 Otu00036 Otu00037 Otu00038 #> 1 -0.1482914828 -0.28857253 -0.337797955 -0.28026882 -0.269009738 #> 2 -0.1507314908 1.32771762 -0.337797955 -0.40104181 -0.269009738 #> 3 -0.1360914431 -0.09645535 -0.309626997 5.43380328 -0.251964926 #> 4 -0.1507314908 -0.24263146 -0.337797955 -0.28781713 -0.254805728 #> 5 0.0469091527 -0.38463111 -0.332163763 -0.55200805 -0.269009738 #> 6 -0.1507314908 -0.31363129 -0.337797955 -0.02362622 -0.269009738 #> 7 -0.1507314908 -0.38880757 3.099058896 -0.19723739 -0.269009738 #> 8 -0.1507314908 -0.25098438 -0.337797955 -0.13685089 -0.266168936 #> 9 -0.0775312524 -0.38880757 -0.337797955 0.32359613 -0.084357613 #> 10 -0.0604511968 -0.30110191 0.811577123 -0.51426649 -0.254805728 #> 11 -0.1507314908 1.31518824 -0.337797955 0.52740055 -0.269009738 #> 12 0.6935112580 -0.25098438 -0.337797955 -0.54445974 -0.266168936 #> 13 -0.1458514749 5.21182571 -0.337797955 -0.55200805 -0.257646530 #> 14 -0.1507314908 -0.31780775 -0.337797955 -0.43878337 -0.269009738 #> 15 -0.1507314908 -0.20921978 0.158010902 -0.40859012 -0.269009738 #> 16 -0.0824112683 -0.36792527 -0.337797955 1.16145875 -0.269009738 #> 17 -0.1507314908 -0.38880757 0.963700295 -0.29536544 0.049160077 #> 18 -0.1507314908 -0.17580810 -0.337797955 0.01411534 -0.200830492 #> 19 -0.1458514749 0.28360254 -0.337797955 -0.43123506 -0.269009738 #> 20 -0.1482914828 -0.36792527 -0.337797955 1.87100007 -0.269009738 #> 21 0.3616701775 -0.38880757 -0.337797955 7.21520489 -0.251964926 #> 22 -0.1214513954 -0.38463111 -0.337797955 0.18772652 -0.232079313 #> 23 -0.1507314908 0.35460236 -0.337797955 -0.25007557 -0.269009738 #> 24 -0.1507314908 -0.38880757 -0.337797955 0.06695353 -0.260487332 #> 25 -0.1360914431 -0.23010208 1.746852922 -0.54445974 0.270742627 #> 26 0.9887522192 -0.38463111 -0.337797955 -0.51426649 -0.260487332 #> 27 13.8524741014 -0.38880757 -0.337797955 -0.55200805 -0.266168936 #> 28 -0.1507314908 -0.38880757 -0.337797955 -0.55200805 -0.101402425 #> 29 -0.1507314908 0.05807368 -0.337797955 -0.31801038 -0.266168936 #> 30 -0.1458514749 -0.38880757 -0.337797955 -0.46897662 -0.260487332 #> 31 -0.1141313716 1.80383409 -0.320895380 0.42927250 0.301991448 #> 32 -0.1482914828 -0.38045465 -0.332163763 -0.33310700 -0.269009738 #> 33 -0.1507314908 -0.30945483 0.929895146 1.22184525 -0.269009738 #> 34 0.3836302490 -0.38880757 -0.337797955 -0.55200805 -0.269009738 #> 35 -0.1434114669 -0.38880757 -0.337797955 0.05940521 -0.266168936 #> 36 0.0542291766 -0.38880757 -0.337797955 -0.55200805 -0.254805728 #> 37 -0.1068113478 -0.38880757 -0.337797955 -0.52936311 2.219532746 #> 38 0.0883892878 -0.38463111 -0.337797955 -0.55200805 0.196881777 #> 39 -0.1507314908 -0.31780775 -0.337797955 -0.20478570 -0.226397709 #> 40 -0.1507314908 -0.27604314 -0.337797955 -0.14439921 0.114498521 #> 41 -0.1385314510 -0.38463111 -0.332163763 0.98029927 -0.269009738 #> 42 -0.0848512763 -0.30945483 -0.072990952 -0.01607790 -0.146855255 #> 43 -0.0360511174 -0.38880757 -0.337797955 -0.55200805 -0.269009738 #> 44 -0.1434114669 -0.38880757 -0.337797955 -0.55200805 -0.269009738 #> 45 -0.1019313319 -0.38880757 -0.337797955 -0.46142831 -0.266168936 #> 46 -0.1409714590 -0.38880757 3.262450451 0.53494886 -0.266168936 #> 47 -0.0214110697 -0.38880757 -0.337797955 0.82933303 -0.269009738 #> 48 -0.1312114272 -0.35121943 -0.337797955 2.98060192 -0.266168936 #> 49 -0.1287714193 -0.38880757 2.969472490 -0.52936311 -0.192308086 #> 50 -0.0946113080 -0.38880757 -0.337797955 -0.49162155 -0.269009738 #> 51 -0.1458514749 -0.18833748 -0.337797955 -0.44633168 -0.135492048 #> 52 -0.1458514749 3.57047681 -0.337797955 -0.54445974 0.392897110 #> 53 0.0493491607 -0.38880757 -0.337797955 1.64455071 -0.229238511 #> 54 0.1249894069 -0.38880757 -0.337797955 -0.54445974 -0.149696057 #> 55 -0.1482914828 -0.19251394 -0.337797955 -0.41613843 -0.269009738 #> 56 -0.0311711015 -0.38880757 -0.337797955 -0.55200805 -0.266168936 #> 57 -0.1507314908 -0.07139659 -0.337797955 -0.43123506 -0.254805728 #> 58 -0.0287310935 -0.37210173 -0.326529572 -0.54445974 -0.269009738 #> 59 -0.1092513557 -0.38880757 -0.337797955 -0.48407324 0.017911256 #> 60 -0.1507314908 -0.11733765 -0.337797955 -0.41613843 -0.269009738 #> 61 -0.1409714590 -0.38880757 -0.337797955 -0.32555869 0.071886493 #> 62 -0.1287714193 -0.28439607 -0.005380653 0.23301639 1.310476131 #> 63 -0.0458111492 -0.38880757 -0.332163763 -0.04627115 -0.007655961 #> 64 -0.1507314908 0.63442520 -0.281456039 0.48965899 -0.226397709 #> 65 -0.1507314908 -0.38880757 -0.337797955 -0.55200805 -0.220716105 #> 66 -0.1409714590 1.92912790 -0.337797955 -0.55200805 -0.090039217 #> 67 -0.1482914828 -0.32198421 -0.337797955 -0.09910934 -0.269009738 #> 68 -0.1507314908 0.04972076 2.293369503 -0.53691142 -0.269009738 #> 69 -0.1507314908 -0.05469075 -0.337797955 -0.42368675 -0.266168936 #> 70 -0.0653312127 0.55507246 -0.337797955 -0.18968908 1.685461984 #> 71 -0.1068113478 -0.38880757 -0.332163763 0.24056470 -0.260487332 #> 72 -0.1482914828 0.44230803 -0.337797955 -0.40104181 -0.226397709 #> 73 -0.1482914828 -0.38880757 -0.337797955 -0.29536544 -0.217875303 #> 74 -0.1482914828 -0.38880757 -0.337797955 -0.25762388 -0.269009738 #> 75 -0.1458514749 -0.34704297 0.011521922 -0.48407324 -0.257646530 #> 76 -0.0897312922 -0.17998456 -0.337797955 -0.55200805 -0.232079313 #> 77 -0.1409714590 -0.25933730 -0.326529572 -0.46897662 0.032115266 #> 78 -0.1482914828 0.07895598 -0.337797955 -0.55200805 -0.246283323 #> 79 -0.1507314908 -0.29692545 -0.337797955 -0.50671818 -0.269009738 #> 80 0.1591495182 -0.38463111 -0.337797955 -0.55200805 -0.269009738 #> 81 -0.1507314908 -0.01292614 0.203084435 -0.53691142 -0.266168936 #> 82 -0.0287310935 -0.36374881 7.662754058 -0.55200805 -0.269009738 #> 83 -0.1190113875 -0.38045465 -0.337797955 2.54279983 -0.195148888 #> 84 -0.1434114669 0.12489705 -0.337797955 2.80699074 -0.266168936 #> 85 0.9009119332 1.03536539 -0.337797955 -0.52936311 -0.269009738 #> 86 -0.1507314908 -0.19669040 -0.337797955 -0.55200805 -0.269009738 #> 87 -0.1507314908 0.47989617 -0.337797955 0.46701406 -0.240601719 #> 88 -0.1141313716 0.53419016 2.304637886 -0.34820363 -0.192308086 #> 89 -0.1507314908 -0.38880757 -0.337797955 -0.29536544 0.398578714 #> 90 -0.0214110697 -0.38880757 -0.337797955 -0.07646440 -0.266168936 #> 91 -0.1434114669 -0.38880757 -0.332163763 -0.46897662 -0.246283323 #> 92 -0.1482914828 1.78712825 -0.337797955 -0.55200805 -0.169581671 #> 93 -0.1507314908 -0.38880757 -0.337797955 -0.39349350 -0.240601719 #> 94 -0.1482914828 -0.32616067 1.284849214 -0.29536544 -0.158218463 #> 95 -0.0824112683 -0.35121943 -0.337797955 -0.25007557 -0.269009738 #> 96 -0.0580111889 -0.38880757 -0.337797955 -0.55200805 -0.266168936 #> 97 0.3909502729 -0.38880757 -0.337797955 -0.52936311 -0.266168936 #> 98 -0.1482914828 1.37365868 -0.337797955 -0.03117453 -0.266168936 #> 99 0.0005490018 -0.35539589 -0.337797955 -0.55200805 -0.269009738 #> 100 0.1786695817 -0.38463111 -0.337797955 -0.55200805 8.500545795 #> 101 -0.0946113080 -0.37210173 -0.247650890 -0.01607790 -0.266168936 #> 102 -0.1434114669 -0.38880757 -0.332163763 -0.42368675 -0.263328134 #> 103 -0.1019313319 -0.38880757 -0.337797955 0.73875328 -0.237760917 #> 104 -0.1482914828 0.41724927 1.160897000 -0.55200805 -0.251964926 #> 105 -0.1263314113 -0.38880757 -0.337797955 -0.52936311 -0.118447236 #> 106 0.5324707336 -0.38463111 0.496062396 -0.55200805 -0.269009738 #> 107 -0.1507314908 1.03954186 -0.337797955 0.11224340 -0.172422473 #> 108 -0.1385314510 -0.38880757 -0.337797955 -0.34820363 -0.095720821 #> 109 -0.1214513954 -0.38045465 -0.337797955 0.74630160 -0.269009738 #> 110 -0.1458514749 -0.38463111 -0.337797955 -0.47652493 -0.266168936 #> 111 -0.1507314908 -0.38463111 -0.337797955 -0.03872284 -0.269009738 #> 112 -0.0165310538 -0.17163164 -0.337797955 0.17262989 -0.263328134 #> 113 0.0200690653 -0.38880757 -0.337797955 -0.45387999 -0.200830492 #> 114 -0.1507314908 -0.32198421 -0.337797955 -0.42368675 -0.075835207 #> 115 -0.1507314908 -0.09645535 -0.337797955 -0.38594519 0.120180125 #> 116 0.1323094308 -0.35539589 -0.332163763 0.55759380 -0.206512096 #> 117 -0.1507314908 -0.30945483 1.476411727 -0.49162155 -0.260487332 #> 118 -0.1434114669 -0.38880757 -0.337797955 -0.55200805 -0.269009738 #> 119 -0.1507314908 -0.38880757 -0.337797955 0.57269042 -0.269009738 #> 120 -0.1409714590 -0.38045465 -0.332163763 0.88971952 -0.269009738 #> 121 -0.1507314908 -0.38880757 -0.332163763 -0.48407324 -0.269009738 #> 122 -0.1507314908 3.68741770 -0.337797955 -0.55200805 -0.030382377 #> 123 -0.1458514749 -0.38880757 -0.337797955 -0.55200805 -0.269009738 #> 124 -0.1019313319 -0.10063181 -0.337797955 0.85952627 -0.215034501 #> 125 -0.1287714193 -0.29692545 -0.337797955 0.49720730 -0.217875303 #> 126 -0.1092513557 0.78477778 -0.337797955 -0.10665765 0.228130598 #> 127 -0.1434114669 -0.38880757 -0.337797955 0.17262989 0.151428946 #> 128 -0.1360914431 -0.38045465 -0.332163763 -0.37839688 0.012229652 #> 129 -0.1507314908 -0.38880757 -0.337797955 -0.53691142 0.179836966 #> 130 -0.1482914828 0.61354290 -0.337797955 -0.35575194 1.557625898 #> 131 -0.1409714590 -0.38880757 -0.337797955 1.72003383 -0.234920115 #> 132 -0.1190113875 -0.34286651 -0.332163763 0.27830626 -0.269009738 #> 133 -0.1385314510 0.68454273 6.113351379 0.40662756 -0.146855255 #> 134 -0.1507314908 -0.38880757 -0.337797955 -0.43878337 -0.269009738 #> 135 -0.1336514351 -0.37210173 -0.332163763 -0.53691142 -0.260487332 #> 136 -0.1507314908 0.21260271 -0.337797955 -0.35575194 -0.254805728 #> 137 -0.1360914431 -0.38880757 -0.281456039 -0.55200805 -0.269009738 #> 138 -0.1409714590 1.77042241 -0.332163763 0.11224340 -0.124128840 #> 139 -0.1507314908 0.57595476 0.056595454 -0.52181480 -0.254805728 #> 140 -0.0458111492 0.54254308 -0.337797955 -0.55200805 -0.237760917 #> 141 -0.1507314908 0.12489705 -0.337797955 -0.40104181 -0.192308086 #> 142 -0.1482914828 0.18336749 -0.315261189 -0.55200805 -0.183785680 #> 143 -0.1238914034 -0.36374881 -0.337797955 -0.45387999 -0.243442521 #> 144 -0.1482914828 -0.38880757 1.955318009 -0.24252726 0.441190742 #> 145 -0.1312114272 -0.35957235 -0.337797955 -0.55200805 -0.260487332 #> 146 -0.1507314908 -0.10898473 -0.270187656 -0.55200805 0.784927775 #> 147 -0.0580111889 -0.38880757 -0.332163763 -0.55200805 -0.269009738 #> 148 -0.1507314908 -0.36792527 1.521485259 -0.51426649 -0.001974357 #> 149 0.2201497168 -0.33869005 -0.337797955 0.32359613 -0.269009738 #> 150 -0.0677712207 -0.38880757 -0.337797955 0.21791976 0.509369989 #> 151 -0.1507314908 -0.23845500 -0.337797955 -0.49162155 0.023592860 #> 152 -0.1482914828 -0.38463111 -0.337797955 0.77649484 -0.263328134 #> 153 -0.1482914828 -0.38880757 -0.292724422 -0.06136778 0.162792154 #> 154 -0.1385314510 -0.36374881 -0.337797955 -0.55200805 4.418313433 #> 155 0.2665098677 -0.32198421 -0.337797955 1.95403150 0.091772106 #> 156 -0.1482914828 -0.16745518 -0.337797955 0.35378938 -0.254805728 #> 157 0.4812305668 -0.37210173 -0.332163763 -0.55200805 -0.223556907 #> 158 -0.0824112683 2.04606879 -0.337797955 -0.51426649 0.052000879 #> 159 -0.1263314113 -0.10063181 -0.337797955 -0.53691142 -0.263328134 #> 160 -0.1482914828 -0.38880757 0.203084435 4.20342844 -0.260487332 #> 161 -0.1507314908 -0.38880757 0.974968678 0.32359613 -0.269009738 #> 162 -0.0994913239 -0.38880757 -0.337797955 -0.55200805 -0.263328134 #> 163 -0.1507314908 -0.18416102 -0.337797955 0.35378938 -0.269009738 #> 164 0.1079093513 -0.37627819 -0.163138017 0.90481615 -0.266168936 #> 165 -0.1287714193 -0.37627819 -0.337797955 -0.50671818 -0.237760917 #> 166 0.0347091130 0.50495493 -0.337797955 -0.54445974 5.517703777 #> 167 -0.1507314908 0.04136784 -0.337797955 -0.55200805 -0.269009738 #> 168 -0.1482914828 -0.38463111 -0.337797955 -0.55200805 -0.266168936 #> 169 -0.1482914828 -0.38880757 2.535639740 -0.55200805 -0.240601719 #> 170 0.5861509084 -0.38463111 -0.337797955 -0.55200805 0.941171881 #> 171 -0.1507314908 -0.29274899 -0.337797955 -0.50671818 -0.260487332 #> 172 -0.0799712604 -0.22592562 0.005887730 -0.35575194 -0.144014453 #> 173 0.0127490415 -0.33869005 -0.264553465 -0.12175427 -0.257646530 #> 174 -0.1507314908 -0.38463111 -0.208211549 -0.15949583 -0.001974357 #> 175 -0.1458514749 0.56342538 -0.298358614 0.11224340 -0.260487332 #> 176 -0.1312114272 1.81218701 -0.337797955 0.33869275 -0.266168936 #> 177 -0.1507314908 -0.31363129 1.279215022 -0.28781713 -0.269009738 #> 178 -0.0775312524 -0.38463111 -0.337797955 -0.55200805 -0.215034501 #> 179 0.1298694228 -0.33451359 -0.337797955 2.56544476 -0.269009738 #> 180 0.3445901219 -0.33033713 0.890455805 -0.37084856 0.091772106 #> 181 -0.1507314908 2.17136260 0.777771974 -0.43878337 -0.269009738 #> 182 -0.1507314908 5.69629511 -0.337797955 -0.50671818 -0.115606434 #> 183 -0.0994913239 -0.38045465 -0.337797955 -0.53691142 -0.269009738 #> 184 0.0371491210 -0.20086686 -0.095527718 -0.25762388 -0.223556907 #> 185 -0.1507314908 -0.38880757 2.259564353 0.05940521 -0.234920115 #> 186 -0.1385314510 -0.35957235 -0.089893526 -0.54445974 0.375852298 #> 187 -0.1360914431 -0.38880757 -0.337797955 -0.55200805 -0.246283323 #> 188 -0.1092513557 -0.38880757 -0.337797955 1.79551695 -0.266168936 #> 189 -0.1165713795 -0.36792527 0.417183714 -0.52936311 -0.246283323 #> 190 -0.1507314908 -0.35957235 -0.337797955 -0.34065532 -0.269009738 #> 191 -0.0628912048 -0.29692545 -0.337797955 0.72365666 -0.266168936 #> 192 -0.0189710618 -0.38463111 2.693397103 0.36888600 7.210821722 #> 193 -0.1360914431 -0.38880757 -0.337797955 0.26320964 -0.186626482 #> 194 0.0298290971 -0.38880757 -0.337797955 2.06725618 0.515051592 #> 195 -0.1458514749 -0.38880757 -0.337797955 -0.44633168 -0.269009738 #> 196 -0.1312114272 -0.38880757 -0.337797955 2.57299307 -0.269009738 #> 197 -0.1190113875 -0.34704297 2.225759204 -0.52936311 -0.257646530 #> 198 0.4446304476 -0.38880757 -0.332163763 0.83688134 -0.269009738 #> 199 0.0200690653 -0.38880757 -0.337797955 -0.54445974 0.128702531 #> 200 -0.1092513557 7.49217304 -0.337797955 -0.15194752 -0.269009738 #> Otu00039 Otu00040 Otu00041 Otu00042 Otu00043 #> 1 -0.369691676 -0.20704023 0.122728281 0.690525991 0.719828577 #> 2 0.504524822 -0.32139200 -0.630775883 -0.301679743 -0.243967502 #> 3 -0.439414464 0.35201286 0.855588495 -0.293479696 -0.461086399 #> 4 0.064734927 -0.33409775 -0.620453908 0.641325706 -0.127464679 #> 5 0.252450126 -0.85503359 4.860514738 2.211634782 -0.461086399 #> 6 -0.214156225 0.05978056 0.277557904 -0.301679743 0.545074343 #> 7 -0.385781550 -0.81691633 -0.424336386 -0.301679743 0.126723298 #> 8 -0.278515722 0.30118985 -0.661741808 -0.301679743 -0.381652656 #> 9 -0.133706855 -0.33409775 3.467048133 -0.297579720 -0.455790816 #> 10 -0.412598007 -0.46115527 0.071118407 -0.301679743 -0.461086399 #> 11 0.102277967 0.50448189 -0.661741808 -0.301679743 -0.461086399 #> 12 -0.417961299 -0.63903580 0.081440382 -0.301679743 0.312068697 #> 13 0.080824801 0.37742437 0.205304080 -0.010578061 -0.461086399 #> 14 -0.396508133 -0.55009554 0.298201853 4.581448478 -0.095691182 #> 15 -0.289242305 -0.37221501 1.712312408 3.257140824 -0.026848605 #> 16 -0.439414464 0.75859693 -0.651419833 -0.301679743 0.539778760 #> 17 -0.289242305 -0.33409775 0.659470973 -0.301679743 0.269704035 #> 18 -0.251699265 0.17413233 -0.155965040 -0.277079601 -0.005666274 #> 19 -0.058620775 -0.60091855 0.628505049 -0.256579483 -0.164533759 #> 20 1.362651445 1.52094206 -0.372726512 -0.297579720 -0.461086399 #> 21 -0.439414464 4.04938672 -0.661741808 -0.301679743 -0.455790816 #> 22 -0.310695471 -0.85503359 -0.661741808 -0.256579483 -0.249263085 #> 23 -0.407234716 0.79671419 -0.021779367 -0.297579720 0.132018880 #> 24 -0.305332179 1.34306153 1.640058584 -0.236079364 -0.365765907 #> 25 -0.439414464 0.25036685 -0.651419833 -0.301679743 -0.461086399 #> 26 -0.434051173 -0.74068182 0.721402822 -0.289379672 0.010220475 #> 27 -0.439414464 -0.85503359 -0.641097858 -0.231979341 -0.424017319 #> 28 -0.230246100 -0.57550704 -0.558522059 -0.002378014 -0.418721736 #> 29 0.466981782 -0.72797607 -0.290150713 -0.301679743 -0.392243822 #> 30 8.093582148 -0.74068182 -0.455302311 -0.268879554 3.399393499 #> 31 -0.310695471 0.14872083 -0.661741808 -0.297579720 -0.455790816 #> 32 -0.439414464 -0.30868625 -0.661741808 -0.281179625 -0.424017319 #> 33 -0.192703060 1.16518100 -0.630775883 -0.301679743 1.180544285 #> 34 0.139821007 0.84753719 0.174338155 -0.289379672 -0.413426153 #> 35 -0.273152431 -0.10539421 -0.475946260 -0.301679743 -0.085100016 #> 36 -0.332148636 1.02541772 -0.661741808 -0.297579720 -0.413426153 #> 37 0.542067861 -0.63903580 -0.269506763 -0.301679743 -0.053326519 #> 38 -0.439414464 -0.85503359 -0.651419833 -0.301679743 -0.461086399 #> 39 -0.417961299 -0.14351147 1.412975137 -0.301679743 -0.249263085 #> 40 0.247086835 -0.29598050 -0.114677141 -0.297579720 0.184974709 #> 41 0.043281762 0.31389561 -0.434658361 -0.301679743 -0.238671919 #> 42 -0.412598007 0.14872083 -0.279828738 -0.260679507 -0.392243822 #> 43 -0.439414464 -0.85503359 -0.641097858 -0.301679743 -0.429312902 #> 44 -0.203429643 -0.85503359 0.287879879 -0.289379672 -0.344583576 #> 45 -0.428687881 -0.82962208 -0.475946260 -0.301679743 -0.339287993 #> 46 0.129094424 0.37742437 -0.506912185 -0.252479459 -0.461086399 #> 47 -0.428687881 -0.80421058 -0.032101342 -0.297579720 0.290886366 #> 48 0.123731133 -0.05457121 -0.166287015 -0.301679743 -0.461086399 #> 49 -0.230246100 -0.62633005 -0.424336386 -0.301679743 0.820444651 #> 50 -0.417961299 0.16142658 0.019508532 -0.297579720 0.449753851 #> 51 0.450891908 -0.43574377 -0.455302311 -0.297579720 -0.461086399 #> 52 0.214907086 -0.74068182 -0.465624286 4.749549449 -0.302218913 #> 53 -0.434051173 0.17413233 -0.620453908 0.973427626 -0.461086399 #> 54 -0.439414464 1.10165224 -0.661741808 -0.297579720 -0.450495233 #> 55 -0.037167609 -0.37221501 0.225948029 -0.301679743 0.412684771 #> 56 -0.439414464 -0.85503359 -0.661741808 1.563831038 -0.461086399 #> 57 -0.235609391 -0.51197828 -0.434658361 1.157928692 -0.386948239 #> 58 -0.369691676 -0.84232784 -0.641097858 -0.293479696 -0.445199650 #> 59 -0.026441027 1.69882259 2.032293628 -0.293479696 -0.445199650 #> 60 -0.305332179 0.13601508 -0.228218864 -0.277079601 -0.010961856 #> 61 -0.412598007 -0.48656678 2.352274849 -0.293479696 -0.445199650 #> 62 -0.026441027 0.19954384 -0.290150713 -0.289379672 -0.439904067 #> 63 0.096914676 2.25787568 -0.073389241 -0.293479696 -0.445199650 #> 64 1.389467902 -0.32139200 -0.651419833 -0.289379672 0.052585138 #> 65 -0.439414464 -0.85503359 -0.424336386 -0.301679743 5.326985656 #> 66 -0.010351152 1.20329825 0.143372231 -0.301679743 -0.461086399 #> 67 -0.407234716 -0.81691633 -0.506912185 3.232540682 2.599760488 #> 68 -0.396508133 -0.55009554 1.784566232 -0.301679743 -0.455790816 #> 69 -0.316058762 0.40283587 -0.661741808 -0.301679743 0.063176303 #> 70 -0.273152431 -0.20704023 -0.661741808 -0.297579720 -0.455790816 #> 71 1.603999558 0.40283587 -0.114677141 -0.301679743 -0.381652656 #> 72 -0.273152431 0.05978056 -0.661741808 -0.301679743 -0.450495233 #> 73 -0.417961299 0.08519207 1.113637867 -0.301679743 -0.286332165 #> 74 0.048645053 0.26307260 -0.197252939 -0.297579720 0.211452623 #> 75 -0.310695471 -0.24515749 1.268467489 -0.297579720 0.788671154 #> 76 -0.257062557 -0.85503359 -0.114677141 -0.293479696 -0.116873513 #> 77 -0.358965093 -0.56280129 1.361365263 -0.289379672 -0.418721736 #> 78 -0.439414464 -0.43574377 1.144603791 -0.297579720 -0.461086399 #> 79 -0.396508133 -0.39762651 -0.052745291 -0.301679743 0.089654218 #> 80 -0.439414464 -0.81691633 -0.661741808 -0.301679743 -0.461086399 #> 81 -0.423324590 -0.23245173 -0.661741808 -0.301679743 -0.233376336 #> 82 -0.439414464 1.07624073 0.102084331 0.292823692 0.910469559 #> 83 3.760042699 0.92377171 -0.238540839 -0.297579720 -0.365765907 #> 84 2.816103414 3.09645532 -0.661741808 2.219834829 -0.450495233 #> 85 -0.439414464 -0.82962208 0.463353451 -0.100778582 0.274999617 #> 86 -0.439414464 -0.74068182 0.525285300 -0.297579720 -0.074508851 #> 87 0.820959014 -0.72797607 -0.279828738 -0.285279649 -0.402834987 #> 88 -0.273152431 -0.85503359 -0.651419833 -0.289379672 -0.333992410 #> 89 0.359715954 0.94918321 0.504641350 -0.293479696 -0.376357073 #> 90 -0.434051173 1.01271197 -0.661741808 -0.301679743 -0.461086399 #> 91 -0.391144842 -0.47386102 0.287879879 -0.301679743 -0.455790816 #> 92 -0.283879014 -0.84232784 -0.651419833 -0.301679743 -0.392243822 #> 93 -0.181976477 -0.85503359 -0.661741808 -0.297579720 -0.307514496 #> 94 -0.364328385 -0.85503359 -0.661741808 -0.297579720 -0.455790816 #> 95 -0.251699265 -0.34680350 0.463353451 -0.297579720 0.666872748 #> 96 -0.439414464 -0.09268846 0.153694206 -0.301679743 -0.461086399 #> 97 0.912134968 1.03812348 -0.641097858 -0.301679743 -0.439904067 #> 98 0.096914676 -0.51197828 0.834944546 -0.301679743 -0.461086399 #> 99 0.075461510 0.49177614 -0.661741808 -0.301679743 6.846817934 #> 100 -0.439414464 -0.85503359 -0.620453908 -0.289379672 4.109001601 #> 101 -0.294605596 -0.68985881 -0.372726512 -0.293479696 1.127588456 #> 102 -0.160523311 -0.65174155 -0.517234160 -0.244279412 -0.376357073 #> 103 -0.214156225 1.57176506 -0.589487984 -0.174579009 -0.386948239 #> 104 2.767833791 1.35576728 -0.383048487 -0.297579720 -0.450495233 #> 105 -0.407234716 -0.49927253 0.019508532 0.219023266 0.417980354 #> 106 1.051580544 -0.71527031 0.060796432 -0.301679743 2.864539631 #> 107 -0.396508133 -0.05457121 -0.444980336 -0.301679743 0.476231766 #> 108 -0.439414464 2.90586903 -0.661741808 0.145222839 -0.439904067 #> 109 -0.348238510 0.98730047 -0.630775883 -0.297579720 1.350002936 #> 110 0.134457715 -0.58821279 0.029830507 0.719226157 -0.016257439 #> 111 -0.364328385 -0.65174155 -0.661741808 -0.244279412 -0.445199650 #> 112 -0.439414464 4.51949955 0.339489753 -0.301679743 4.956294857 #> 113 -0.198066351 -0.85503359 -0.661741808 1.752432128 -0.455790816 #> 114 -0.171249894 -0.60091855 2.589680270 -0.297579720 -0.286332165 #> 115 -0.348238510 -0.04186545 -0.661741808 -0.301679743 0.089654218 #> 116 -0.181976477 -0.52468403 -0.001135417 -0.108978630 -0.291627748 #> 117 -0.396508133 0.04707481 0.969130219 -0.301679743 -0.461086399 #> 118 -0.439414464 -0.23245173 2.259377075 -0.301679743 -0.461086399 #> 119 0.107641258 -0.85503359 2.042615603 -0.293479696 -0.461086399 #> 120 6.806392213 1.94023187 -0.651419833 -0.297579720 -0.455790816 #> 121 -0.401871424 -0.65174155 1.113637867 0.018122105 -0.206898422 #> 122 0.745872935 -0.71527031 -0.661741808 1.756532152 -0.455790816 #> 123 -0.439414464 -0.85503359 -0.465624286 -0.297579720 -0.455790816 #> 124 0.761962809 0.93647746 -0.661741808 -0.297579720 -0.461086399 #> 125 -0.428687881 0.94918321 -0.558522059 0.624925612 -0.429312902 #> 126 0.037918470 -0.42303802 0.422065552 0.895527176 -0.461086399 #> 127 -0.122980272 -0.84232784 1.825854131 -0.297579720 -0.445199650 #> 128 0.155910881 -0.56280129 -0.661741808 -0.301679743 -0.243967502 #> 129 0.649333689 -0.66444731 -0.537878109 -0.301679743 -0.281036582 #> 130 -0.385781550 0.36471861 -0.166287015 -0.301679743 -0.461086399 #> 131 -0.439414464 -0.85503359 -0.589487984 -0.256579483 -0.450495233 #> 132 0.155910881 -0.33409775 -0.599809959 0.268223550 1.662442324 #> 133 0.155910881 1.68611683 -0.661741808 -0.301679743 -0.455790816 #> 134 -0.326785345 0.12330932 0.463353451 -0.301679743 1.620077661 #> 135 -0.139070146 0.80941994 -0.651419833 -0.301679743 -0.434608484 #> 136 -0.149796729 -0.21974598 2.114869427 -0.281179625 0.073767469 #> 137 -0.353601802 0.46636463 -0.661741808 0.743826299 -0.058622102 #> 138 -0.101527106 -0.39762651 -0.661741808 2.387935801 -0.461086399 #> 139 -0.149796729 -0.21974598 0.277557904 -0.301679743 -0.217489588 #> 140 0.525977987 1.19059250 0.164016180 -0.301679743 -0.461086399 #> 141 -0.332148636 -0.74068182 0.618183074 1.990233502 0.184974709 #> 142 -0.434051173 -0.84232784 -0.641097858 -0.289379672 -0.333992410 #> 143 1.587909684 -0.66444731 -0.465624286 -0.297579720 -0.318105662 #> 144 -0.439414464 -0.21974598 -0.362404537 -0.301679743 0.492118514 #> 145 -0.321422053 -0.85503359 -0.444980336 -0.281179625 1.561826250 #> 146 -0.342875219 -0.76609332 -0.475946260 9.243175419 -0.450495233 #> 147 -0.439414464 -0.85503359 -0.455302311 -0.293479696 -0.461086399 #> 148 -0.434051173 0.40283587 2.909661491 -0.301679743 0.889287228 #> 149 -0.439414464 -0.52468403 -0.403692436 -0.301679743 -0.461086399 #> 150 -0.439414464 0.45365888 0.308523828 -0.297579720 -0.376357073 #> 151 0.032555179 -0.70256456 0.287879879 -0.301679743 -0.461086399 #> 152 -0.004987861 0.96188896 -0.300472688 -0.002378014 -0.461086399 #> 153 -0.358965093 -0.81691633 6.832011934 -0.293479696 -0.461086399 #> 154 -0.412598007 0.31389561 -0.269506763 -0.297579720 0.169087960 #> 155 0.102277967 0.59342215 -0.630775883 -0.100778582 0.121427715 #> 156 -0.439414464 4.15103274 -0.290150713 -0.301679743 -0.461086399 #> 157 -0.439414464 -0.85503359 -0.630775883 -0.301679743 -0.355174742 #> 158 0.107641258 -0.47386102 0.215626055 -0.301679743 -0.386948239 #> 159 -0.031804318 -0.13080572 0.153694206 -0.281179625 -0.318105662 #> 160 1.169572955 -0.77879908 -0.630775883 -0.301679743 -0.429312902 #> 161 -0.332148636 0.22495534 -0.630775883 -0.301679743 -0.461086399 #> 162 -0.417961299 -0.01645395 -0.661741808 -0.297579720 -0.450495233 #> 163 -0.042530901 0.21224959 -0.599809959 -0.301679743 -0.455790816 #> 164 -0.407234716 1.95293763 -0.114677141 -0.281179625 -0.445199650 #> 165 -0.364328385 2.10540665 -0.610131933 -0.301679743 0.592734588 #> 166 -0.439414464 -0.85503359 -0.661741808 -0.301679743 -0.455790816 #> 167 -0.439414464 -0.85503359 -0.434658361 -0.301679743 -0.461086399 #> 168 3.373885719 -0.06727696 -0.661741808 -0.223779293 -0.450495233 #> 169 0.359715954 -0.84232784 -0.589487984 0.124722721 -0.185716091 #> 170 -0.439414464 -0.85503359 -0.661741808 -0.297579720 -0.461086399 #> 171 -0.391144842 -0.28327474 0.525285300 -0.301679743 0.635099251 #> 172 -0.439414464 0.05978056 -0.465624286 -0.240179388 0.862809314 #> 173 -0.417961299 -0.76609332 -0.630775883 -0.301679743 3.341142087 #> 174 0.338262788 -0.15621722 0.680114923 -0.301679743 -0.085100016 #> 175 0.005738722 -0.04186545 1.010418118 -0.277079601 -0.455790816 #> 176 -0.439414464 -0.85503359 -0.661741808 0.501924901 -0.461086399 #> 177 -0.391144842 -0.43574377 -0.032101342 -0.293479696 -0.058622102 #> 178 -0.439414464 1.39388453 -0.145643065 -0.301679743 -0.461086399 #> 179 -0.439414464 0.61883366 -0.661741808 -0.301679743 -0.445199650 #> 180 -0.369691676 -0.49927253 0.164016180 -0.301679743 -0.069213268 #> 181 -0.267789139 -0.39762651 0.081440382 0.961127555 -0.153942593 #> 182 3.111084440 1.03812348 -0.661741808 -0.178679033 -0.439904067 #> 183 -0.198066351 -0.51197828 -0.290150713 -0.301679743 -0.196307256 #> 184 -0.353601802 -0.70256456 2.486460522 -0.293479696 -0.408130570 #> 185 -0.439414464 1.22870976 -0.496590210 -0.281179625 -0.381652656 #> 186 -0.407234716 -0.85503359 -0.661741808 -0.293479696 -0.413426153 #> 187 -0.439414464 -0.85503359 0.607861099 -0.301679743 -0.455790816 #> 188 -0.439414464 3.94774071 -0.661741808 -0.268879554 -0.445199650 #> 189 -0.423324590 -0.84232784 -0.527556135 -0.256579483 -0.333992410 #> 190 -0.321422053 -0.41033226 1.805210182 -0.285279649 -0.397539405 #> 191 0.134457715 -0.62633005 -0.661741808 0.014022081 -0.386948239 #> 192 -0.439414464 1.52094206 -0.661741808 -0.301679743 -0.450495233 #> 193 -0.412598007 -0.09268846 -0.094033191 -0.289379672 0.455049434 #> 194 -0.423324590 0.98730047 -0.527556135 -0.133578772 -0.392243822 #> 195 -0.375054967 -0.15621722 0.236270004 -0.297579720 1.090519376 #> 196 -0.144433437 -0.85503359 -0.661741808 0.104222602 -0.450495233 #> 197 -0.439414464 -0.82962208 -0.001135417 -0.293479696 -0.376357073 #> 198 0.692240021 -0.81691633 -0.661741808 -0.301679743 -0.445199650 #> 199 -0.423324590 -0.75338757 -0.290150713 -0.293479696 -0.191011673 #> 200 0.445528616 0.11060357 0.494319376 -0.301679743 -0.392243822 #> Otu00044 Otu00045 Otu00046 Otu00047 Otu00048 Otu00049 #> 1 -0.611704260 -0.23391339 0.693551357 -0.203512195 -0.253544727 0.60651290 #> 2 -0.622709104 -0.23391339 -0.569110688 -0.208661143 -0.253544727 -0.42970775 #> 3 0.026576699 -0.23391339 -0.584323484 0.342276360 0.007337307 -0.42161228 #> 4 0.092605763 -0.23391339 -0.523472301 -0.208661143 -0.253544727 -0.43780323 #> 5 -0.303568625 -0.14075174 -0.584323484 -0.208661143 -0.194846269 0.42841248 #> 6 -0.259549248 -0.23391339 0.784828131 -0.208661143 -0.253544727 -0.43780323 #> 7 0.829930318 -0.23391339 -0.584323484 -0.033596890 -0.247022676 0.01554331 #> 8 -0.204525028 -0.23391339 0.221954690 -0.208661143 -0.253544727 -0.33256207 #> 9 -0.534670351 -0.23391339 -0.584323484 -0.208661143 -0.070927303 -0.31637112 #> 10 -0.446631598 -0.23391339 -0.584323484 0.501893767 -0.207890371 -0.42970775 #> 11 0.235668737 -0.23391339 1.895362219 -0.203512195 -0.247022676 -0.43780323 #> 12 -0.622709104 -0.23391339 -0.188790795 -0.208661143 -0.116581659 -0.40542133 #> 13 -0.314573469 -0.23391339 -0.584323484 -0.208661143 -0.227456524 -0.42161228 #> 14 -0.578689727 -0.18733256 0.298018668 -0.208661143 -0.253544727 -0.43780323 #> 15 0.884954539 -0.23391339 1.180360820 -0.208661143 -0.253544727 -0.40542133 #> 16 -0.611704260 -0.10348707 -0.584323484 -0.193214297 -0.253544727 8.67770035 #> 17 0.004567010 -0.23391339 0.678338561 -0.208661143 -0.207890371 -0.41351681 #> 18 -0.215529872 -0.23391339 0.632700174 -0.203512195 -0.253544727 -0.43780323 #> 19 0.169639672 -0.23391339 -0.386557139 -0.208661143 -0.253544727 0.68746764 #> 20 -0.402612222 0.55174991 -0.584323484 -0.208661143 -0.247022676 -0.43780323 #> 21 -0.600699416 -0.23391339 -0.477833914 -0.208661143 1.142174157 -0.42161228 #> 22 0.488780151 -0.23391339 -0.234429182 -0.203512195 -0.227456524 -0.42970775 #> 23 -0.039452366 -0.23391339 -0.097514021 -0.208661143 -0.247022676 0.50936722 #> 24 6.431395968 -0.23391339 1.119509637 0.316531617 -0.253544727 -0.13017522 #> 25 -0.435626754 -0.23391339 -0.584323484 -0.208661143 -0.207890371 -0.43780323 #> 26 0.279688113 -0.23391339 -0.127939612 -0.203512195 0.626932139 -0.43780323 #> 27 -0.732757545 -0.23391339 -0.584323484 -0.208661143 5.707609757 1.02747754 #> 28 -0.380602533 -0.23391339 -0.584323484 -0.208661143 -0.253544727 -0.43780323 #> 29 -0.633713948 -0.23080800 -0.219216386 0.002445751 -0.253544727 0.03982973 #> 30 -0.545675195 -0.23080800 -0.295280365 -0.203512195 -0.253544727 -0.43780323 #> 31 -0.644718792 -0.23391339 -0.584323484 -0.208661143 -0.207890371 -0.26779828 #> 32 -0.226534716 5.84954278 -0.584323484 -0.208661143 -0.253544727 -0.43780323 #> 33 0.026576699 -0.23391339 1.073871250 -0.141724811 -0.253544727 0.52555816 #> 34 -0.655723636 -0.23391339 0.982594476 3.756029300 0.920424427 -0.02493406 #> 35 -0.347588001 -0.23080800 -0.264854773 -0.208661143 -0.240500625 0.26650300 #> 36 -0.721752701 -0.23391339 -0.584323484 -0.208661143 0.033425511 -0.28398922 #> 37 1.677303314 -0.23391339 0.510997808 -0.208661143 -0.097015507 -0.38113491 #> 38 0.829930318 -0.23391339 0.008975549 -0.208661143 -0.233978575 -0.12207975 #> 39 -0.006437834 7.04201198 0.754402540 -0.208661143 -0.253544727 0.12078447 #> 40 0.180644516 -0.23080800 1.256424799 -0.208661143 -0.253544727 -0.41351681 #> 41 -0.138495963 -0.23080800 0.008975549 -0.208661143 -0.247022676 0.48508079 #> 42 -0.292563781 -0.22459723 -0.493046709 -0.193214297 0.274741392 -0.41351681 #> 43 -0.523665507 -0.23391339 -0.584323484 -0.208661143 1.311747479 -0.34065754 #> 44 -0.094476587 -0.14385712 2.153979746 -0.208661143 -0.227456524 -0.36494396 #> 45 0.202654204 -0.23391339 -0.462621118 -0.208661143 1.279137225 0.19364374 #> 46 -0.380602533 -0.23391339 -0.569110688 -0.188065349 -0.194846269 -0.42161228 #> 47 3.206976645 -0.23391339 -0.386557139 0.661511175 0.079079867 -0.37303944 #> 48 -0.600699416 -0.23080800 -0.584323484 -0.208661143 -0.220934473 -0.43780323 #> 49 -0.380602533 -0.23391339 -0.386557139 -0.208661143 -0.207890371 -0.08969785 #> 50 -0.490650974 -0.23391339 0.100252324 -0.203512195 1.670460276 -0.31637112 #> 51 -0.215529872 -0.23391339 0.419721034 -0.208661143 -0.253544727 -0.43780323 #> 52 -0.688738168 -0.23391339 0.997807271 -0.208661143 -0.253544727 -0.43780323 #> 53 -0.721752701 -0.23391339 -0.584323484 -0.208661143 0.046469612 -0.43780323 #> 54 -0.534670351 -0.22770262 -0.188790795 -0.208661143 0.366050104 -0.42161228 #> 55 -0.248544404 -0.23391339 3.918664050 -0.208661143 -0.253544727 -0.43780323 #> 56 -0.732757545 -0.23391339 -0.584323484 -0.208661143 -0.220934473 -0.42970775 #> 57 -0.127491119 -0.02274697 -0.508259505 -0.208661143 -0.253544727 -0.17065259 #> 58 -0.721752701 -0.23391339 0.176316302 -0.198363246 -0.247022676 -0.34065754 #> 59 -0.325578313 -0.23391339 -0.371344344 -0.203512195 -0.240500625 -0.38923038 #> 60 0.323707489 0.39026971 -0.538685096 -0.208661143 -0.253544727 0.08840257 #> 61 1.226104706 -0.23391339 -0.584323484 -0.208661143 -0.253544727 -0.42161228 #> 62 -0.699743012 -0.23391339 -0.416982731 -0.203512195 0.079079867 0.25031205 #> 63 -0.501655819 2.33734833 -0.477833914 -0.203512195 -0.175280117 -0.42970775 #> 64 -0.567684883 -0.23391339 0.510997808 -0.203512195 -0.240500625 -0.41351681 #> 65 -0.468641286 -0.23080800 -0.219216386 -0.115980068 -0.253544727 2.04750725 #> 66 0.983998136 -0.23391339 -0.082301225 -0.203512195 -0.149191913 -0.07350690 #> 67 -0.446631598 -0.23391339 -0.508259505 -0.018150044 -0.253544727 0.74413596 #> 68 1.435196744 -0.23391339 3.812174480 -0.208661143 -0.253544727 -0.43780323 #> 69 0.873949695 -0.23391339 5.303028460 -0.208661143 -0.227456524 1.06795491 #> 70 -0.534670351 -0.23391339 -0.584323484 -0.208661143 -0.083971405 -0.42970775 #> 71 0.433755930 -0.18422718 -0.553897892 -0.208661143 -0.240500625 0.54174911 #> 72 1.138065953 -0.23391339 -0.584323484 -0.208661143 -0.253544727 -0.07350690 #> 73 -0.369597689 -0.23391339 2.473448456 6.943228501 -0.227456524 -0.38923038 #> 74 -0.094476587 1.04550669 -0.386557139 -0.208661143 -0.253544727 -0.20303448 #> 75 -0.347588001 -0.23391339 0.374082647 -0.208661143 -0.253544727 -0.29208470 #> 76 -0.710747857 -0.23391339 -0.158365203 -0.208661143 0.646498291 -0.43780323 #> 77 0.510789839 -0.23080800 -0.553897892 -0.208661143 -0.253544727 -0.43780323 #> 78 -0.732757545 -0.23391339 -0.584323484 -0.208661143 0.033425511 -0.43780323 #> 79 0.048586387 4.98624476 -0.204003591 -0.208661143 -0.253544727 -0.08160238 #> 80 0.323707489 -0.23391339 -0.584323484 -0.208661143 -0.136147812 -0.43780323 #> 81 0.499784995 -0.23391339 0.997807271 -0.208661143 -0.253544727 0.09649805 #> 82 -0.732757545 -0.23391339 -0.584323484 -0.203512195 -0.129625761 -0.42161228 #> 83 0.147629984 -0.23080800 -0.356131548 -0.208661143 -0.240500625 -0.42161228 #> 84 -0.523665507 -0.23391339 -0.584323484 -0.208661143 -0.227456524 -0.30018017 #> 85 5.352921246 -0.19975412 -0.569110688 -0.208661143 -0.175280117 0.06411615 #> 86 -0.457636442 -0.23391339 -0.401769935 -0.208661143 0.248653189 -0.29208470 #> 87 0.081600919 -0.23391339 -0.553897892 -0.208661143 -0.240500625 -0.42970775 #> 88 -0.116486275 -0.23391339 -0.584323484 -0.208661143 -0.253544727 -0.34875301 #> 89 0.774906098 -0.23391339 1.773659853 -0.208661143 -0.253544727 -0.43780323 #> 90 -0.534670351 -0.22149184 -0.584323484 -0.208661143 0.666064444 -0.43780323 #> 91 -0.380602533 -0.23391339 1.682383079 -0.198363246 -0.253544727 -0.32446659 #> 92 0.499784995 -0.23391339 3.583982544 -0.208661143 -0.253544727 -0.39732586 #> 93 -0.633713948 -0.23391339 -0.538685096 -0.208661143 -0.253544727 -0.35684849 #> 94 -0.457636442 -0.23391339 0.419721034 -0.208661143 -0.253544727 -0.33256207 #> 95 -0.391607378 -0.23391339 0.298018668 -0.208661143 -0.083971405 -0.39732586 #> 96 -0.732757545 -0.23391339 -0.584323484 1.160959192 0.144300375 -0.43780323 #> 97 -0.369597689 -0.23080800 -0.584323484 -0.193214297 0.242131138 0.06411615 #> 98 -0.259549248 -0.23391339 0.434933830 -0.208661143 -0.253544727 -0.38113491 #> 99 -0.677733324 -0.23391339 -0.584323484 -0.208661143 -0.038317049 -0.39732586 #> 100 3.273005710 -0.23391339 -0.477833914 -0.208661143 -0.253544727 4.16042593 #> 101 0.554809216 -0.23391339 -0.553897892 -0.167469554 -0.057883201 1.04366849 #> 102 -0.710747857 0.20084100 -0.508259505 -0.208661143 -0.207890371 -0.42161228 #> 103 -0.435626754 -0.23391339 -0.584323484 -0.208661143 -0.175280117 0.14507089 #> 104 -0.600699416 -0.23080800 -0.584323484 -0.208661143 0.633454190 -0.33256207 #> 105 -0.281558936 -0.23391339 0.008975549 -0.208661143 -0.240500625 -0.36494396 #> 106 -0.479646130 -0.22770262 -0.097514021 -0.208661143 0.509535223 1.65892451 #> 107 0.213659048 -0.23391339 -0.569110688 -0.208661143 -0.253544727 1.18129155 #> 108 0.213659048 -0.23391339 -0.584323484 -0.208661143 -0.253544727 -0.36494396 #> 109 1.699313003 -0.22459723 1.210786411 -0.208661143 -0.253544727 3.28611475 #> 110 2.260560052 -0.03206314 1.575893509 -0.208661143 -0.240500625 0.03173426 #> 111 1.908405041 -0.23391339 -0.462621118 -0.208661143 -0.253544727 -0.42161228 #> 112 0.686867345 -0.23391339 -0.584323484 0.120871569 -0.253544727 3.50469255 #> 113 0.466770463 -0.23391339 -0.584323484 -0.208661143 -0.233978575 -0.43780323 #> 114 0.653852813 6.28429718 1.560680713 -0.203512195 -0.253544727 -0.33256207 #> 115 -0.149500807 -0.23391339 1.530255122 -0.208661143 -0.247022676 2.12846199 #> 116 -0.314573469 -0.23391339 -0.493046709 -0.146873760 -0.207890371 -0.42970775 #> 117 -0.490650974 -0.23080800 -0.584323484 -0.208661143 -0.207890371 -0.42970775 #> 118 -0.710747857 -0.23080800 -0.584323484 2.715941677 -0.240500625 -0.43780323 #> 119 -0.380602533 -0.23391339 -0.584323484 3.169049157 -0.194846269 -0.41351681 #> 120 -0.600699416 -0.23080800 -0.584323484 -0.208661143 -0.253544727 -0.42970775 #> 121 -0.358592845 5.26883512 -0.584323484 -0.208661143 -0.253544727 0.08840257 #> 122 -0.501655819 -0.23080800 -0.432195526 -0.208661143 -0.253544727 -0.42970775 #> 123 -0.369597689 -0.22149184 -0.584323484 -0.038745838 -0.247022676 -0.43780323 #> 124 -0.402612222 -0.23391339 -0.569110688 -0.208661143 -0.247022676 -0.42970775 #> 125 0.664857657 -0.23391339 -0.508259505 -0.208661143 -0.227456524 -0.42161228 #> 126 -0.490650974 -0.23391339 1.438978347 -0.203512195 -0.201368320 -0.43780323 #> 127 -0.534670351 -0.23080800 -0.401769935 -0.203512195 -0.123103710 -0.34875301 #> 128 -0.644718792 -0.23391339 -0.523472301 -0.208661143 -0.253544727 0.36364869 #> 129 0.015571854 -0.23391339 -0.310493161 1.572875082 -0.253544727 0.71175406 #> 130 -0.094476587 -0.23391339 -0.584323484 -0.203512195 -0.253544727 -0.30827565 #> 131 -0.567684883 0.69770317 -0.584323484 -0.208661143 -0.025272947 -0.43780323 #> 132 -0.039452366 -0.23391339 0.860892110 -0.198363246 -0.253544727 1.01938207 #> 133 0.972993292 -0.23391339 -0.584323484 -0.208661143 -0.240500625 -0.37303944 #> 134 0.400741398 -0.23391339 1.895362219 -0.208661143 -0.253544727 -0.40542133 #> 135 -0.534670351 -0.22770262 -0.432195526 -0.208661143 -0.253544727 -0.10588880 #> 136 0.037581543 -0.23391339 -0.584323484 -0.208661143 -0.253544727 1.36748745 #> 137 -0.578689727 -0.23391339 -0.264854773 -0.208661143 -0.227456524 1.17319607 #> 138 0.928973915 -0.22770262 -0.584323484 -0.208661143 -0.201368320 -0.43780323 #> 139 -0.545675195 -0.11901402 -0.584323484 -0.208661143 -0.247022676 -0.21922543 #> 140 3.262000866 -0.23391339 -0.584323484 -0.203512195 -0.240500625 -0.43780323 #> 141 0.895959383 -0.22149184 -0.386557139 -0.208661143 -0.253544727 0.08840257 #> 142 -0.600699416 -0.23391339 -0.462621118 -0.208661143 -0.253544727 -0.42161228 #> 143 0.125620295 0.74428400 -0.584323484 -0.193214297 -0.240500625 0.82509070 #> 144 -0.468641286 -0.21217567 0.161103507 -0.136575862 -0.253544727 -0.34065754 #> 145 -0.160505651 -0.23391339 -0.584323484 -0.198363246 -0.240500625 -0.33256207 #> 146 -0.589694571 -0.22149184 4.146855986 -0.182916400 -0.253544727 -0.43780323 #> 147 -0.633713948 -0.23391339 -0.584323484 -0.208661143 0.137778324 -0.13017522 #> 148 -0.732757545 -0.23391339 -0.584323484 -0.208661143 -0.247022676 0.81699522 #> 149 -0.567684883 -0.23391339 0.298018668 -0.208661143 0.085601918 -0.42970775 #> 150 -0.732757545 -0.23391339 -0.553897892 -0.208661143 -0.162236015 -0.43780323 #> 151 -0.611704260 -0.23080800 -0.310493161 -0.208661143 -0.253544727 -0.43780323 #> 152 -0.281558936 -0.23391339 -0.584323484 0.980745990 -0.253544727 -0.43780323 #> 153 -0.424621910 -0.23391339 2.777704371 9.152127462 -0.253544727 -0.31637112 #> 154 -0.699743012 -0.23391339 1.515042326 -0.208661143 -0.233978575 0.20983468 #> 155 -0.534670351 -0.23391339 -0.584323484 -0.208661143 -0.207890371 4.74330005 #> 156 -0.490650974 -0.23391339 -0.584323484 -0.208661143 0.020381409 -0.43780323 #> 157 -0.699743012 -0.22770262 -0.584323484 -0.208661143 11.623109885 -0.29208470 #> 158 2.271564896 -0.19975412 3.188449855 -0.208661143 -0.253544727 -0.43780323 #> 159 -0.622709104 -0.23391339 -0.584323484 -0.208661143 -0.175280117 -0.31637112 #> 160 -0.556680039 -0.23080800 -0.401769935 -0.208661143 -0.247022676 -0.43780323 #> 161 -0.567684883 0.65422773 -0.584323484 -0.208661143 -0.253544727 -0.43780323 #> 162 -0.501655819 -0.23391339 0.465359421 -0.208661143 -0.201368320 0.76032691 #> 163 1.369167679 0.46169364 1.241212003 -0.208661143 -0.253544727 -0.30018017 #> 164 -0.446631598 -0.23391339 -0.493046709 -0.198363246 0.222564986 -0.42970775 #> 165 0.400741398 -0.23080800 -0.553897892 -0.208661143 -0.240500625 -0.10588880 #> 166 -0.732757545 -0.23391339 -0.584323484 -0.208661143 1.540019259 -0.26779828 #> 167 -0.545675195 -0.23080800 0.480572217 0.337127411 -0.247022676 -0.39732586 #> 168 0.191649360 -0.23080800 -0.432195526 -0.208661143 -0.253544727 -0.43780323 #> 169 -0.512660663 -0.23391339 -0.432195526 -0.208661143 -0.175280117 0.88985449 #> 170 -0.721752701 -0.23080800 -0.584323484 -0.208661143 0.653020342 -0.36494396 #> 171 0.257678425 -0.23391339 1.362914369 -0.203512195 -0.181802168 -0.40542133 #> 172 -0.501655819 -0.19043795 -0.493046709 -0.208661143 -0.247022676 2.04750725 #> 173 -0.512660663 -0.23391339 -0.553897892 -0.208661143 0.326917799 2.76800443 #> 174 -0.677733324 1.07345519 -0.584323484 -0.208661143 -0.247022676 -0.37303944 #> 175 0.015571854 -0.23391339 -0.112726816 -0.203512195 -0.253544727 -0.43780323 #> 176 -0.358592845 -0.23391339 -0.569110688 -0.208661143 0.366050104 0.11268900 #> 177 0.059591231 0.80639177 -0.280067569 -0.208661143 -0.253544727 -0.43780323 #> 178 1.006007824 -0.23080800 -0.584323484 -0.208661143 0.561711630 -0.43780323 #> 179 -0.732757545 -0.23080800 -0.584323484 -0.208661143 -0.077449354 0.23412110 #> 180 -0.402612222 0.02693925 0.632700174 -0.188065349 -0.253544727 0.32317132 #> 181 -0.270554092 -0.23391339 0.008975549 0.450404281 -0.253544727 0.39603058 #> 182 0.609833436 -0.23391339 0.465359421 -0.208661143 -0.227456524 -0.42161228 #> 183 0.631843124 0.11389013 -0.401769935 -0.208661143 -0.253544727 -0.30018017 #> 184 -0.589694571 -0.22459723 -0.371344344 -0.172618503 0.222564986 -0.35684849 #> 185 -0.457636442 0.65112234 -0.553897892 -0.208661143 -0.253544727 -0.37303944 #> 186 -0.655723636 -0.23391339 -0.477833914 -0.208661143 -0.247022676 -0.32446659 #> 187 0.895959383 -0.23391339 -0.584323484 -0.208661143 0.092123968 -0.30827565 #> 188 -0.248544404 -0.23391339 -0.493046709 -0.208661143 -0.129625761 -0.18684354 #> 189 -0.666728480 -0.23080800 -0.553897892 4.682840053 0.150822426 -0.41351681 #> 190 -0.171510495 1.64484668 1.073871250 -0.110831119 -0.247022676 -0.42970775 #> 191 -0.369597689 -0.23391339 -0.553897892 -0.208661143 2.146569989 -0.30018017 #> 192 3.735209162 -0.22459723 -0.569110688 -0.208661143 -0.240500625 -0.43780323 #> 193 -0.369597689 -0.23080800 0.328444260 -0.208661143 -0.253544727 -0.31637112 #> 194 0.224663892 -0.23391339 -0.356131548 -0.208661143 -0.253544727 -0.32446659 #> 195 -0.204525028 -0.23080800 0.313231464 -0.177767451 -0.247022676 0.43650795 #> 196 -0.490650974 -0.23391339 -0.386557139 -0.208661143 -0.188324219 -0.43780323 #> 197 -0.435626754 -0.23391339 -0.569110688 -0.208661143 -0.142669863 -0.42161228 #> 198 -0.666728480 -0.23391339 -0.553897892 -0.208661143 -0.103537557 -0.22732091 #> 199 -0.303568625 -0.23391339 -0.340918752 -0.208661143 1.983518717 -0.29208470 #> 200 2.876831322 -0.23391339 -0.584323484 -0.208661143 -0.253544727 -0.42970775 #> Otu00050 Otu00051 Otu00052 Otu00053 Otu00054 Otu00055 #> 1 -0.475385806 -0.20991733 0.19735560 -0.082761027 -0.18688626 -0.256009183 #> 2 -0.450642238 -0.20991733 -0.25745566 0.651532741 -0.45315341 -0.418554697 #> 3 0.304036595 -0.16859502 5.36271211 -0.189845534 1.12780781 -0.377918318 #> 4 1.380381816 -0.20991733 -0.25745566 -0.128654387 -0.08703608 -0.405009237 #> 5 -0.549616511 2.09035789 -0.25745566 -0.465205697 -0.53636190 -0.201827346 #> 6 -0.475385806 -0.20991733 -0.25745566 -0.342823403 -0.58628699 -0.283100102 #> 7 -0.524872942 -0.20991733 0.06740953 -0.082761027 -0.33666153 -0.432100156 #> 8 1.652561068 -0.20991733 -0.22496914 -0.388716763 -0.51972020 -0.418554697 #> 9 0.390639084 -0.20991733 -0.25745566 1.095168558 0.76169047 0.136809140 #> 10 -0.475385806 -0.20991733 -0.25745566 -0.373418976 0.26243956 0.096172762 #> 11 3.384610848 -0.20991733 -0.25745566 -0.465205697 -0.58628699 -0.296645562 #> 12 -0.549616511 -0.20991733 -0.25745566 -0.419312337 -0.38658662 -0.296645562 #> 13 -0.549616511 -0.20991733 -0.25745566 0.085514628 -0.30337814 -0.418554697 #> 14 -0.425898669 0.04490358 -0.25745566 -0.358121189 -0.50307850 -0.350827400 #> 15 0.192690538 -0.20991733 -0.25745566 0.024323481 -0.58628699 -0.296645562 #> 16 -0.203206555 0.84380156 -0.25745566 -0.465205697 -0.53636190 -0.432100156 #> 17 -0.549616511 -0.20991733 -0.25745566 -0.419312337 -0.03711098 -0.364372859 #> 18 -0.376411533 -0.20991733 -0.25745566 -0.312227829 0.16258938 0.245172816 #> 19 1.120574349 -0.20303028 -0.25745566 -0.281632255 -0.18688626 -0.405009237 #> 20 -0.524872942 0.91955912 -0.25745566 0.100812415 -0.58628699 -0.201827346 #> 21 -0.512501158 -0.20991733 -0.25745566 -0.465205697 -0.10367777 4.850629026 #> 22 -0.487757590 -0.20303028 -0.25745566 -0.449907910 2.24280151 -0.432100156 #> 23 -0.326924396 -0.20991733 -0.25745566 -0.388716763 -0.35330323 -0.432100156 #> 24 1.256663975 -0.20991733 7.27941672 -0.465205697 -0.51972020 -0.432100156 #> 25 -0.265065475 -0.20991733 -0.25745566 -0.006272093 2.12630963 -0.201827346 #> 26 -0.549616511 -0.20991733 0.58719383 -0.388716763 -0.43651171 0.475445626 #> 27 -0.512501158 -0.20991733 -0.25745566 -0.449907910 -0.58628699 2.547900921 #> 28 0.019485560 -0.20991733 -0.25745566 -0.434610124 -0.40322832 -0.405009237 #> 29 1.442240737 -0.18236913 -0.25745566 -0.449907910 -0.32001983 1.829991571 #> 30 -0.549616511 -0.20991733 -0.25745566 -0.465205697 -0.58628699 -0.405009237 #> 31 -0.549616511 -0.20303028 -0.24662682 -0.465205697 -0.07039438 2.209264435 #> 32 -0.005258008 -0.03774104 5.22193719 1.079870772 -0.10367777 -0.418554697 #> 33 -0.302180828 -0.20991733 -0.25745566 -0.327525616 -0.51972020 0.949536707 #> 34 3.533072258 -0.20991733 -0.25745566 -0.449907910 2.79197752 0.109718221 #> 35 -0.549616511 -0.20991733 -0.25745566 -0.312227829 -0.56964529 -0.323736481 #> 36 -0.537244727 -0.20991733 -0.24662682 -0.465205697 -0.41987002 -0.418554697 #> 37 -0.549616511 -0.20991733 -0.25745566 -0.419312337 -0.50307850 -0.147645508 #> 38 -0.524872942 -0.20991733 -0.25745566 -0.465205697 -0.58628699 -0.377918318 #> 39 -0.512501158 -0.08595040 -0.25745566 0.009025694 -0.58628699 -0.405009237 #> 40 1.937112103 -0.20991733 -0.25745566 -0.465205697 0.11266429 -0.418554697 #> 41 -0.116604066 -0.20991733 -0.25745566 -0.465205697 -0.00382759 3.035537461 #> 42 -0.487757590 -0.18236913 -0.09502307 -0.189845534 0.27908126 -0.120554589 #> 43 0.897882235 -0.20303028 -0.23579798 -0.465205697 2.30936830 2.507264543 #> 44 -0.401155101 -0.20991733 -0.04087887 -0.159249961 -0.12031947 -0.377918318 #> 45 -0.549616511 -0.20991733 -0.25745566 -0.449907910 0.86154066 -0.256009183 #> 46 0.056600912 -0.20991733 -0.25745566 -0.434610124 -0.33666153 -0.432100156 #> 47 -0.500129374 -0.20991733 -0.25745566 -0.388716763 -0.33666153 -0.377918318 #> 48 -0.549616511 -0.20991733 -0.25745566 -0.465205697 -0.58628699 -0.432100156 #> 49 3.124803381 -0.20991733 -0.25745566 -0.465205697 2.22615982 -0.350827400 #> 50 -0.549616511 -0.20991733 -0.25745566 -0.342823403 -0.46979511 -0.323736481 #> 51 -0.549616511 -0.20991733 -0.25745566 -0.251036682 -0.51972020 -0.432100156 #> 52 -0.549616511 -0.20991733 -0.25745566 0.085514628 -0.56964529 -0.418554697 #> 53 -0.524872942 -0.20991733 -0.25745566 -0.465205697 -0.51972020 0.163900059 #> 54 6.564159374 -0.20991733 -0.21414030 -0.465205697 -0.30337814 -0.418554697 #> 55 0.242177675 -0.20991733 -0.25745566 -0.358121189 -0.51972020 -0.337281940 #> 56 -0.537244727 -0.19614323 -0.24662682 -0.312227829 -0.51972020 -0.418554697 #> 57 -0.388783317 0.25840217 -0.25745566 -0.404014550 -0.46979511 -0.405009237 #> 58 -0.549616511 -0.20991733 -0.17082495 -0.449907910 -0.58628699 0.123263681 #> 59 0.254549459 -0.20991733 -0.25745566 -0.465205697 -0.12031947 -0.391463778 #> 60 -0.091860497 2.84104651 -0.25745566 -0.388716763 -0.56964529 0.055536384 #> 61 -0.302180828 -0.20991733 -0.25745566 -0.449907910 -0.46979511 -0.350827400 #> 62 -0.487757590 -0.20991733 -0.25745566 -0.006272093 3.92361292 4.539083459 #> 63 -0.512501158 0.39614321 4.64800869 -0.296930042 -0.58628699 -0.418554697 #> 64 0.613331199 -0.20991733 -0.11668075 0.819808396 0.12930599 -0.432100156 #> 65 0.304036595 -0.20991733 -0.25745566 -0.296930042 -0.56964529 -0.405009237 #> 66 1.454612521 -0.20991733 -0.25745566 -0.465205697 -0.20352796 -0.432100156 #> 67 4.287751091 -0.20991733 -0.25745566 -0.327525616 -0.58628699 -0.432100156 #> 68 0.786536177 -0.20991733 2.70964640 0.223194710 -0.58628699 -0.432100156 #> 69 0.118459833 -0.20991733 -0.25745566 -0.449907910 -0.58628699 -0.418554697 #> 70 -0.537244727 -0.20991733 -0.25745566 -0.465205697 3.05824467 0.367081951 #> 71 -0.549616511 -0.19614323 -0.25745566 -0.465205697 -0.53636190 -0.432100156 #> 72 0.130831617 -0.20991733 -0.25745566 0.391470365 -0.51972020 -0.377918318 #> 73 0.922625803 -0.20991733 -0.25745566 0.116110202 -0.20352796 -0.432100156 #> 74 0.192690538 -0.18925618 -0.25745566 -0.419312337 -0.30337814 -0.432100156 #> 75 -0.524872942 -0.20991733 -0.25745566 -0.052165453 -0.48643681 -0.283100102 #> 76 -0.537244727 -0.20991733 1.99494298 0.529150446 -0.33666153 -0.418554697 #> 77 -0.512501158 -0.20303028 -0.25745566 -0.174547748 -0.58628699 0.055536384 #> 78 -0.351667964 -0.20991733 -0.25745566 1.033977411 -0.56964529 -0.242463724 #> 79 -0.425898669 2.84793356 -0.13833843 -0.419312337 -0.58628699 -0.405009237 #> 80 -0.549616511 -0.20991733 -0.24662682 -0.465205697 -0.28673644 -0.432100156 #> 81 -0.326924396 -0.20991733 -0.25745566 -0.449907910 -0.48643681 -0.432100156 #> 82 -0.549616511 -0.20991733 -0.25745566 -0.465205697 -0.50307850 -0.174736427 #> 83 -0.549616511 -0.20991733 -0.25745566 -0.404014550 -0.51972020 -0.323736481 #> 84 0.551472278 -0.20991733 -0.25745566 -0.388716763 -0.40322832 -0.323736481 #> 85 1.528843226 -0.18925618 -0.25745566 -0.220441108 -0.43651171 -0.310191021 #> 86 1.256663975 -0.20991733 -0.25745566 -0.449907910 -0.45315341 -0.432100156 #> 87 -0.549616511 -0.20991733 -0.25745566 -0.251036682 0.36228975 0.908900329 #> 88 0.266921243 -0.20991733 -0.25745566 1.095168558 -0.56964529 -0.161190967 #> 89 -0.500129374 -0.20991733 -0.25745566 0.238492497 0.42885653 -0.432100156 #> 90 -0.475385806 -0.20991733 -0.25745566 4.078236988 1.95989266 -0.283100102 #> 91 -0.277437260 -0.18925618 -0.24662682 0.330279217 -0.58628699 -0.432100156 #> 92 -0.351667964 -0.20991733 -0.25745566 1.202253066 -0.56964529 -0.391463778 #> 93 -0.166091202 -0.20991733 -0.25745566 -0.465205697 -0.58628699 -0.377918318 #> 94 -0.166091202 -0.20991733 -0.25745566 -0.465205697 -0.40322832 -0.432100156 #> 95 -0.524872942 -0.20991733 0.77128410 -0.419312337 -0.03711098 -0.310191021 #> 96 -0.376411533 -0.20991733 -0.25745566 1.752973392 4.00682140 -0.350827400 #> 97 -0.537244727 -0.20991733 -0.24662682 -0.465205697 -0.46979511 1.071445842 #> 98 -0.104232281 -0.20991733 -0.25745566 -0.404014550 -0.56964529 0.082627303 #> 99 -0.401155101 -0.20991733 -0.25745566 -0.465205697 1.22765799 5.026719999 #> 100 -0.549616511 -0.20991733 -0.25745566 8.912337624 -0.58628699 -0.093463670 #> 101 -0.549616511 -0.20991733 -0.25745566 -0.128654387 -0.53636190 -0.418554697 #> 102 -0.549616511 0.14820935 -0.25745566 -0.358121189 -0.58628699 -0.418554697 #> 103 -0.227950123 -0.20991733 -0.25745566 -0.465205697 1.47728345 0.394172870 #> 104 -0.549616511 -0.20991733 -0.25745566 0.269088070 1.22765799 -0.350827400 #> 105 -0.364039749 -0.20991733 -0.25745566 -0.388716763 0.26243956 -0.174736427 #> 106 -0.524872942 -0.20991733 -0.25745566 -0.404014550 1.76019230 -0.418554697 #> 107 0.007113776 -0.20991733 -0.24662682 -0.067463240 -0.58628699 -0.418554697 #> 108 -0.190834770 -0.20991733 -0.24662682 -0.465205697 0.12930599 -0.432100156 #> 109 1.182433270 -0.20991733 -0.24662682 -0.465205697 -0.23681135 -0.405009237 #> 110 2.036086376 0.46501372 -0.24662682 -0.205143321 -0.12031947 -0.377918318 #> 111 -0.265065475 -0.20991733 -0.25745566 0.590341593 -0.55300359 -0.337281940 #> 112 -0.227950123 -0.20991733 -0.25745566 -0.465205697 -0.20352796 -0.432100156 #> 113 -0.450642238 -0.20991733 -0.24662682 -0.006272093 -0.28673644 -0.432100156 #> 114 -0.116604066 0.05179063 1.34521260 -0.082761027 -0.27009474 -0.418554697 #> 115 -0.339296180 -0.20991733 -0.25745566 0.162003562 -0.15360286 -0.052827292 #> 116 -0.537244727 0.38236910 -0.25745566 -0.174547748 -0.08703608 1.003718545 #> 117 -0.487757590 -0.20991733 -0.25745566 -0.205143321 -0.48643681 -0.174736427 #> 118 -0.549616511 -0.20991733 -0.25745566 1.951844620 -0.35330323 -0.391463778 #> 119 -0.289809044 -0.20991733 -0.25745566 1.538804376 0.06273920 -0.432100156 #> 120 -0.549616511 -0.20303028 -0.25745566 -0.465205697 -0.56964529 -0.432100156 #> 121 -0.463014022 11.54627967 -0.25745566 -0.205143321 -0.38658662 -0.432100156 #> 122 -0.326924396 -0.20991733 -0.25745566 2.915605190 -0.55300359 -0.432100156 #> 123 -0.463014022 -0.16170797 1.12863581 -0.342823403 -0.48643681 -0.432100156 #> 124 -0.549616511 -0.20991733 -0.25745566 -0.358121189 -0.43651171 0.597354761 #> 125 1.244292191 -0.20991733 -0.25745566 0.146705776 0.94474914 -0.418554697 #> 126 -0.537244727 -0.20991733 -0.25745566 4.185321496 -0.58628699 -0.432100156 #> 127 0.316408380 -0.20303028 -0.25745566 -0.281632255 0.42885653 -0.432100156 #> 128 -0.376411533 -0.20991733 -0.25745566 -0.281632255 -0.58628699 -0.418554697 #> 129 0.588587631 -0.20991733 -0.25745566 -0.388716763 -0.35330323 -0.432100156 #> 130 -0.425898669 -0.20991733 -0.25745566 0.116110202 -0.51972020 -0.432100156 #> 131 -0.463014022 0.92644617 -0.25745566 -0.449907910 -0.43651171 6.354175024 #> 132 -0.537244727 -0.19614323 -0.25745566 -0.465205697 -0.56964529 -0.432100156 #> 133 0.514356926 -0.20991733 -0.25745566 -0.404014550 -0.56964529 0.407718329 #> 134 -0.549616511 -0.20991733 -0.25745566 -0.143952174 -0.51972020 -0.201827346 #> 135 -0.425898669 -0.20991733 -0.25745566 -0.465205697 -0.45315341 -0.364372859 #> 136 0.192690538 -0.20991733 -0.24662682 3.879365760 -0.36994493 -0.432100156 #> 137 -0.388783317 -0.20991733 -0.25745566 0.100812415 1.19437460 -0.405009237 #> 138 1.145317917 -0.20991733 -0.25745566 -0.251036682 0.31236465 -0.134100048 #> 139 0.019485560 -0.08595040 -0.24662682 -0.113356600 -0.56964529 -0.432100156 #> 140 -0.401155101 -0.20991733 -0.17082495 2.686138388 -0.51972020 -0.432100156 #> 141 -0.487757590 -0.20991733 -0.25745566 -0.052165453 0.02945580 -0.405009237 #> 142 -0.500129374 -0.20991733 -0.25745566 0.452661512 0.71176538 -0.432100156 #> 143 -0.425898669 -0.18925618 -0.25745566 0.024323481 -0.08703608 -0.432100156 #> 144 0.167946970 -0.19614323 1.64842011 -0.235738895 1.92660927 -0.432100156 #> 145 -0.537244727 -0.20991733 -0.25745566 -0.220441108 0.34564805 -0.012190913 #> 146 -0.252693691 -0.19614323 -0.25745566 0.054919055 -0.27009474 -0.296645562 #> 147 4.225892170 -0.20303028 -0.25745566 -0.465205697 0.06273920 0.231627356 #> 148 -0.376411533 -0.20991733 3.34854794 0.177301349 -0.10367777 -0.432100156 #> 149 0.761792609 -0.19614323 -0.24662682 -0.327525616 4.95539814 0.488991086 #> 150 -0.549616511 -0.20991733 -0.25745566 -0.220441108 3.50757049 -0.418554697 #> 151 -0.549616511 -0.20991733 -0.25745566 0.100812415 -0.55300359 -0.432100156 #> 152 -0.549616511 -0.20991733 -0.25745566 -0.373418976 -0.22016965 2.317628111 #> 153 -0.537244727 -0.20991733 -0.25745566 0.636234954 0.02945580 0.150354600 #> 154 1.083458997 -0.20991733 -0.25745566 -0.082761027 3.90697122 -0.377918318 #> 155 1.491727874 -0.20991733 -0.25745566 -0.388716763 -0.45315341 1.355900490 #> 156 -0.153719418 -0.20991733 -0.25745566 -0.327525616 -0.03711098 -0.337281940 #> 157 -0.549616511 -0.20991733 -0.15999611 -0.434610124 -0.58628699 3.726355893 #> 158 -0.500129374 -0.20991733 -0.25745566 -0.205143321 -0.13696117 -0.405009237 #> 159 -0.413526885 -0.20991733 -0.25745566 -0.465205697 3.57413728 -0.405009237 #> 160 -0.537244727 -0.20991733 -0.25745566 -0.449907910 -0.58628699 0.651536599 #> 161 -0.549616511 1.87685929 -0.25745566 -0.327525616 -0.58628699 1.667446057 #> 162 -0.425898669 -0.20991733 -0.25745566 2.303693717 -0.20352796 -0.283100102 #> 163 0.205062322 3.83278193 -0.25745566 -0.358121189 -0.58628699 0.001354546 #> 164 -0.500129374 -0.20991733 -0.23579798 -0.266334469 -0.15360286 0.312900113 #> 165 -0.487757590 -0.18236913 -0.23579798 -0.449907910 -0.56964529 -0.391463778 #> 166 -0.537244727 -0.20991733 -0.25745566 1.018679624 -0.15360286 -0.377918318 #> 167 -0.413526885 -0.20991733 -0.25745566 0.452661512 1.84340078 -0.337281940 #> 168 2.852624130 -0.20991733 -0.25745566 -0.434610124 -0.56964529 -0.432100156 #> 169 -0.227950123 -0.20991733 -0.25745566 -0.358121189 -0.56964529 -0.432100156 #> 170 -0.549616511 -0.20991733 -0.25745566 1.079870772 0.24579787 0.326445573 #> 171 -0.524872942 -0.20991733 -0.25745566 -0.296930042 -0.46979511 -0.337281940 #> 172 -0.463014022 -0.20991733 -0.25745566 -0.358121189 -0.56964529 -0.432100156 #> 173 0.130831617 -0.20991733 -0.25745566 -0.465205697 -0.56964529 -0.188281886 #> 174 -0.524872942 -0.16859502 -0.25745566 -0.449907910 -0.50307850 -0.432100156 #> 175 -0.425898669 -0.20991733 0.19735560 0.620937167 -0.48643681 0.190990978 #> 176 -0.500129374 -0.20991733 -0.25745566 -0.434610124 -0.20352796 -0.161190967 #> 177 0.279293027 3.47465525 -0.01922119 -0.342823403 -0.56964529 -0.405009237 #> 178 -0.512501158 -0.19614323 -0.25745566 -0.342823403 0.29572296 0.231627356 #> 179 -0.401155101 -0.20991733 -0.25745566 -0.465205697 -0.30337814 -0.432100156 #> 180 -0.475385806 1.57382902 0.34895936 -0.128654387 -0.03711098 -0.405009237 #> 181 1.095830781 -0.20991733 -0.25745566 0.054919055 0.01281411 -0.256009183 #> 182 1.009228292 -0.20991733 -0.25745566 -0.404014550 -0.36994493 -0.391463778 #> 183 2.679419152 0.31349859 -0.25745566 -0.404014550 -0.55300359 -0.350827400 #> 184 -0.438270453 1.29834696 -0.24662682 1.538804376 -0.33666153 -0.215372805 #> 185 -0.549616511 0.24462807 -0.01922119 -0.358121189 -0.58628699 -0.405009237 #> 186 -0.524872942 -0.20991733 -0.24662682 -0.419312337 -0.33666153 -0.432100156 #> 187 -0.549616511 -0.20303028 3.85750340 1.095168558 -0.38658662 -0.269554643 #> 188 -0.537244727 -0.20991733 -0.23579798 -0.419312337 -0.55300359 0.475445626 #> 189 -0.425898669 -0.20991733 -0.23579798 -0.373418976 2.70876903 0.177445519 #> 190 0.254549459 -0.09972451 3.55429589 0.162003562 -0.33666153 -0.432100156 #> 191 -0.512501158 -0.20991733 -0.25745566 -0.465205697 0.21251447 -0.147645508 #> 192 -0.537244727 -0.20991733 -0.25745566 -0.465205697 0.54534841 -0.432100156 #> 193 -0.450642238 -0.20303028 -0.25745566 -0.358121189 -0.38658662 -0.310191021 #> 194 0.885510450 -0.20991733 -0.25745566 -0.388716763 -0.56964529 -0.432100156 #> 195 -0.104232281 -0.16170797 0.01326533 -0.388716763 -0.32001983 -0.269554643 #> 196 -0.549616511 -0.20991733 -0.25745566 -0.465205697 -0.22016965 -0.256009183 #> 197 -0.512501158 -0.05151515 0.31647284 1.768271179 0.91146575 -0.174736427 #> 198 0.167946970 -0.20991733 -0.25745566 -0.465205697 -0.58628699 -0.147645508 #> 199 -0.537244727 -0.20991733 -0.25745566 -0.434610124 -0.38658662 1.708082436 #> 200 -0.450642238 -0.20991733 -0.25745566 -0.342823403 -0.15360286 2.046718922 #> Otu00056 Otu00057 Otu00058 Otu00059 Otu00060 #> 1 -0.67302626 -0.063085238 0.244028438 -0.04265350 -0.41506494 #> 2 2.49956176 -0.378272648 0.956294184 -0.33573273 -0.41506494 #> 3 -0.80430576 2.658987854 -0.313396928 -0.40900254 -0.40518715 #> 4 0.18029052 -0.340068114 -0.065652321 -0.29386427 -0.41506494 #> 5 -0.80430576 -0.426028317 -0.561141535 -0.39853543 -0.40518715 #> 6 0.77104829 0.786965657 0.151124210 0.66911037 -0.41506494 #> 7 -0.82618568 -0.244556777 -0.545657497 -0.29386427 -0.41506494 #> 8 -0.62926642 -0.426028317 -0.406301156 7.84955171 -0.16812007 #> 9 0.24593027 -0.426028317 -0.483721345 -0.40900254 -0.41506494 #> 10 -0.23542791 -0.406926049 -0.576625573 -0.40900254 2.69644047 #> 11 -0.82618568 -0.406926049 2.798894699 -0.40900254 0.40479204 #> 12 -0.56362667 0.557738450 -0.205008662 0.09341901 0.04919142 #> 13 0.66164870 -0.426028317 1.730496081 -0.40900254 -0.41506494 #> 14 0.04901101 0.529085049 0.213060362 0.69004460 -0.41506494 #> 15 1.82128432 1.407789345 0.832421880 -0.05312061 -0.41506494 #> 16 1.66812490 -0.397374916 -0.158556549 -0.40900254 -0.41506494 #> 17 -0.41046725 0.519533915 -0.220492700 0.21902440 -0.41506494 #> 18 -0.30106766 1.073499667 -0.096620397 0.03061631 -0.38543156 #> 19 -0.69490618 0.147039703 0.569193235 -0.21012735 -0.41506494 #> 20 -0.78242585 -0.359170381 -0.545657497 -0.23106158 -0.41506494 #> 21 -0.82618568 -0.406926049 -0.576625573 -0.40900254 -0.41506494 #> 22 1.88692408 -0.426028317 -0.530173459 2.16590791 -0.41506494 #> 23 0.46472945 -0.426028317 -0.205008662 0.76331441 -0.41506494 #> 24 1.05548722 -0.426028317 -0.375333080 -0.40900254 -0.41506494 #> 25 0.31157002 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 26 -0.32294758 0.357164643 -0.081136359 -0.03218638 1.06660430 #> 27 -0.78242585 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 28 -0.60738651 -0.426028317 -0.607593649 -0.40900254 2.67668488 #> 29 -0.76054593 -0.426028317 -0.437269232 0.75284729 -0.41506494 #> 30 -0.69490618 -0.053534104 -0.189524624 -0.13685754 -0.39530935 #> 31 -0.82618568 -0.426028317 1.482751474 -0.39853543 1.00733753 #> 32 -0.60738651 2.085919835 -0.375333080 -0.40900254 2.59766252 #> 33 -0.62926642 0.252102173 -0.592109611 -0.40900254 -0.31628699 #> 34 -0.82618568 0.242551039 0.770485728 -0.40900254 -0.41506494 #> 35 -0.71678609 8.532935052 0.878873994 -0.19966023 -0.33604258 #> 36 -0.49798692 -0.426028317 -0.607593649 -0.40900254 0.20723614 #> 37 2.23700275 0.280755574 -0.235976738 -0.04265350 -0.41506494 #> 38 -0.76054593 -0.426028317 0.383384780 -0.40900254 0.54308117 #> 39 -0.80430576 -0.426028317 0.615645349 -0.40900254 2.37047324 #> 40 -0.43234717 0.605494118 -0.143072511 0.03061631 0.12821378 #> 41 -0.60738651 -0.292312446 -0.437269232 -0.40900254 -0.41506494 #> 42 0.61788887 -0.416477183 -0.344365004 -0.35666697 -0.40518715 #> 43 -0.80430576 -0.426028317 -0.375333080 -0.40900254 1.60000523 #> 44 1.99632366 -0.063085238 0.042735945 -0.40900254 -0.41506494 #> 45 0.31157002 -0.034431837 -0.514689421 -0.29386427 0.39491424 #> 46 0.02713110 -0.406926049 -0.468237308 -0.40900254 -0.40518715 #> 47 -0.71678609 -0.015329570 -0.313396928 -0.14732465 -0.41506494 #> 48 -0.82618568 -0.387823782 -0.545657497 -0.40900254 -0.41506494 #> 49 -0.65114634 0.137488569 -0.266944814 -0.16825888 -0.41506494 #> 50 -0.54174675 0.634147519 0.305964590 0.28182709 -0.41506494 #> 51 0.37720978 -0.426028317 -0.561141535 4.57334451 -0.40518715 #> 52 -0.47610700 -0.177698842 -0.468237308 -0.25199581 -0.41506494 #> 53 -0.80430576 -0.416477183 -0.592109611 -0.40900254 -0.41506494 #> 54 -0.80430576 -0.426028317 -0.561141535 -0.40900254 -0.40518715 #> 55 -0.38858733 0.739209989 0.058219983 0.08295189 -0.40518715 #> 56 -0.82618568 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 57 -0.76054593 -0.416477183 0.135640172 0.40743248 -0.41506494 #> 58 1.20864664 -0.416477183 -0.452753270 -0.40900254 -0.03970874 #> 59 -0.21354799 -0.426028317 0.166608248 0.83658422 -0.40518715 #> 60 -0.10414841 -0.129943173 -0.003716169 0.02014920 -0.41506494 #> 61 0.70540854 -0.426028317 1.157586677 -0.40900254 1.35306035 #> 62 -0.76054593 0.739209989 -0.514689421 -0.40900254 -0.39530935 #> 63 0.44284953 -0.235005644 -0.359849042 -0.39853543 -0.41506494 #> 64 -0.76054593 -0.426028317 -0.592109611 -0.40900254 -0.41506494 #> 65 -0.82618568 0.318960108 -0.468237308 -0.40900254 0.21711393 #> 66 0.48660936 -0.426028317 5.369244999 -0.40900254 -0.41506494 #> 67 1.29616631 -0.426028317 -0.561141535 0.54350498 0.82953722 #> 68 1.23052655 1.197664405 0.166608248 -0.19966023 2.07413939 #> 69 1.20864664 -0.426028317 1.064682449 -0.40900254 -0.41506494 #> 70 0.13653068 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 71 -0.45422709 -0.349619247 -0.530173459 -0.38806831 6.91425892 #> 72 0.13653068 2.534823116 2.195017219 -0.07405484 1.57037184 #> 73 0.50848928 0.242551039 -0.607593649 -0.40900254 -0.41506494 #> 74 4.62191375 0.013323831 0.182092286 0.63770902 3.72373115 #> 75 0.81480812 0.748761123 0.491773045 1.42274270 -0.41506494 #> 76 -0.82618568 -0.426028317 5.431181150 -0.40900254 0.02943583 #> 77 -0.69490618 -0.426028317 0.213060362 1.06686076 -0.40518715 #> 78 -0.56362667 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 79 1.58060523 -0.091738639 0.940810146 1.19246615 -0.41506494 #> 80 -0.82618568 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 81 0.59600895 -0.426028317 1.699528005 0.20855728 -0.41506494 #> 82 3.28723879 0.939783796 -0.607593649 -0.39853543 -0.41506494 #> 83 0.83668804 -0.034431837 -0.545657497 -0.25199581 -0.40518715 #> 84 -0.76054593 -0.426028317 -0.390817118 -0.40900254 -0.16812007 #> 85 -0.43234717 -0.426028317 2.427277789 -0.40900254 -0.41506494 #> 86 -0.82618568 -0.139494307 -0.251460776 -0.40900254 -0.40518715 #> 87 -0.06038857 0.051528366 -0.390817118 -0.36713408 -0.41506494 #> 88 1.01172738 -0.426028317 6.546031883 -0.40900254 -0.41506494 #> 89 1.79940441 -0.359170381 0.151124210 -0.31479850 -0.41506494 #> 90 0.13653068 6.603606053 -0.174040587 -0.28339716 -0.41506494 #> 91 -0.23542791 -0.378272648 -0.344365004 2.80440196 0.95794856 #> 92 -0.76054593 -0.426028317 2.009208764 -0.40900254 0.41466983 #> 93 -0.82618568 -0.426028317 -0.530173459 -0.40900254 -0.41506494 #> 94 -0.80430576 -0.426028317 0.228544400 2.50085561 -0.38543156 #> 95 1.03360730 1.054397400 0.274996514 0.55397210 -0.41506494 #> 96 -0.82618568 -0.426028317 -0.576625573 -0.40900254 -0.41506494 #> 97 -0.78242585 -0.426028317 -0.592109611 -0.40900254 -0.41506494 #> 98 -0.16978816 -0.426028317 -0.468237308 1.63208501 -0.41506494 #> 99 -0.78242585 -0.406926049 -0.592109611 -0.40900254 -0.41506494 #> 100 2.41204209 -0.397374916 -0.499205383 -0.39853543 -0.37555376 #> 101 1.79940441 -0.177698842 -0.576625573 -0.40900254 -0.41506494 #> 102 -0.80430576 -0.426028317 -0.607593649 -0.36713408 -0.41506494 #> 103 -0.19166808 -0.301863579 -0.421785194 -0.40900254 -0.41506494 #> 104 -0.82618568 1.025743999 0.011767869 -0.40900254 -0.39530935 #> 105 0.18029052 0.509982781 0.027251907 0.47023517 0.07882480 #> 106 0.04901101 0.309408975 -0.235976738 0.03061631 -0.39530935 #> 107 0.20217044 -0.426028317 -0.034684245 -0.40900254 0.33564747 #> 108 0.81480812 -0.426028317 1.838884347 -0.40900254 0.80978163 #> 109 -0.62926642 -0.129943173 -0.251460776 -0.38806831 -0.41506494 #> 110 2.08384333 -0.397374916 -0.205008662 -0.27293004 -0.40518715 #> 111 0.53036920 -0.426028317 -0.220492700 -0.40900254 -0.41506494 #> 112 0.50848928 -0.426028317 0.259512476 -0.40900254 0.13809157 #> 113 -0.21354799 -0.426028317 0.569193235 -0.38806831 -0.41506494 #> 114 0.35532986 -0.378272648 1.637591853 -0.15779177 1.13574887 #> 115 0.44284953 -0.426028317 1.467267436 -0.40900254 -0.06934212 #> 116 2.01820358 -0.215903376 -0.174040587 -0.40900254 -0.41506494 #> 117 -0.03850865 -0.426028317 -0.607593649 -0.40900254 2.64705149 #> 118 0.18029052 -0.426028317 -0.514689421 -0.40900254 -0.41506494 #> 119 -0.82618568 -0.426028317 -0.050168283 -0.40900254 -0.41506494 #> 120 -0.32294758 -0.387823782 -0.607593649 -0.38806831 -0.34592038 #> 121 -0.34482750 0.414471445 1.002746297 0.35509690 4.63248828 #> 122 0.24593027 -0.416477183 -0.576625573 -0.40900254 -0.41506494 #> 123 -0.82618568 -0.426028317 -0.545657497 -0.39853543 -0.41506494 #> 124 0.02713110 -0.426028317 -0.530173459 -0.40900254 -0.41506494 #> 125 -0.60738651 -0.426028317 0.089188059 3.14981678 2.73595165 #> 126 0.63976878 -0.426028317 1.064682449 -0.40900254 -0.41506494 #> 127 -0.27918775 -0.378272648 -0.545657497 -0.31479850 -0.39530935 #> 128 -0.78242585 -0.426028317 -0.576625573 -0.40900254 -0.06934212 #> 129 -0.80430576 -0.110840906 -0.483721345 0.26089286 -0.41506494 #> 130 -0.47610700 -0.426028317 -0.344365004 -0.40900254 -0.40518715 #> 131 -0.56362667 -0.426028317 -0.390817118 -0.40900254 -0.41506494 #> 132 1.47120565 -0.426028317 -0.421785194 -0.40900254 -0.20763125 #> 133 -0.67302626 -0.426028317 -0.530173459 -0.26246293 -0.41506494 #> 134 0.46472945 0.739209989 1.869852422 1.54834808 -0.40518715 #> 135 -0.82618568 -0.406926049 -0.437269232 -0.39853543 -0.41506494 #> 136 0.85856796 -0.426028317 0.011767869 -0.40900254 -0.41506494 #> 137 -0.16978816 2.085919835 -0.468237308 -0.40900254 1.15550446 #> 138 0.88044788 -0.426028317 -0.220492700 -0.40900254 -0.40518715 #> 139 -0.71678609 -0.416477183 -0.468237308 0.11435324 -0.41506494 #> 140 -0.82618568 -0.426028317 -0.220492700 -0.40900254 -0.41506494 #> 141 -0.65114634 -0.426028317 -0.174040587 1.51694674 -0.03970874 #> 142 -0.56362667 1.617914285 0.693065539 -0.40900254 -0.41506494 #> 143 -0.73866601 -0.005778436 -0.607593649 -0.06358773 -0.41506494 #> 144 -0.58550659 1.149908736 -0.468237308 0.88891980 -0.41506494 #> 145 0.61788887 -0.196801109 -0.607593649 -0.40900254 -0.41506494 #> 146 0.81480812 -0.426028317 -0.592109611 -0.06358773 -0.40518715 #> 147 -0.82618568 -0.426028317 -0.592109611 -0.39853543 -0.41506494 #> 148 -0.73866601 -0.426028317 -0.359849042 -0.40900254 -0.41506494 #> 149 -0.71678609 0.185244237 -0.452753270 -0.40900254 -0.41506494 #> 150 -0.82618568 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 151 1.66812490 0.834721326 0.878873994 -0.40900254 -0.41506494 #> 152 1.05548722 -0.168147708 -0.576625573 -0.40900254 -0.41506494 #> 153 -0.67302626 -0.426028317 0.058219983 0.45976806 -0.41506494 #> 154 -0.82618568 -0.426028317 -0.607593649 1.78909174 -0.41506494 #> 155 -0.69490618 -0.426028317 -0.545657497 5.65145742 -0.41506494 #> 156 -0.19166808 0.643698653 -0.483721345 -0.40900254 0.16772496 #> 157 -0.82618568 -0.416477183 -0.607593649 -0.40900254 -0.23726464 #> 158 1.53684540 -0.426028317 2.597602206 -0.40900254 -0.37555376 #> 159 -0.78242585 0.041977232 -0.437269232 -0.40900254 -0.41506494 #> 160 -0.80430576 -0.426028317 -0.592109611 -0.40900254 -0.41506494 #> 161 -0.65114634 -0.426028317 0.352416704 -0.40900254 -0.41506494 #> 162 -0.32294758 -0.426028317 -0.468237308 -0.40900254 0.28625850 #> 163 0.66164870 -0.378272648 0.816937842 3.22308659 -0.41506494 #> 164 -0.80430576 -0.416477183 -0.576625573 -0.40900254 2.05438380 #> 165 -0.71678609 -0.406926049 -0.576625573 -0.40900254 2.11365057 #> 166 -0.82618568 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 167 0.48660936 3.585447818 -0.328880966 -0.40900254 -0.27677581 #> 168 -0.82618568 -0.426028317 -0.406301156 -0.40900254 -0.41506494 #> 169 -0.80430576 -0.426028317 -0.530173459 -0.38806831 1.61976082 #> 170 -0.82618568 -0.426028317 -0.607593649 -0.40900254 1.05672651 #> 171 -0.47610700 0.701005455 0.646613425 0.81564999 -0.41506494 #> 172 -0.76054593 -0.426028317 -0.437269232 -0.40900254 -0.01995315 #> 173 -0.82618568 -0.426028317 -0.592109611 -0.40900254 -0.39530935 #> 174 -0.78242585 -0.416477183 -0.421785194 -0.31479850 4.01018720 #> 175 2.43392201 -0.215903376 -0.034684245 -0.40900254 -0.40518715 #> 176 1.07736713 -0.426028317 -0.127588473 -0.39853543 -0.41506494 #> 177 0.20217044 -0.034431837 0.538225159 0.05155054 -0.41506494 #> 178 -0.82618568 -0.426028317 0.182092286 -0.40900254 -0.41506494 #> 179 -0.80430576 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 180 -0.25730783 0.844272459 -0.065652321 -0.10545619 -0.41506494 #> 181 -0.67302626 -0.416477183 -0.576625573 0.78424864 -0.41506494 #> 182 0.26781019 -0.426028317 -0.452753270 0.86798557 -0.41506494 #> 183 -0.41046725 -0.263659045 0.027251907 0.54350498 -0.41506494 #> 184 -0.36670742 -0.273210178 -0.174040587 -0.36713408 -0.30640920 #> 185 2.43392201 -0.378272648 -0.561141535 -0.40900254 -0.41506494 #> 186 -0.78242585 -0.416477183 -0.545657497 -0.37760120 -0.41506494 #> 187 0.31157002 0.548187316 -0.607593649 -0.40900254 -0.15824228 #> 188 -0.82618568 -0.426028317 -0.592109611 -0.40900254 -0.35579817 #> 189 -0.71678609 -0.340068114 -0.514689421 -0.40900254 -0.26689802 #> 190 0.81480812 0.739209989 -0.297912890 -0.25199581 -0.40518715 #> 191 0.00525118 -0.426028317 -0.499205383 -0.40900254 1.41232712 #> 192 1.12112697 -0.426028317 -0.561141535 -0.40900254 -0.41506494 #> 193 1.47120565 1.130806469 0.383384780 0.66911037 -0.05946433 #> 194 -0.56362667 -0.387823782 -0.576625573 0.02014920 0.52332558 #> 195 -0.21354799 0.901579261 0.491773045 0.50163652 -0.39530935 #> 196 -0.82618568 -0.426028317 -0.592109611 -0.40900254 -0.41506494 #> 197 -0.80430576 1.608363152 -0.514689421 -0.38806831 -0.37555376 #> 198 -0.80430576 -0.426028317 -0.530173459 -0.40900254 -0.25702023 #> 199 1.71188474 0.204346505 -0.421785194 -0.19966023 0.06894701 #> 200 3.72483714 -0.426028317 1.869852422 -0.40900254 -0.32616479 #> #> $removed #> character(0) #>"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":null,"dir":"Reference","previous_headings":"","what":"Get feature importance using the permutation method — get_feature_importance","title":"Get feature importance using the permutation method — get_feature_importance","text":"Calculates feature importance using trained model test data. Requires future.apply package.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get feature importance using the permutation method — get_feature_importance","text":"","code":"get_feature_importance( trained_model, test_data, outcome_colname, perf_metric_function, perf_metric_name, class_probs, method, seed = NA, corr_thresh = 1, groups = NULL, nperms = 100, corr_method = \"spearman\" )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get feature importance using the permutation method — get_feature_importance","text":"trained_model Trained model caret::train(). test_data Held test data: dataframe outcome features. outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). perf_metric_function Function calculate performance metric used cross-validation test performance. functions provided caret (see caret::defaultSummary()). Defaults: binary classification = twoClassSummary, multi-class classification = multiClassSummary, regression = defaultSummary. perf_metric_name column name output function provided perf_metric_function used performance metric. Defaults: binary classification = \"ROC\", multi-class classification = \"logLoss\", regression = \"RMSE\". class_probs Whether use class probabilities (TRUE categorical outcomes, FALSE numeric outcomes). method ML method. Options: c(\"glmnet\", \"rf\", \"rpart2\", \"svmRadial\", \"xgbTree\"). glmnet: linear, logistic, multiclass regression rf: random forest rpart2: decision tree svmRadial: support vector machine xgbTree: xgboost seed Random seed (default: NA). results reproducible set seed. corr_thresh feature importance, group correlations equal corr_thresh (range 0 1; default: 1). groups Vector feature names group together permutation. element string feature names separated pipe character (|). NULL (default), correlated features grouped together based corr_thresh. nperms number permutations perform (default: 100). corr_method correlation method. options supported stats::cor: spearman, pearson, kendall. (default: spearman)","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get feature importance using the permutation method — get_feature_importance","text":"Data frame performance metrics feature (group correlated features; names) permuted (perf_metric), differences actual test performance metric permuted performance metric (perf_metric_diff; test minus permuted performance), p-value (pvalue: probability obtaining actual performance value null hypothesis). Features larger perf_metric_diff important. performance metric name (perf_metric_name) seed (seed) also returned.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Get feature importance using the permutation method — get_feature_importance","text":"permutation tests, p-value number permutation statistics greater test statistic, divided number permutations. case, permutation statistic model performance (e.g. AUROC) randomizing order observations one feature, test statistic actual performance test data. default perform 100 permutations per feature; increasing increase precision estimating null distribution, also increases runtime. p-value represents probability obtaining actual performance event null hypothesis true, null hypothesis feature important model performance. strongly recommend providing multiple cores speed computation time. See vignette parallel processing details.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get feature importance using the permutation method — get_feature_importance","text":"Begüm Topçuoğlu, topcuoglu.begum@gmail.com Zena Lapp, zenalapp@umich.edu Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get feature importance using the permutation method — get_feature_importance","text":"","code":"if (FALSE) { # If you called `run_ml()` with `feature_importance = FALSE` (the default), # you can use `get_feature_importance()` later as long as you have the # trained model and test data. results <- run_ml(otu_small, \"glmnet\", kfold = 2, cv_times = 2) names(results$trained_model$trainingData)[1] <- \"dx\" feat_imp <- get_feature_importance(results$trained_model, results$trained_model$trainingData, results$test_data, \"dx\", multiClassSummary, \"AUC\", class_probs = TRUE, method = \"glmnet\" ) # We strongly recommend providing multiple cores to speed up computation time. # Do this before calling `get_feature_importance()`. doFuture::registerDoFuture() future::plan(future::multicore, workers = 2) # Optionally, you can group features together with a custom grouping feat_imp <- get_feature_importance(results$trained_model, results$trained_model$trainingData, results$test_data, \"dx\", multiClassSummary, \"AUC\", class_probs = TRUE, method = \"glmnet\", groups = c( \"Otu00007\", \"Otu00008\", \"Otu00009\", \"Otu00011\", \"Otu00012\", \"Otu00015\", \"Otu00016\", \"Otu00018\", \"Otu00019\", \"Otu00020\", \"Otu00022\", \"Otu00023\", \"Otu00025\", \"Otu00028\", \"Otu00029\", \"Otu00030\", \"Otu00035\", \"Otu00036\", \"Otu00037\", \"Otu00038\", \"Otu00039\", \"Otu00040\", \"Otu00047\", \"Otu00050\", \"Otu00052\", \"Otu00054\", \"Otu00055\", \"Otu00056\", \"Otu00060\", \"Otu00003|Otu00002|Otu00005|Otu00024|Otu00032|Otu00041|Otu00053\", \"Otu00014|Otu00021|Otu00017|Otu00031|Otu00057\", \"Otu00013|Otu00006\", \"Otu00026|Otu00001|Otu00034|Otu00048\", \"Otu00033|Otu00010\", \"Otu00042|Otu00004\", \"Otu00043|Otu00027|Otu00049\", \"Otu00051|Otu00045\", \"Otu00058|Otu00044\", \"Otu00059|Otu00046\" ) ) # the function can show a progress bar if you have the `progressr` package installed. ## optionally, specify the progress bar format: progressr::handlers(progressr::handler_progress( format = \":message :bar :percent | elapsed: :elapsed | eta: :eta\", clear = FALSE, show_after = 0 )) ## tell progressr to always report progress progressr::handlers(global = TRUE) ## run the function and watch the live progress udpates feat_imp <- get_feature_importance(results$trained_model, results$trained_model$trainingData, results$test_data, \"dx\", multiClassSummary, \"AUC\", class_probs = TRUE, method = \"glmnet\" ) # You can specify any correlation method supported by `stats::cor`: feat_imp <- get_feature_importance(results$trained_model, results$trained_model$trainingData, results$test_data, \"dx\", multiClassSummary, \"AUC\", class_probs = TRUE, method = \"glmnet\", corr_method = \"pearson\" ) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hp_performance.html","id":null,"dir":"Reference","previous_headings":"","what":"Get hyperparameter performance metrics — get_hp_performance","title":"Get hyperparameter performance metrics — get_hp_performance","text":"Get hyperparameter performance metrics","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hp_performance.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get hyperparameter performance metrics — get_hp_performance","text":"","code":"get_hp_performance(trained_model)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hp_performance.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get hyperparameter performance metrics — get_hp_performance","text":"trained_model trained model (e.g. run_ml())","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hp_performance.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get hyperparameter performance metrics — get_hp_performance","text":"Named list: dat: Dataframe performance metric group hyperparameters. params: Hyperparameters tuned. metric: Performance metric used.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hp_performance.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get hyperparameter performance metrics — get_hp_performance","text":"Zena Lapp, zenalapp@umich.edu Kelly Sovacool sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hp_performance.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get hyperparameter performance metrics — get_hp_performance","text":"","code":"get_hp_performance(otu_mini_bin_results_glmnet$trained_model) #> $dat #> alpha lambda AUC #> 1 0 1e-04 0.6082552 #> 2 0 1e-03 0.6082552 #> 3 0 1e-02 0.6086458 #> 4 0 1e-01 0.6166789 #> 5 0 1e+00 0.6221737 #> 6 0 1e+01 0.6187408 #> #> $params #> [1] \"lambda\" #> #> $metric #> [1] \"AUC\" #>"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hyperparams_list.html","id":null,"dir":"Reference","previous_headings":"","what":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","title":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","text":"details see vignette hyperparameter tuning.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hyperparams_list.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","text":"","code":"get_hyperparams_list(dataset, method)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hyperparams_list.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","text":"dataset Dataframe outcome variable columns features. method ML method. Options: c(\"glmnet\", \"rf\", \"rpart2\", \"svmRadial\", \"xgbTree\"). glmnet: linear, logistic, multiclass regression rf: random forest rpart2: decision tree svmRadial: support vector machine xgbTree: xgboost","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hyperparams_list.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","text":"Named list hyperparameters.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hyperparams_list.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","text":"Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hyperparams_list.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","text":"","code":"get_hyperparams_list(otu_mini_bin, \"rf\") #> $mtry #> [1] 2 3 6 #> get_hyperparams_list(otu_small, \"rf\") #> $mtry #> [1] 4 8 16 #> get_hyperparams_list(otu_mini_bin, \"rpart2\") #> $maxdepth #> [1] 1 2 4 8 16 30 #> get_hyperparams_list(otu_small, \"rpart2\") #> $maxdepth #> [1] 1 2 4 8 16 30 #>"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_outcome_type.html","id":null,"dir":"Reference","previous_headings":"","what":"Get outcome type. — get_outcome_type","title":"Get outcome type. — get_outcome_type","text":"outcome numeric, type continuous. Otherwise, outcome type binary two outcomes multiclass two outcomes.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_outcome_type.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get outcome type. — get_outcome_type","text":"","code":"get_outcome_type(outcomes_vec)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_outcome_type.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get outcome type. — get_outcome_type","text":"outcomes_vec Vector outcomes.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_outcome_type.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get outcome type. — get_outcome_type","text":"Outcome type (continuous, binary, multiclass).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_outcome_type.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get outcome type. — get_outcome_type","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_outcome_type.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get outcome type. — get_outcome_type","text":"","code":"get_outcome_type(c(1, 2, 1)) #> [1] \"continuous\" get_outcome_type(c(\"a\", \"b\", \"b\")) #> [1] \"binary\" get_outcome_type(c(\"a\", \"b\", \"c\")) #> [1] \"multiclass\""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":null,"dir":"Reference","previous_headings":"","what":"Select indices to partition the data into training & testing sets. — get_partition_indices","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"Use function get row indices training set.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"","code":"get_partition_indices( outcomes, training_frac = 0.8, groups = NULL, group_partitions = NULL )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"outcomes vector outcomes training_frac Fraction data training set (default: 0.8). Rows dataset randomly selected training set, remaining rows used testing set. Alternatively, provide vector integers, used row indices training set. remaining rows used testing set. groups Vector groups keep together splitting data train test sets. number groups training set larger kfold, groups also kept together cross-validation. Length matches number rows dataset (default: NULL). group_partitions Specify assign groups training testing partitions (default: NULL). groups specifies samples belong group \"\" belong group \"B\", setting group_partitions = list(train = c(\"\", \"B\"), test = c(\"B\")) result samples group \"\" placed training set, samples \"B\" also training set, remaining samples \"B\" testing set. partition sizes close training_frac possible. number groups training set larger kfold, groups also kept together cross-validation.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"Vector row indices training set.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"groups NULL, uses createDataPartition. Otherwise, uses create_grouped_data_partition(). Set seed prior calling function like data partitions reproducible (recommended).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"","code":"training_inds <- get_partition_indices(otu_mini_bin$dx) train_data <- otu_mini_bin[training_inds, ] test_data <- otu_mini_bin[-training_inds, ]"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_fn.html","id":null,"dir":"Reference","previous_headings":"","what":"Get default performance metric function — get_perf_metric_fn","title":"Get default performance metric function — get_perf_metric_fn","text":"Get default performance metric function","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_fn.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get default performance metric function — get_perf_metric_fn","text":"","code":"get_perf_metric_fn(outcome_type)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_fn.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get default performance metric function — get_perf_metric_fn","text":"outcome_type Type outcome (one : \"continuous\",\"binary\",\"multiclass\").","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_fn.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get default performance metric function — get_perf_metric_fn","text":"Performance metric function.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_fn.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get default performance metric function — get_perf_metric_fn","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_fn.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get default performance metric function — get_perf_metric_fn","text":"","code":"get_perf_metric_fn(\"continuous\") #> function (data, lev = NULL, model = NULL) #> { #> if (is.character(data$obs)) #> data$obs <- factor(data$obs, levels = lev) #> postResample(data[, \"pred\"], data[, \"obs\"]) #> } #> #> get_perf_metric_fn(\"binary\") #> function (data, lev = NULL, model = NULL) #> { #> if (!all(levels(data[, \"pred\"]) == levels(data[, \"obs\"]))) #> stop(\"levels of observed and predicted data do not match\") #> has_class_probs <- all(lev %in% colnames(data)) #> if (has_class_probs) { #> lloss <- mnLogLoss(data = data, lev = lev, model = model) #> requireNamespaceQuietStop(\"pROC\") #> requireNamespaceQuietStop(\"MLmetrics\") #> prob_stats <- lapply(levels(data[, \"pred\"]), function(x) { #> obs <- ifelse(data[, \"obs\"] == x, 1, 0) #> prob <- data[, x] #> roc_auc <- try(pROC::roc(obs, data[, x], direction = \"<\", #> quiet = TRUE), silent = TRUE) #> roc_auc <- if (inherits(roc_auc, \"try-error\")) #> NA #> else roc_auc$auc #> pr_auc <- try(MLmetrics::PRAUC(y_pred = data[, x], #> y_true = obs), silent = TRUE) #> if (inherits(pr_auc, \"try-error\")) #> pr_auc <- NA #> res <- c(ROC = roc_auc, AUC = pr_auc) #> return(res) #> }) #> prob_stats <- do.call(\"rbind\", prob_stats) #> prob_stats <- colMeans(prob_stats, na.rm = TRUE) #> } #> CM <- confusionMatrix(data[, \"pred\"], data[, \"obs\"], mode = \"everything\") #> if (length(levels(data[, \"pred\"])) == 2) { #> class_stats <- CM$byClass #> } #> else { #> class_stats <- colMeans(CM$byClass) #> names(class_stats) <- paste(\"Mean\", names(class_stats)) #> } #> overall_stats <- if (has_class_probs) #> c(CM$overall, logLoss = as.numeric(lloss), AUC = unname(prob_stats[\"ROC\"]), #> prAUC = unname(prob_stats[\"AUC\"])) #> else CM$overall #> stats <- c(overall_stats, class_stats) #> stats <- stats[!names(stats) %in% c(\"AccuracyNull\", \"AccuracyLower\", #> \"AccuracyUpper\", \"AccuracyPValue\", \"McnemarPValue\", \"Mean Prevalence\", #> \"Mean Detection Prevalence\")] #> names(stats) <- gsub(\"[[:blank:]]+\", \"_\", names(stats)) #> stat_list <- c(\"Accuracy\", \"Kappa\", \"Mean_F1\", \"Mean_Sensitivity\", #> \"Mean_Specificity\", \"Mean_Pos_Pred_Value\", \"Mean_Neg_Pred_Value\", #> \"Mean_Precision\", \"Mean_Recall\", \"Mean_Detection_Rate\", #> \"Mean_Balanced_Accuracy\") #> if (has_class_probs) #> stat_list <- c(\"logLoss\", \"AUC\", \"prAUC\", stat_list) #> if (length(levels(data[, \"pred\"])) == 2) #> stat_list <- gsub(\"^Mean_\", \"\", stat_list) #> stats <- stats[c(stat_list)] #> return(stats) #> } #> #> get_perf_metric_fn(\"multiclass\") #> function (data, lev = NULL, model = NULL) #> { #> if (!all(levels(data[, \"pred\"]) == levels(data[, \"obs\"]))) #> stop(\"levels of observed and predicted data do not match\") #> has_class_probs <- all(lev %in% colnames(data)) #> if (has_class_probs) { #> lloss <- mnLogLoss(data = data, lev = lev, model = model) #> requireNamespaceQuietStop(\"pROC\") #> requireNamespaceQuietStop(\"MLmetrics\") #> prob_stats <- lapply(levels(data[, \"pred\"]), function(x) { #> obs <- ifelse(data[, \"obs\"] == x, 1, 0) #> prob <- data[, x] #> roc_auc <- try(pROC::roc(obs, data[, x], direction = \"<\", #> quiet = TRUE), silent = TRUE) #> roc_auc <- if (inherits(roc_auc, \"try-error\")) #> NA #> else roc_auc$auc #> pr_auc <- try(MLmetrics::PRAUC(y_pred = data[, x], #> y_true = obs), silent = TRUE) #> if (inherits(pr_auc, \"try-error\")) #> pr_auc <- NA #> res <- c(ROC = roc_auc, AUC = pr_auc) #> return(res) #> }) #> prob_stats <- do.call(\"rbind\", prob_stats) #> prob_stats <- colMeans(prob_stats, na.rm = TRUE) #> } #> CM <- confusionMatrix(data[, \"pred\"], data[, \"obs\"], mode = \"everything\") #> if (length(levels(data[, \"pred\"])) == 2) { #> class_stats <- CM$byClass #> } #> else { #> class_stats <- colMeans(CM$byClass) #> names(class_stats) <- paste(\"Mean\", names(class_stats)) #> } #> overall_stats <- if (has_class_probs) #> c(CM$overall, logLoss = as.numeric(lloss), AUC = unname(prob_stats[\"ROC\"]), #> prAUC = unname(prob_stats[\"AUC\"])) #> else CM$overall #> stats <- c(overall_stats, class_stats) #> stats <- stats[!names(stats) %in% c(\"AccuracyNull\", \"AccuracyLower\", #> \"AccuracyUpper\", \"AccuracyPValue\", \"McnemarPValue\", \"Mean Prevalence\", #> \"Mean Detection Prevalence\")] #> names(stats) <- gsub(\"[[:blank:]]+\", \"_\", names(stats)) #> stat_list <- c(\"Accuracy\", \"Kappa\", \"Mean_F1\", \"Mean_Sensitivity\", #> \"Mean_Specificity\", \"Mean_Pos_Pred_Value\", \"Mean_Neg_Pred_Value\", #> \"Mean_Precision\", \"Mean_Recall\", \"Mean_Detection_Rate\", #> \"Mean_Balanced_Accuracy\") #> if (has_class_probs) #> stat_list <- c(\"logLoss\", \"AUC\", \"prAUC\", stat_list) #> if (length(levels(data[, \"pred\"])) == 2) #> stat_list <- gsub(\"^Mean_\", \"\", stat_list) #> stats <- stats[c(stat_list)] #> return(stats) #> } #> #> "},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_name.html","id":null,"dir":"Reference","previous_headings":"","what":"Get default performance metric name — get_perf_metric_name","title":"Get default performance metric name — get_perf_metric_name","text":"Get default performance metric name cross-validation.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_name.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get default performance metric name — get_perf_metric_name","text":"","code":"get_perf_metric_name(outcome_type)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_name.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get default performance metric name — get_perf_metric_name","text":"outcome_type Type outcome (one : \"continuous\",\"binary\",\"multiclass\").","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_name.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get default performance metric name — get_perf_metric_name","text":"Performance metric name.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_name.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get default performance metric name — get_perf_metric_name","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_name.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get default performance metric name — get_perf_metric_name","text":"","code":"get_perf_metric_name(\"continuous\") #> [1] \"RMSE\" get_perf_metric_name(\"binary\") #> [1] \"AUC\" get_perf_metric_name(\"multiclass\") #> [1] \"logLoss\""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_performance_tbl.html","id":null,"dir":"Reference","previous_headings":"","what":"Get model performance metrics as a one-row tibble — get_performance_tbl","title":"Get model performance metrics as a one-row tibble — get_performance_tbl","text":"Get model performance metrics one-row tibble","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_performance_tbl.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get model performance metrics as a one-row tibble — get_performance_tbl","text":"","code":"get_performance_tbl( trained_model, test_data, outcome_colname, perf_metric_function, perf_metric_name, class_probs, method, seed = NA )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_performance_tbl.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get model performance metrics as a one-row tibble — get_performance_tbl","text":"trained_model Trained model caret::train(). test_data Held test data: dataframe outcome features. outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). perf_metric_function Function calculate performance metric used cross-validation test performance. functions provided caret (see caret::defaultSummary()). Defaults: binary classification = twoClassSummary, multi-class classification = multiClassSummary, regression = defaultSummary. perf_metric_name column name output function provided perf_metric_function used performance metric. Defaults: binary classification = \"ROC\", multi-class classification = \"logLoss\", regression = \"RMSE\". class_probs Whether use class probabilities (TRUE categorical outcomes, FALSE numeric outcomes). method ML method. Options: c(\"glmnet\", \"rf\", \"rpart2\", \"svmRadial\", \"xgbTree\"). glmnet: linear, logistic, multiclass regression rf: random forest rpart2: decision tree svmRadial: support vector machine xgbTree: xgboost seed Random seed (default: NA). results reproducible set seed.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_performance_tbl.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get model performance metrics as a one-row tibble — get_performance_tbl","text":"one-row tibble columns cv_auroc, column performance metrics test data method, seed.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_performance_tbl.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get model performance metrics as a one-row tibble — get_performance_tbl","text":"Kelly Sovacool, sovacool@umich.edu Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_performance_tbl.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get model performance metrics as a one-row tibble — get_performance_tbl","text":"","code":"if (FALSE) { results <- run_ml(otu_small, \"glmnet\", kfold = 2, cv_times = 2) names(results$trained_model$trainingData)[1] <- \"dx\" get_performance_tbl(results$trained_model, results$test_data, \"dx\", multiClassSummary, \"AUC\", class_probs = TRUE, method = \"glmnet\" ) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_tuning_grid.html","id":null,"dir":"Reference","previous_headings":"","what":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","title":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","text":"Generate tuning grid tuning hyperparameters","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_tuning_grid.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","text":"","code":"get_tuning_grid(hyperparams_list, method)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_tuning_grid.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","text":"hyperparams_list Named list lists hyperparameters. method ML method. Options: c(\"glmnet\", \"rf\", \"rpart2\", \"svmRadial\", \"xgbTree\"). glmnet: linear, logistic, multiclass regression rf: random forest rpart2: decision tree svmRadial: support vector machine xgbTree: xgboost","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_tuning_grid.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","text":"tuning grid.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_tuning_grid.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","text":"Begüm Topçuoğlu, topcuoglu.begum@gmail.com Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_tuning_grid.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","text":"","code":"ml_method <- \"glmnet\" hparams_list <- get_hyperparams_list(otu_small, ml_method) get_tuning_grid(hparams_list, ml_method) #> lambda alpha #> 1 1e-04 0 #> 2 1e-03 0 #> 3 1e-02 0 #> 4 1e-01 0 #> 5 1e+00 0 #> 6 1e+01 0"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/group_correlated_features.html","id":null,"dir":"Reference","previous_headings":"","what":"Group correlated features — group_correlated_features","title":"Group correlated features — group_correlated_features","text":"Group correlated features","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/group_correlated_features.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Group correlated features — group_correlated_features","text":"","code":"group_correlated_features( features, corr_thresh = 1, group_neg_corr = TRUE, corr_method = \"spearman\" )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/group_correlated_features.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Group correlated features — group_correlated_features","text":"features dataframe column feature ML corr_thresh feature importance, group correlations equal corr_thresh (range 0 1; default: 1). group_neg_corr Whether group negatively correlated features together (e.g. c(0,1) c(1,0)). corr_method correlation method. options supported stats::cor: spearman, pearson, kendall. (default: spearman)","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/group_correlated_features.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Group correlated features — group_correlated_features","text":"vector element group correlated features separated pipes (|)","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/group_correlated_features.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Group correlated features — group_correlated_features","text":"Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/group_correlated_features.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Group correlated features — group_correlated_features","text":"","code":"features <- data.frame( a = 1:3, b = 2:4, c = c(1, 0, 1), d = (5:7), e = c(5, 1, 4), f = c(-1, 0, -1) ) group_correlated_features(features) #> [1] \"a|b|d\" \"c|f\" \"e\""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/mikropml.html","id":null,"dir":"Reference","previous_headings":"","what":"mikropml: User-Friendly R Package for Robust Machine Learning Pipelines — mikropml","title":"mikropml: User-Friendly R Package for Robust Machine Learning Pipelines — mikropml","text":"mikropml implements supervised machine learning pipelines using regression, support vector machines, decision trees, random forest, gradient-boosted trees. main functions preprocess_data() process data prior running machine learning, run_ml() run machine learning.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/mikropml.html","id":"authors","dir":"Reference","previous_headings":"","what":"Authors","title":"mikropml: User-Friendly R Package for Robust Machine Learning Pipelines — mikropml","text":"Begüm D. Topçuoğlu (ORCID) Zena Lapp (ORCID) Kelly L. Sovacool (ORCID) Evan Snitkin (ORCID) Jenna Wiens (ORCID) Patrick D. Schloss (ORCID)","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/mikropml.html","id":"see-vignettes","dir":"Reference","previous_headings":"","what":"See vignettes","title":"mikropml: User-Friendly R Package for Robust Machine Learning Pipelines — mikropml","text":"Introduction Preprocessing data Hyperparameter tuning Parallel processing mikropml paper","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin.html","id":null,"dir":"Reference","previous_headings":"","what":"Mini OTU abundance dataset — otu_mini_bin","title":"Mini OTU abundance dataset — otu_mini_bin","text":"dataset containing relatives abundances OTUs human stool samples binary outcome, dx. subset otu_small.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Mini OTU abundance dataset — otu_mini_bin","text":"","code":"otu_mini_bin"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Mini OTU abundance dataset — otu_mini_bin","text":"data frame dx column diagnosis: healthy cancerous (colorectal). columns OTU relative abundances.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_glmnet.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with L2 logistic regression on otu_mini_bin with feature importance and grouping — otu_mini_bin_results_glmnet","title":"Results from running the pipeline with L2 logistic regression on otu_mini_bin with feature importance and grouping — otu_mini_bin_results_glmnet","text":"Results running pipeline L2 logistic regression otu_mini_bin feature importance grouping","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_glmnet.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with L2 logistic regression on otu_mini_bin with feature importance and grouping — otu_mini_bin_results_glmnet","text":"","code":"otu_mini_bin_results_glmnet"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_glmnet.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with L2 logistic regression on otu_mini_bin with feature importance and grouping — otu_mini_bin_results_glmnet","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_rf.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with random forest on otu_mini_bin — otu_mini_bin_results_rf","title":"Results from running the pipeline with random forest on otu_mini_bin — otu_mini_bin_results_rf","text":"Results running pipeline random forest otu_mini_bin","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_rf.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with random forest on otu_mini_bin — otu_mini_bin_results_rf","text":"","code":"otu_mini_bin_results_rf"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_rf.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with random forest on otu_mini_bin — otu_mini_bin_results_rf","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_rpart2.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with rpart2 on otu_mini_bin — otu_mini_bin_results_rpart2","title":"Results from running the pipeline with rpart2 on otu_mini_bin — otu_mini_bin_results_rpart2","text":"Results running pipeline rpart2 otu_mini_bin","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_rpart2.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with rpart2 on otu_mini_bin — otu_mini_bin_results_rpart2","text":"","code":"otu_mini_bin_results_rpart2"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_rpart2.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with rpart2 on otu_mini_bin — otu_mini_bin_results_rpart2","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_svmRadial.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with svmRadial on otu_mini_bin — otu_mini_bin_results_svmRadial","title":"Results from running the pipeline with svmRadial on otu_mini_bin — otu_mini_bin_results_svmRadial","text":"Results running pipeline svmRadial otu_mini_bin","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_svmRadial.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with svmRadial on otu_mini_bin — otu_mini_bin_results_svmRadial","text":"","code":"otu_mini_bin_results_svmRadial"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_svmRadial.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with svmRadial on otu_mini_bin — otu_mini_bin_results_svmRadial","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_xgbTree.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with xbgTree on otu_mini_bin — otu_mini_bin_results_xgbTree","title":"Results from running the pipeline with xbgTree on otu_mini_bin — otu_mini_bin_results_xgbTree","text":"Results running pipeline xbgTree otu_mini_bin","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_xgbTree.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with xbgTree on otu_mini_bin — otu_mini_bin_results_xgbTree","text":"","code":"otu_mini_bin_results_xgbTree"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_xgbTree.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with xbgTree on otu_mini_bin — otu_mini_bin_results_xgbTree","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cont_results_glmnet.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome — otu_mini_cont_results_glmnet","title":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome — otu_mini_cont_results_glmnet","text":"Results running pipeline glmnet otu_mini_bin Otu00001 outcome","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cont_results_glmnet.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome — otu_mini_cont_results_glmnet","text":"","code":"otu_mini_cont_results_glmnet"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cont_results_glmnet.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome — otu_mini_cont_results_glmnet","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cont_results_nocv.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome column,\nusing a custom train control scheme that does not perform cross-validation — otu_mini_cont_results_nocv","title":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome column,\nusing a custom train control scheme that does not perform cross-validation — otu_mini_cont_results_nocv","text":"Results running pipeline glmnet otu_mini_bin Otu00001 outcome column, using custom train control scheme perform cross-validation","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cont_results_nocv.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome column,\nusing a custom train control scheme that does not perform cross-validation — otu_mini_cont_results_nocv","text":"","code":"otu_mini_cont_results_nocv"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cont_results_nocv.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome column,\nusing a custom train control scheme that does not perform cross-validation — otu_mini_cont_results_nocv","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cv.html","id":null,"dir":"Reference","previous_headings":"","what":"Cross validation on train_data_mini with grouped features. — otu_mini_cv","title":"Cross validation on train_data_mini with grouped features. — otu_mini_cv","text":"Cross validation train_data_mini grouped features.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cv.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Cross validation on train_data_mini with grouped features. — otu_mini_cv","text":"","code":"otu_mini_cv"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cv.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Cross validation on train_data_mini with grouped features. — otu_mini_cv","text":"object class list length 27.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi.html","id":null,"dir":"Reference","previous_headings":"","what":"Mini OTU abundance dataset with 3 categorical variables — otu_mini_multi","title":"Mini OTU abundance dataset with 3 categorical variables — otu_mini_multi","text":"dataset containing relatives abundances OTUs human stool samples","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Mini OTU abundance dataset with 3 categorical variables — otu_mini_multi","text":"","code":"otu_mini_multi"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Mini OTU abundance dataset with 3 categorical variables — otu_mini_multi","text":"data frame dx column colorectal cancer diagnosis: adenoma, carcinoma, normal. columns OTU relative abundances.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi_group.html","id":null,"dir":"Reference","previous_headings":"","what":"Groups for otu_mini_multi — otu_mini_multi_group","title":"Groups for otu_mini_multi — otu_mini_multi_group","text":"Groups otu_mini_multi","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi_group.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Groups for otu_mini_multi — otu_mini_multi_group","text":"","code":"otu_mini_multi_group"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi_group.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Groups for otu_mini_multi — otu_mini_multi_group","text":"object class character length 490.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi_results_glmnet.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with glmnet on otu_mini_multi for\nmulticlass outcomes — otu_mini_multi_results_glmnet","title":"Results from running the pipeline with glmnet on otu_mini_multi for\nmulticlass outcomes — otu_mini_multi_results_glmnet","text":"Results running pipeline glmnet otu_mini_multi multiclass outcomes","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi_results_glmnet.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with glmnet on otu_mini_multi for\nmulticlass outcomes — otu_mini_multi_results_glmnet","text":"","code":"otu_mini_multi_results_glmnet"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi_results_glmnet.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with glmnet on otu_mini_multi for\nmulticlass outcomes — otu_mini_multi_results_glmnet","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_small.html","id":null,"dir":"Reference","previous_headings":"","what":"Small OTU abundance dataset — otu_small","title":"Small OTU abundance dataset — otu_small","text":"dataset containing relatives abundances 60 OTUs 60 human stool samples. subset data provided extdata/otu_large.csv, used Topçuoğlu et al. 2020.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_small.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Small OTU abundance dataset — otu_small","text":"","code":"otu_small"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_small.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Small OTU abundance dataset — otu_small","text":"data frame 60 rows 61 variables. dx column diagnosis: healthy cancerous (colorectal). columns OTU relative abundances.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/permute_p_value.html","id":null,"dir":"Reference","previous_headings":"","what":"Calculated a permuted p-value comparing two models — permute_p_value","title":"Calculated a permuted p-value comparing two models — permute_p_value","text":"Calculated permuted p-value comparing two models","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/permute_p_value.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Calculated a permuted p-value comparing two models — permute_p_value","text":"","code":"permute_p_value( merged_data, metric, group_name, group_1, group_2, nperm = 10000 )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/permute_p_value.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Calculated a permuted p-value comparing two models — permute_p_value","text":"merged_data concatenated performance data run_ml metric metric compare, must numeric group_name column group variables compare group_1 name one group compare group_2 name group compare nperm number permutations, default=10000","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/permute_p_value.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Calculated a permuted p-value comparing two models — permute_p_value","text":"numeric p-value comparing two models","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/permute_p_value.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Calculated a permuted p-value comparing two models — permute_p_value","text":"Begüm Topçuoğlu, topcuoglu.begum@gmail.com Courtney R Armour, armourc@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/permute_p_value.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Calculated a permuted p-value comparing two models — permute_p_value","text":"","code":"df <- dplyr::tibble( model = c(\"rf\", \"rf\", \"glmnet\", \"glmnet\", \"svmRadial\", \"svmRadial\"), AUC = c(.2, 0.3, 0.8, 0.9, 0.85, 0.95) ) set.seed(123) permute_p_value(df, \"AUC\", \"model\", \"rf\", \"glmnet\", nperm = 100) #> [1] 0.3663366"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_hp_performance.html","id":null,"dir":"Reference","previous_headings":"","what":"Plot hyperparameter performance metrics — plot_hp_performance","title":"Plot hyperparameter performance metrics — plot_hp_performance","text":"Plot hyperparameter performance metrics","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_hp_performance.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Plot hyperparameter performance metrics — plot_hp_performance","text":"","code":"plot_hp_performance(dat, param_col, metric_col)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_hp_performance.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Plot hyperparameter performance metrics — plot_hp_performance","text":"dat dataframe hyperparameters performance metric (e.g. get_hp_performance() combine_hp_performance()) param_col hyperparameter plotted. must column dat. metric_col performance metric. must column dat.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_hp_performance.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Plot hyperparameter performance metrics — plot_hp_performance","text":"ggplot hyperparameter performance.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_hp_performance.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Plot hyperparameter performance metrics — plot_hp_performance","text":"Zena Lapp, zenalapp@umich.edu Kelly Sovacool sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_hp_performance.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Plot hyperparameter performance metrics — plot_hp_performance","text":"","code":"# plot for a single `run_ml()` call hp_metrics <- get_hp_performance(otu_mini_bin_results_glmnet$trained_model) hp_metrics #> $dat #> alpha lambda AUC #> 1 0 1e-04 0.6082552 #> 2 0 1e-03 0.6082552 #> 3 0 1e-02 0.6086458 #> 4 0 1e-01 0.6166789 #> 5 0 1e+00 0.6221737 #> 6 0 1e+01 0.6187408 #> #> $params #> [1] \"lambda\" #> #> $metric #> [1] \"AUC\" #> plot_hp_performance(hp_metrics$dat, lambda, AUC) if (FALSE) { # plot for multiple `run_ml()` calls results <- lapply(seq(100, 102), function(seed) { run_ml(otu_small, \"glmnet\", seed = seed) }) models <- lapply(results, function(x) x$trained_model) hp_metrics <- combine_hp_performance(models) plot_hp_performance(hp_metrics$dat, lambda, AUC) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_model_performance.html","id":null,"dir":"Reference","previous_headings":"","what":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","title":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","text":"ggplot2 required use function.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_model_performance.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","text":"","code":"plot_model_performance(performance_df)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_model_performance.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","text":"performance_df dataframe performance results multiple calls run_ml()","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_model_performance.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","text":"ggplot2 plot performance.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_model_performance.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","text":"Begüm Topçuoglu, topcuoglu.begum@gmail.com Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_model_performance.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","text":"","code":"if (FALSE) { # call `run_ml()` multiple times with different seeds results_lst <- lapply(seq(100, 104), function(seed) { run_ml(otu_small, \"glmnet\", seed = seed) }) # extract and combine the performance results perf_df <- lapply(results_lst, function(result) { result[[\"performance\"]] }) %>% dplyr::bind_rows() # plot the performance results p <- plot_model_performance(perf_df) # call `run_ml()` with different ML methods param_grid <- expand.grid( seeds = seq(100, 104), methods = c(\"glmnet\", \"rf\") ) results_mtx <- mapply( function(seed, method) { run_ml(otu_mini_bin, method, seed = seed, kfold = 2) }, param_grid$seeds, param_grid$methods ) # extract and combine the performance results perf_df2 <- dplyr::bind_rows(results_mtx[\"performance\", ]) # plot the performance results p <- plot_model_performance(perf_df2) # you can continue adding layers to customize the plot p + theme_classic() + scale_color_brewer(palette = \"Dark2\") + coord_flip() }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":null,"dir":"Reference","previous_headings":"","what":"Preprocess data prior to running machine learning — preprocess_data","title":"Preprocess data prior to running machine learning — preprocess_data","text":"Function preprocess data input run_ml().","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Preprocess data prior to running machine learning — preprocess_data","text":"","code":"preprocess_data( dataset, outcome_colname, method = c(\"center\", \"scale\"), remove_var = \"nzv\", collapse_corr_feats = TRUE, to_numeric = TRUE, group_neg_corr = TRUE, prefilter_threshold = 1 )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Preprocess data prior to running machine learning — preprocess_data","text":"dataset Dataframe outcome variable columns features. outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). method Methods preprocess data, described caret::preProcess() (default: c(\"center\",\"scale\"), use NULL normalization). remove_var Whether remove variables near-zero variance ('nzv'; default), zero variance ('zv'), none (NULL). collapse_corr_feats Whether keep one perfectly correlated features. to_numeric Whether change features numeric possible. group_neg_corr Whether group negatively correlated features together (e.g. c(0,1) c(1,0)). prefilter_threshold Remove features non-zero & non-NA values N rows fewer (default: 1). Set -1 keep columns step. step also skipped to_numeric set FALSE.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Preprocess data prior to running machine learning — preprocess_data","text":"Named list including: dat_transformed: Preprocessed data. grp_feats: features grouped together, named list features corresponding group. removed_feats: features removed preprocessing (e.g. zero variance near-zero variance features). progressr package installed, progress bar time elapsed estimated time completion can displayed.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":"more-details","dir":"Reference","previous_headings":"","what":"More details","title":"Preprocess data prior to running machine learning — preprocess_data","text":"See preprocessing vignette details. Note values outcome_colname contain spaces, converted underscores compatibility caret.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Preprocess data prior to running machine learning — preprocess_data","text":"Zena Lapp, zenalapp@umich.edu Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Preprocess data prior to running machine learning — preprocess_data","text":"","code":"preprocess_data(mikropml::otu_small, \"dx\") #> Using 'dx' as the outcome column. #> $dat_transformed #> # A tibble: 200 × 61 #> dx Otu00001 Otu00002 Otu00003 Otu00004 Otu00005 Otu00006 Otu00…¹ Otu00008 #> #> 1 normal -0.420 -0.219 -0.174 -0.591 -0.0488 -0.167 -0.569 -0.0624 #> 2 normal -0.105 1.75 -0.718 0.0381 1.54 -0.573 -0.643 -0.132 #> 3 normal -0.708 0.696 1.43 0.604 -0.265 -0.0364 -0.612 -0.207 #> 4 normal -0.494 -0.665 2.02 -0.593 -0.676 -0.586 -0.552 -0.470 #> 5 normal 1.11 -0.395 -0.754 -0.586 -0.754 2.73 0.191 -0.676 #> 6 normal -0.685 0.614 -0.174 -0.584 0.376 0.804 -0.337 -0.00608 #> 7 cancer -0.770 -0.496 -0.318 0.159 -0.658 2.20 -0.717 0.0636 #> 8 normal -0.424 -0.478 -0.397 -0.556 -0.391 -0.0620 0.376 -0.0222 #> 9 normal -0.556 1.14 1.62 -0.352 -0.275 -0.465 -0.804 0.294 #> 10 cancer 1.46 -0.451 -0.694 -0.0567 -0.706 0.689 -0.370 1.59 #> # … with 190 more rows, 52 more variables: Otu00009 , Otu00010 , #> # Otu00011 , Otu00012 , Otu00013 , Otu00014 , #> # Otu00015 , Otu00016 , Otu00017 , Otu00018 , #> # Otu00019 , Otu00020 , Otu00021 , Otu00022 , #> # Otu00023 , Otu00024 , Otu00025 , Otu00026 , #> # Otu00027 , Otu00028 , Otu00029 , Otu00030 , #> # Otu00031 , Otu00032 , Otu00033 , Otu00034 , … #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0) #> # the function can show a progress bar if you have the progressr package installed ## optionally, specify the progress bar format progressr::handlers(progressr::handler_progress( format = \":message :bar :percent | elapsed: :elapsed | eta: :eta\", clear = FALSE, show_after = 0 )) ## tell progressor to always report progress if (FALSE) { progressr::handlers(global = TRUE) ## run the function and watch the live progress udpates dat_preproc <- preprocess_data(mikropml::otu_small, \"dx\") }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/randomize_feature_order.html","id":null,"dir":"Reference","previous_headings":"","what":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","title":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","text":"Randomize feature order eliminate position-dependent effects","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/randomize_feature_order.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","text":"","code":"randomize_feature_order(dataset, outcome_colname)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/randomize_feature_order.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","text":"dataset Dataframe outcome variable columns features. outcome_colname Column name string outcome variable (default NULL; first column chosen automatically).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/randomize_feature_order.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","text":"Dataset feature order randomized.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/randomize_feature_order.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","text":"Nick Lesniak, nlesniak@umich.edu Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/randomize_feature_order.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","text":"","code":"dat <- data.frame( outcome = c(\"1\", \"2\", \"3\"), a = 4:6, b = 7:9, c = 10:12, d = 13:15 ) randomize_feature_order(dat, \"outcome\") #> outcome c b a d #> 1 1 10 7 4 13 #> 2 2 11 8 5 14 #> 3 3 12 9 6 15"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/reexports.html","id":null,"dir":"Reference","previous_headings":"","what":"dplyr pipe — reexports","title":"dplyr pipe — reexports","text":"objects imported packages. Follow links see documentation. caret contr.ltfr dplyr %>% rlang :=, !!, .data","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/remove_singleton_columns.html","id":null,"dir":"Reference","previous_headings":"","what":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","title":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","text":"Removes columns non-zero & non-NA values threshold row(s) fewer.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/remove_singleton_columns.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","text":"","code":"remove_singleton_columns(dat, threshold = 1)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/remove_singleton_columns.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","text":"dat dataframe threshold Number rows. column non-zero & non-NA values threshold row(s) fewer, removed.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/remove_singleton_columns.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","text":"dataframe without singleton columns","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/remove_singleton_columns.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","text":"Kelly Sovacool, sovacool@umich.edu Courtney Armour","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/remove_singleton_columns.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","text":"","code":"remove_singleton_columns(data.frame(a = 1:3, b = c(0, 1, 0), c = 4:6)) #> $dat #> a c #> 1 1 4 #> 2 2 5 #> 3 3 6 #> #> $removed_feats #> [1] \"b\" #> remove_singleton_columns(data.frame(a = 1:3, b = c(0, 1, 0), c = 4:6), threshold = 0) #> $dat #> a b c #> 1 1 0 4 #> 2 2 1 5 #> 3 3 0 6 #> #> $removed_feats #> character(0) #> remove_singleton_columns(data.frame(a = 1:3, b = c(0, 1, NA), c = 4:6)) #> $dat #> a c #> 1 1 4 #> 2 2 5 #> 3 3 6 #> #> $removed_feats #> [1] \"b\" #> remove_singleton_columns(data.frame(a = 1:3, b = c(1, 1, 1), c = 4:6)) #> $dat #> a b c #> 1 1 1 4 #> 2 2 1 5 #> 3 3 1 6 #> #> $removed_feats #> character(0) #>"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/replace_spaces.html","id":null,"dir":"Reference","previous_headings":"","what":"Replace spaces in all elements of a character vector with underscores — replace_spaces","title":"Replace spaces in all elements of a character vector with underscores — replace_spaces","text":"Replace spaces elements character vector underscores","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/replace_spaces.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Replace spaces in all elements of a character vector with underscores — replace_spaces","text":"","code":"replace_spaces(x, new_char = \"_\")"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/replace_spaces.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Replace spaces in all elements of a character vector with underscores — replace_spaces","text":"x character vector new_char character replace spaces (default: _)","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/replace_spaces.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Replace spaces in all elements of a character vector with underscores — replace_spaces","text":"character vector spaces replaced new_char","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/replace_spaces.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Replace spaces in all elements of a character vector with underscores — replace_spaces","text":"Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/replace_spaces.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Replace spaces in all elements of a character vector with underscores — replace_spaces","text":"","code":"dat <- data.frame( dx = c(\"outcome 1\", \"outcome 2\", \"outcome 1\"), a = 1:3, b = c(5, 7, 1) ) dat$dx <- replace_spaces(dat$dx) dat #> dx a b #> 1 outcome_1 1 5 #> 2 outcome_2 2 7 #> 3 outcome_1 3 1"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":null,"dir":"Reference","previous_headings":"","what":"Run the machine learning pipeline — run_ml","title":"Run the machine learning pipeline — run_ml","text":"function runs machine learning (ML), evaluates best model, optionally calculates feature importance using framework outlined Topçuoğlu et al. 2020 (doi:10.1128/mBio.00434-20 ). Required inputs dataframe outcome variable columns features, well ML method. See vignette('introduction') details.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Run the machine learning pipeline — run_ml","text":"","code":"run_ml( dataset, method, outcome_colname = NULL, hyperparameters = NULL, find_feature_importance = FALSE, calculate_performance = TRUE, kfold = 5, cv_times = 100, cross_val = NULL, training_frac = 0.8, perf_metric_function = NULL, perf_metric_name = NULL, groups = NULL, group_partitions = NULL, corr_thresh = 1, seed = NA, ... )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Run the machine learning pipeline — run_ml","text":"dataset Dataframe outcome variable columns features. method ML method. Options: c(\"glmnet\", \"rf\", \"rpart2\", \"svmRadial\", \"xgbTree\"). glmnet: linear, logistic, multiclass regression rf: random forest rpart2: decision tree svmRadial: support vector machine xgbTree: xgboost outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). hyperparameters Dataframe hyperparameters (default NULL; sensible defaults chosen automatically). find_feature_importance Run permutation importance (default: FALSE). TRUE recommended like identify features important predicting outcome, resource-intensive. calculate_performance Whether calculate performance metrics (default: TRUE). might choose skip perform cross-validation model training. kfold Fold number k-fold cross-validation (default: 5). cv_times Number cross-validation partitions create (default: 100). cross_val custom cross-validation scheme caret::trainControl() (default: NULL, uses kfold cross validation repeated cv_times). kfold cv_times ignored user provides custom cross-validation scheme. See caret::trainControl() docs information use . training_frac Fraction data training set (default: 0.8). Rows dataset randomly selected training set, remaining rows used testing set. Alternatively, provide vector integers, used row indices training set. remaining rows used testing set. perf_metric_function Function calculate performance metric used cross-validation test performance. functions provided caret (see caret::defaultSummary()). Defaults: binary classification = twoClassSummary, multi-class classification = multiClassSummary, regression = defaultSummary. perf_metric_name column name output function provided perf_metric_function used performance metric. Defaults: binary classification = \"ROC\", multi-class classification = \"logLoss\", regression = \"RMSE\". groups Vector groups keep together splitting data train test sets. number groups training set larger kfold, groups also kept together cross-validation. Length matches number rows dataset (default: NULL). group_partitions Specify assign groups training testing partitions (default: NULL). groups specifies samples belong group \"\" belong group \"B\", setting group_partitions = list(train = c(\"\", \"B\"), test = c(\"B\")) result samples group \"\" placed training set, samples \"B\" also training set, remaining samples \"B\" testing set. partition sizes close training_frac possible. number groups training set larger kfold, groups also kept together cross-validation. corr_thresh feature importance, group correlations equal corr_thresh (range 0 1; default: 1). seed Random seed (default: NA). results reproducible set seed. ... additional arguments passed caret::train(), case weights via weights argument ntree rf models. See caret::train() docs details.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Run the machine learning pipeline — run_ml","text":"Named list results: trained_model: Output caret::train(), including best model. test_data: Part data used testing. performance: Dataframe performance metrics. first column cross-validation performance metric, last two columns ML method used seed (one set), respectively. columns performance metrics calculated test data. contains one row, can easily combine performance dataframes multiple calls run_ml() (see vignette(\"parallel\")). feature_importance: feature importances calculated, dataframe row feature correlated group. columns performance metric permuted data, difference true performance metric performance metric permuted data (true - permuted), feature name, ML method, performance metric name, seed (provided). AUC RMSE, higher perf_metric_diff , important feature predicting outcome. log loss, lower perf_metric_diff , important feature predicting outcome.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":"more-details","dir":"Reference","previous_headings":"","what":"More details","title":"Run the machine learning pipeline — run_ml","text":"details, please see vignettes.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Run the machine learning pipeline — run_ml","text":"Begüm Topçuoğlu, topcuoglu.begum@gmail.com Zena Lapp, zenalapp@umich.edu Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Run the machine learning pipeline — run_ml","text":"","code":"if (FALSE) { # regression run_ml(otu_small, \"glmnet\", seed = 2019 ) # random forest w/ feature importance run_ml(otu_small, \"rf\", outcome_colname = \"dx\", find_feature_importance = TRUE ) # custom cross validation & hyperparameters run_ml(otu_mini_bin[, 2:11], \"glmnet\", outcome_colname = \"Otu00001\", seed = 2019, hyperparameters = list(lambda = c(1e-04), alpha = 0), cross_val = caret::trainControl(method = \"none\"), calculate_performance = FALSE ) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/tidy_perf_data.html","id":null,"dir":"Reference","previous_headings":"","what":"Tidy the performance dataframe — tidy_perf_data","title":"Tidy the performance dataframe — tidy_perf_data","text":"Used plot_model_performance().","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/tidy_perf_data.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Tidy the performance dataframe — tidy_perf_data","text":"","code":"tidy_perf_data(performance_df)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/tidy_perf_data.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Tidy the performance dataframe — tidy_perf_data","text":"performance_df dataframe performance results multiple calls run_ml()","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/tidy_perf_data.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Tidy the performance dataframe — tidy_perf_data","text":"Tidy dataframe model performance metrics.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/tidy_perf_data.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Tidy the performance dataframe — tidy_perf_data","text":"Begüm Topçuoglu, topcuoglu.begum@gmail.com Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/tidy_perf_data.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Tidy the performance dataframe — tidy_perf_data","text":"","code":"if (FALSE) { # call `run_ml()` multiple times with different seeds results_lst <- lapply(seq(100, 104), function(seed) { run_ml(otu_small, \"glmnet\", seed = seed) }) # extract and combine the performance results perf_df <- lapply(results_lst, function(result) { result[[\"performance\"]] }) %>% dplyr::bind_rows() # make it pretty! tidy_perf_data(perf_df) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/train_model.html","id":null,"dir":"Reference","previous_headings":"","what":"Train model using caret::train(). — train_model","title":"Train model using caret::train(). — train_model","text":"Train model using caret::train().","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/train_model.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Train model using caret::train(). — train_model","text":"","code":"train_model( train_data, outcome_colname, method, cv, perf_metric_name, tune_grid, ... )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/train_model.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Train model using caret::train(). — train_model","text":"train_data Training data. Expected subset full dataset. outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). method ML method. Options: c(\"glmnet\", \"rf\", \"rpart2\", \"svmRadial\", \"xgbTree\"). glmnet: linear, logistic, multiclass regression rf: random forest rpart2: decision tree svmRadial: support vector machine xgbTree: xgboost cv Cross-validation caret scheme define_cv(). perf_metric_name column name output function provided perf_metric_function used performance metric. Defaults: binary classification = \"ROC\", multi-class classification = \"logLoss\", regression = \"RMSE\". tune_grid Tuning grid get_tuning_grid().#' ... additional arguments passed caret::train(), case weights via weights argument ntree rf models. See caret::train() docs details.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/train_model.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Train model using caret::train(). — train_model","text":"Trained model caret::train().","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/train_model.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Train model using caret::train(). — train_model","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/train_model.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Train model using caret::train(). — train_model","text":"","code":"if (FALSE) { training_data <- otu_mini_bin_results_glmnet$trained_model$trainingData %>% dplyr::rename(dx = .outcome) method <- \"rf\" hyperparameters <- get_hyperparams_list(otu_mini_bin, method) cross_val <- define_cv(training_data, \"dx\", hyperparameters, perf_metric_function = caret::multiClassSummary, class_probs = TRUE, cv_times = 2 ) tune_grid <- get_tuning_grid(hyperparameters, method) rf_model <- train_model( training_data, \"dx\", method, cross_val, \"AUC\", tune_grid, ntree = 1000 ) rf_model$results %>% dplyr::select(mtry, AUC, prAUC) }"},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-140","dir":"Changelog","previous_headings":"","what":"mikropml 1.4.0","title":"mikropml 1.4.0","text":"CRAN release: 2022-10-16 Users can now pass model-specific arguments (e.g. weights) caret::train(), allowing greater flexibility. Improved tests (#298, #300, #303 #kelly-sovacool) Minor documentation improvements.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-130","dir":"Changelog","previous_headings":"","what":"mikropml 1.3.0","title":"mikropml 1.3.0","text":"CRAN release: 2022-05-20 mikropml now requires R version 4.1.0 greater due update randomForest package (#292). New function compare_models() compares performance two models permutation test (#295, @courtneyarmour). Fixed bug cv_times affect reported repeats cross-validation (#291, @kelly-sovacool). Made minor documentation improvements (#293, @kelly-sovacool)","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-122","dir":"Changelog","previous_headings":"","what":"mikropml 1.2.2","title":"mikropml 1.2.2","text":"CRAN release: 2022-02-03 minor patch fixes test failure platforms long doubles. actual package code remains unchanged.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-121","dir":"Changelog","previous_headings":"","what":"mikropml 1.2.1","title":"mikropml 1.2.1","text":"CRAN release: 2022-01-30 using groups parameter, groups kept together cross-validation partitions kfold <= number groups training set. Previously, error thrown condition met. Now, enough groups training set groups kept together CV, groups allowed split across CV partitions. Report p-values permutation feature importance (#288, @kelly-sovacool).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-120","dir":"Changelog","previous_headings":"","what":"mikropml 1.2.0","title":"mikropml 1.2.0","text":"CRAN release: 2021-11-10 Also added new parameter calculate_performance, controls whether performance metrics calculated (default: TRUE). Users may wish skip performance calculations training models cross-validation. New parameter group_partitions added run_ml() allows users control groups go partition train/test split (#281, @kelly-sovacool). default, training_frac fraction 0 1 specifies much dataset used training fraction train/test split. Users can instead give training_frac vector indices correspond rows dataset go training fraction train/test split. gives users direct control exactly observations training fraction desired.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-111","dir":"Changelog","previous_headings":"","what":"mikropml 1.1.1","title":"mikropml 1.1.1","text":"CRAN release: 2021-09-14 Also, group_correlated_features() now user-facing function.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-110","dir":"Changelog","previous_headings":"","what":"mikropml 1.1.0","title":"mikropml 1.1.0","text":"CRAN release: 2021-08-10 default still “spearman”, now can use methods supported stats::cor corr_method parameter: get_feature_importance(corr_method = \"pearson\") now video tutorials covering mikropml skills related machine learning, created @pschloss (#270). Fixed bug preprocess_data() converted outcome column character vector (#273, @kelly-sovacool, @ecmaggioncalda).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-100","dir":"Changelog","previous_headings":"","what":"mikropml 1.0.0","title":"mikropml 1.0.0","text":"CRAN release: 2021-05-13 mikropml now logo created @NLesniak! Made documentation improvements (#238, #231 @kelly-sovacool; #256 @BTopcuoglu). Remove features appear N=prefilter_threshold fewer rows data. Created function remove_singleton_columns() called preprocess_data() carry . Provide custom groups features permute together permutation importance. groups NULL default; case, correlated features corr_thresh grouped together. preprocess_data() now replaces spaces outcome column underscores (#247, @kelly-sovacool, @JonnyTran). Clarify intro vignette support multi-label outcomes. (#254, @zenalapp) Optional progress bar preprocess_data() get_feature_importance() using progressr package (#257, @kelly-sovacool, @JonnyTran, @FedericoComoglio). mikropml paper soon published JOSS!","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-002","dir":"Changelog","previous_headings":"","what":"mikropml 0.0.2","title":"mikropml 0.0.2","text":"CRAN release: 2020-12-03 Fixed test failure Solaris. Fixed multiple test failures R 3.6.2 due stringsAsFactors behavior. Made minor documentation improvements. Moved rpart Suggests Imports consistency packages used model training.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-001","dir":"Changelog","previous_headings":"","what":"mikropml 0.0.1","title":"mikropml 0.0.1","text":"CRAN release: 2020-11-23 first release version mikropml! 🎉 Added NEWS.md file track changes package. run_ml() preprocess_data() plot_model_performance() plot_hp_performance() glmnet: logistic linear regression rf: random forest rpart2: decision trees svmRadial: support vector machines xgbTree: gradient-boosted trees Introduction Preprocess data Hyperparameter tuning Parallel processing mikropml paper","code":""}]