diff --git a/README.md b/README.md index c9d4dda9..ff4e4e60 100644 --- a/README.md +++ b/README.md @@ -54,9 +54,9 @@ mamba install -c conda-forge r-mikropml ### Dependencies -- Imports: caret, dplyr, e1071, glmnet, kernlab, MLmetrics, + - Imports: caret, dplyr, e1071, glmnet, kernlab, MLmetrics, randomForest, rlang, rpart, stats, utils, xgboost -- Suggests: doFuture, foreach, future, future.apply, ggplot2, knitr, + - Suggests: doFuture, foreach, future, future.apply, ggplot2, knitr, progress, progressr, purrr, rmarkdown, testthat, tidyr ## Usage @@ -107,29 +107,35 @@ license](https://creativecommons.org/licenses/by/4.0/). To cite mikropml in publications, use: +> +> >

+> > Topçuoğlu BD, Lapp Z, Sovacool KL, Snitkin E, Wiens J, Schloss PD > (2021). “mikropml: User-Friendly R Package for Supervised Machine > Learning Pipelines.” Journal of Open Source Software, > 6(61), 3073. > doi:10.21105/joss.03073, > https://joss.theoj.org/papers/10.21105/joss.03073. +> >

A BibTeX entry for LaTeX users is: - @Article{, - title = {{mikropml}: User-Friendly R Package for Supervised Machine Learning Pipelines}, - author = {Begüm D. Topçuoğlu and Zena Lapp and Kelly L. Sovacool and Evan Snitkin and Jenna Wiens and Patrick D. Schloss}, - journal = {Journal of Open Source Software}, - year = {2021}, - month = {May}, - volume = {6}, - number = {61}, - pages = {3073}, - doi = {10.21105/joss.03073}, - url = {https://joss.theoj.org/papers/10.21105/joss.03073}, - } +``` + @Article{, + title = {{mikropml}: User-Friendly R Package for Supervised Machine Learning Pipelines}, + author = {Begüm D. Topçuoğlu and Zena Lapp and Kelly L. Sovacool and Evan Snitkin and Jenna Wiens and Patrick D. Schloss}, + journal = {Journal of Open Source Software}, + year = {2021}, + month = {May}, + volume = {6}, + number = {61}, + pages = {3073}, + doi = {10.21105/joss.03073}, + url = {https://joss.theoj.org/papers/10.21105/joss.03073}, +} +``` ## Why the name? @@ -138,4 +144,4 @@ This package was originally implemented as a machine learning pipeline for microbiome-based classification problems (see [Topçuoğlu *et al.* 2020](https://doi.org/10.1128/mBio.00434-20)). We realized that these methods are applicable in many other fields too, but stuck with the name -because we like it! +because we like it\! diff --git a/docs/dev/CODE_OF_CONDUCT.html b/docs/dev/CODE_OF_CONDUCT.html index 388aa4f6..0428e050 100644 --- a/docs/dev/CODE_OF_CONDUCT.html +++ b/docs/dev/CODE_OF_CONDUCT.html @@ -1,5 +1,5 @@ -Contributor Covenant Code of Conduct • mikropmlContributor Covenant Code of Conduct • mikropml @@ -134,7 +134,7 @@

Attribution -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/CONTRIBUTING.html b/docs/dev/CONTRIBUTING.html index 76ecfd5f..9e234d66 100644 --- a/docs/dev/CONTRIBUTING.html +++ b/docs/dev/CONTRIBUTING.html @@ -1,5 +1,5 @@ -Contributing to mikropml • mikropmlContributing to mikropml • mikropml @@ -96,7 +96,7 @@

Code of Conduct diff --git a/docs/dev/LICENSE-text.html b/docs/dev/LICENSE-text.html index 428a8090..8fbaa22b 100644 --- a/docs/dev/LICENSE-text.html +++ b/docs/dev/LICENSE-text.html @@ -1,5 +1,5 @@ -License • mikropmlLicense • mikropml @@ -69,7 +69,7 @@

diff --git a/docs/dev/LICENSE.html b/docs/dev/LICENSE.html index ac82e5d7..2924c279 100644 --- a/docs/dev/LICENSE.html +++ b/docs/dev/LICENSE.html @@ -1,5 +1,5 @@ -MIT License • mikropmlMIT License • mikropml @@ -72,7 +72,7 @@ diff --git a/docs/dev/SUPPORT.html b/docs/dev/SUPPORT.html index 43c7a1ca..5b2b2ef0 100644 --- a/docs/dev/SUPPORT.html +++ b/docs/dev/SUPPORT.html @@ -1,5 +1,5 @@ -Getting help with mikropml • mikropmlGetting help with mikropml • mikropml @@ -86,7 +86,7 @@

What happens next? -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/articles/index.html b/docs/dev/articles/index.html index 39f12551..12070d3a 100644 --- a/docs/dev/articles/index.html +++ b/docs/dev/articles/index.html @@ -1,5 +1,5 @@ -Articles • mikropmlArticles • mikropml @@ -85,7 +85,7 @@

Vignettes

diff --git a/docs/dev/articles/introduction.html b/docs/dev/articles/introduction.html index 6b265f4f..775e6e09 100644 --- a/docs/dev/articles/introduction.html +++ b/docs/dev/articles/introduction.html @@ -14,8 +14,8 @@ - - + + @@ -219,12 +219,12 @@

The simplest way to run_ml()
 results$performance
 #> # A tibble: 1 × 17
-#>   cv_metric_AUC logLoss   AUC prAUC Accuracy Kappa    F1 Sensitivity Specificity
-#>           <dbl>   <dbl> <dbl> <dbl>    <dbl> <dbl> <dbl>       <dbl>       <dbl>
-#> 1         0.622   0.684 0.647 0.606    0.590 0.179   0.6         0.6       0.579
-#> # … with 8 more variables: Pos_Pred_Value <dbl>, Neg_Pred_Value <dbl>,
-#> #   Precision <dbl>, Recall <dbl>, Detection_Rate <dbl>,
-#> #   Balanced_Accuracy <dbl>, method <chr>, seed <dbl>
+#> cv_metric_AUC logLoss AUC prAUC Accuracy Kappa F1 Sensi…¹ Speci…² Pos_P…³ +#> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> +#> 1 0.622 0.684 0.647 0.606 0.590 0.179 0.6 0.6 0.579 0.6 +#> # … with 7 more variables: Neg_Pred_Value <dbl>, Precision <dbl>, Recall <dbl>, +#> # Detection_Rate <dbl>, Balanced_Accuracy <dbl>, method <chr>, seed <dbl>, +#> # and abbreviated variable names ¹​Sensitivity, ²​Specificity, ³​Pos_Pred_Value

When using logistic regression for binary classification, area under the receiver-operator characteristic curve (AUC) is a useful metric to evaluate model performance. Because of that, it’s the default that we use for mikropml. However, it is crucial to evaluate your model performance using multiple metrics. Below you can find more information about other performance metrics and how to use them in our package.

cv_metric_AUC is the AUC for the cross-validation folds for the training data. This gives us a sense of how well the model performs on the training data.

Most of the other columns are performance metrics for the test data — the data that wasn’t used to build the model. Here, you can see that the AUC for the test data is not much above 0.5, suggesting that this model does not predict much better than chance, and that the model is overfit because the cross-validation AUC (cv_metric_AUC, measured during training) is much higher than the testing AUC. This isn’t too surprising since we’re using so few features with this example dataset, so don’t be discouraged. The default option also provides a number of other performance metrics that you might be interested in, including area under the precision-recall curve (prAUC).

@@ -329,12 +329,13 @@

Changing the performance metric
 results_pr$performance
 #> # A tibble: 1 × 17
-#>   cv_metric_prAUC logLoss   AUC prAUC Accuracy  Kappa    F1 Sensitivity
-#>             <dbl>   <dbl> <dbl> <dbl>    <dbl>  <dbl> <dbl>       <dbl>
-#> 1           0.577   0.691 0.663 0.605    0.538 0.0539 0.690           1
-#> # … with 9 more variables: Specificity <dbl>, Pos_Pred_Value <dbl>,
-#> #   Neg_Pred_Value <dbl>, Precision <dbl>, Recall <dbl>, Detection_Rate <dbl>,
-#> #   Balanced_Accuracy <dbl>, method <chr>, seed <dbl>
+#> cv_metric_p…¹ logLoss AUC prAUC Accur…² Kappa F1 Sensi…³ Speci…⁴ Pos_P…⁵ +#> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> +#> 1 0.577 0.691 0.663 0.605 0.538 0.0539 0.690 1 0.0526 0.526 +#> # … with 7 more variables: Neg_Pred_Value <dbl>, Precision <dbl>, Recall <dbl>, +#> # Detection_Rate <dbl>, Balanced_Accuracy <dbl>, method <chr>, seed <dbl>, +#> # and abbreviated variable names ¹​cv_metric_prAUC, ²​Accuracy, ³​Sensitivity, +#> # ⁴​Specificity, ⁵​Pos_Pred_Value
+#> cv_metric…¹ logLoss AUC prAUC Accur…² Kappa Mean_F1 Mean_…³ Mean_…⁴ Mean_…⁵ +#> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <chr> <dbl> <dbl> <chr> +#> 1 1.07 1.11 0.506 0.353 0.382 0.0449 NA 0.360 0.682 NaN +#> # … with 7 more variables: Mean_Neg_Pred_Value <dbl>, Mean_Precision <chr>, +#> # Mean_Recall <dbl>, Mean_Detection_Rate <dbl>, Mean_Balanced_Accuracy <dbl>, +#> # method <chr>, seed <dbl>, and abbreviated variable names +#> # ¹​cv_metric_logLoss, ²​Accuracy, ³​Mean_Sensitivity, ⁴​Mean_Specificity, +#> # ⁵​Mean_Pos_Pred_Value

Continuous data @@ -611,7 +613,7 @@

References

-

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/articles/paper.html b/docs/dev/articles/paper.html index 80cf7e8a..becd1555 100644 --- a/docs/dev/articles/paper.html +++ b/docs/dev/articles/paper.html @@ -14,8 +14,8 @@ - - + + @@ -264,7 +264,7 @@

References

-

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/articles/parallel.html b/docs/dev/articles/parallel.html index 7440baf8..fcc33857 100644 --- a/docs/dev/articles/parallel.html +++ b/docs/dev/articles/parallel.html @@ -14,8 +14,8 @@ - - + + @@ -276,7 +276,7 @@

Parallelizing with Snakemake

-

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/articles/preprocess.html b/docs/dev/articles/preprocess.html index da7e7a56..b8a91be7 100644 --- a/docs/dev/articles/preprocess.html +++ b/docs/dev/articles/preprocess.html @@ -14,8 +14,8 @@ - - + + @@ -680,7 +680,7 @@

Next step: train and evaluate y diff --git a/docs/dev/articles/tuning.html b/docs/dev/articles/tuning.html index 7ccdad77..98af5acf 100644 --- a/docs/dev/articles/tuning.html +++ b/docs/dev/articles/tuning.html @@ -14,8 +14,8 @@ - - + + @@ -433,7 +433,7 @@

XGBoost

-

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/authors.html b/docs/dev/authors.html index 81c6a7a1..2c177ea1 100644 --- a/docs/dev/authors.html +++ b/docs/dev/authors.html @@ -1,5 +1,5 @@ -Authors and Citation • mikropmlAuthors and Citation • mikropml @@ -124,7 +124,7 @@

Citation

diff --git a/docs/dev/deps/bootstrap-5.1.3/bootstrap.bundle.min.js b/docs/dev/deps/bootstrap-5.1.3/bootstrap.bundle.min.js new file mode 100644 index 00000000..cc0a2556 --- /dev/null +++ b/docs/dev/deps/bootstrap-5.1.3/bootstrap.bundle.min.js @@ -0,0 +1,7 @@ +/*! + * Bootstrap v5.1.3 (https://getbootstrap.com/) + * Copyright 2011-2021 The Bootstrap Authors (https://github.com/twbs/bootstrap/graphs/contributors) + * Licensed under MIT (https://github.com/twbs/bootstrap/blob/main/LICENSE) + */ +!function(t,e){"object"==typeof exports&&"undefined"!=typeof module?module.exports=e():"function"==typeof define&&define.amd?define(e):(t="undefined"!=typeof globalThis?globalThis:t||self).bootstrap=e()}(this,(function(){"use strict";const t="transitionend",e=t=>{let e=t.getAttribute("data-bs-target");if(!e||"#"===e){let i=t.getAttribute("href");if(!i||!i.includes("#")&&!i.startsWith("."))return null;i.includes("#")&&!i.startsWith("#")&&(i=`#${i.split("#")[1]}`),e=i&&"#"!==i?i.trim():null}return e},i=t=>{const i=e(t);return i&&document.querySelector(i)?i:null},n=t=>{const i=e(t);return i?document.querySelector(i):null},s=e=>{e.dispatchEvent(new Event(t))},o=t=>!(!t||"object"!=typeof t)&&(void 0!==t.jquery&&(t=t[0]),void 0!==t.nodeType),r=t=>o(t)?t.jquery?t[0]:t:"string"==typeof t&&t.length>0?document.querySelector(t):null,a=(t,e,i)=>{Object.keys(i).forEach((n=>{const s=i[n],r=e[n],a=r&&o(r)?"element":null==(l=r)?`${l}`:{}.toString.call(l).match(/\s([a-z]+)/i)[1].toLowerCase();var l;if(!new RegExp(s).test(a))throw new TypeError(`${t.toUpperCase()}: Option "${n}" provided type "${a}" but expected type "${s}".`)}))},l=t=>!(!o(t)||0===t.getClientRects().length)&&"visible"===getComputedStyle(t).getPropertyValue("visibility"),c=t=>!t||t.nodeType!==Node.ELEMENT_NODE||!!t.classList.contains("disabled")||(void 0!==t.disabled?t.disabled:t.hasAttribute("disabled")&&"false"!==t.getAttribute("disabled")),h=t=>{if(!document.documentElement.attachShadow)return null;if("function"==typeof t.getRootNode){const e=t.getRootNode();return e instanceof ShadowRoot?e:null}return t instanceof ShadowRoot?t:t.parentNode?h(t.parentNode):null},d=()=>{},u=t=>{t.offsetHeight},f=()=>{const{jQuery:t}=window;return t&&!document.body.hasAttribute("data-bs-no-jquery")?t:null},p=[],m=()=>"rtl"===document.documentElement.dir,g=t=>{var e;e=()=>{const e=f();if(e){const i=t.NAME,n=e.fn[i];e.fn[i]=t.jQueryInterface,e.fn[i].Constructor=t,e.fn[i].noConflict=()=>(e.fn[i]=n,t.jQueryInterface)}},"loading"===document.readyState?(p.length||document.addEventListener("DOMContentLoaded",(()=>{p.forEach((t=>t()))})),p.push(e)):e()},_=t=>{"function"==typeof t&&t()},b=(e,i,n=!0)=>{if(!n)return void _(e);const o=(t=>{if(!t)return 0;let{transitionDuration:e,transitionDelay:i}=window.getComputedStyle(t);const n=Number.parseFloat(e),s=Number.parseFloat(i);return n||s?(e=e.split(",")[0],i=i.split(",")[0],1e3*(Number.parseFloat(e)+Number.parseFloat(i))):0})(i)+5;let r=!1;const a=({target:n})=>{n===i&&(r=!0,i.removeEventListener(t,a),_(e))};i.addEventListener(t,a),setTimeout((()=>{r||s(i)}),o)},v=(t,e,i,n)=>{let s=t.indexOf(e);if(-1===s)return t[!i&&n?t.length-1:0];const o=t.length;return s+=i?1:-1,n&&(s=(s+o)%o),t[Math.max(0,Math.min(s,o-1))]},y=/[^.]*(?=\..*)\.|.*/,w=/\..*/,E=/::\d+$/,A={};let T=1;const O={mouseenter:"mouseover",mouseleave:"mouseout"},C=/^(mouseenter|mouseleave)/i,k=new Set(["click","dblclick","mouseup","mousedown","contextmenu","mousewheel","DOMMouseScroll","mouseover","mouseout","mousemove","selectstart","selectend","keydown","keypress","keyup","orientationchange","touchstart","touchmove","touchend","touchcancel","pointerdown","pointermove","pointerup","pointerleave","pointercancel","gesturestart","gesturechange","gestureend","focus","blur","change","reset","select","submit","focusin","focusout","load","unload","beforeunload","resize","move","DOMContentLoaded","readystatechange","error","abort","scroll"]);function L(t,e){return e&&`${e}::${T++}`||t.uidEvent||T++}function x(t){const e=L(t);return t.uidEvent=e,A[e]=A[e]||{},A[e]}function D(t,e,i=null){const n=Object.keys(t);for(let s=0,o=n.length;sfunction(e){if(!e.relatedTarget||e.relatedTarget!==e.delegateTarget&&!e.delegateTarget.contains(e.relatedTarget))return t.call(this,e)};n?n=t(n):i=t(i)}const[o,r,a]=S(e,i,n),l=x(t),c=l[a]||(l[a]={}),h=D(c,r,o?i:null);if(h)return void(h.oneOff=h.oneOff&&s);const d=L(r,e.replace(y,"")),u=o?function(t,e,i){return function n(s){const o=t.querySelectorAll(e);for(let{target:r}=s;r&&r!==this;r=r.parentNode)for(let a=o.length;a--;)if(o[a]===r)return s.delegateTarget=r,n.oneOff&&j.off(t,s.type,e,i),i.apply(r,[s]);return null}}(t,i,n):function(t,e){return function i(n){return n.delegateTarget=t,i.oneOff&&j.off(t,n.type,e),e.apply(t,[n])}}(t,i);u.delegationSelector=o?i:null,u.originalHandler=r,u.oneOff=s,u.uidEvent=d,c[d]=u,t.addEventListener(a,u,o)}function I(t,e,i,n,s){const o=D(e[i],n,s);o&&(t.removeEventListener(i,o,Boolean(s)),delete e[i][o.uidEvent])}function P(t){return t=t.replace(w,""),O[t]||t}const j={on(t,e,i,n){N(t,e,i,n,!1)},one(t,e,i,n){N(t,e,i,n,!0)},off(t,e,i,n){if("string"!=typeof e||!t)return;const[s,o,r]=S(e,i,n),a=r!==e,l=x(t),c=e.startsWith(".");if(void 0!==o){if(!l||!l[r])return;return void I(t,l,r,o,s?i:null)}c&&Object.keys(l).forEach((i=>{!function(t,e,i,n){const s=e[i]||{};Object.keys(s).forEach((o=>{if(o.includes(n)){const n=s[o];I(t,e,i,n.originalHandler,n.delegationSelector)}}))}(t,l,i,e.slice(1))}));const h=l[r]||{};Object.keys(h).forEach((i=>{const n=i.replace(E,"");if(!a||e.includes(n)){const e=h[i];I(t,l,r,e.originalHandler,e.delegationSelector)}}))},trigger(t,e,i){if("string"!=typeof e||!t)return null;const n=f(),s=P(e),o=e!==s,r=k.has(s);let a,l=!0,c=!0,h=!1,d=null;return o&&n&&(a=n.Event(e,i),n(t).trigger(a),l=!a.isPropagationStopped(),c=!a.isImmediatePropagationStopped(),h=a.isDefaultPrevented()),r?(d=document.createEvent("HTMLEvents"),d.initEvent(s,l,!0)):d=new CustomEvent(e,{bubbles:l,cancelable:!0}),void 0!==i&&Object.keys(i).forEach((t=>{Object.defineProperty(d,t,{get:()=>i[t]})})),h&&d.preventDefault(),c&&t.dispatchEvent(d),d.defaultPrevented&&void 0!==a&&a.preventDefault(),d}},M=new Map,H={set(t,e,i){M.has(t)||M.set(t,new Map);const n=M.get(t);n.has(e)||0===n.size?n.set(e,i):console.error(`Bootstrap doesn't allow more than one instance per element. Bound instance: ${Array.from(n.keys())[0]}.`)},get:(t,e)=>M.has(t)&&M.get(t).get(e)||null,remove(t,e){if(!M.has(t))return;const i=M.get(t);i.delete(e),0===i.size&&M.delete(t)}};class B{constructor(t){(t=r(t))&&(this._element=t,H.set(this._element,this.constructor.DATA_KEY,this))}dispose(){H.remove(this._element,this.constructor.DATA_KEY),j.off(this._element,this.constructor.EVENT_KEY),Object.getOwnPropertyNames(this).forEach((t=>{this[t]=null}))}_queueCallback(t,e,i=!0){b(t,e,i)}static getInstance(t){return H.get(r(t),this.DATA_KEY)}static getOrCreateInstance(t,e={}){return this.getInstance(t)||new this(t,"object"==typeof e?e:null)}static get VERSION(){return"5.1.3"}static get NAME(){throw new Error('You have to implement the static method "NAME", for each component!')}static get DATA_KEY(){return`bs.${this.NAME}`}static get EVENT_KEY(){return`.${this.DATA_KEY}`}}const R=(t,e="hide")=>{const i=`click.dismiss${t.EVENT_KEY}`,s=t.NAME;j.on(document,i,`[data-bs-dismiss="${s}"]`,(function(i){if(["A","AREA"].includes(this.tagName)&&i.preventDefault(),c(this))return;const o=n(this)||this.closest(`.${s}`);t.getOrCreateInstance(o)[e]()}))};class W extends B{static get NAME(){return"alert"}close(){if(j.trigger(this._element,"close.bs.alert").defaultPrevented)return;this._element.classList.remove("show");const t=this._element.classList.contains("fade");this._queueCallback((()=>this._destroyElement()),this._element,t)}_destroyElement(){this._element.remove(),j.trigger(this._element,"closed.bs.alert"),this.dispose()}static jQueryInterface(t){return this.each((function(){const e=W.getOrCreateInstance(this);if("string"==typeof t){if(void 0===e[t]||t.startsWith("_")||"constructor"===t)throw new TypeError(`No method named "${t}"`);e[t](this)}}))}}R(W,"close"),g(W);const $='[data-bs-toggle="button"]';class z extends B{static get NAME(){return"button"}toggle(){this._element.setAttribute("aria-pressed",this._element.classList.toggle("active"))}static jQueryInterface(t){return this.each((function(){const e=z.getOrCreateInstance(this);"toggle"===t&&e[t]()}))}}function q(t){return"true"===t||"false"!==t&&(t===Number(t).toString()?Number(t):""===t||"null"===t?null:t)}function F(t){return t.replace(/[A-Z]/g,(t=>`-${t.toLowerCase()}`))}j.on(document,"click.bs.button.data-api",$,(t=>{t.preventDefault();const e=t.target.closest($);z.getOrCreateInstance(e).toggle()})),g(z);const U={setDataAttribute(t,e,i){t.setAttribute(`data-bs-${F(e)}`,i)},removeDataAttribute(t,e){t.removeAttribute(`data-bs-${F(e)}`)},getDataAttributes(t){if(!t)return{};const e={};return Object.keys(t.dataset).filter((t=>t.startsWith("bs"))).forEach((i=>{let n=i.replace(/^bs/,"");n=n.charAt(0).toLowerCase()+n.slice(1,n.length),e[n]=q(t.dataset[i])})),e},getDataAttribute:(t,e)=>q(t.getAttribute(`data-bs-${F(e)}`)),offset(t){const e=t.getBoundingClientRect();return{top:e.top+window.pageYOffset,left:e.left+window.pageXOffset}},position:t=>({top:t.offsetTop,left:t.offsetLeft})},V={find:(t,e=document.documentElement)=>[].concat(...Element.prototype.querySelectorAll.call(e,t)),findOne:(t,e=document.documentElement)=>Element.prototype.querySelector.call(e,t),children:(t,e)=>[].concat(...t.children).filter((t=>t.matches(e))),parents(t,e){const i=[];let n=t.parentNode;for(;n&&n.nodeType===Node.ELEMENT_NODE&&3!==n.nodeType;)n.matches(e)&&i.push(n),n=n.parentNode;return i},prev(t,e){let i=t.previousElementSibling;for(;i;){if(i.matches(e))return[i];i=i.previousElementSibling}return[]},next(t,e){let i=t.nextElementSibling;for(;i;){if(i.matches(e))return[i];i=i.nextElementSibling}return[]},focusableChildren(t){const e=["a","button","input","textarea","select","details","[tabindex]",'[contenteditable="true"]'].map((t=>`${t}:not([tabindex^="-"])`)).join(", ");return this.find(e,t).filter((t=>!c(t)&&l(t)))}},K="carousel",X={interval:5e3,keyboard:!0,slide:!1,pause:"hover",wrap:!0,touch:!0},Y={interval:"(number|boolean)",keyboard:"boolean",slide:"(boolean|string)",pause:"(string|boolean)",wrap:"boolean",touch:"boolean"},Q="next",G="prev",Z="left",J="right",tt={ArrowLeft:J,ArrowRight:Z},et="slid.bs.carousel",it="active",nt=".active.carousel-item";class st extends B{constructor(t,e){super(t),this._items=null,this._interval=null,this._activeElement=null,this._isPaused=!1,this._isSliding=!1,this.touchTimeout=null,this.touchStartX=0,this.touchDeltaX=0,this._config=this._getConfig(e),this._indicatorsElement=V.findOne(".carousel-indicators",this._element),this._touchSupported="ontouchstart"in document.documentElement||navigator.maxTouchPoints>0,this._pointerEvent=Boolean(window.PointerEvent),this._addEventListeners()}static get Default(){return X}static get NAME(){return K}next(){this._slide(Q)}nextWhenVisible(){!document.hidden&&l(this._element)&&this.next()}prev(){this._slide(G)}pause(t){t||(this._isPaused=!0),V.findOne(".carousel-item-next, .carousel-item-prev",this._element)&&(s(this._element),this.cycle(!0)),clearInterval(this._interval),this._interval=null}cycle(t){t||(this._isPaused=!1),this._interval&&(clearInterval(this._interval),this._interval=null),this._config&&this._config.interval&&!this._isPaused&&(this._updateInterval(),this._interval=setInterval((document.visibilityState?this.nextWhenVisible:this.next).bind(this),this._config.interval))}to(t){this._activeElement=V.findOne(nt,this._element);const e=this._getItemIndex(this._activeElement);if(t>this._items.length-1||t<0)return;if(this._isSliding)return void j.one(this._element,et,(()=>this.to(t)));if(e===t)return this.pause(),void this.cycle();const i=t>e?Q:G;this._slide(i,this._items[t])}_getConfig(t){return t={...X,...U.getDataAttributes(this._element),..."object"==typeof t?t:{}},a(K,t,Y),t}_handleSwipe(){const t=Math.abs(this.touchDeltaX);if(t<=40)return;const e=t/this.touchDeltaX;this.touchDeltaX=0,e&&this._slide(e>0?J:Z)}_addEventListeners(){this._config.keyboard&&j.on(this._element,"keydown.bs.carousel",(t=>this._keydown(t))),"hover"===this._config.pause&&(j.on(this._element,"mouseenter.bs.carousel",(t=>this.pause(t))),j.on(this._element,"mouseleave.bs.carousel",(t=>this.cycle(t)))),this._config.touch&&this._touchSupported&&this._addTouchEventListeners()}_addTouchEventListeners(){const t=t=>this._pointerEvent&&("pen"===t.pointerType||"touch"===t.pointerType),e=e=>{t(e)?this.touchStartX=e.clientX:this._pointerEvent||(this.touchStartX=e.touches[0].clientX)},i=t=>{this.touchDeltaX=t.touches&&t.touches.length>1?0:t.touches[0].clientX-this.touchStartX},n=e=>{t(e)&&(this.touchDeltaX=e.clientX-this.touchStartX),this._handleSwipe(),"hover"===this._config.pause&&(this.pause(),this.touchTimeout&&clearTimeout(this.touchTimeout),this.touchTimeout=setTimeout((t=>this.cycle(t)),500+this._config.interval))};V.find(".carousel-item img",this._element).forEach((t=>{j.on(t,"dragstart.bs.carousel",(t=>t.preventDefault()))})),this._pointerEvent?(j.on(this._element,"pointerdown.bs.carousel",(t=>e(t))),j.on(this._element,"pointerup.bs.carousel",(t=>n(t))),this._element.classList.add("pointer-event")):(j.on(this._element,"touchstart.bs.carousel",(t=>e(t))),j.on(this._element,"touchmove.bs.carousel",(t=>i(t))),j.on(this._element,"touchend.bs.carousel",(t=>n(t))))}_keydown(t){if(/input|textarea/i.test(t.target.tagName))return;const e=tt[t.key];e&&(t.preventDefault(),this._slide(e))}_getItemIndex(t){return this._items=t&&t.parentNode?V.find(".carousel-item",t.parentNode):[],this._items.indexOf(t)}_getItemByOrder(t,e){const i=t===Q;return v(this._items,e,i,this._config.wrap)}_triggerSlideEvent(t,e){const i=this._getItemIndex(t),n=this._getItemIndex(V.findOne(nt,this._element));return j.trigger(this._element,"slide.bs.carousel",{relatedTarget:t,direction:e,from:n,to:i})}_setActiveIndicatorElement(t){if(this._indicatorsElement){const e=V.findOne(".active",this._indicatorsElement);e.classList.remove(it),e.removeAttribute("aria-current");const i=V.find("[data-bs-target]",this._indicatorsElement);for(let e=0;e{j.trigger(this._element,et,{relatedTarget:o,direction:d,from:s,to:r})};if(this._element.classList.contains("slide")){o.classList.add(h),u(o),n.classList.add(c),o.classList.add(c);const t=()=>{o.classList.remove(c,h),o.classList.add(it),n.classList.remove(it,h,c),this._isSliding=!1,setTimeout(f,0)};this._queueCallback(t,n,!0)}else n.classList.remove(it),o.classList.add(it),this._isSliding=!1,f();a&&this.cycle()}_directionToOrder(t){return[J,Z].includes(t)?m()?t===Z?G:Q:t===Z?Q:G:t}_orderToDirection(t){return[Q,G].includes(t)?m()?t===G?Z:J:t===G?J:Z:t}static carouselInterface(t,e){const i=st.getOrCreateInstance(t,e);let{_config:n}=i;"object"==typeof e&&(n={...n,...e});const s="string"==typeof e?e:n.slide;if("number"==typeof e)i.to(e);else if("string"==typeof s){if(void 0===i[s])throw new TypeError(`No method named "${s}"`);i[s]()}else n.interval&&n.ride&&(i.pause(),i.cycle())}static jQueryInterface(t){return this.each((function(){st.carouselInterface(this,t)}))}static dataApiClickHandler(t){const e=n(this);if(!e||!e.classList.contains("carousel"))return;const i={...U.getDataAttributes(e),...U.getDataAttributes(this)},s=this.getAttribute("data-bs-slide-to");s&&(i.interval=!1),st.carouselInterface(e,i),s&&st.getInstance(e).to(s),t.preventDefault()}}j.on(document,"click.bs.carousel.data-api","[data-bs-slide], [data-bs-slide-to]",st.dataApiClickHandler),j.on(window,"load.bs.carousel.data-api",(()=>{const t=V.find('[data-bs-ride="carousel"]');for(let e=0,i=t.length;et===this._element));null!==s&&o.length&&(this._selector=s,this._triggerArray.push(e))}this._initializeChildren(),this._config.parent||this._addAriaAndCollapsedClass(this._triggerArray,this._isShown()),this._config.toggle&&this.toggle()}static get Default(){return rt}static get NAME(){return ot}toggle(){this._isShown()?this.hide():this.show()}show(){if(this._isTransitioning||this._isShown())return;let t,e=[];if(this._config.parent){const t=V.find(ut,this._config.parent);e=V.find(".collapse.show, .collapse.collapsing",this._config.parent).filter((e=>!t.includes(e)))}const i=V.findOne(this._selector);if(e.length){const n=e.find((t=>i!==t));if(t=n?pt.getInstance(n):null,t&&t._isTransitioning)return}if(j.trigger(this._element,"show.bs.collapse").defaultPrevented)return;e.forEach((e=>{i!==e&&pt.getOrCreateInstance(e,{toggle:!1}).hide(),t||H.set(e,"bs.collapse",null)}));const n=this._getDimension();this._element.classList.remove(ct),this._element.classList.add(ht),this._element.style[n]=0,this._addAriaAndCollapsedClass(this._triggerArray,!0),this._isTransitioning=!0;const s=`scroll${n[0].toUpperCase()+n.slice(1)}`;this._queueCallback((()=>{this._isTransitioning=!1,this._element.classList.remove(ht),this._element.classList.add(ct,lt),this._element.style[n]="",j.trigger(this._element,"shown.bs.collapse")}),this._element,!0),this._element.style[n]=`${this._element[s]}px`}hide(){if(this._isTransitioning||!this._isShown())return;if(j.trigger(this._element,"hide.bs.collapse").defaultPrevented)return;const t=this._getDimension();this._element.style[t]=`${this._element.getBoundingClientRect()[t]}px`,u(this._element),this._element.classList.add(ht),this._element.classList.remove(ct,lt);const e=this._triggerArray.length;for(let t=0;t{this._isTransitioning=!1,this._element.classList.remove(ht),this._element.classList.add(ct),j.trigger(this._element,"hidden.bs.collapse")}),this._element,!0)}_isShown(t=this._element){return t.classList.contains(lt)}_getConfig(t){return(t={...rt,...U.getDataAttributes(this._element),...t}).toggle=Boolean(t.toggle),t.parent=r(t.parent),a(ot,t,at),t}_getDimension(){return this._element.classList.contains("collapse-horizontal")?"width":"height"}_initializeChildren(){if(!this._config.parent)return;const t=V.find(ut,this._config.parent);V.find(ft,this._config.parent).filter((e=>!t.includes(e))).forEach((t=>{const e=n(t);e&&this._addAriaAndCollapsedClass([t],this._isShown(e))}))}_addAriaAndCollapsedClass(t,e){t.length&&t.forEach((t=>{e?t.classList.remove(dt):t.classList.add(dt),t.setAttribute("aria-expanded",e)}))}static jQueryInterface(t){return this.each((function(){const e={};"string"==typeof t&&/show|hide/.test(t)&&(e.toggle=!1);const i=pt.getOrCreateInstance(this,e);if("string"==typeof t){if(void 0===i[t])throw new TypeError(`No method named "${t}"`);i[t]()}}))}}j.on(document,"click.bs.collapse.data-api",ft,(function(t){("A"===t.target.tagName||t.delegateTarget&&"A"===t.delegateTarget.tagName)&&t.preventDefault();const e=i(this);V.find(e).forEach((t=>{pt.getOrCreateInstance(t,{toggle:!1}).toggle()}))})),g(pt);var mt="top",gt="bottom",_t="right",bt="left",vt="auto",yt=[mt,gt,_t,bt],wt="start",Et="end",At="clippingParents",Tt="viewport",Ot="popper",Ct="reference",kt=yt.reduce((function(t,e){return t.concat([e+"-"+wt,e+"-"+Et])}),[]),Lt=[].concat(yt,[vt]).reduce((function(t,e){return t.concat([e,e+"-"+wt,e+"-"+Et])}),[]),xt="beforeRead",Dt="read",St="afterRead",Nt="beforeMain",It="main",Pt="afterMain",jt="beforeWrite",Mt="write",Ht="afterWrite",Bt=[xt,Dt,St,Nt,It,Pt,jt,Mt,Ht];function Rt(t){return t?(t.nodeName||"").toLowerCase():null}function Wt(t){if(null==t)return window;if("[object Window]"!==t.toString()){var e=t.ownerDocument;return e&&e.defaultView||window}return t}function $t(t){return t instanceof Wt(t).Element||t instanceof Element}function zt(t){return t instanceof Wt(t).HTMLElement||t instanceof HTMLElement}function qt(t){return"undefined"!=typeof ShadowRoot&&(t instanceof Wt(t).ShadowRoot||t instanceof ShadowRoot)}const Ft={name:"applyStyles",enabled:!0,phase:"write",fn:function(t){var e=t.state;Object.keys(e.elements).forEach((function(t){var i=e.styles[t]||{},n=e.attributes[t]||{},s=e.elements[t];zt(s)&&Rt(s)&&(Object.assign(s.style,i),Object.keys(n).forEach((function(t){var e=n[t];!1===e?s.removeAttribute(t):s.setAttribute(t,!0===e?"":e)})))}))},effect:function(t){var e=t.state,i={popper:{position:e.options.strategy,left:"0",top:"0",margin:"0"},arrow:{position:"absolute"},reference:{}};return Object.assign(e.elements.popper.style,i.popper),e.styles=i,e.elements.arrow&&Object.assign(e.elements.arrow.style,i.arrow),function(){Object.keys(e.elements).forEach((function(t){var n=e.elements[t],s=e.attributes[t]||{},o=Object.keys(e.styles.hasOwnProperty(t)?e.styles[t]:i[t]).reduce((function(t,e){return t[e]="",t}),{});zt(n)&&Rt(n)&&(Object.assign(n.style,o),Object.keys(s).forEach((function(t){n.removeAttribute(t)})))}))}},requires:["computeStyles"]};function Ut(t){return t.split("-")[0]}function Vt(t,e){var i=t.getBoundingClientRect();return{width:i.width/1,height:i.height/1,top:i.top/1,right:i.right/1,bottom:i.bottom/1,left:i.left/1,x:i.left/1,y:i.top/1}}function Kt(t){var e=Vt(t),i=t.offsetWidth,n=t.offsetHeight;return Math.abs(e.width-i)<=1&&(i=e.width),Math.abs(e.height-n)<=1&&(n=e.height),{x:t.offsetLeft,y:t.offsetTop,width:i,height:n}}function Xt(t,e){var i=e.getRootNode&&e.getRootNode();if(t.contains(e))return!0;if(i&&qt(i)){var n=e;do{if(n&&t.isSameNode(n))return!0;n=n.parentNode||n.host}while(n)}return!1}function Yt(t){return Wt(t).getComputedStyle(t)}function Qt(t){return["table","td","th"].indexOf(Rt(t))>=0}function Gt(t){return(($t(t)?t.ownerDocument:t.document)||window.document).documentElement}function Zt(t){return"html"===Rt(t)?t:t.assignedSlot||t.parentNode||(qt(t)?t.host:null)||Gt(t)}function Jt(t){return zt(t)&&"fixed"!==Yt(t).position?t.offsetParent:null}function te(t){for(var e=Wt(t),i=Jt(t);i&&Qt(i)&&"static"===Yt(i).position;)i=Jt(i);return i&&("html"===Rt(i)||"body"===Rt(i)&&"static"===Yt(i).position)?e:i||function(t){var e=-1!==navigator.userAgent.toLowerCase().indexOf("firefox");if(-1!==navigator.userAgent.indexOf("Trident")&&zt(t)&&"fixed"===Yt(t).position)return null;for(var i=Zt(t);zt(i)&&["html","body"].indexOf(Rt(i))<0;){var n=Yt(i);if("none"!==n.transform||"none"!==n.perspective||"paint"===n.contain||-1!==["transform","perspective"].indexOf(n.willChange)||e&&"filter"===n.willChange||e&&n.filter&&"none"!==n.filter)return i;i=i.parentNode}return null}(t)||e}function ee(t){return["top","bottom"].indexOf(t)>=0?"x":"y"}var ie=Math.max,ne=Math.min,se=Math.round;function oe(t,e,i){return ie(t,ne(e,i))}function re(t){return Object.assign({},{top:0,right:0,bottom:0,left:0},t)}function ae(t,e){return e.reduce((function(e,i){return e[i]=t,e}),{})}const le={name:"arrow",enabled:!0,phase:"main",fn:function(t){var e,i=t.state,n=t.name,s=t.options,o=i.elements.arrow,r=i.modifiersData.popperOffsets,a=Ut(i.placement),l=ee(a),c=[bt,_t].indexOf(a)>=0?"height":"width";if(o&&r){var h=function(t,e){return re("number"!=typeof(t="function"==typeof t?t(Object.assign({},e.rects,{placement:e.placement})):t)?t:ae(t,yt))}(s.padding,i),d=Kt(o),u="y"===l?mt:bt,f="y"===l?gt:_t,p=i.rects.reference[c]+i.rects.reference[l]-r[l]-i.rects.popper[c],m=r[l]-i.rects.reference[l],g=te(o),_=g?"y"===l?g.clientHeight||0:g.clientWidth||0:0,b=p/2-m/2,v=h[u],y=_-d[c]-h[f],w=_/2-d[c]/2+b,E=oe(v,w,y),A=l;i.modifiersData[n]=((e={})[A]=E,e.centerOffset=E-w,e)}},effect:function(t){var e=t.state,i=t.options.element,n=void 0===i?"[data-popper-arrow]":i;null!=n&&("string"!=typeof n||(n=e.elements.popper.querySelector(n)))&&Xt(e.elements.popper,n)&&(e.elements.arrow=n)},requires:["popperOffsets"],requiresIfExists:["preventOverflow"]};function ce(t){return t.split("-")[1]}var he={top:"auto",right:"auto",bottom:"auto",left:"auto"};function de(t){var e,i=t.popper,n=t.popperRect,s=t.placement,o=t.variation,r=t.offsets,a=t.position,l=t.gpuAcceleration,c=t.adaptive,h=t.roundOffsets,d=!0===h?function(t){var e=t.x,i=t.y,n=window.devicePixelRatio||1;return{x:se(se(e*n)/n)||0,y:se(se(i*n)/n)||0}}(r):"function"==typeof h?h(r):r,u=d.x,f=void 0===u?0:u,p=d.y,m=void 0===p?0:p,g=r.hasOwnProperty("x"),_=r.hasOwnProperty("y"),b=bt,v=mt,y=window;if(c){var w=te(i),E="clientHeight",A="clientWidth";w===Wt(i)&&"static"!==Yt(w=Gt(i)).position&&"absolute"===a&&(E="scrollHeight",A="scrollWidth"),w=w,s!==mt&&(s!==bt&&s!==_t||o!==Et)||(v=gt,m-=w[E]-n.height,m*=l?1:-1),s!==bt&&(s!==mt&&s!==gt||o!==Et)||(b=_t,f-=w[A]-n.width,f*=l?1:-1)}var T,O=Object.assign({position:a},c&&he);return l?Object.assign({},O,((T={})[v]=_?"0":"",T[b]=g?"0":"",T.transform=(y.devicePixelRatio||1)<=1?"translate("+f+"px, "+m+"px)":"translate3d("+f+"px, "+m+"px, 0)",T)):Object.assign({},O,((e={})[v]=_?m+"px":"",e[b]=g?f+"px":"",e.transform="",e))}const ue={name:"computeStyles",enabled:!0,phase:"beforeWrite",fn:function(t){var e=t.state,i=t.options,n=i.gpuAcceleration,s=void 0===n||n,o=i.adaptive,r=void 0===o||o,a=i.roundOffsets,l=void 0===a||a,c={placement:Ut(e.placement),variation:ce(e.placement),popper:e.elements.popper,popperRect:e.rects.popper,gpuAcceleration:s};null!=e.modifiersData.popperOffsets&&(e.styles.popper=Object.assign({},e.styles.popper,de(Object.assign({},c,{offsets:e.modifiersData.popperOffsets,position:e.options.strategy,adaptive:r,roundOffsets:l})))),null!=e.modifiersData.arrow&&(e.styles.arrow=Object.assign({},e.styles.arrow,de(Object.assign({},c,{offsets:e.modifiersData.arrow,position:"absolute",adaptive:!1,roundOffsets:l})))),e.attributes.popper=Object.assign({},e.attributes.popper,{"data-popper-placement":e.placement})},data:{}};var fe={passive:!0};const pe={name:"eventListeners",enabled:!0,phase:"write",fn:function(){},effect:function(t){var e=t.state,i=t.instance,n=t.options,s=n.scroll,o=void 0===s||s,r=n.resize,a=void 0===r||r,l=Wt(e.elements.popper),c=[].concat(e.scrollParents.reference,e.scrollParents.popper);return o&&c.forEach((function(t){t.addEventListener("scroll",i.update,fe)})),a&&l.addEventListener("resize",i.update,fe),function(){o&&c.forEach((function(t){t.removeEventListener("scroll",i.update,fe)})),a&&l.removeEventListener("resize",i.update,fe)}},data:{}};var me={left:"right",right:"left",bottom:"top",top:"bottom"};function ge(t){return t.replace(/left|right|bottom|top/g,(function(t){return me[t]}))}var _e={start:"end",end:"start"};function be(t){return t.replace(/start|end/g,(function(t){return _e[t]}))}function ve(t){var e=Wt(t);return{scrollLeft:e.pageXOffset,scrollTop:e.pageYOffset}}function ye(t){return Vt(Gt(t)).left+ve(t).scrollLeft}function we(t){var e=Yt(t),i=e.overflow,n=e.overflowX,s=e.overflowY;return/auto|scroll|overlay|hidden/.test(i+s+n)}function Ee(t){return["html","body","#document"].indexOf(Rt(t))>=0?t.ownerDocument.body:zt(t)&&we(t)?t:Ee(Zt(t))}function Ae(t,e){var i;void 0===e&&(e=[]);var n=Ee(t),s=n===(null==(i=t.ownerDocument)?void 0:i.body),o=Wt(n),r=s?[o].concat(o.visualViewport||[],we(n)?n:[]):n,a=e.concat(r);return s?a:a.concat(Ae(Zt(r)))}function Te(t){return Object.assign({},t,{left:t.x,top:t.y,right:t.x+t.width,bottom:t.y+t.height})}function Oe(t,e){return e===Tt?Te(function(t){var e=Wt(t),i=Gt(t),n=e.visualViewport,s=i.clientWidth,o=i.clientHeight,r=0,a=0;return n&&(s=n.width,o=n.height,/^((?!chrome|android).)*safari/i.test(navigator.userAgent)||(r=n.offsetLeft,a=n.offsetTop)),{width:s,height:o,x:r+ye(t),y:a}}(t)):zt(e)?function(t){var e=Vt(t);return e.top=e.top+t.clientTop,e.left=e.left+t.clientLeft,e.bottom=e.top+t.clientHeight,e.right=e.left+t.clientWidth,e.width=t.clientWidth,e.height=t.clientHeight,e.x=e.left,e.y=e.top,e}(e):Te(function(t){var e,i=Gt(t),n=ve(t),s=null==(e=t.ownerDocument)?void 0:e.body,o=ie(i.scrollWidth,i.clientWidth,s?s.scrollWidth:0,s?s.clientWidth:0),r=ie(i.scrollHeight,i.clientHeight,s?s.scrollHeight:0,s?s.clientHeight:0),a=-n.scrollLeft+ye(t),l=-n.scrollTop;return"rtl"===Yt(s||i).direction&&(a+=ie(i.clientWidth,s?s.clientWidth:0)-o),{width:o,height:r,x:a,y:l}}(Gt(t)))}function Ce(t){var e,i=t.reference,n=t.element,s=t.placement,o=s?Ut(s):null,r=s?ce(s):null,a=i.x+i.width/2-n.width/2,l=i.y+i.height/2-n.height/2;switch(o){case mt:e={x:a,y:i.y-n.height};break;case gt:e={x:a,y:i.y+i.height};break;case _t:e={x:i.x+i.width,y:l};break;case bt:e={x:i.x-n.width,y:l};break;default:e={x:i.x,y:i.y}}var c=o?ee(o):null;if(null!=c){var h="y"===c?"height":"width";switch(r){case wt:e[c]=e[c]-(i[h]/2-n[h]/2);break;case Et:e[c]=e[c]+(i[h]/2-n[h]/2)}}return e}function ke(t,e){void 0===e&&(e={});var i=e,n=i.placement,s=void 0===n?t.placement:n,o=i.boundary,r=void 0===o?At:o,a=i.rootBoundary,l=void 0===a?Tt:a,c=i.elementContext,h=void 0===c?Ot:c,d=i.altBoundary,u=void 0!==d&&d,f=i.padding,p=void 0===f?0:f,m=re("number"!=typeof p?p:ae(p,yt)),g=h===Ot?Ct:Ot,_=t.rects.popper,b=t.elements[u?g:h],v=function(t,e,i){var n="clippingParents"===e?function(t){var e=Ae(Zt(t)),i=["absolute","fixed"].indexOf(Yt(t).position)>=0&&zt(t)?te(t):t;return $t(i)?e.filter((function(t){return $t(t)&&Xt(t,i)&&"body"!==Rt(t)})):[]}(t):[].concat(e),s=[].concat(n,[i]),o=s[0],r=s.reduce((function(e,i){var n=Oe(t,i);return e.top=ie(n.top,e.top),e.right=ne(n.right,e.right),e.bottom=ne(n.bottom,e.bottom),e.left=ie(n.left,e.left),e}),Oe(t,o));return r.width=r.right-r.left,r.height=r.bottom-r.top,r.x=r.left,r.y=r.top,r}($t(b)?b:b.contextElement||Gt(t.elements.popper),r,l),y=Vt(t.elements.reference),w=Ce({reference:y,element:_,strategy:"absolute",placement:s}),E=Te(Object.assign({},_,w)),A=h===Ot?E:y,T={top:v.top-A.top+m.top,bottom:A.bottom-v.bottom+m.bottom,left:v.left-A.left+m.left,right:A.right-v.right+m.right},O=t.modifiersData.offset;if(h===Ot&&O){var C=O[s];Object.keys(T).forEach((function(t){var e=[_t,gt].indexOf(t)>=0?1:-1,i=[mt,gt].indexOf(t)>=0?"y":"x";T[t]+=C[i]*e}))}return T}function Le(t,e){void 0===e&&(e={});var i=e,n=i.placement,s=i.boundary,o=i.rootBoundary,r=i.padding,a=i.flipVariations,l=i.allowedAutoPlacements,c=void 0===l?Lt:l,h=ce(n),d=h?a?kt:kt.filter((function(t){return ce(t)===h})):yt,u=d.filter((function(t){return c.indexOf(t)>=0}));0===u.length&&(u=d);var f=u.reduce((function(e,i){return e[i]=ke(t,{placement:i,boundary:s,rootBoundary:o,padding:r})[Ut(i)],e}),{});return Object.keys(f).sort((function(t,e){return f[t]-f[e]}))}const xe={name:"flip",enabled:!0,phase:"main",fn:function(t){var e=t.state,i=t.options,n=t.name;if(!e.modifiersData[n]._skip){for(var s=i.mainAxis,o=void 0===s||s,r=i.altAxis,a=void 0===r||r,l=i.fallbackPlacements,c=i.padding,h=i.boundary,d=i.rootBoundary,u=i.altBoundary,f=i.flipVariations,p=void 0===f||f,m=i.allowedAutoPlacements,g=e.options.placement,_=Ut(g),b=l||(_!==g&&p?function(t){if(Ut(t)===vt)return[];var e=ge(t);return[be(t),e,be(e)]}(g):[ge(g)]),v=[g].concat(b).reduce((function(t,i){return t.concat(Ut(i)===vt?Le(e,{placement:i,boundary:h,rootBoundary:d,padding:c,flipVariations:p,allowedAutoPlacements:m}):i)}),[]),y=e.rects.reference,w=e.rects.popper,E=new Map,A=!0,T=v[0],O=0;O=0,D=x?"width":"height",S=ke(e,{placement:C,boundary:h,rootBoundary:d,altBoundary:u,padding:c}),N=x?L?_t:bt:L?gt:mt;y[D]>w[D]&&(N=ge(N));var I=ge(N),P=[];if(o&&P.push(S[k]<=0),a&&P.push(S[N]<=0,S[I]<=0),P.every((function(t){return t}))){T=C,A=!1;break}E.set(C,P)}if(A)for(var j=function(t){var e=v.find((function(e){var i=E.get(e);if(i)return i.slice(0,t).every((function(t){return t}))}));if(e)return T=e,"break"},M=p?3:1;M>0&&"break"!==j(M);M--);e.placement!==T&&(e.modifiersData[n]._skip=!0,e.placement=T,e.reset=!0)}},requiresIfExists:["offset"],data:{_skip:!1}};function De(t,e,i){return void 0===i&&(i={x:0,y:0}),{top:t.top-e.height-i.y,right:t.right-e.width+i.x,bottom:t.bottom-e.height+i.y,left:t.left-e.width-i.x}}function Se(t){return[mt,_t,gt,bt].some((function(e){return t[e]>=0}))}const Ne={name:"hide",enabled:!0,phase:"main",requiresIfExists:["preventOverflow"],fn:function(t){var e=t.state,i=t.name,n=e.rects.reference,s=e.rects.popper,o=e.modifiersData.preventOverflow,r=ke(e,{elementContext:"reference"}),a=ke(e,{altBoundary:!0}),l=De(r,n),c=De(a,s,o),h=Se(l),d=Se(c);e.modifiersData[i]={referenceClippingOffsets:l,popperEscapeOffsets:c,isReferenceHidden:h,hasPopperEscaped:d},e.attributes.popper=Object.assign({},e.attributes.popper,{"data-popper-reference-hidden":h,"data-popper-escaped":d})}},Ie={name:"offset",enabled:!0,phase:"main",requires:["popperOffsets"],fn:function(t){var e=t.state,i=t.options,n=t.name,s=i.offset,o=void 0===s?[0,0]:s,r=Lt.reduce((function(t,i){return t[i]=function(t,e,i){var n=Ut(t),s=[bt,mt].indexOf(n)>=0?-1:1,o="function"==typeof i?i(Object.assign({},e,{placement:t})):i,r=o[0],a=o[1];return r=r||0,a=(a||0)*s,[bt,_t].indexOf(n)>=0?{x:a,y:r}:{x:r,y:a}}(i,e.rects,o),t}),{}),a=r[e.placement],l=a.x,c=a.y;null!=e.modifiersData.popperOffsets&&(e.modifiersData.popperOffsets.x+=l,e.modifiersData.popperOffsets.y+=c),e.modifiersData[n]=r}},Pe={name:"popperOffsets",enabled:!0,phase:"read",fn:function(t){var e=t.state,i=t.name;e.modifiersData[i]=Ce({reference:e.rects.reference,element:e.rects.popper,strategy:"absolute",placement:e.placement})},data:{}},je={name:"preventOverflow",enabled:!0,phase:"main",fn:function(t){var e=t.state,i=t.options,n=t.name,s=i.mainAxis,o=void 0===s||s,r=i.altAxis,a=void 0!==r&&r,l=i.boundary,c=i.rootBoundary,h=i.altBoundary,d=i.padding,u=i.tether,f=void 0===u||u,p=i.tetherOffset,m=void 0===p?0:p,g=ke(e,{boundary:l,rootBoundary:c,padding:d,altBoundary:h}),_=Ut(e.placement),b=ce(e.placement),v=!b,y=ee(_),w="x"===y?"y":"x",E=e.modifiersData.popperOffsets,A=e.rects.reference,T=e.rects.popper,O="function"==typeof m?m(Object.assign({},e.rects,{placement:e.placement})):m,C={x:0,y:0};if(E){if(o||a){var k="y"===y?mt:bt,L="y"===y?gt:_t,x="y"===y?"height":"width",D=E[y],S=E[y]+g[k],N=E[y]-g[L],I=f?-T[x]/2:0,P=b===wt?A[x]:T[x],j=b===wt?-T[x]:-A[x],M=e.elements.arrow,H=f&&M?Kt(M):{width:0,height:0},B=e.modifiersData["arrow#persistent"]?e.modifiersData["arrow#persistent"].padding:{top:0,right:0,bottom:0,left:0},R=B[k],W=B[L],$=oe(0,A[x],H[x]),z=v?A[x]/2-I-$-R-O:P-$-R-O,q=v?-A[x]/2+I+$+W+O:j+$+W+O,F=e.elements.arrow&&te(e.elements.arrow),U=F?"y"===y?F.clientTop||0:F.clientLeft||0:0,V=e.modifiersData.offset?e.modifiersData.offset[e.placement][y]:0,K=E[y]+z-V-U,X=E[y]+q-V;if(o){var Y=oe(f?ne(S,K):S,D,f?ie(N,X):N);E[y]=Y,C[y]=Y-D}if(a){var Q="x"===y?mt:bt,G="x"===y?gt:_t,Z=E[w],J=Z+g[Q],tt=Z-g[G],et=oe(f?ne(J,K):J,Z,f?ie(tt,X):tt);E[w]=et,C[w]=et-Z}}e.modifiersData[n]=C}},requiresIfExists:["offset"]};function Me(t,e,i){void 0===i&&(i=!1);var n=zt(e);zt(e)&&function(t){var e=t.getBoundingClientRect();e.width,t.offsetWidth,e.height,t.offsetHeight}(e);var s,o,r=Gt(e),a=Vt(t),l={scrollLeft:0,scrollTop:0},c={x:0,y:0};return(n||!n&&!i)&&(("body"!==Rt(e)||we(r))&&(l=(s=e)!==Wt(s)&&zt(s)?{scrollLeft:(o=s).scrollLeft,scrollTop:o.scrollTop}:ve(s)),zt(e)?((c=Vt(e)).x+=e.clientLeft,c.y+=e.clientTop):r&&(c.x=ye(r))),{x:a.left+l.scrollLeft-c.x,y:a.top+l.scrollTop-c.y,width:a.width,height:a.height}}function He(t){var e=new Map,i=new Set,n=[];function s(t){i.add(t.name),[].concat(t.requires||[],t.requiresIfExists||[]).forEach((function(t){if(!i.has(t)){var n=e.get(t);n&&s(n)}})),n.push(t)}return t.forEach((function(t){e.set(t.name,t)})),t.forEach((function(t){i.has(t.name)||s(t)})),n}var Be={placement:"bottom",modifiers:[],strategy:"absolute"};function Re(){for(var t=arguments.length,e=new Array(t),i=0;ij.on(t,"mouseover",d))),this._element.focus(),this._element.setAttribute("aria-expanded",!0),this._menu.classList.add(Je),this._element.classList.add(Je),j.trigger(this._element,"shown.bs.dropdown",t)}hide(){if(c(this._element)||!this._isShown(this._menu))return;const t={relatedTarget:this._element};this._completeHide(t)}dispose(){this._popper&&this._popper.destroy(),super.dispose()}update(){this._inNavbar=this._detectNavbar(),this._popper&&this._popper.update()}_completeHide(t){j.trigger(this._element,"hide.bs.dropdown",t).defaultPrevented||("ontouchstart"in document.documentElement&&[].concat(...document.body.children).forEach((t=>j.off(t,"mouseover",d))),this._popper&&this._popper.destroy(),this._menu.classList.remove(Je),this._element.classList.remove(Je),this._element.setAttribute("aria-expanded","false"),U.removeDataAttribute(this._menu,"popper"),j.trigger(this._element,"hidden.bs.dropdown",t))}_getConfig(t){if(t={...this.constructor.Default,...U.getDataAttributes(this._element),...t},a(Ue,t,this.constructor.DefaultType),"object"==typeof t.reference&&!o(t.reference)&&"function"!=typeof t.reference.getBoundingClientRect)throw new TypeError(`${Ue.toUpperCase()}: Option "reference" provided type "object" without a required "getBoundingClientRect" method.`);return t}_createPopper(t){if(void 0===Fe)throw new TypeError("Bootstrap's dropdowns require Popper (https://popper.js.org)");let e=this._element;"parent"===this._config.reference?e=t:o(this._config.reference)?e=r(this._config.reference):"object"==typeof this._config.reference&&(e=this._config.reference);const i=this._getPopperConfig(),n=i.modifiers.find((t=>"applyStyles"===t.name&&!1===t.enabled));this._popper=qe(e,this._menu,i),n&&U.setDataAttribute(this._menu,"popper","static")}_isShown(t=this._element){return t.classList.contains(Je)}_getMenuElement(){return V.next(this._element,ei)[0]}_getPlacement(){const t=this._element.parentNode;if(t.classList.contains("dropend"))return ri;if(t.classList.contains("dropstart"))return ai;const e="end"===getComputedStyle(this._menu).getPropertyValue("--bs-position").trim();return t.classList.contains("dropup")?e?ni:ii:e?oi:si}_detectNavbar(){return null!==this._element.closest(".navbar")}_getOffset(){const{offset:t}=this._config;return"string"==typeof t?t.split(",").map((t=>Number.parseInt(t,10))):"function"==typeof t?e=>t(e,this._element):t}_getPopperConfig(){const t={placement:this._getPlacement(),modifiers:[{name:"preventOverflow",options:{boundary:this._config.boundary}},{name:"offset",options:{offset:this._getOffset()}}]};return"static"===this._config.display&&(t.modifiers=[{name:"applyStyles",enabled:!1}]),{...t,..."function"==typeof this._config.popperConfig?this._config.popperConfig(t):this._config.popperConfig}}_selectMenuItem({key:t,target:e}){const i=V.find(".dropdown-menu .dropdown-item:not(.disabled):not(:disabled)",this._menu).filter(l);i.length&&v(i,e,t===Ye,!i.includes(e)).focus()}static jQueryInterface(t){return this.each((function(){const e=hi.getOrCreateInstance(this,t);if("string"==typeof t){if(void 0===e[t])throw new TypeError(`No method named "${t}"`);e[t]()}}))}static clearMenus(t){if(t&&(2===t.button||"keyup"===t.type&&"Tab"!==t.key))return;const e=V.find(ti);for(let i=0,n=e.length;ie+t)),this._setElementAttributes(di,"paddingRight",(e=>e+t)),this._setElementAttributes(ui,"marginRight",(e=>e-t))}_disableOverFlow(){this._saveInitialAttribute(this._element,"overflow"),this._element.style.overflow="hidden"}_setElementAttributes(t,e,i){const n=this.getWidth();this._applyManipulationCallback(t,(t=>{if(t!==this._element&&window.innerWidth>t.clientWidth+n)return;this._saveInitialAttribute(t,e);const s=window.getComputedStyle(t)[e];t.style[e]=`${i(Number.parseFloat(s))}px`}))}reset(){this._resetElementAttributes(this._element,"overflow"),this._resetElementAttributes(this._element,"paddingRight"),this._resetElementAttributes(di,"paddingRight"),this._resetElementAttributes(ui,"marginRight")}_saveInitialAttribute(t,e){const i=t.style[e];i&&U.setDataAttribute(t,e,i)}_resetElementAttributes(t,e){this._applyManipulationCallback(t,(t=>{const i=U.getDataAttribute(t,e);void 0===i?t.style.removeProperty(e):(U.removeDataAttribute(t,e),t.style[e]=i)}))}_applyManipulationCallback(t,e){o(t)?e(t):V.find(t,this._element).forEach(e)}isOverflowing(){return this.getWidth()>0}}const pi={className:"modal-backdrop",isVisible:!0,isAnimated:!1,rootElement:"body",clickCallback:null},mi={className:"string",isVisible:"boolean",isAnimated:"boolean",rootElement:"(element|string)",clickCallback:"(function|null)"},gi="show",_i="mousedown.bs.backdrop";class bi{constructor(t){this._config=this._getConfig(t),this._isAppended=!1,this._element=null}show(t){this._config.isVisible?(this._append(),this._config.isAnimated&&u(this._getElement()),this._getElement().classList.add(gi),this._emulateAnimation((()=>{_(t)}))):_(t)}hide(t){this._config.isVisible?(this._getElement().classList.remove(gi),this._emulateAnimation((()=>{this.dispose(),_(t)}))):_(t)}_getElement(){if(!this._element){const t=document.createElement("div");t.className=this._config.className,this._config.isAnimated&&t.classList.add("fade"),this._element=t}return this._element}_getConfig(t){return(t={...pi,..."object"==typeof t?t:{}}).rootElement=r(t.rootElement),a("backdrop",t,mi),t}_append(){this._isAppended||(this._config.rootElement.append(this._getElement()),j.on(this._getElement(),_i,(()=>{_(this._config.clickCallback)})),this._isAppended=!0)}dispose(){this._isAppended&&(j.off(this._element,_i),this._element.remove(),this._isAppended=!1)}_emulateAnimation(t){b(t,this._getElement(),this._config.isAnimated)}}const vi={trapElement:null,autofocus:!0},yi={trapElement:"element",autofocus:"boolean"},wi=".bs.focustrap",Ei="backward";class Ai{constructor(t){this._config=this._getConfig(t),this._isActive=!1,this._lastTabNavDirection=null}activate(){const{trapElement:t,autofocus:e}=this._config;this._isActive||(e&&t.focus(),j.off(document,wi),j.on(document,"focusin.bs.focustrap",(t=>this._handleFocusin(t))),j.on(document,"keydown.tab.bs.focustrap",(t=>this._handleKeydown(t))),this._isActive=!0)}deactivate(){this._isActive&&(this._isActive=!1,j.off(document,wi))}_handleFocusin(t){const{target:e}=t,{trapElement:i}=this._config;if(e===document||e===i||i.contains(e))return;const n=V.focusableChildren(i);0===n.length?i.focus():this._lastTabNavDirection===Ei?n[n.length-1].focus():n[0].focus()}_handleKeydown(t){"Tab"===t.key&&(this._lastTabNavDirection=t.shiftKey?Ei:"forward")}_getConfig(t){return t={...vi,..."object"==typeof t?t:{}},a("focustrap",t,yi),t}}const Ti="modal",Oi="Escape",Ci={backdrop:!0,keyboard:!0,focus:!0},ki={backdrop:"(boolean|string)",keyboard:"boolean",focus:"boolean"},Li="hidden.bs.modal",xi="show.bs.modal",Di="resize.bs.modal",Si="click.dismiss.bs.modal",Ni="keydown.dismiss.bs.modal",Ii="mousedown.dismiss.bs.modal",Pi="modal-open",ji="show",Mi="modal-static";class Hi extends B{constructor(t,e){super(t),this._config=this._getConfig(e),this._dialog=V.findOne(".modal-dialog",this._element),this._backdrop=this._initializeBackDrop(),this._focustrap=this._initializeFocusTrap(),this._isShown=!1,this._ignoreBackdropClick=!1,this._isTransitioning=!1,this._scrollBar=new fi}static get Default(){return Ci}static get NAME(){return Ti}toggle(t){return this._isShown?this.hide():this.show(t)}show(t){this._isShown||this._isTransitioning||j.trigger(this._element,xi,{relatedTarget:t}).defaultPrevented||(this._isShown=!0,this._isAnimated()&&(this._isTransitioning=!0),this._scrollBar.hide(),document.body.classList.add(Pi),this._adjustDialog(),this._setEscapeEvent(),this._setResizeEvent(),j.on(this._dialog,Ii,(()=>{j.one(this._element,"mouseup.dismiss.bs.modal",(t=>{t.target===this._element&&(this._ignoreBackdropClick=!0)}))})),this._showBackdrop((()=>this._showElement(t))))}hide(){if(!this._isShown||this._isTransitioning)return;if(j.trigger(this._element,"hide.bs.modal").defaultPrevented)return;this._isShown=!1;const t=this._isAnimated();t&&(this._isTransitioning=!0),this._setEscapeEvent(),this._setResizeEvent(),this._focustrap.deactivate(),this._element.classList.remove(ji),j.off(this._element,Si),j.off(this._dialog,Ii),this._queueCallback((()=>this._hideModal()),this._element,t)}dispose(){[window,this._dialog].forEach((t=>j.off(t,".bs.modal"))),this._backdrop.dispose(),this._focustrap.deactivate(),super.dispose()}handleUpdate(){this._adjustDialog()}_initializeBackDrop(){return new bi({isVisible:Boolean(this._config.backdrop),isAnimated:this._isAnimated()})}_initializeFocusTrap(){return new Ai({trapElement:this._element})}_getConfig(t){return t={...Ci,...U.getDataAttributes(this._element),..."object"==typeof t?t:{}},a(Ti,t,ki),t}_showElement(t){const e=this._isAnimated(),i=V.findOne(".modal-body",this._dialog);this._element.parentNode&&this._element.parentNode.nodeType===Node.ELEMENT_NODE||document.body.append(this._element),this._element.style.display="block",this._element.removeAttribute("aria-hidden"),this._element.setAttribute("aria-modal",!0),this._element.setAttribute("role","dialog"),this._element.scrollTop=0,i&&(i.scrollTop=0),e&&u(this._element),this._element.classList.add(ji),this._queueCallback((()=>{this._config.focus&&this._focustrap.activate(),this._isTransitioning=!1,j.trigger(this._element,"shown.bs.modal",{relatedTarget:t})}),this._dialog,e)}_setEscapeEvent(){this._isShown?j.on(this._element,Ni,(t=>{this._config.keyboard&&t.key===Oi?(t.preventDefault(),this.hide()):this._config.keyboard||t.key!==Oi||this._triggerBackdropTransition()})):j.off(this._element,Ni)}_setResizeEvent(){this._isShown?j.on(window,Di,(()=>this._adjustDialog())):j.off(window,Di)}_hideModal(){this._element.style.display="none",this._element.setAttribute("aria-hidden",!0),this._element.removeAttribute("aria-modal"),this._element.removeAttribute("role"),this._isTransitioning=!1,this._backdrop.hide((()=>{document.body.classList.remove(Pi),this._resetAdjustments(),this._scrollBar.reset(),j.trigger(this._element,Li)}))}_showBackdrop(t){j.on(this._element,Si,(t=>{this._ignoreBackdropClick?this._ignoreBackdropClick=!1:t.target===t.currentTarget&&(!0===this._config.backdrop?this.hide():"static"===this._config.backdrop&&this._triggerBackdropTransition())})),this._backdrop.show(t)}_isAnimated(){return this._element.classList.contains("fade")}_triggerBackdropTransition(){if(j.trigger(this._element,"hidePrevented.bs.modal").defaultPrevented)return;const{classList:t,scrollHeight:e,style:i}=this._element,n=e>document.documentElement.clientHeight;!n&&"hidden"===i.overflowY||t.contains(Mi)||(n||(i.overflowY="hidden"),t.add(Mi),this._queueCallback((()=>{t.remove(Mi),n||this._queueCallback((()=>{i.overflowY=""}),this._dialog)}),this._dialog),this._element.focus())}_adjustDialog(){const t=this._element.scrollHeight>document.documentElement.clientHeight,e=this._scrollBar.getWidth(),i=e>0;(!i&&t&&!m()||i&&!t&&m())&&(this._element.style.paddingLeft=`${e}px`),(i&&!t&&!m()||!i&&t&&m())&&(this._element.style.paddingRight=`${e}px`)}_resetAdjustments(){this._element.style.paddingLeft="",this._element.style.paddingRight=""}static jQueryInterface(t,e){return this.each((function(){const i=Hi.getOrCreateInstance(this,t);if("string"==typeof t){if(void 0===i[t])throw new TypeError(`No method named "${t}"`);i[t](e)}}))}}j.on(document,"click.bs.modal.data-api",'[data-bs-toggle="modal"]',(function(t){const e=n(this);["A","AREA"].includes(this.tagName)&&t.preventDefault(),j.one(e,xi,(t=>{t.defaultPrevented||j.one(e,Li,(()=>{l(this)&&this.focus()}))}));const i=V.findOne(".modal.show");i&&Hi.getInstance(i).hide(),Hi.getOrCreateInstance(e).toggle(this)})),R(Hi),g(Hi);const Bi="offcanvas",Ri={backdrop:!0,keyboard:!0,scroll:!1},Wi={backdrop:"boolean",keyboard:"boolean",scroll:"boolean"},$i="show",zi=".offcanvas.show",qi="hidden.bs.offcanvas";class Fi extends B{constructor(t,e){super(t),this._config=this._getConfig(e),this._isShown=!1,this._backdrop=this._initializeBackDrop(),this._focustrap=this._initializeFocusTrap(),this._addEventListeners()}static get NAME(){return Bi}static get Default(){return Ri}toggle(t){return this._isShown?this.hide():this.show(t)}show(t){this._isShown||j.trigger(this._element,"show.bs.offcanvas",{relatedTarget:t}).defaultPrevented||(this._isShown=!0,this._element.style.visibility="visible",this._backdrop.show(),this._config.scroll||(new fi).hide(),this._element.removeAttribute("aria-hidden"),this._element.setAttribute("aria-modal",!0),this._element.setAttribute("role","dialog"),this._element.classList.add($i),this._queueCallback((()=>{this._config.scroll||this._focustrap.activate(),j.trigger(this._element,"shown.bs.offcanvas",{relatedTarget:t})}),this._element,!0))}hide(){this._isShown&&(j.trigger(this._element,"hide.bs.offcanvas").defaultPrevented||(this._focustrap.deactivate(),this._element.blur(),this._isShown=!1,this._element.classList.remove($i),this._backdrop.hide(),this._queueCallback((()=>{this._element.setAttribute("aria-hidden",!0),this._element.removeAttribute("aria-modal"),this._element.removeAttribute("role"),this._element.style.visibility="hidden",this._config.scroll||(new fi).reset(),j.trigger(this._element,qi)}),this._element,!0)))}dispose(){this._backdrop.dispose(),this._focustrap.deactivate(),super.dispose()}_getConfig(t){return t={...Ri,...U.getDataAttributes(this._element),..."object"==typeof t?t:{}},a(Bi,t,Wi),t}_initializeBackDrop(){return new bi({className:"offcanvas-backdrop",isVisible:this._config.backdrop,isAnimated:!0,rootElement:this._element.parentNode,clickCallback:()=>this.hide()})}_initializeFocusTrap(){return new Ai({trapElement:this._element})}_addEventListeners(){j.on(this._element,"keydown.dismiss.bs.offcanvas",(t=>{this._config.keyboard&&"Escape"===t.key&&this.hide()}))}static jQueryInterface(t){return this.each((function(){const e=Fi.getOrCreateInstance(this,t);if("string"==typeof t){if(void 0===e[t]||t.startsWith("_")||"constructor"===t)throw new TypeError(`No method named "${t}"`);e[t](this)}}))}}j.on(document,"click.bs.offcanvas.data-api",'[data-bs-toggle="offcanvas"]',(function(t){const e=n(this);if(["A","AREA"].includes(this.tagName)&&t.preventDefault(),c(this))return;j.one(e,qi,(()=>{l(this)&&this.focus()}));const i=V.findOne(zi);i&&i!==e&&Fi.getInstance(i).hide(),Fi.getOrCreateInstance(e).toggle(this)})),j.on(window,"load.bs.offcanvas.data-api",(()=>V.find(zi).forEach((t=>Fi.getOrCreateInstance(t).show())))),R(Fi),g(Fi);const Ui=new Set(["background","cite","href","itemtype","longdesc","poster","src","xlink:href"]),Vi=/^(?:(?:https?|mailto|ftp|tel|file|sms):|[^#&/:?]*(?:[#/?]|$))/i,Ki=/^data:(?:image\/(?:bmp|gif|jpeg|jpg|png|tiff|webp)|video\/(?:mpeg|mp4|ogg|webm)|audio\/(?:mp3|oga|ogg|opus));base64,[\d+/a-z]+=*$/i,Xi=(t,e)=>{const i=t.nodeName.toLowerCase();if(e.includes(i))return!Ui.has(i)||Boolean(Vi.test(t.nodeValue)||Ki.test(t.nodeValue));const n=e.filter((t=>t instanceof RegExp));for(let t=0,e=n.length;t{Xi(t,r)||i.removeAttribute(t.nodeName)}))}return n.body.innerHTML}const Qi="tooltip",Gi=new Set(["sanitize","allowList","sanitizeFn"]),Zi={animation:"boolean",template:"string",title:"(string|element|function)",trigger:"string",delay:"(number|object)",html:"boolean",selector:"(string|boolean)",placement:"(string|function)",offset:"(array|string|function)",container:"(string|element|boolean)",fallbackPlacements:"array",boundary:"(string|element)",customClass:"(string|function)",sanitize:"boolean",sanitizeFn:"(null|function)",allowList:"object",popperConfig:"(null|object|function)"},Ji={AUTO:"auto",TOP:"top",RIGHT:m()?"left":"right",BOTTOM:"bottom",LEFT:m()?"right":"left"},tn={animation:!0,template:'',trigger:"hover focus",title:"",delay:0,html:!1,selector:!1,placement:"top",offset:[0,0],container:!1,fallbackPlacements:["top","right","bottom","left"],boundary:"clippingParents",customClass:"",sanitize:!0,sanitizeFn:null,allowList:{"*":["class","dir","id","lang","role",/^aria-[\w-]*$/i],a:["target","href","title","rel"],area:[],b:[],br:[],col:[],code:[],div:[],em:[],hr:[],h1:[],h2:[],h3:[],h4:[],h5:[],h6:[],i:[],img:["src","srcset","alt","title","width","height"],li:[],ol:[],p:[],pre:[],s:[],small:[],span:[],sub:[],sup:[],strong:[],u:[],ul:[]},popperConfig:null},en={HIDE:"hide.bs.tooltip",HIDDEN:"hidden.bs.tooltip",SHOW:"show.bs.tooltip",SHOWN:"shown.bs.tooltip",INSERTED:"inserted.bs.tooltip",CLICK:"click.bs.tooltip",FOCUSIN:"focusin.bs.tooltip",FOCUSOUT:"focusout.bs.tooltip",MOUSEENTER:"mouseenter.bs.tooltip",MOUSELEAVE:"mouseleave.bs.tooltip"},nn="fade",sn="show",on="show",rn="out",an=".tooltip-inner",ln=".modal",cn="hide.bs.modal",hn="hover",dn="focus";class un extends B{constructor(t,e){if(void 0===Fe)throw new TypeError("Bootstrap's tooltips require Popper (https://popper.js.org)");super(t),this._isEnabled=!0,this._timeout=0,this._hoverState="",this._activeTrigger={},this._popper=null,this._config=this._getConfig(e),this.tip=null,this._setListeners()}static get Default(){return tn}static get NAME(){return Qi}static get Event(){return en}static get DefaultType(){return Zi}enable(){this._isEnabled=!0}disable(){this._isEnabled=!1}toggleEnabled(){this._isEnabled=!this._isEnabled}toggle(t){if(this._isEnabled)if(t){const e=this._initializeOnDelegatedTarget(t);e._activeTrigger.click=!e._activeTrigger.click,e._isWithActiveTrigger()?e._enter(null,e):e._leave(null,e)}else{if(this.getTipElement().classList.contains(sn))return void this._leave(null,this);this._enter(null,this)}}dispose(){clearTimeout(this._timeout),j.off(this._element.closest(ln),cn,this._hideModalHandler),this.tip&&this.tip.remove(),this._disposePopper(),super.dispose()}show(){if("none"===this._element.style.display)throw new Error("Please use show on visible elements");if(!this.isWithContent()||!this._isEnabled)return;const t=j.trigger(this._element,this.constructor.Event.SHOW),e=h(this._element),i=null===e?this._element.ownerDocument.documentElement.contains(this._element):e.contains(this._element);if(t.defaultPrevented||!i)return;"tooltip"===this.constructor.NAME&&this.tip&&this.getTitle()!==this.tip.querySelector(an).innerHTML&&(this._disposePopper(),this.tip.remove(),this.tip=null);const n=this.getTipElement(),s=(t=>{do{t+=Math.floor(1e6*Math.random())}while(document.getElementById(t));return t})(this.constructor.NAME);n.setAttribute("id",s),this._element.setAttribute("aria-describedby",s),this._config.animation&&n.classList.add(nn);const o="function"==typeof this._config.placement?this._config.placement.call(this,n,this._element):this._config.placement,r=this._getAttachment(o);this._addAttachmentClass(r);const{container:a}=this._config;H.set(n,this.constructor.DATA_KEY,this),this._element.ownerDocument.documentElement.contains(this.tip)||(a.append(n),j.trigger(this._element,this.constructor.Event.INSERTED)),this._popper?this._popper.update():this._popper=qe(this._element,n,this._getPopperConfig(r)),n.classList.add(sn);const l=this._resolvePossibleFunction(this._config.customClass);l&&n.classList.add(...l.split(" ")),"ontouchstart"in document.documentElement&&[].concat(...document.body.children).forEach((t=>{j.on(t,"mouseover",d)}));const c=this.tip.classList.contains(nn);this._queueCallback((()=>{const t=this._hoverState;this._hoverState=null,j.trigger(this._element,this.constructor.Event.SHOWN),t===rn&&this._leave(null,this)}),this.tip,c)}hide(){if(!this._popper)return;const t=this.getTipElement();if(j.trigger(this._element,this.constructor.Event.HIDE).defaultPrevented)return;t.classList.remove(sn),"ontouchstart"in document.documentElement&&[].concat(...document.body.children).forEach((t=>j.off(t,"mouseover",d))),this._activeTrigger.click=!1,this._activeTrigger.focus=!1,this._activeTrigger.hover=!1;const e=this.tip.classList.contains(nn);this._queueCallback((()=>{this._isWithActiveTrigger()||(this._hoverState!==on&&t.remove(),this._cleanTipClass(),this._element.removeAttribute("aria-describedby"),j.trigger(this._element,this.constructor.Event.HIDDEN),this._disposePopper())}),this.tip,e),this._hoverState=""}update(){null!==this._popper&&this._popper.update()}isWithContent(){return Boolean(this.getTitle())}getTipElement(){if(this.tip)return this.tip;const t=document.createElement("div");t.innerHTML=this._config.template;const e=t.children[0];return this.setContent(e),e.classList.remove(nn,sn),this.tip=e,this.tip}setContent(t){this._sanitizeAndSetContent(t,this.getTitle(),an)}_sanitizeAndSetContent(t,e,i){const n=V.findOne(i,t);e||!n?this.setElementContent(n,e):n.remove()}setElementContent(t,e){if(null!==t)return o(e)?(e=r(e),void(this._config.html?e.parentNode!==t&&(t.innerHTML="",t.append(e)):t.textContent=e.textContent)):void(this._config.html?(this._config.sanitize&&(e=Yi(e,this._config.allowList,this._config.sanitizeFn)),t.innerHTML=e):t.textContent=e)}getTitle(){const t=this._element.getAttribute("data-bs-original-title")||this._config.title;return this._resolvePossibleFunction(t)}updateAttachment(t){return"right"===t?"end":"left"===t?"start":t}_initializeOnDelegatedTarget(t,e){return e||this.constructor.getOrCreateInstance(t.delegateTarget,this._getDelegateConfig())}_getOffset(){const{offset:t}=this._config;return"string"==typeof t?t.split(",").map((t=>Number.parseInt(t,10))):"function"==typeof t?e=>t(e,this._element):t}_resolvePossibleFunction(t){return"function"==typeof t?t.call(this._element):t}_getPopperConfig(t){const e={placement:t,modifiers:[{name:"flip",options:{fallbackPlacements:this._config.fallbackPlacements}},{name:"offset",options:{offset:this._getOffset()}},{name:"preventOverflow",options:{boundary:this._config.boundary}},{name:"arrow",options:{element:`.${this.constructor.NAME}-arrow`}},{name:"onChange",enabled:!0,phase:"afterWrite",fn:t=>this._handlePopperPlacementChange(t)}],onFirstUpdate:t=>{t.options.placement!==t.placement&&this._handlePopperPlacementChange(t)}};return{...e,..."function"==typeof this._config.popperConfig?this._config.popperConfig(e):this._config.popperConfig}}_addAttachmentClass(t){this.getTipElement().classList.add(`${this._getBasicClassPrefix()}-${this.updateAttachment(t)}`)}_getAttachment(t){return Ji[t.toUpperCase()]}_setListeners(){this._config.trigger.split(" ").forEach((t=>{if("click"===t)j.on(this._element,this.constructor.Event.CLICK,this._config.selector,(t=>this.toggle(t)));else if("manual"!==t){const e=t===hn?this.constructor.Event.MOUSEENTER:this.constructor.Event.FOCUSIN,i=t===hn?this.constructor.Event.MOUSELEAVE:this.constructor.Event.FOCUSOUT;j.on(this._element,e,this._config.selector,(t=>this._enter(t))),j.on(this._element,i,this._config.selector,(t=>this._leave(t)))}})),this._hideModalHandler=()=>{this._element&&this.hide()},j.on(this._element.closest(ln),cn,this._hideModalHandler),this._config.selector?this._config={...this._config,trigger:"manual",selector:""}:this._fixTitle()}_fixTitle(){const t=this._element.getAttribute("title"),e=typeof this._element.getAttribute("data-bs-original-title");(t||"string"!==e)&&(this._element.setAttribute("data-bs-original-title",t||""),!t||this._element.getAttribute("aria-label")||this._element.textContent||this._element.setAttribute("aria-label",t),this._element.setAttribute("title",""))}_enter(t,e){e=this._initializeOnDelegatedTarget(t,e),t&&(e._activeTrigger["focusin"===t.type?dn:hn]=!0),e.getTipElement().classList.contains(sn)||e._hoverState===on?e._hoverState=on:(clearTimeout(e._timeout),e._hoverState=on,e._config.delay&&e._config.delay.show?e._timeout=setTimeout((()=>{e._hoverState===on&&e.show()}),e._config.delay.show):e.show())}_leave(t,e){e=this._initializeOnDelegatedTarget(t,e),t&&(e._activeTrigger["focusout"===t.type?dn:hn]=e._element.contains(t.relatedTarget)),e._isWithActiveTrigger()||(clearTimeout(e._timeout),e._hoverState=rn,e._config.delay&&e._config.delay.hide?e._timeout=setTimeout((()=>{e._hoverState===rn&&e.hide()}),e._config.delay.hide):e.hide())}_isWithActiveTrigger(){for(const t in this._activeTrigger)if(this._activeTrigger[t])return!0;return!1}_getConfig(t){const e=U.getDataAttributes(this._element);return Object.keys(e).forEach((t=>{Gi.has(t)&&delete e[t]})),(t={...this.constructor.Default,...e,..."object"==typeof t&&t?t:{}}).container=!1===t.container?document.body:r(t.container),"number"==typeof t.delay&&(t.delay={show:t.delay,hide:t.delay}),"number"==typeof t.title&&(t.title=t.title.toString()),"number"==typeof t.content&&(t.content=t.content.toString()),a(Qi,t,this.constructor.DefaultType),t.sanitize&&(t.template=Yi(t.template,t.allowList,t.sanitizeFn)),t}_getDelegateConfig(){const t={};for(const e in this._config)this.constructor.Default[e]!==this._config[e]&&(t[e]=this._config[e]);return t}_cleanTipClass(){const t=this.getTipElement(),e=new RegExp(`(^|\\s)${this._getBasicClassPrefix()}\\S+`,"g"),i=t.getAttribute("class").match(e);null!==i&&i.length>0&&i.map((t=>t.trim())).forEach((e=>t.classList.remove(e)))}_getBasicClassPrefix(){return"bs-tooltip"}_handlePopperPlacementChange(t){const{state:e}=t;e&&(this.tip=e.elements.popper,this._cleanTipClass(),this._addAttachmentClass(this._getAttachment(e.placement)))}_disposePopper(){this._popper&&(this._popper.destroy(),this._popper=null)}static jQueryInterface(t){return this.each((function(){const e=un.getOrCreateInstance(this,t);if("string"==typeof t){if(void 0===e[t])throw new TypeError(`No method named "${t}"`);e[t]()}}))}}g(un);const fn={...un.Default,placement:"right",offset:[0,8],trigger:"click",content:"",template:''},pn={...un.DefaultType,content:"(string|element|function)"},mn={HIDE:"hide.bs.popover",HIDDEN:"hidden.bs.popover",SHOW:"show.bs.popover",SHOWN:"shown.bs.popover",INSERTED:"inserted.bs.popover",CLICK:"click.bs.popover",FOCUSIN:"focusin.bs.popover",FOCUSOUT:"focusout.bs.popover",MOUSEENTER:"mouseenter.bs.popover",MOUSELEAVE:"mouseleave.bs.popover"};class gn extends un{static get Default(){return fn}static get NAME(){return"popover"}static get Event(){return mn}static get DefaultType(){return pn}isWithContent(){return this.getTitle()||this._getContent()}setContent(t){this._sanitizeAndSetContent(t,this.getTitle(),".popover-header"),this._sanitizeAndSetContent(t,this._getContent(),".popover-body")}_getContent(){return this._resolvePossibleFunction(this._config.content)}_getBasicClassPrefix(){return"bs-popover"}static jQueryInterface(t){return this.each((function(){const e=gn.getOrCreateInstance(this,t);if("string"==typeof t){if(void 0===e[t])throw new TypeError(`No method named "${t}"`);e[t]()}}))}}g(gn);const _n="scrollspy",bn={offset:10,method:"auto",target:""},vn={offset:"number",method:"string",target:"(string|element)"},yn="active",wn=".nav-link, .list-group-item, .dropdown-item",En="position";class An extends B{constructor(t,e){super(t),this._scrollElement="BODY"===this._element.tagName?window:this._element,this._config=this._getConfig(e),this._offsets=[],this._targets=[],this._activeTarget=null,this._scrollHeight=0,j.on(this._scrollElement,"scroll.bs.scrollspy",(()=>this._process())),this.refresh(),this._process()}static get Default(){return bn}static get NAME(){return _n}refresh(){const t=this._scrollElement===this._scrollElement.window?"offset":En,e="auto"===this._config.method?t:this._config.method,n=e===En?this._getScrollTop():0;this._offsets=[],this._targets=[],this._scrollHeight=this._getScrollHeight(),V.find(wn,this._config.target).map((t=>{const s=i(t),o=s?V.findOne(s):null;if(o){const t=o.getBoundingClientRect();if(t.width||t.height)return[U[e](o).top+n,s]}return null})).filter((t=>t)).sort(((t,e)=>t[0]-e[0])).forEach((t=>{this._offsets.push(t[0]),this._targets.push(t[1])}))}dispose(){j.off(this._scrollElement,".bs.scrollspy"),super.dispose()}_getConfig(t){return(t={...bn,...U.getDataAttributes(this._element),..."object"==typeof t&&t?t:{}}).target=r(t.target)||document.documentElement,a(_n,t,vn),t}_getScrollTop(){return this._scrollElement===window?this._scrollElement.pageYOffset:this._scrollElement.scrollTop}_getScrollHeight(){return this._scrollElement.scrollHeight||Math.max(document.body.scrollHeight,document.documentElement.scrollHeight)}_getOffsetHeight(){return this._scrollElement===window?window.innerHeight:this._scrollElement.getBoundingClientRect().height}_process(){const t=this._getScrollTop()+this._config.offset,e=this._getScrollHeight(),i=this._config.offset+e-this._getOffsetHeight();if(this._scrollHeight!==e&&this.refresh(),t>=i){const t=this._targets[this._targets.length-1];this._activeTarget!==t&&this._activate(t)}else{if(this._activeTarget&&t0)return this._activeTarget=null,void this._clear();for(let e=this._offsets.length;e--;)this._activeTarget!==this._targets[e]&&t>=this._offsets[e]&&(void 0===this._offsets[e+1]||t`${e}[data-bs-target="${t}"],${e}[href="${t}"]`)),i=V.findOne(e.join(","),this._config.target);i.classList.add(yn),i.classList.contains("dropdown-item")?V.findOne(".dropdown-toggle",i.closest(".dropdown")).classList.add(yn):V.parents(i,".nav, .list-group").forEach((t=>{V.prev(t,".nav-link, .list-group-item").forEach((t=>t.classList.add(yn))),V.prev(t,".nav-item").forEach((t=>{V.children(t,".nav-link").forEach((t=>t.classList.add(yn)))}))})),j.trigger(this._scrollElement,"activate.bs.scrollspy",{relatedTarget:t})}_clear(){V.find(wn,this._config.target).filter((t=>t.classList.contains(yn))).forEach((t=>t.classList.remove(yn)))}static jQueryInterface(t){return this.each((function(){const e=An.getOrCreateInstance(this,t);if("string"==typeof t){if(void 0===e[t])throw new TypeError(`No method named "${t}"`);e[t]()}}))}}j.on(window,"load.bs.scrollspy.data-api",(()=>{V.find('[data-bs-spy="scroll"]').forEach((t=>new An(t)))})),g(An);const Tn="active",On="fade",Cn="show",kn=".active",Ln=":scope > li > .active";class xn extends B{static get NAME(){return"tab"}show(){if(this._element.parentNode&&this._element.parentNode.nodeType===Node.ELEMENT_NODE&&this._element.classList.contains(Tn))return;let t;const e=n(this._element),i=this._element.closest(".nav, .list-group");if(i){const e="UL"===i.nodeName||"OL"===i.nodeName?Ln:kn;t=V.find(e,i),t=t[t.length-1]}const s=t?j.trigger(t,"hide.bs.tab",{relatedTarget:this._element}):null;if(j.trigger(this._element,"show.bs.tab",{relatedTarget:t}).defaultPrevented||null!==s&&s.defaultPrevented)return;this._activate(this._element,i);const o=()=>{j.trigger(t,"hidden.bs.tab",{relatedTarget:this._element}),j.trigger(this._element,"shown.bs.tab",{relatedTarget:t})};e?this._activate(e,e.parentNode,o):o()}_activate(t,e,i){const n=(!e||"UL"!==e.nodeName&&"OL"!==e.nodeName?V.children(e,kn):V.find(Ln,e))[0],s=i&&n&&n.classList.contains(On),o=()=>this._transitionComplete(t,n,i);n&&s?(n.classList.remove(Cn),this._queueCallback(o,t,!0)):o()}_transitionComplete(t,e,i){if(e){e.classList.remove(Tn);const t=V.findOne(":scope > .dropdown-menu .active",e.parentNode);t&&t.classList.remove(Tn),"tab"===e.getAttribute("role")&&e.setAttribute("aria-selected",!1)}t.classList.add(Tn),"tab"===t.getAttribute("role")&&t.setAttribute("aria-selected",!0),u(t),t.classList.contains(On)&&t.classList.add(Cn);let n=t.parentNode;if(n&&"LI"===n.nodeName&&(n=n.parentNode),n&&n.classList.contains("dropdown-menu")){const e=t.closest(".dropdown");e&&V.find(".dropdown-toggle",e).forEach((t=>t.classList.add(Tn))),t.setAttribute("aria-expanded",!0)}i&&i()}static jQueryInterface(t){return this.each((function(){const e=xn.getOrCreateInstance(this);if("string"==typeof t){if(void 0===e[t])throw new TypeError(`No method named "${t}"`);e[t]()}}))}}j.on(document,"click.bs.tab.data-api",'[data-bs-toggle="tab"], [data-bs-toggle="pill"], [data-bs-toggle="list"]',(function(t){["A","AREA"].includes(this.tagName)&&t.preventDefault(),c(this)||xn.getOrCreateInstance(this).show()})),g(xn);const Dn="toast",Sn="hide",Nn="show",In="showing",Pn={animation:"boolean",autohide:"boolean",delay:"number"},jn={animation:!0,autohide:!0,delay:5e3};class Mn extends B{constructor(t,e){super(t),this._config=this._getConfig(e),this._timeout=null,this._hasMouseInteraction=!1,this._hasKeyboardInteraction=!1,this._setListeners()}static get DefaultType(){return Pn}static get Default(){return jn}static get NAME(){return Dn}show(){j.trigger(this._element,"show.bs.toast").defaultPrevented||(this._clearTimeout(),this._config.animation&&this._element.classList.add("fade"),this._element.classList.remove(Sn),u(this._element),this._element.classList.add(Nn),this._element.classList.add(In),this._queueCallback((()=>{this._element.classList.remove(In),j.trigger(this._element,"shown.bs.toast"),this._maybeScheduleHide()}),this._element,this._config.animation))}hide(){this._element.classList.contains(Nn)&&(j.trigger(this._element,"hide.bs.toast").defaultPrevented||(this._element.classList.add(In),this._queueCallback((()=>{this._element.classList.add(Sn),this._element.classList.remove(In),this._element.classList.remove(Nn),j.trigger(this._element,"hidden.bs.toast")}),this._element,this._config.animation)))}dispose(){this._clearTimeout(),this._element.classList.contains(Nn)&&this._element.classList.remove(Nn),super.dispose()}_getConfig(t){return t={...jn,...U.getDataAttributes(this._element),..."object"==typeof t&&t?t:{}},a(Dn,t,this.constructor.DefaultType),t}_maybeScheduleHide(){this._config.autohide&&(this._hasMouseInteraction||this._hasKeyboardInteraction||(this._timeout=setTimeout((()=>{this.hide()}),this._config.delay)))}_onInteraction(t,e){switch(t.type){case"mouseover":case"mouseout":this._hasMouseInteraction=e;break;case"focusin":case"focusout":this._hasKeyboardInteraction=e}if(e)return void this._clearTimeout();const i=t.relatedTarget;this._element===i||this._element.contains(i)||this._maybeScheduleHide()}_setListeners(){j.on(this._element,"mouseover.bs.toast",(t=>this._onInteraction(t,!0))),j.on(this._element,"mouseout.bs.toast",(t=>this._onInteraction(t,!1))),j.on(this._element,"focusin.bs.toast",(t=>this._onInteraction(t,!0))),j.on(this._element,"focusout.bs.toast",(t=>this._onInteraction(t,!1)))}_clearTimeout(){clearTimeout(this._timeout),this._timeout=null}static jQueryInterface(t){return this.each((function(){const e=Mn.getOrCreateInstance(this,t);if("string"==typeof t){if(void 0===e[t])throw new TypeError(`No method named "${t}"`);e[t](this)}}))}}return R(Mn),g(Mn),{Alert:W,Button:z,Carousel:st,Collapse:pt,Dropdown:hi,Modal:Hi,Offcanvas:Fi,Popover:gn,ScrollSpy:An,Tab:xn,Toast:Mn,Tooltip:un}})); +//# sourceMappingURL=bootstrap.bundle.min.js.map \ No newline at end of file diff --git a/docs/dev/deps/bootstrap-5.1.3/bootstrap.bundle.min.js.map b/docs/dev/deps/bootstrap-5.1.3/bootstrap.bundle.min.js.map new file mode 100644 index 00000000..7d78e32a --- /dev/null +++ b/docs/dev/deps/bootstrap-5.1.3/bootstrap.bundle.min.js.map @@ -0,0 +1 @@ +{"version":3,"sources":["../../js/src/util/index.js","../../js/src/dom/event-handler.js","../../js/src/dom/data.js","../../js/src/base-component.js","../../js/src/util/component-functions.js","../../js/src/alert.js","../../js/src/button.js","../../js/src/dom/manipulator.js","../../js/src/dom/selector-engine.js","../../js/src/carousel.js","../../js/src/collapse.js","../../node_modules/@popperjs/core/lib/enums.js","../../node_modules/@popperjs/core/lib/dom-utils/getNodeName.js","../../node_modules/@popperjs/core/lib/dom-utils/getWindow.js","../../node_modules/@popperjs/core/lib/dom-utils/instanceOf.js","../../node_modules/@popperjs/core/lib/modifiers/applyStyles.js","../../node_modules/@popperjs/core/lib/utils/getBasePlacement.js","../../node_modules/@popperjs/core/lib/dom-utils/getBoundingClientRect.js","../../node_modules/@popperjs/core/lib/dom-utils/getLayoutRect.js","../../node_modules/@popperjs/core/lib/dom-utils/contains.js","../../node_modules/@popperjs/core/lib/dom-utils/getComputedStyle.js","../../node_modules/@popperjs/core/lib/dom-utils/isTableElement.js","../../node_modules/@popperjs/core/lib/dom-utils/getDocumentElement.js","../../node_modules/@popperjs/core/lib/dom-utils/getParentNode.js","../../node_modules/@popperjs/core/lib/dom-utils/getOffsetParent.js","../../node_modules/@popperjs/core/lib/utils/getMainAxisFromPlacement.js","../../node_modules/@popperjs/core/lib/utils/math.js","../../node_modules/@popperjs/core/lib/utils/within.js","../../node_modules/@popperjs/core/lib/utils/mergePaddingObject.js","../../node_modules/@popperjs/core/lib/utils/getFreshSideObject.js","../../node_modules/@popperjs/core/lib/utils/expandToHashMap.js","../../node_modules/@popperjs/core/lib/modifiers/arrow.js","../../node_modules/@popperjs/core/lib/utils/getVariation.js","../../node_modules/@popperjs/core/lib/modifiers/computeStyles.js","../../node_modules/@popperjs/core/lib/modifiers/eventListeners.js","../../node_modules/@popperjs/core/lib/utils/getOppositePlacement.js","../../node_modules/@popperjs/core/lib/utils/getOppositeVariationPlacement.js","../../node_modules/@popperjs/core/lib/dom-utils/getWindowScroll.js","../../node_modules/@popperjs/core/lib/dom-utils/getWindowScrollBarX.js","../../node_modules/@popperjs/core/lib/dom-utils/isScrollParent.js","../../node_modules/@popperjs/core/lib/dom-utils/getScrollParent.js","../../node_modules/@popperjs/core/lib/dom-utils/listScrollParents.js","../../node_modules/@popperjs/core/lib/utils/rectToClientRect.js","../../node_modules/@popperjs/core/lib/dom-utils/getClippingRect.js","../../node_modules/@popperjs/core/lib/dom-utils/getViewportRect.js","../../node_modules/@popperjs/core/lib/dom-utils/getDocumentRect.js","../../node_modules/@popperjs/core/lib/utils/computeOffsets.js","../../node_modules/@popperjs/core/lib/utils/detectOverflow.js","../../node_modules/@popperjs/core/lib/utils/computeAutoPlacement.js","../../node_modules/@popperjs/core/lib/modifiers/flip.js","../../node_modules/@popperjs/core/lib/modifiers/hide.js","../../node_modules/@popperjs/core/lib/modifiers/offset.js","../../node_modules/@popperjs/core/lib/modifiers/popperOffsets.js","../../node_modules/@popperjs/core/lib/modifiers/preventOverflow.js","../../node_modules/@popperjs/core/lib/utils/getAltAxis.js","../../node_modules/@popperjs/core/lib/dom-utils/getCompositeRect.js","../../node_modules/@popperjs/core/lib/dom-utils/getNodeScroll.js","../../node_modules/@popperjs/core/lib/dom-utils/getHTMLElementScroll.js","../../node_modules/@popperjs/core/lib/utils/orderModifiers.js","../../node_modules/@popperjs/core/lib/createPopper.js","../../node_modules/@popperjs/core/lib/utils/debounce.js","../../node_modules/@popperjs/core/lib/utils/mergeByName.js","../../node_modules/@popperjs/core/lib/popper-lite.js","../../node_modules/@popperjs/core/lib/popper.js","../../js/src/dropdown.js","../../js/src/util/scrollbar.js","../../js/src/util/backdrop.js","../../js/src/util/focustrap.js","../../js/src/modal.js","../../js/src/offcanvas.js","../../js/src/util/sanitizer.js","../../js/src/tooltip.js","../../js/src/popover.js","../../js/src/scrollspy.js","../../js/src/tab.js","../../js/src/toast.js","../../js/index.umd.js"],"names":["TRANSITION_END","getSelector","element","selector","getAttribute","hrefAttr","includes","startsWith","split","trim","getSelectorFromElement","document","querySelector","getElementFromSelector","triggerTransitionEnd","dispatchEvent","Event","isElement","obj","jquery","nodeType","getElement","length","typeCheckConfig","componentName","config","configTypes","Object","keys","forEach","property","expectedTypes","value","valueType","toString","call","match","toLowerCase","RegExp","test","TypeError","toUpperCase","isVisible","getClientRects","getComputedStyle","getPropertyValue","isDisabled","Node","ELEMENT_NODE","classList","contains","disabled","hasAttribute","findShadowRoot","documentElement","attachShadow","getRootNode","root","ShadowRoot","parentNode","noop","reflow","offsetHeight","getjQuery","jQuery","window","body","DOMContentLoadedCallbacks","isRTL","dir","defineJQueryPlugin","plugin","callback","$","name","NAME","JQUERY_NO_CONFLICT","fn","jQueryInterface","Constructor","noConflict","readyState","addEventListener","push","execute","executeAfterTransition","transitionElement","waitForTransition","emulatedDuration","transitionDuration","transitionDelay","floatTransitionDuration","Number","parseFloat","floatTransitionDelay","getTransitionDurationFromElement","called","handler","target","removeEventListener","setTimeout","getNextActiveElement","list","activeElement","shouldGetNext","isCycleAllowed","index","indexOf","listLength","Math","max","min","namespaceRegex","stripNameRegex","stripUidRegex","eventRegistry","uidEvent","customEvents","mouseenter","mouseleave","customEventsRegex","nativeEvents","Set","getUidEvent","uid","getEvent","findHandler","events","delegationSelector","uidEventList","i","len","event","originalHandler","normalizeParams","originalTypeEvent","delegationFn","delegation","typeEvent","getTypeEvent","has","addHandler","oneOff","wrapFn","relatedTarget","delegateTarget","this","handlers","previousFn","replace","domElements","querySelectorAll","EventHandler","off","type","apply","bootstrapDelegationHandler","bootstrapHandler","removeHandler","Boolean","on","one","inNamespace","isNamespace","elementEvent","namespace","storeElementEvent","handlerKey","removeNamespacedHandlers","slice","keyHandlers","trigger","args","isNative","jQueryEvent","bubbles","nativeDispatch","defaultPrevented","evt","isPropagationStopped","isImmediatePropagationStopped","isDefaultPrevented","createEvent","initEvent","CustomEvent","cancelable","key","defineProperty","get","preventDefault","elementMap","Map","Data","set","instance","instanceMap","size","console","error","Array","from","remove","delete","BaseComponent","constructor","_element","DATA_KEY","dispose","EVENT_KEY","getOwnPropertyNames","propertyName","_queueCallback","isAnimated","static","getInstance","VERSION","Error","enableDismissTrigger","component","method","clickEvent","tagName","closest","getOrCreateInstance","Alert","close","_destroyElement","each","data","undefined","SELECTOR_DATA_TOGGLE","Button","toggle","setAttribute","normalizeData","val","normalizeDataKey","chr","button","Manipulator","setDataAttribute","removeDataAttribute","removeAttribute","getDataAttributes","attributes","dataset","filter","pureKey","charAt","getDataAttribute","offset","rect","getBoundingClientRect","top","pageYOffset","left","pageXOffset","position","offsetTop","offsetLeft","SelectorEngine","find","concat","Element","prototype","findOne","children","child","matches","parents","ancestor","prev","previous","previousElementSibling","next","nextElementSibling","focusableChildren","focusables","map","join","el","Default","interval","keyboard","slide","pause","wrap","touch","DefaultType","ORDER_NEXT","ORDER_PREV","DIRECTION_LEFT","DIRECTION_RIGHT","KEY_TO_DIRECTION","ArrowLeft","ArrowRight","EVENT_SLID","CLASS_NAME_ACTIVE","SELECTOR_ACTIVE_ITEM","Carousel","super","_items","_interval","_activeElement","_isPaused","_isSliding","touchTimeout","touchStartX","touchDeltaX","_config","_getConfig","_indicatorsElement","_touchSupported","navigator","maxTouchPoints","_pointerEvent","PointerEvent","_addEventListeners","_slide","nextWhenVisible","hidden","cycle","clearInterval","_updateInterval","setInterval","visibilityState","bind","to","activeIndex","_getItemIndex","order","_handleSwipe","absDeltax","abs","direction","_keydown","_addTouchEventListeners","hasPointerPenTouch","pointerType","start","clientX","touches","move","end","clearTimeout","itemImg","add","_getItemByOrder","isNext","_triggerSlideEvent","eventDirectionName","targetIndex","fromIndex","_setActiveIndicatorElement","activeIndicator","indicators","parseInt","elementInterval","defaultInterval","directionOrOrder","_directionToOrder","activeElementIndex","nextElement","nextElementIndex","isCycling","directionalClassName","orderClassName","_orderToDirection","triggerSlidEvent","completeCallBack","action","ride","carouselInterface","slideIndex","dataApiClickHandler","carousels","parent","CLASS_NAME_SHOW","CLASS_NAME_COLLAPSE","CLASS_NAME_COLLAPSING","CLASS_NAME_COLLAPSED","CLASS_NAME_DEEPER_CHILDREN","Collapse","_isTransitioning","_triggerArray","toggleList","elem","filterElement","foundElem","_selector","_initializeChildren","_addAriaAndCollapsedClass","_isShown","hide","show","activesData","actives","container","tempActiveData","elemActive","dimension","_getDimension","style","scrollSize","triggerArrayLength","selected","triggerArray","isOpen","bottom","right","auto","basePlacements","clippingParents","viewport","popper","reference","variationPlacements","reduce","acc","placement","placements","beforeRead","read","afterRead","beforeMain","main","afterMain","beforeWrite","write","afterWrite","modifierPhases","getNodeName","nodeName","getWindow","node","ownerDocument","defaultView","isHTMLElement","HTMLElement","isShadowRoot","applyStyles$1","enabled","phase","_ref","state","elements","styles","assign","effect","_ref2","initialStyles","options","strategy","margin","arrow","hasOwnProperty","attribute","requires","getBasePlacement","includeScale","width","height","x","y","getLayoutRect","clientRect","offsetWidth","rootNode","isSameNode","host","isTableElement","getDocumentElement","getParentNode","assignedSlot","getTrueOffsetParent","offsetParent","getOffsetParent","isFirefox","userAgent","currentNode","css","transform","perspective","contain","willChange","getContainingBlock","getMainAxisFromPlacement","round","within","mathMax","mathMin","mergePaddingObject","paddingObject","expandToHashMap","hashMap","arrow$1","_state$modifiersData$","arrowElement","popperOffsets","modifiersData","basePlacement","axis","padding","rects","toPaddingObject","arrowRect","minProp","maxProp","endDiff","startDiff","arrowOffsetParent","clientSize","clientHeight","clientWidth","centerToReference","center","axisProp","centerOffset","_options$element","requiresIfExists","getVariation","unsetSides","mapToStyles","_Object$assign2","popperRect","variation","offsets","gpuAcceleration","adaptive","roundOffsets","_ref3","dpr","devicePixelRatio","roundOffsetsByDPR","_ref3$x","_ref3$y","hasX","hasY","sideX","sideY","win","heightProp","widthProp","_Object$assign","commonStyles","computeStyles$1","_ref4","_options$gpuAccelerat","_options$adaptive","_options$roundOffsets","passive","eventListeners","_options$scroll","scroll","_options$resize","resize","scrollParents","scrollParent","update","hash","getOppositePlacement","matched","getOppositeVariationPlacement","getWindowScroll","scrollLeft","scrollTop","getWindowScrollBarX","isScrollParent","_getComputedStyle","overflow","overflowX","overflowY","getScrollParent","listScrollParents","_element$ownerDocumen","isBody","visualViewport","updatedList","rectToClientRect","getClientRectFromMixedType","clippingParent","html","getViewportRect","clientTop","clientLeft","getInnerBoundingClientRect","winScroll","scrollWidth","scrollHeight","getDocumentRect","computeOffsets","commonX","commonY","mainAxis","detectOverflow","_options","_options$placement","_options$boundary","boundary","_options$rootBoundary","rootBoundary","_options$elementConte","elementContext","_options$altBoundary","altBoundary","_options$padding","altContext","clippingClientRect","mainClippingParents","clipperElement","getClippingParents","firstClippingParent","clippingRect","accRect","getClippingRect","contextElement","referenceClientRect","popperClientRect","elementClientRect","overflowOffsets","offsetData","multiply","computeAutoPlacement","flipVariations","_options$allowedAutoP","allowedAutoPlacements","allPlacements","allowedPlacements","overflows","sort","a","b","flip$1","_skip","_options$mainAxis","checkMainAxis","_options$altAxis","altAxis","checkAltAxis","specifiedFallbackPlacements","fallbackPlacements","_options$flipVariatio","preferredPlacement","oppositePlacement","getExpandedFallbackPlacements","referenceRect","checksMap","makeFallbackChecks","firstFittingPlacement","_basePlacement","isStartVariation","isVertical","mainVariationSide","altVariationSide","checks","every","check","_loop","_i","fittingPlacement","reset","getSideOffsets","preventedOffsets","isAnySideFullyClipped","some","side","hide$1","preventOverflow","referenceOverflow","popperAltOverflow","referenceClippingOffsets","popperEscapeOffsets","isReferenceHidden","hasPopperEscaped","offset$1","_options$offset","invertDistance","skidding","distance","distanceAndSkiddingToXY","_data$state$placement","popperOffsets$1","preventOverflow$1","_options$tether","tether","_options$tetherOffset","tetherOffset","isBasePlacement","tetherOffsetValue","mainSide","altSide","additive","minLen","maxLen","arrowPaddingObject","arrowPaddingMin","arrowPaddingMax","arrowLen","minOffset","maxOffset","clientOffset","offsetModifierValue","tetherMin","tetherMax","preventedOffset","_mainSide","_altSide","_offset","_min","_max","_preventedOffset","getCompositeRect","elementOrVirtualElement","isFixed","isOffsetParentAnElement","isElementScaled","modifiers","visited","result","modifier","dep","depModifier","DEFAULT_OPTIONS","areValidElements","_len","arguments","_key","popperGenerator","generatorOptions","_generatorOptions","_generatorOptions$def","defaultModifiers","_generatorOptions$def2","defaultOptions","pending","orderedModifiers","effectCleanupFns","isDestroyed","setOptions","setOptionsAction","cleanupModifierEffects","merged","orderModifiers","current","existing","m","_ref3$options","cleanupFn","forceUpdate","_state$elements","_state$orderedModifie","_state$orderedModifie2","Promise","resolve","then","destroy","onFirstUpdate","createPopper","computeStyles","applyStyles","flip","ESCAPE_KEY","SPACE_KEY","ARROW_UP_KEY","ARROW_DOWN_KEY","REGEXP_KEYDOWN","EVENT_CLICK_DATA_API","EVENT_KEYDOWN_DATA_API","SELECTOR_MENU","PLACEMENT_TOP","PLACEMENT_TOPEND","PLACEMENT_BOTTOM","PLACEMENT_BOTTOMEND","PLACEMENT_RIGHT","PLACEMENT_LEFT","display","popperConfig","autoClose","Dropdown","_popper","_menu","_getMenuElement","_inNavbar","_detectNavbar","getParentFromElement","_createPopper","focus","_completeHide","Popper","referenceElement","_getPopperConfig","isDisplayStatic","_getPlacement","parentDropdown","isEnd","_getOffset","popperData","defaultBsPopperConfig","_selectMenuItem","items","toggles","context","composedPath","isMenuTarget","isActive","stopPropagation","getToggleButton","clearMenus","dataApiKeydownHandler","SELECTOR_FIXED_CONTENT","SELECTOR_STICKY_CONTENT","ScrollBarHelper","getWidth","documentWidth","innerWidth","_disableOverFlow","_setElementAttributes","calculatedValue","_saveInitialAttribute","styleProp","scrollbarWidth","_applyManipulationCallback","_resetElementAttributes","actualValue","removeProperty","callBack","isOverflowing","className","rootElement","clickCallback","EVENT_MOUSEDOWN","Backdrop","_isAppended","_append","_getElement","_emulateAnimation","backdrop","createElement","append","trapElement","autofocus","TAB_NAV_BACKWARD","FocusTrap","_isActive","_lastTabNavDirection","activate","_handleFocusin","_handleKeydown","deactivate","shiftKey","EVENT_HIDDEN","EVENT_SHOW","EVENT_RESIZE","EVENT_CLICK_DISMISS","EVENT_KEYDOWN_DISMISS","EVENT_MOUSEDOWN_DISMISS","CLASS_NAME_OPEN","CLASS_NAME_STATIC","Modal","_dialog","_backdrop","_initializeBackDrop","_focustrap","_initializeFocusTrap","_ignoreBackdropClick","_scrollBar","_isAnimated","_adjustDialog","_setEscapeEvent","_setResizeEvent","_showBackdrop","_showElement","_hideModal","htmlElement","handleUpdate","modalBody","_triggerBackdropTransition","_resetAdjustments","currentTarget","isModalOverflowing","isBodyOverflowing","paddingLeft","paddingRight","showEvent","allReadyOpen","OPEN_SELECTOR","Offcanvas","visibility","blur","uriAttributes","SAFE_URL_PATTERN","DATA_URL_PATTERN","allowedAttribute","allowedAttributeList","attributeName","nodeValue","regExp","attributeRegex","sanitizeHtml","unsafeHtml","allowList","sanitizeFn","createdDocument","DOMParser","parseFromString","elementName","attributeList","allowedAttributes","innerHTML","DISALLOWED_ATTRIBUTES","animation","template","title","delay","customClass","sanitize","AttachmentMap","AUTO","TOP","RIGHT","BOTTOM","LEFT","area","br","col","code","div","em","hr","h1","h2","h3","h4","h5","h6","img","li","ol","p","pre","s","small","span","sub","sup","strong","u","ul","HIDE","HIDDEN","SHOW","SHOWN","INSERTED","CLICK","FOCUSIN","FOCUSOUT","MOUSEENTER","MOUSELEAVE","CLASS_NAME_FADE","HOVER_STATE_SHOW","HOVER_STATE_OUT","SELECTOR_TOOLTIP_INNER","SELECTOR_MODAL","EVENT_MODAL_HIDE","TRIGGER_HOVER","TRIGGER_FOCUS","Tooltip","_isEnabled","_timeout","_hoverState","_activeTrigger","tip","_setListeners","enable","disable","toggleEnabled","_initializeOnDelegatedTarget","click","_isWithActiveTrigger","_enter","_leave","getTipElement","_hideModalHandler","_disposePopper","isWithContent","shadowRoot","isInTheDom","getTitle","tipId","prefix","floor","random","getElementById","getUID","attachment","_getAttachment","_addAttachmentClass","_resolvePossibleFunction","prevHoverState","_cleanTipClass","setContent","_sanitizeAndSetContent","content","templateElement","setElementContent","textContent","updateAttachment","_getDelegateConfig","_handlePopperPlacementChange","_getBasicClassPrefix","eventIn","eventOut","_fixTitle","originalTitleType","dataAttributes","dataAttr","basicClassPrefixRegex","tabClass","token","tClass","Popover","_getContent","SELECTOR_LINK_ITEMS","METHOD_POSITION","ScrollSpy","_scrollElement","_offsets","_targets","_activeTarget","_scrollHeight","_process","refresh","autoMethod","offsetMethod","offsetBase","_getScrollTop","_getScrollHeight","targetSelector","targetBCR","item","_getOffsetHeight","innerHeight","maxScroll","_activate","_clear","queries","link","listGroup","navItem","spy","SELECTOR_ACTIVE","SELECTOR_ACTIVE_UL","Tab","listElement","itemSelector","hideEvent","complete","active","isTransitioning","_transitionComplete","dropdownChild","dropdownElement","dropdown","CLASS_NAME_HIDE","CLASS_NAME_SHOWING","autohide","Toast","_hasMouseInteraction","_hasKeyboardInteraction","_clearTimeout","_maybeScheduleHide","_onInteraction","isInteracting"],"mappings":";;;;;0OAOA,MAEMA,EAAiB,gBAyBjBC,EAAcC,IAClB,IAAIC,EAAWD,EAAQE,aAAa,kBAEpC,IAAKD,GAAyB,MAAbA,EAAkB,CACjC,IAAIE,EAAWH,EAAQE,aAAa,QAMpC,IAAKC,IAAcA,EAASC,SAAS,OAASD,EAASE,WAAW,KAChE,OAAO,KAILF,EAASC,SAAS,OAASD,EAASE,WAAW,OACjDF,EAAY,IAAGA,EAASG,MAAM,KAAK,MAGrCL,EAAWE,GAAyB,MAAbA,EAAmBA,EAASI,OAAS,KAG9D,OAAON,GAGHO,EAAyBR,IAC7B,MAAMC,EAAWF,EAAYC,GAE7B,OAAIC,GACKQ,SAASC,cAAcT,GAAYA,EAGrC,MAGHU,EAAyBX,IAC7B,MAAMC,EAAWF,EAAYC,GAE7B,OAAOC,EAAWQ,SAASC,cAAcT,GAAY,MA0BjDW,EAAuBZ,IAC3BA,EAAQa,cAAc,IAAIC,MAAMhB,KAG5BiB,EAAYC,MACXA,GAAsB,iBAARA,UAIO,IAAfA,EAAIC,SACbD,EAAMA,EAAI,SAGmB,IAAjBA,EAAIE,UAGdC,EAAaH,GACbD,EAAUC,GACLA,EAAIC,OAASD,EAAI,GAAKA,EAGZ,iBAARA,GAAoBA,EAAII,OAAS,EACnCX,SAASC,cAAcM,GAGzB,KAGHK,EAAkB,CAACC,EAAeC,EAAQC,KAC9CC,OAAOC,KAAKF,GAAaG,SAAQC,IAC/B,MAAMC,EAAgBL,EAAYI,GAC5BE,EAAQP,EAAOK,GACfG,EAAYD,GAASf,EAAUe,GAAS,UArH5Cd,OADSA,EAsHsDc,GApHzD,GAAEd,IAGL,GAAGgB,SAASC,KAAKjB,GAAKkB,MAAM,eAAe,GAAGC,cALxCnB,IAAAA,EAwHX,IAAK,IAAIoB,OAAOP,GAAeQ,KAAKN,GAClC,MAAM,IAAIO,UACP,GAAEhB,EAAciB,0BAA0BX,qBAA4BG,yBAAiCF,WAM1GW,EAAYxC,MACXe,EAAUf,IAAgD,IAApCA,EAAQyC,iBAAiBrB,SAIgB,YAA7DsB,iBAAiB1C,GAAS2C,iBAAiB,cAG9CC,EAAa5C,IACZA,GAAWA,EAAQkB,WAAa2B,KAAKC,gBAItC9C,EAAQ+C,UAAUC,SAAS,mBAIC,IAArBhD,EAAQiD,SACVjD,EAAQiD,SAGVjD,EAAQkD,aAAa,aAAoD,UAArClD,EAAQE,aAAa,aAG5DiD,EAAiBnD,IACrB,IAAKS,SAAS2C,gBAAgBC,aAC5B,OAAO,KAIT,GAAmC,mBAAxBrD,EAAQsD,YAA4B,CAC7C,MAAMC,EAAOvD,EAAQsD,cACrB,OAAOC,aAAgBC,WAAaD,EAAO,KAG7C,OAAIvD,aAAmBwD,WACdxD,EAIJA,EAAQyD,WAINN,EAAenD,EAAQyD,YAHrB,MAMLC,EAAO,OAUPC,EAAS3D,IAEbA,EAAQ4D,cAGJC,EAAY,KAChB,MAAMC,OAAEA,GAAWC,OAEnB,OAAID,IAAWrD,SAASuD,KAAKd,aAAa,qBACjCY,EAGF,MAGHG,EAA4B,GAiB5BC,EAAQ,IAAuC,QAAjCzD,SAAS2C,gBAAgBe,IAEvCC,EAAqBC,IAjBAC,IAAAA,EAAAA,EAkBN,KACjB,MAAMC,EAAIV,IAEV,GAAIU,EAAG,CACL,MAAMC,EAAOH,EAAOI,KACdC,EAAqBH,EAAEI,GAAGH,GAChCD,EAAEI,GAAGH,GAAQH,EAAOO,gBACpBL,EAAEI,GAAGH,GAAMK,YAAcR,EACzBE,EAAEI,GAAGH,GAAMM,WAAa,KACtBP,EAAEI,GAAGH,GAAQE,EACNL,EAAOO,mBA3BQ,YAAxBnE,SAASsE,YAENd,EAA0B7C,QAC7BX,SAASuE,iBAAiB,oBAAoB,KAC5Cf,EAA0BtC,SAAQ2C,GAAYA,SAIlDL,EAA0BgB,KAAKX,IAE/BA,KAuBEY,EAAUZ,IACU,mBAAbA,GACTA,KAIEa,EAAyB,CAACb,EAAUc,EAAmBC,GAAoB,KAC/E,IAAKA,EAEH,YADAH,EAAQZ,GAIV,MACMgB,EA1LiCtF,CAAAA,IACvC,IAAKA,EACH,OAAO,EAIT,IAAIuF,mBAAEA,EAAFC,gBAAsBA,GAAoBzB,OAAOrB,iBAAiB1C,GAEtE,MAAMyF,EAA0BC,OAAOC,WAAWJ,GAC5CK,EAAuBF,OAAOC,WAAWH,GAG/C,OAAKC,GAA4BG,GAKjCL,EAAqBA,EAAmBjF,MAAM,KAAK,GACnDkF,EAAkBA,EAAgBlF,MAAM,KAAK,GArFf,KAuFtBoF,OAAOC,WAAWJ,GAAsBG,OAAOC,WAAWH,KAPzD,GA6KgBK,CAAiCT,GADlC,EAGxB,IAAIU,GAAS,EAEb,MAAMC,EAAU,EAAGC,OAAAA,MACbA,IAAWZ,IAIfU,GAAS,EACTV,EAAkBa,oBAAoBnG,EAAgBiG,GACtDb,EAAQZ,KAGVc,EAAkBJ,iBAAiBlF,EAAgBiG,GACnDG,YAAW,KACJJ,GACHlF,EAAqBwE,KAEtBE,IAYCa,EAAuB,CAACC,EAAMC,EAAeC,EAAeC,KAChE,IAAIC,EAAQJ,EAAKK,QAAQJ,GAGzB,IAAe,IAAXG,EACF,OAAOJ,GAAME,GAAiBC,EAAiBH,EAAKhF,OAAS,EAAI,GAGnE,MAAMsF,EAAaN,EAAKhF,OAQxB,OANAoF,GAASF,EAAgB,GAAK,EAE1BC,IACFC,GAASA,EAAQE,GAAcA,GAG1BN,EAAKO,KAAKC,IAAI,EAAGD,KAAKE,IAAIL,EAAOE,EAAa,MCrSjDI,EAAiB,qBACjBC,EAAiB,OACjBC,EAAgB,SAChBC,EAAgB,GACtB,IAAIC,EAAW,EACf,MAAMC,EAAe,CACnBC,WAAY,YACZC,WAAY,YAERC,EAAoB,4BACpBC,EAAe,IAAIC,IAAI,CAC3B,QACA,WACA,UACA,YACA,cACA,aACA,iBACA,YACA,WACA,YACA,cACA,YACA,UACA,WACA,QACA,oBACA,aACA,YACA,WACA,cACA,cACA,cACA,YACA,eACA,gBACA,eACA,gBACA,aACA,QACA,OACA,SACA,QACA,SACA,SACA,UACA,WACA,OACA,SACA,eACA,SACA,OACA,mBACA,mBACA,QACA,QACA,WASF,SAASC,EAAYzH,EAAS0H,GAC5B,OAAQA,GAAQ,GAAEA,MAAQR,OAAiBlH,EAAQkH,UAAYA,IAGjE,SAASS,EAAS3H,GAChB,MAAM0H,EAAMD,EAAYzH,GAKxB,OAHAA,EAAQkH,SAAWQ,EACnBT,EAAcS,GAAOT,EAAcS,IAAQ,GAEpCT,EAAcS,GAsCvB,SAASE,EAAYC,EAAQ9B,EAAS+B,EAAqB,MACzD,MAAMC,EAAetG,OAAOC,KAAKmG,GAEjC,IAAK,IAAIG,EAAI,EAAGC,EAAMF,EAAa3G,OAAQ4G,EAAIC,EAAKD,IAAK,CACvD,MAAME,EAAQL,EAAOE,EAAaC,IAElC,GAAIE,EAAMC,kBAAoBpC,GAAWmC,EAAMJ,qBAAuBA,EACpE,OAAOI,EAIX,OAAO,KAGT,SAASE,EAAgBC,EAAmBtC,EAASuC,GACnD,MAAMC,EAAgC,iBAAZxC,EACpBoC,EAAkBI,EAAaD,EAAevC,EAEpD,IAAIyC,EAAYC,EAAaJ,GAO7B,OANiBd,EAAamB,IAAIF,KAGhCA,EAAYH,GAGP,CAACE,EAAYJ,EAAiBK,GAGvC,SAASG,EAAW3I,EAASqI,EAAmBtC,EAASuC,EAAcM,GACrE,GAAiC,iBAAtBP,IAAmCrI,EAC5C,OAUF,GAPK+F,IACHA,EAAUuC,EACVA,EAAe,MAKbhB,EAAkBjF,KAAKgG,GAAoB,CAC7C,MAAMQ,EAASlE,GACN,SAAUuD,GACf,IAAKA,EAAMY,eAAkBZ,EAAMY,gBAAkBZ,EAAMa,iBAAmBb,EAAMa,eAAe/F,SAASkF,EAAMY,eAChH,OAAOnE,EAAG1C,KAAK+G,KAAMd,IAKvBI,EACFA,EAAeO,EAAOP,GAEtBvC,EAAU8C,EAAO9C,GAIrB,MAAOwC,EAAYJ,EAAiBK,GAAaJ,EAAgBC,EAAmBtC,EAASuC,GACvFT,EAASF,EAAS3H,GAClBiJ,EAAWpB,EAAOW,KAAeX,EAAOW,GAAa,IACrDU,EAAatB,EAAYqB,EAAUd,EAAiBI,EAAaxC,EAAU,MAEjF,GAAImD,EAGF,YAFAA,EAAWN,OAASM,EAAWN,QAAUA,GAK3C,MAAMlB,EAAMD,EAAYU,EAAiBE,EAAkBc,QAAQrC,EAAgB,KAC7EnC,EAAK4D,EA3Fb,SAAoCvI,EAASC,EAAU0E,GACrD,OAAO,SAASoB,EAAQmC,GACtB,MAAMkB,EAAcpJ,EAAQqJ,iBAAiBpJ,GAE7C,IAAK,IAAI+F,OAAEA,GAAWkC,EAAOlC,GAAUA,IAAWgD,KAAMhD,EAASA,EAAOvC,WACtE,IAAK,IAAIuE,EAAIoB,EAAYhI,OAAQ4G,KAC/B,GAAIoB,EAAYpB,KAAOhC,EAOrB,OANAkC,EAAMa,eAAiB/C,EAEnBD,EAAQ6C,QACVU,EAAaC,IAAIvJ,EAASkI,EAAMsB,KAAMvJ,EAAU0E,GAG3CA,EAAG8E,MAAMzD,EAAQ,CAACkC,IAM/B,OAAO,MAyEPwB,CAA2B1J,EAAS+F,EAASuC,GAxGjD,SAA0BtI,EAAS2E,GACjC,OAAO,SAASoB,EAAQmC,GAOtB,OANAA,EAAMa,eAAiB/I,EAEnB+F,EAAQ6C,QACVU,EAAaC,IAAIvJ,EAASkI,EAAMsB,KAAM7E,GAGjCA,EAAG8E,MAAMzJ,EAAS,CAACkI,KAiG1ByB,CAAiB3J,EAAS+F,GAE5BpB,EAAGmD,mBAAqBS,EAAaxC,EAAU,KAC/CpB,EAAGwD,gBAAkBA,EACrBxD,EAAGiE,OAASA,EACZjE,EAAGuC,SAAWQ,EACduB,EAASvB,GAAO/C,EAEhB3E,EAAQgF,iBAAiBwD,EAAW7D,EAAI4D,GAG1C,SAASqB,EAAc5J,EAAS6H,EAAQW,EAAWzC,EAAS+B,GAC1D,MAAMnD,EAAKiD,EAAYC,EAAOW,GAAYzC,EAAS+B,GAE9CnD,IAIL3E,EAAQiG,oBAAoBuC,EAAW7D,EAAIkF,QAAQ/B,WAC5CD,EAAOW,GAAW7D,EAAGuC,WAe9B,SAASuB,EAAaP,GAGpB,OADAA,EAAQA,EAAMiB,QAAQpC,EAAgB,IAC/BI,EAAae,IAAUA,EAGhC,MAAMoB,EAAe,CACnBQ,GAAG9J,EAASkI,EAAOnC,EAASuC,GAC1BK,EAAW3I,EAASkI,EAAOnC,EAASuC,GAAc,IAGpDyB,IAAI/J,EAASkI,EAAOnC,EAASuC,GAC3BK,EAAW3I,EAASkI,EAAOnC,EAASuC,GAAc,IAGpDiB,IAAIvJ,EAASqI,EAAmBtC,EAASuC,GACvC,GAAiC,iBAAtBD,IAAmCrI,EAC5C,OAGF,MAAOuI,EAAYJ,EAAiBK,GAAaJ,EAAgBC,EAAmBtC,EAASuC,GACvF0B,EAAcxB,IAAcH,EAC5BR,EAASF,EAAS3H,GAClBiK,EAAc5B,EAAkBhI,WAAW,KAEjD,QAA+B,IAApB8H,EAAiC,CAE1C,IAAKN,IAAWA,EAAOW,GACrB,OAIF,YADAoB,EAAc5J,EAAS6H,EAAQW,EAAWL,EAAiBI,EAAaxC,EAAU,MAIhFkE,GACFxI,OAAOC,KAAKmG,GAAQlG,SAAQuI,KAhDlC,SAAkClK,EAAS6H,EAAQW,EAAW2B,GAC5D,MAAMC,EAAoBvC,EAAOW,IAAc,GAE/C/G,OAAOC,KAAK0I,GAAmBzI,SAAQ0I,IACrC,GAAIA,EAAWjK,SAAS+J,GAAY,CAClC,MAAMjC,EAAQkC,EAAkBC,GAEhCT,EAAc5J,EAAS6H,EAAQW,EAAWN,EAAMC,gBAAiBD,EAAMJ,wBA0CrEwC,CAAyBtK,EAAS6H,EAAQqC,EAAc7B,EAAkBkC,MAAM,OAIpF,MAAMH,EAAoBvC,EAAOW,IAAc,GAC/C/G,OAAOC,KAAK0I,GAAmBzI,SAAQ6I,IACrC,MAAMH,EAAaG,EAAYrB,QAAQnC,EAAe,IAEtD,IAAKgD,GAAe3B,EAAkBjI,SAASiK,GAAa,CAC1D,MAAMnC,EAAQkC,EAAkBI,GAEhCZ,EAAc5J,EAAS6H,EAAQW,EAAWN,EAAMC,gBAAiBD,EAAMJ,yBAK7E2C,QAAQzK,EAASkI,EAAOwC,GACtB,GAAqB,iBAAVxC,IAAuBlI,EAChC,OAAO,KAGT,MAAMuE,EAAIV,IACJ2E,EAAYC,EAAaP,GACzB8B,EAAc9B,IAAUM,EACxBmC,EAAWpD,EAAamB,IAAIF,GAElC,IAAIoC,EACAC,GAAU,EACVC,GAAiB,EACjBC,GAAmB,EACnBC,EAAM,KA4CV,OA1CIhB,GAAezF,IACjBqG,EAAcrG,EAAEzD,MAAMoH,EAAOwC,GAE7BnG,EAAEvE,GAASyK,QAAQG,GACnBC,GAAWD,EAAYK,uBACvBH,GAAkBF,EAAYM,gCAC9BH,EAAmBH,EAAYO,sBAG7BR,GACFK,EAAMvK,SAAS2K,YAAY,cAC3BJ,EAAIK,UAAU7C,EAAWqC,GAAS,IAElCG,EAAM,IAAIM,YAAYpD,EAAO,CAC3B2C,QAAAA,EACAU,YAAY,SAKI,IAATb,GACTjJ,OAAOC,KAAKgJ,GAAM/I,SAAQ6J,IACxB/J,OAAOgK,eAAeT,EAAKQ,EAAK,CAC9BE,IAAG,IACMhB,EAAKc,QAMhBT,GACFC,EAAIW,iBAGFb,GACF9K,EAAQa,cAAcmK,GAGpBA,EAAID,uBAA2C,IAAhBH,GACjCA,EAAYe,iBAGPX,IC1ULY,EAAa,IAAIC,IAEvBC,EAAe,CACbC,IAAI/L,EAASwL,EAAKQ,GACXJ,EAAWlD,IAAI1I,IAClB4L,EAAWG,IAAI/L,EAAS,IAAI6L,KAG9B,MAAMI,EAAcL,EAAWF,IAAI1L,GAI9BiM,EAAYvD,IAAI8C,IAA6B,IAArBS,EAAYC,KAMzCD,EAAYF,IAAIP,EAAKQ,GAJnBG,QAAQC,MAAO,+EAA8EC,MAAMC,KAAKL,EAAYvK,QAAQ,QAOhIgK,IAAG,CAAC1L,EAASwL,IACPI,EAAWlD,IAAI1I,IACV4L,EAAWF,IAAI1L,GAAS0L,IAAIF,IAG9B,KAGTe,OAAOvM,EAASwL,GACd,IAAKI,EAAWlD,IAAI1I,GAClB,OAGF,MAAMiM,EAAcL,EAAWF,IAAI1L,GAEnCiM,EAAYO,OAAOhB,GAGM,IAArBS,EAAYC,MACdN,EAAWY,OAAOxM,KC/BxB,MAAMyM,EACJC,YAAY1M,IACVA,EAAUmB,EAAWnB,MAMrBgJ,KAAK2D,SAAW3M,EAChB8L,EAAKC,IAAI/C,KAAK2D,SAAU3D,KAAK0D,YAAYE,SAAU5D,OAGrD6D,UACEf,EAAKS,OAAOvD,KAAK2D,SAAU3D,KAAK0D,YAAYE,UAC5CtD,EAAaC,IAAIP,KAAK2D,SAAU3D,KAAK0D,YAAYI,WAEjDrL,OAAOsL,oBAAoB/D,MAAMrH,SAAQqL,IACvChE,KAAKgE,GAAgB,QAIzBC,eAAe3I,EAAUtE,EAASkN,GAAa,GAC7C/H,EAAuBb,EAAUtE,EAASkN,GAK1BC,mBAACnN,GACjB,OAAO8L,EAAKJ,IAAIvK,EAAWnB,GAAUgJ,KAAK4D,UAGlBO,2BAACnN,EAASuB,EAAS,IAC3C,OAAOyH,KAAKoE,YAAYpN,IAAY,IAAIgJ,KAAKhJ,EAA2B,iBAAXuB,EAAsBA,EAAS,MAGnF8L,qBACT,MAtCY,QAyCH5I,kBACT,MAAM,IAAI6I,MAAM,uEAGPV,sBACT,MAAQ,MAAK5D,KAAKvE,OAGTqI,uBACT,MAAQ,IAAG9D,KAAK4D,YC5DpB,MAAMW,EAAuB,CAACC,EAAWC,EAAS,UAChD,MAAMC,EAAc,gBAAeF,EAAUV,YACvCtI,EAAOgJ,EAAU/I,KAEvB6E,EAAaQ,GAAGrJ,SAAUiN,EAAa,qBAAoBlJ,OAAU,SAAU0D,GAK7E,GAJI,CAAC,IAAK,QAAQ9H,SAAS4I,KAAK2E,UAC9BzF,EAAMyD,iBAGJ/I,EAAWoG,MACb,OAGF,MAAMhD,EAASrF,EAAuBqI,OAASA,KAAK4E,QAAS,IAAGpJ,KAC/CgJ,EAAUK,oBAAoB7H,GAGtCyH,SCMb,MAAMK,UAAcrB,EAGPhI,kBACT,MAnBS,QAwBXsJ,QAGE,GAFmBzE,EAAamB,QAAQzB,KAAK2D,SArB5B,kBAuBF5B,iBACb,OAGF/B,KAAK2D,SAAS5J,UAAUwJ,OAxBJ,QA0BpB,MAAMW,EAAalE,KAAK2D,SAAS5J,UAAUC,SA3BvB,QA4BpBgG,KAAKiE,gBAAe,IAAMjE,KAAKgF,mBAAmBhF,KAAK2D,SAAUO,GAInEc,kBACEhF,KAAK2D,SAASJ,SACdjD,EAAamB,QAAQzB,KAAK2D,SAnCR,mBAoClB3D,KAAK6D,UAKeM,uBAAC5L,GACrB,OAAOyH,KAAKiF,MAAK,WACf,MAAMC,EAAOJ,EAAMD,oBAAoB7E,MAEvC,GAAsB,iBAAXzH,EAAX,CAIA,QAAqB4M,IAAjBD,EAAK3M,IAAyBA,EAAOlB,WAAW,MAAmB,gBAAXkB,EAC1D,MAAM,IAAIe,UAAW,oBAAmBf,MAG1C2M,EAAK3M,GAAQyH,WAWnBuE,EAAqBO,EAAO,SAS5B1J,EAAmB0J,GC/EnB,MAOMM,EAAuB,4BAU7B,MAAMC,UAAe5B,EAGRhI,kBACT,MArBS,SA0BX6J,SAEEtF,KAAK2D,SAAS4B,aAAa,eAAgBvF,KAAK2D,SAAS5J,UAAUuL,OAvB7C,WA4BFnB,uBAAC5L,GACrB,OAAOyH,KAAKiF,MAAK,WACf,MAAMC,EAAOG,EAAOR,oBAAoB7E,MAEzB,WAAXzH,GACF2M,EAAK3M,SChDb,SAASiN,EAAcC,GACrB,MAAY,SAARA,GAIQ,UAARA,IAIAA,IAAQ/I,OAAO+I,GAAKzM,WACf0D,OAAO+I,GAGJ,KAARA,GAAsB,SAARA,EACT,KAGFA,GAGT,SAASC,EAAiBlD,GACxB,OAAOA,EAAIrC,QAAQ,UAAUwF,GAAQ,IAAGA,EAAIxM,kBDuC9CmH,EAAaQ,GAAGrJ,SAzCc,2BAyCkB2N,GAAsBlG,IACpEA,EAAMyD,iBAEN,MAAMiD,EAAS1G,EAAMlC,OAAO4H,QAAQQ,GACvBC,EAAOR,oBAAoBe,GAEnCN,YAUPlK,EAAmBiK,GCpDnB,MAAMQ,EAAc,CAClBC,iBAAiB9O,EAASwL,EAAK1J,GAC7B9B,EAAQuO,aAAc,WAAUG,EAAiBlD,KAAQ1J,IAG3DiN,oBAAoB/O,EAASwL,GAC3BxL,EAAQgP,gBAAiB,WAAUN,EAAiBlD,OAGtDyD,kBAAkBjP,GAChB,IAAKA,EACH,MAAO,GAGT,MAAMkP,EAAa,GAUnB,OARAzN,OAAOC,KAAK1B,EAAQmP,SACjBC,QAAO5D,GAAOA,EAAInL,WAAW,QAC7BsB,SAAQ6J,IACP,IAAI6D,EAAU7D,EAAIrC,QAAQ,MAAO,IACjCkG,EAAUA,EAAQC,OAAO,GAAGnN,cAAgBkN,EAAQ9E,MAAM,EAAG8E,EAAQjO,QACrE8N,EAAWG,GAAWb,EAAcxO,EAAQmP,QAAQ3D,OAGjD0D,GAGTK,iBAAgB,CAACvP,EAASwL,IACjBgD,EAAcxO,EAAQE,aAAc,WAAUwO,EAAiBlD,OAGxEgE,OAAOxP,GACL,MAAMyP,EAAOzP,EAAQ0P,wBAErB,MAAO,CACLC,IAAKF,EAAKE,IAAM5L,OAAO6L,YACvBC,KAAMJ,EAAKI,KAAO9L,OAAO+L,cAI7BC,SAAS/P,IACA,CACL2P,IAAK3P,EAAQgQ,UACbH,KAAM7P,EAAQiQ,cCzDdC,EAAiB,CACrBC,KAAI,CAAClQ,EAAUD,EAAUS,SAAS2C,kBACzB,GAAGgN,UAAUC,QAAQC,UAAUjH,iBAAiBpH,KAAKjC,EAASC,IAGvEsQ,QAAO,CAACtQ,EAAUD,EAAUS,SAAS2C,kBAC5BiN,QAAQC,UAAU5P,cAAcuB,KAAKjC,EAASC,GAGvDuQ,SAAQ,CAACxQ,EAASC,IACT,GAAGmQ,UAAUpQ,EAAQwQ,UACzBpB,QAAOqB,GAASA,EAAMC,QAAQzQ,KAGnC0Q,QAAQ3Q,EAASC,GACf,MAAM0Q,EAAU,GAEhB,IAAIC,EAAW5Q,EAAQyD,WAEvB,KAAOmN,GAAYA,EAAS1P,WAAa2B,KAAKC,cArBhC,IAqBgD8N,EAAS1P,UACjE0P,EAASF,QAAQzQ,IACnB0Q,EAAQ1L,KAAK2L,GAGfA,EAAWA,EAASnN,WAGtB,OAAOkN,GAGTE,KAAK7Q,EAASC,GACZ,IAAI6Q,EAAW9Q,EAAQ+Q,uBAEvB,KAAOD,GAAU,CACf,GAAIA,EAASJ,QAAQzQ,GACnB,MAAO,CAAC6Q,GAGVA,EAAWA,EAASC,uBAGtB,MAAO,IAGTC,KAAKhR,EAASC,GACZ,IAAI+Q,EAAOhR,EAAQiR,mBAEnB,KAAOD,GAAM,CACX,GAAIA,EAAKN,QAAQzQ,GACf,MAAO,CAAC+Q,GAGVA,EAAOA,EAAKC,mBAGd,MAAO,IAGTC,kBAAkBlR,GAChB,MAAMmR,EAAa,CACjB,IACA,SACA,QACA,WACA,SACA,UACA,aACA,4BACAC,KAAInR,GAAa,GAAEA,2BAAiCoR,KAAK,MAE3D,OAAOrI,KAAKmH,KAAKgB,EAAYnR,GAASoP,QAAOkC,IAAO1O,EAAW0O,IAAO9O,EAAU8O,OC3D9E7M,EAAO,WAUP8M,EAAU,CACdC,SAAU,IACVC,UAAU,EACVC,OAAO,EACPC,MAAO,QACPC,MAAM,EACNC,OAAO,GAGHC,EAAc,CAClBN,SAAU,mBACVC,SAAU,UACVC,MAAO,mBACPC,MAAO,mBACPC,KAAM,UACNC,MAAO,WAGHE,EAAa,OACbC,EAAa,OACbC,EAAiB,OACjBC,EAAkB,QAElBC,GAAmB,CACvBC,UAAkBF,EAClBG,WAAmBJ,GAIfK,GAAc,mBAcdC,GAAoB,SASpBC,GAAuB,wBAiB7B,MAAMC,WAAiBhG,EACrBC,YAAY1M,EAASuB,GACnBmR,MAAM1S,GAENgJ,KAAK2J,OAAS,KACd3J,KAAK4J,UAAY,KACjB5J,KAAK6J,eAAiB,KACtB7J,KAAK8J,WAAY,EACjB9J,KAAK+J,YAAa,EAClB/J,KAAKgK,aAAe,KACpBhK,KAAKiK,YAAc,EACnBjK,KAAKkK,YAAc,EAEnBlK,KAAKmK,QAAUnK,KAAKoK,WAAW7R,GAC/ByH,KAAKqK,mBAAqBnD,EAAeK,QA3BjB,uBA2B8CvH,KAAK2D,UAC3E3D,KAAKsK,gBAAkB,iBAAkB7S,SAAS2C,iBAAmBmQ,UAAUC,eAAiB,EAChGxK,KAAKyK,cAAgB5J,QAAQ9F,OAAO2P,cAEpC1K,KAAK2K,qBAKIpC,qBACT,OAAOA,EAGE9M,kBACT,OAAOA,EAKTuM,OACEhI,KAAK4K,OAAO7B,GAGd8B,mBAGOpT,SAASqT,QAAUtR,EAAUwG,KAAK2D,WACrC3D,KAAKgI,OAITH,OACE7H,KAAK4K,OAAO5B,GAGdL,MAAMzJ,GACCA,IACHc,KAAK8J,WAAY,GAGf5C,EAAeK,QApEI,2CAoEwBvH,KAAK2D,YAClD/L,EAAqBoI,KAAK2D,UAC1B3D,KAAK+K,OAAM,IAGbC,cAAchL,KAAK4J,WACnB5J,KAAK4J,UAAY,KAGnBmB,MAAM7L,GACCA,IACHc,KAAK8J,WAAY,GAGf9J,KAAK4J,YACPoB,cAAchL,KAAK4J,WACnB5J,KAAK4J,UAAY,MAGf5J,KAAKmK,SAAWnK,KAAKmK,QAAQ3B,WAAaxI,KAAK8J,YACjD9J,KAAKiL,kBAELjL,KAAK4J,UAAYsB,aACdzT,SAAS0T,gBAAkBnL,KAAK6K,gBAAkB7K,KAAKgI,MAAMoD,KAAKpL,MACnEA,KAAKmK,QAAQ3B,WAKnB6C,GAAG7N,GACDwC,KAAK6J,eAAiB3C,EAAeK,QAAQiC,GAAsBxJ,KAAK2D,UACxE,MAAM2H,EAActL,KAAKuL,cAAcvL,KAAK6J,gBAE5C,GAAIrM,EAAQwC,KAAK2J,OAAOvR,OAAS,GAAKoF,EAAQ,EAC5C,OAGF,GAAIwC,KAAK+J,WAEP,YADAzJ,EAAaS,IAAIf,KAAK2D,SAAU2F,IAAY,IAAMtJ,KAAKqL,GAAG7N,KAI5D,GAAI8N,IAAgB9N,EAGlB,OAFAwC,KAAK2I,aACL3I,KAAK+K,QAIP,MAAMS,EAAQhO,EAAQ8N,EACpBvC,EACAC,EAEFhJ,KAAK4K,OAAOY,EAAOxL,KAAK2J,OAAOnM,IAKjC4M,WAAW7R,GAOT,OANAA,EAAS,IACJgQ,KACA1C,EAAYI,kBAAkBjG,KAAK2D,aAChB,iBAAXpL,EAAsBA,EAAS,IAE5CF,EAAgBoD,EAAMlD,EAAQuQ,GACvBvQ,EAGTkT,eACE,MAAMC,EAAY/N,KAAKgO,IAAI3L,KAAKkK,aAEhC,GAAIwB,GAnMgB,GAoMlB,OAGF,MAAME,EAAYF,EAAY1L,KAAKkK,YAEnClK,KAAKkK,YAAc,EAEd0B,GAIL5L,KAAK4K,OAAOgB,EAAY,EAAI1C,EAAkBD,GAGhD0B,qBACM3K,KAAKmK,QAAQ1B,UACfnI,EAAaQ,GAAGd,KAAK2D,SApLJ,uBAoL6BzE,GAASc,KAAK6L,SAAS3M,KAG5C,UAAvBc,KAAKmK,QAAQxB,QACfrI,EAAaQ,GAAGd,KAAK2D,SAvLD,0BAuL6BzE,GAASc,KAAK2I,MAAMzJ,KACrEoB,EAAaQ,GAAGd,KAAK2D,SAvLD,0BAuL6BzE,GAASc,KAAK+K,MAAM7L,MAGnEc,KAAKmK,QAAQtB,OAAS7I,KAAKsK,iBAC7BtK,KAAK8L,0BAITA,0BACE,MAAMC,EAAqB7M,GAClBc,KAAKyK,gBAnKO,QAoKhBvL,EAAM8M,aArKY,UAqKwB9M,EAAM8M,aAG/CC,EAAQ/M,IACR6M,EAAmB7M,GACrBc,KAAKiK,YAAc/K,EAAMgN,QACflM,KAAKyK,gBACfzK,KAAKiK,YAAc/K,EAAMiN,QAAQ,GAAGD,UAIlCE,EAAOlN,IAEXc,KAAKkK,YAAchL,EAAMiN,SAAWjN,EAAMiN,QAAQ/T,OAAS,EACzD,EACA8G,EAAMiN,QAAQ,GAAGD,QAAUlM,KAAKiK,aAG9BoC,EAAMnN,IACN6M,EAAmB7M,KACrBc,KAAKkK,YAAchL,EAAMgN,QAAUlM,KAAKiK,aAG1CjK,KAAKyL,eACsB,UAAvBzL,KAAKmK,QAAQxB,QASf3I,KAAK2I,QACD3I,KAAKgK,cACPsC,aAAatM,KAAKgK,cAGpBhK,KAAKgK,aAAe9M,YAAWgC,GAASc,KAAK+K,MAAM7L,IA3Q5B,IA2Q6Dc,KAAKmK,QAAQ3B,YAIrGtB,EAAeC,KAtNO,qBAsNiBnH,KAAK2D,UAAUhL,SAAQ4T,IAC5DjM,EAAaQ,GAAGyL,EAvOI,yBAuOuBrN,GAASA,EAAMyD,sBAGxD3C,KAAKyK,eACPnK,EAAaQ,GAAGd,KAAK2D,SA7OA,2BA6O6BzE,GAAS+M,EAAM/M,KACjEoB,EAAaQ,GAAGd,KAAK2D,SA7OF,yBA6O6BzE,GAASmN,EAAInN,KAE7Dc,KAAK2D,SAAS5J,UAAUyS,IAnOG,mBAqO3BlM,EAAaQ,GAAGd,KAAK2D,SArPD,0BAqP6BzE,GAAS+M,EAAM/M,KAChEoB,EAAaQ,GAAGd,KAAK2D,SArPF,yBAqP6BzE,GAASkN,EAAKlN,KAC9DoB,EAAaQ,GAAGd,KAAK2D,SArPH,wBAqP6BzE,GAASmN,EAAInN,MAIhE2M,SAAS3M,GACP,GAAI,kBAAkB7F,KAAK6F,EAAMlC,OAAO2H,SACtC,OAGF,MAAMiH,EAAYzC,GAAiBjK,EAAMsD,KACrCoJ,IACF1M,EAAMyD,iBACN3C,KAAK4K,OAAOgB,IAIhBL,cAAcvU,GAKZ,OAJAgJ,KAAK2J,OAAS3S,GAAWA,EAAQyD,WAC/ByM,EAAeC,KArPC,iBAqPmBnQ,EAAQyD,YAC3C,GAEKuF,KAAK2J,OAAOlM,QAAQzG,GAG7ByV,gBAAgBjB,EAAOnO,GACrB,MAAMqP,EAASlB,IAAUzC,EACzB,OAAO5L,EAAqB6C,KAAK2J,OAAQtM,EAAeqP,EAAQ1M,KAAKmK,QAAQvB,MAG/E+D,mBAAmB7M,EAAe8M,GAChC,MAAMC,EAAc7M,KAAKuL,cAAczL,GACjCgN,EAAY9M,KAAKuL,cAAcrE,EAAeK,QAAQiC,GAAsBxJ,KAAK2D,WAEvF,OAAOrD,EAAamB,QAAQzB,KAAK2D,SA7RhB,oBA6RuC,CACtD7D,cAAAA,EACA8L,UAAWgB,EACXtJ,KAAMwJ,EACNzB,GAAIwB,IAIRE,2BAA2B/V,GACzB,GAAIgJ,KAAKqK,mBAAoB,CAC3B,MAAM2C,EAAkB9F,EAAeK,QAhRrB,UAgR8CvH,KAAKqK,oBAErE2C,EAAgBjT,UAAUwJ,OAAOgG,IACjCyD,EAAgBhH,gBAAgB,gBAEhC,MAAMiH,EAAa/F,EAAeC,KA/Qb,mBA+QsCnH,KAAKqK,oBAEhE,IAAK,IAAIrL,EAAI,EAAGA,EAAIiO,EAAW7U,OAAQ4G,IACrC,GAAItC,OAAOwQ,SAASD,EAAWjO,GAAG9H,aAAa,oBAAqB,MAAQ8I,KAAKuL,cAAcvU,GAAU,CACvGiW,EAAWjO,GAAGjF,UAAUyS,IAAIjD,IAC5B0D,EAAWjO,GAAGuG,aAAa,eAAgB,QAC3C,QAMR0F,kBACE,MAAMjU,EAAUgJ,KAAK6J,gBAAkB3C,EAAeK,QAAQiC,GAAsBxJ,KAAK2D,UAEzF,IAAK3M,EACH,OAGF,MAAMmW,EAAkBzQ,OAAOwQ,SAASlW,EAAQE,aAAa,oBAAqB,IAE9EiW,GACFnN,KAAKmK,QAAQiD,gBAAkBpN,KAAKmK,QAAQiD,iBAAmBpN,KAAKmK,QAAQ3B,SAC5ExI,KAAKmK,QAAQ3B,SAAW2E,GAExBnN,KAAKmK,QAAQ3B,SAAWxI,KAAKmK,QAAQiD,iBAAmBpN,KAAKmK,QAAQ3B,SAIzEoC,OAAOyC,EAAkBrW,GACvB,MAAMwU,EAAQxL,KAAKsN,kBAAkBD,GAC/BhQ,EAAgB6J,EAAeK,QAAQiC,GAAsBxJ,KAAK2D,UAClE4J,EAAqBvN,KAAKuL,cAAclO,GACxCmQ,EAAcxW,GAAWgJ,KAAKyM,gBAAgBjB,EAAOnO,GAErDoQ,EAAmBzN,KAAKuL,cAAciC,GACtCE,EAAY7M,QAAQb,KAAK4J,WAEzB8C,EAASlB,IAAUzC,EACnB4E,EAAuBjB,EAjUR,sBADF,oBAmUbkB,EAAiBlB,EAjUH,qBACA,qBAiUdE,EAAqB5M,KAAK6N,kBAAkBrC,GAElD,GAAIgC,GAAeA,EAAYzT,UAAUC,SAASuP,IAEhD,YADAvJ,KAAK+J,YAAa,GAIpB,GAAI/J,KAAK+J,WACP,OAIF,GADmB/J,KAAK2M,mBAAmBa,EAAaZ,GACzC7K,iBACb,OAGF,IAAK1E,IAAkBmQ,EAErB,OAGFxN,KAAK+J,YAAa,EAEd2D,GACF1N,KAAK2I,QAGP3I,KAAK+M,2BAA2BS,GAChCxN,KAAK6J,eAAiB2D,EAEtB,MAAMM,EAAmB,KACvBxN,EAAamB,QAAQzB,KAAK2D,SAAU2F,GAAY,CAC9CxJ,cAAe0N,EACf5B,UAAWgB,EACXtJ,KAAMiK,EACNlC,GAAIoC,KAIR,GAAIzN,KAAK2D,SAAS5J,UAAUC,SA5WP,SA4WmC,CACtDwT,EAAYzT,UAAUyS,IAAIoB,GAE1BjT,EAAO6S,GAEPnQ,EAActD,UAAUyS,IAAImB,GAC5BH,EAAYzT,UAAUyS,IAAImB,GAE1B,MAAMI,EAAmB,KACvBP,EAAYzT,UAAUwJ,OAAOoK,EAAsBC,GACnDJ,EAAYzT,UAAUyS,IAAIjD,IAE1BlM,EAActD,UAAUwJ,OAAOgG,GAAmBqE,EAAgBD,GAElE3N,KAAK+J,YAAa,EAElB7M,WAAW4Q,EAAkB,IAG/B9N,KAAKiE,eAAe8J,EAAkB1Q,GAAe,QAErDA,EAActD,UAAUwJ,OAAOgG,IAC/BiE,EAAYzT,UAAUyS,IAAIjD,IAE1BvJ,KAAK+J,YAAa,EAClB+D,IAGEJ,GACF1N,KAAK+K,QAITuC,kBAAkB1B,GAChB,MAAK,CAAC1C,EAAiBD,GAAgB7R,SAASwU,GAI5C1Q,IACK0Q,IAAc3C,EAAiBD,EAAaD,EAG9C6C,IAAc3C,EAAiBF,EAAaC,EAP1C4C,EAUXiC,kBAAkBrC,GAChB,MAAK,CAACzC,EAAYC,GAAY5R,SAASoU,GAInCtQ,IACKsQ,IAAUxC,EAAaC,EAAiBC,EAG1CsC,IAAUxC,EAAaE,EAAkBD,EAPvCuC,EAYarH,yBAACnN,EAASuB,GAChC,MAAM2M,EAAOuE,GAAS5E,oBAAoB7N,EAASuB,GAEnD,IAAI4R,QAAEA,GAAYjF,EACI,iBAAX3M,IACT4R,EAAU,IACLA,KACA5R,IAIP,MAAMyV,EAA2B,iBAAXzV,EAAsBA,EAAS4R,EAAQzB,MAE7D,GAAsB,iBAAXnQ,EACT2M,EAAKmG,GAAG9S,QACH,GAAsB,iBAAXyV,EAAqB,CACrC,QAA4B,IAAjB9I,EAAK8I,GACd,MAAM,IAAI1U,UAAW,oBAAmB0U,MAG1C9I,EAAK8I,UACI7D,EAAQ3B,UAAY2B,EAAQ8D,OACrC/I,EAAKyD,QACLzD,EAAK6F,SAIa5G,uBAAC5L,GACrB,OAAOyH,KAAKiF,MAAK,WACfwE,GAASyE,kBAAkBlO,KAAMzH,MAIX4L,2BAACjF,GACzB,MAAMlC,EAASrF,EAAuBqI,MAEtC,IAAKhD,IAAWA,EAAOjD,UAAUC,SA7cT,YA8ctB,OAGF,MAAMzB,EAAS,IACVsN,EAAYI,kBAAkBjJ,MAC9B6I,EAAYI,kBAAkBjG,OAE7BmO,EAAanO,KAAK9I,aAAa,oBAEjCiX,IACF5V,EAAOiQ,UAAW,GAGpBiB,GAASyE,kBAAkBlR,EAAQzE,GAE/B4V,GACF1E,GAASrF,YAAYpH,GAAQqO,GAAG8C,GAGlCjP,EAAMyD,kBAUVrC,EAAaQ,GAAGrJ,SA7ec,6BAkBF,sCA2dyCgS,GAAS2E,qBAE9E9N,EAAaQ,GAAG/F,OAhfa,6BAgfgB,KAC3C,MAAMsT,EAAYnH,EAAeC,KA7dR,6BA+dzB,IAAK,IAAInI,EAAI,EAAGC,EAAMoP,EAAUjW,OAAQ4G,EAAIC,EAAKD,IAC/CyK,GAASyE,kBAAkBG,EAAUrP,GAAIyK,GAASrF,YAAYiK,EAAUrP,QAW5E5D,EAAmBqO,ICjjBnB,MAAMhO,GAAO,WAKP8M,GAAU,CACdjD,QAAQ,EACRgJ,OAAQ,MAGJxF,GAAc,CAClBxD,OAAQ,UACRgJ,OAAQ,kBASJC,GAAkB,OAClBC,GAAsB,WACtBC,GAAwB,aACxBC,GAAuB,YACvBC,GAA8B,6BAO9BvJ,GAAuB,8BAQ7B,MAAMwJ,WAAiBnL,EACrBC,YAAY1M,EAASuB,GACnBmR,MAAM1S,GAENgJ,KAAK6O,kBAAmB,EACxB7O,KAAKmK,QAAUnK,KAAKoK,WAAW7R,GAC/ByH,KAAK8O,cAAgB,GAErB,MAAMC,EAAa7H,EAAeC,KAAK/B,IAEvC,IAAK,IAAIpG,EAAI,EAAGC,EAAM8P,EAAW3W,OAAQ4G,EAAIC,EAAKD,IAAK,CACrD,MAAMgQ,EAAOD,EAAW/P,GAClB/H,EAAWO,EAAuBwX,GAClCC,EAAgB/H,EAAeC,KAAKlQ,GACvCmP,QAAO8I,GAAaA,IAAclP,KAAK2D,WAEzB,OAAb1M,GAAqBgY,EAAc7W,SACrC4H,KAAKmP,UAAYlY,EACjB+I,KAAK8O,cAAc7S,KAAK+S,IAI5BhP,KAAKoP,sBAEApP,KAAKmK,QAAQmE,QAChBtO,KAAKqP,0BAA0BrP,KAAK8O,cAAe9O,KAAKsP,YAGtDtP,KAAKmK,QAAQ7E,QACftF,KAAKsF,SAMEiD,qBACT,OAAOA,GAGE9M,kBACT,OAAOA,GAKT6J,SACMtF,KAAKsP,WACPtP,KAAKuP,OAELvP,KAAKwP,OAITA,OACE,GAAIxP,KAAK6O,kBAAoB7O,KAAKsP,WAChC,OAGF,IACIG,EADAC,EAAU,GAGd,GAAI1P,KAAKmK,QAAQmE,OAAQ,CACvB,MAAM9G,EAAWN,EAAeC,KAAKwH,GAA4B3O,KAAKmK,QAAQmE,QAC9EoB,EAAUxI,EAAeC,KAxEN,uCAwE6BnH,KAAKmK,QAAQmE,QAAQlI,QAAO4I,IAASxH,EAASpQ,SAAS4X,KAGzG,MAAMW,EAAYzI,EAAeK,QAAQvH,KAAKmP,WAC9C,GAAIO,EAAQtX,OAAQ,CAClB,MAAMwX,EAAiBF,EAAQvI,MAAK6H,GAAQW,IAAcX,IAG1D,GAFAS,EAAcG,EAAiBhB,GAASxK,YAAYwL,GAAkB,KAElEH,GAAeA,EAAYZ,iBAC7B,OAKJ,GADmBvO,EAAamB,QAAQzB,KAAK2D,SArG7B,oBAsGD5B,iBACb,OAGF2N,EAAQ/W,SAAQkX,IACVF,IAAcE,GAChBjB,GAAS/J,oBAAoBgL,EAAY,CAAEvK,QAAQ,IAASiK,OAGzDE,GACH3M,EAAKC,IAAI8M,EA9HA,cA8HsB,SAInC,MAAMC,EAAY9P,KAAK+P,gBAEvB/P,KAAK2D,SAAS5J,UAAUwJ,OAAOiL,IAC/BxO,KAAK2D,SAAS5J,UAAUyS,IAAIiC,IAE5BzO,KAAK2D,SAASqM,MAAMF,GAAa,EAEjC9P,KAAKqP,0BAA0BrP,KAAK8O,eAAe,GACnD9O,KAAK6O,kBAAmB,EAExB,MAYMoB,EAAc,SADSH,EAAU,GAAGvW,cAAgBuW,EAAUvO,MAAM,KAG1EvB,KAAKiE,gBAdY,KACfjE,KAAK6O,kBAAmB,EAExB7O,KAAK2D,SAAS5J,UAAUwJ,OAAOkL,IAC/BzO,KAAK2D,SAAS5J,UAAUyS,IAAIgC,GAAqBD,IAEjDvO,KAAK2D,SAASqM,MAAMF,GAAa,GAEjCxP,EAAamB,QAAQzB,KAAK2D,SArIX,uBA2Ia3D,KAAK2D,UAAU,GAC7C3D,KAAK2D,SAASqM,MAAMF,GAAc,GAAE9P,KAAK2D,SAASsM,OAGpDV,OACE,GAAIvP,KAAK6O,mBAAqB7O,KAAKsP,WACjC,OAIF,GADmBhP,EAAamB,QAAQzB,KAAK2D,SAnJ7B,oBAoJD5B,iBACb,OAGF,MAAM+N,EAAY9P,KAAK+P,gBAEvB/P,KAAK2D,SAASqM,MAAMF,GAAc,GAAE9P,KAAK2D,SAAS+C,wBAAwBoJ,OAE1EnV,EAAOqF,KAAK2D,UAEZ3D,KAAK2D,SAAS5J,UAAUyS,IAAIiC,IAC5BzO,KAAK2D,SAAS5J,UAAUwJ,OAAOiL,GAAqBD,IAEpD,MAAM2B,EAAqBlQ,KAAK8O,cAAc1W,OAC9C,IAAK,IAAI4G,EAAI,EAAGA,EAAIkR,EAAoBlR,IAAK,CAC3C,MAAMyC,EAAUzB,KAAK8O,cAAc9P,GAC7BgQ,EAAOrX,EAAuB8J,GAEhCuN,IAAShP,KAAKsP,SAASN,IACzBhP,KAAKqP,0BAA0B,CAAC5N,IAAU,GAI9CzB,KAAK6O,kBAAmB,EASxB7O,KAAK2D,SAASqM,MAAMF,GAAa,GAEjC9P,KAAKiE,gBATY,KACfjE,KAAK6O,kBAAmB,EACxB7O,KAAK2D,SAAS5J,UAAUwJ,OAAOkL,IAC/BzO,KAAK2D,SAAS5J,UAAUyS,IAAIgC,IAC5BlO,EAAamB,QAAQzB,KAAK2D,SAhLV,wBAqLY3D,KAAK2D,UAAU,GAG/C2L,SAAStY,EAAUgJ,KAAK2D,UACtB,OAAO3M,EAAQ+C,UAAUC,SAASuU,IAKpCnE,WAAW7R,GAST,OARAA,EAAS,IACJgQ,MACA1C,EAAYI,kBAAkBjG,KAAK2D,aACnCpL,IAEE+M,OAASzE,QAAQtI,EAAO+M,QAC/B/M,EAAO+V,OAASnW,EAAWI,EAAO+V,QAClCjW,EAAgBoD,GAAMlD,EAAQuQ,IACvBvQ,EAGTwX,gBACE,OAAO/P,KAAK2D,SAAS5J,UAAUC,SAnML,uBAEhB,QACC,SAmMboV,sBACE,IAAKpP,KAAKmK,QAAQmE,OAChB,OAGF,MAAM9G,EAAWN,EAAeC,KAAKwH,GAA4B3O,KAAKmK,QAAQmE,QAC9EpH,EAAeC,KAAK/B,GAAsBpF,KAAKmK,QAAQmE,QAAQlI,QAAO4I,IAASxH,EAASpQ,SAAS4X,KAC9FrW,SAAQ3B,IACP,MAAMmZ,EAAWxY,EAAuBX,GAEpCmZ,GACFnQ,KAAKqP,0BAA0B,CAACrY,GAAUgJ,KAAKsP,SAASa,OAKhEd,0BAA0Be,EAAcC,GACjCD,EAAahY,QAIlBgY,EAAazX,SAAQqW,IACfqB,EACFrB,EAAKjV,UAAUwJ,OAAOmL,IAEtBM,EAAKjV,UAAUyS,IAAIkC,IAGrBM,EAAKzJ,aAAa,gBAAiB8K,MAMjBlM,uBAAC5L,GACrB,OAAOyH,KAAKiF,MAAK,WACf,MAAMkF,EAAU,GACM,iBAAX5R,GAAuB,YAAYc,KAAKd,KACjD4R,EAAQ7E,QAAS,GAGnB,MAAMJ,EAAO0J,GAAS/J,oBAAoB7E,KAAMmK,GAEhD,GAAsB,iBAAX5R,EAAqB,CAC9B,QAA4B,IAAjB2M,EAAK3M,GACd,MAAM,IAAIe,UAAW,oBAAmBf,MAG1C2M,EAAK3M,UAYb+H,EAAaQ,GAAGrJ,SAzQc,6BAyQkB2N,IAAsB,SAAUlG,IAEjD,MAAzBA,EAAMlC,OAAO2H,SAAoBzF,EAAMa,gBAAmD,MAAjCb,EAAMa,eAAe4E,UAChFzF,EAAMyD,iBAGR,MAAM1L,EAAWO,EAAuBwI,MACfkH,EAAeC,KAAKlQ,GAE5B0B,SAAQ3B,IACvB4X,GAAS/J,oBAAoB7N,EAAS,CAAEsO,QAAQ,IAASA,eAW7DlK,EAAmBwT,IC5UZ,IAAIjI,GAAM,MACN2J,GAAS,SACTC,GAAQ,QACR1J,GAAO,OACP2J,GAAO,OACPC,GAAiB,CAAC9J,GAAK2J,GAAQC,GAAO1J,IACtCoF,GAAQ,QACRI,GAAM,MACNqE,GAAkB,kBAClBC,GAAW,WACXC,GAAS,SACTC,GAAY,YACZC,GAAmCL,GAAeM,QAAO,SAAUC,EAAKC,GACjF,OAAOD,EAAI5J,OAAO,CAAC6J,EAAY,IAAMhF,GAAOgF,EAAY,IAAM5E,OAC7D,IACQ6E,GAA0B,GAAG9J,OAAOqJ,GAAgB,CAACD,KAAOO,QAAO,SAAUC,EAAKC,GAC3F,OAAOD,EAAI5J,OAAO,CAAC6J,EAAWA,EAAY,IAAMhF,GAAOgF,EAAY,IAAM5E,OACxE,IAEQ8E,GAAa,aACbC,GAAO,OACPC,GAAY,YAEZC,GAAa,aACbC,GAAO,OACPC,GAAY,YAEZC,GAAc,cACdC,GAAQ,QACRC,GAAa,aACbC,GAAiB,CAACT,GAAYC,GAAMC,GAAWC,GAAYC,GAAMC,GAAWC,GAAaC,GAAOC,IC9B5F,SAASE,GAAY7a,GAClC,OAAOA,GAAWA,EAAQ8a,UAAY,IAAI3Y,cAAgB,KCD7C,SAAS4Y,GAAUC,GAChC,GAAY,MAARA,EACF,OAAOjX,OAGT,GAAwB,oBAApBiX,EAAKhZ,WAAkC,CACzC,IAAIiZ,EAAgBD,EAAKC,cACzB,OAAOA,GAAgBA,EAAcC,aAAwBnX,OAG/D,OAAOiX,ECRT,SAASja,GAAUia,GAEjB,OAAOA,aADUD,GAAUC,GAAM3K,SACI2K,aAAgB3K,QAGvD,SAAS8K,GAAcH,GAErB,OAAOA,aADUD,GAAUC,GAAMI,aACIJ,aAAgBI,YAGvD,SAASC,GAAaL,GAEpB,MAA0B,oBAAfxX,aAKJwX,aADUD,GAAUC,GAAMxX,YACIwX,aAAgBxX,YCyDvD,MAAA8X,GAAe,CACb9W,KAAM,cACN+W,SAAS,EACTC,MAAO,QACP7W,GA5EF,SAAqB8W,GACnB,IAAIC,EAAQD,EAAKC,MACjBja,OAAOC,KAAKga,EAAMC,UAAUha,SAAQ,SAAU6C,GAC5C,IAAIwU,EAAQ0C,EAAME,OAAOpX,IAAS,GAC9B0K,EAAawM,EAAMxM,WAAW1K,IAAS,GACvCxE,EAAU0b,EAAMC,SAASnX,GAExB2W,GAAcnb,IAAa6a,GAAY7a,KAO5CyB,OAAOoa,OAAO7b,EAAQgZ,MAAOA,GAC7BvX,OAAOC,KAAKwN,GAAYvN,SAAQ,SAAU6C,GACxC,IAAI1C,EAAQoN,EAAW1K,IAET,IAAV1C,EACF9B,EAAQgP,gBAAgBxK,GAExBxE,EAAQuO,aAAa/J,GAAgB,IAAV1C,EAAiB,GAAKA,WAwDvDga,OAlDF,SAAgBC,GACd,IAAIL,EAAQK,EAAML,MACdM,EAAgB,CAClBpC,OAAQ,CACN7J,SAAU2L,EAAMO,QAAQC,SACxBrM,KAAM,IACNF,IAAK,IACLwM,OAAQ,KAEVC,MAAO,CACLrM,SAAU,YAEZ8J,UAAW,IASb,OAPApY,OAAOoa,OAAOH,EAAMC,SAAS/B,OAAOZ,MAAOgD,EAAcpC,QACzD8B,EAAME,OAASI,EAEXN,EAAMC,SAASS,OACjB3a,OAAOoa,OAAOH,EAAMC,SAASS,MAAMpD,MAAOgD,EAAcI,OAGnD,WACL3a,OAAOC,KAAKga,EAAMC,UAAUha,SAAQ,SAAU6C,GAC5C,IAAIxE,EAAU0b,EAAMC,SAASnX,GACzB0K,EAAawM,EAAMxM,WAAW1K,IAAS,GAGvCwU,EAFkBvX,OAAOC,KAAKga,EAAME,OAAOS,eAAe7X,GAAQkX,EAAME,OAAOpX,GAAQwX,EAAcxX,IAE7EuV,QAAO,SAAUf,EAAOpX,GAElD,OADAoX,EAAMpX,GAAY,GACXoX,IACN,IAEEmC,GAAcnb,IAAa6a,GAAY7a,KAI5CyB,OAAOoa,OAAO7b,EAAQgZ,MAAOA,GAC7BvX,OAAOC,KAAKwN,GAAYvN,SAAQ,SAAU2a,GACxCtc,EAAQgP,gBAAgBsN,YAa9BC,SAAU,CAAC,kBCjFE,SAASC,GAAiBvC,GACvC,OAAOA,EAAU3Z,MAAM,KAAK,GCDf,SAASoP,GAAsB1P,EAC9Cyc,GAKE,IAAIhN,EAAOzP,EAAQ0P,wBAoBnB,MAAO,CACLgN,MAAOjN,EAAKiN,MApBD,EAqBXC,OAAQlN,EAAKkN,OApBF,EAqBXhN,IAAKF,EAAKE,IArBC,EAsBX4J,MAAO9J,EAAK8J,MAvBD,EAwBXD,OAAQ7J,EAAK6J,OAvBF,EAwBXzJ,KAAMJ,EAAKI,KAzBA,EA0BX+M,EAAGnN,EAAKI,KA1BG,EA2BXgN,EAAGpN,EAAKE,IA1BG,GCNA,SAASmN,GAAc9c,GACpC,IAAI+c,EAAarN,GAAsB1P,GAGnC0c,EAAQ1c,EAAQgd,YAChBL,EAAS3c,EAAQ4D,aAUrB,OARI+C,KAAKgO,IAAIoI,EAAWL,MAAQA,IAAU,IACxCA,EAAQK,EAAWL,OAGjB/V,KAAKgO,IAAIoI,EAAWJ,OAASA,IAAW,IAC1CA,EAASI,EAAWJ,QAGf,CACLC,EAAG5c,EAAQiQ,WACX4M,EAAG7c,EAAQgQ,UACX0M,MAAOA,EACPC,OAAQA,GCrBG,SAAS3Z,GAASsU,EAAQ7G,GACvC,IAAIwM,EAAWxM,EAAMnN,aAAemN,EAAMnN,cAE1C,GAAIgU,EAAOtU,SAASyN,GAClB,OAAO,EAEJ,GAAIwM,GAAY5B,GAAa4B,GAAW,CACzC,IAAIjM,EAAOP,EAEX,EAAG,CACD,GAAIO,GAAQsG,EAAO4F,WAAWlM,GAC5B,OAAO,EAITA,EAAOA,EAAKvN,YAAcuN,EAAKmM,WACxBnM,GAIb,OAAO,ECpBM,SAAStO,GAAiB1C,GACvC,OAAO+a,GAAU/a,GAAS0C,iBAAiB1C,GCD9B,SAASod,GAAepd,GACrC,MAAO,CAAC,QAAS,KAAM,MAAMyG,QAAQoU,GAAY7a,KAAa,ECDjD,SAASqd,GAAmBrd,GAEzC,QAASe,GAAUf,GAAWA,EAAQib,cACtCjb,EAAQS,WAAasD,OAAOtD,UAAU2C,gBCDzB,SAASka,GAActd,GACpC,MAA6B,SAAzB6a,GAAY7a,GACPA,EAMPA,EAAQud,cACRvd,EAAQyD,aACR4X,GAAarb,GAAWA,EAAQmd,KAAO,OAEvCE,GAAmBrd,GCRvB,SAASwd,GAAoBxd,GAC3B,OAAKmb,GAAcnb,IACoB,UAAvC0C,GAAiB1C,GAAS+P,SAInB/P,EAAQyd,aAHN,KAwCI,SAASC,GAAgB1d,GAItC,IAHA,IAAI+D,EAASgX,GAAU/a,GACnByd,EAAeD,GAAoBxd,GAEhCyd,GAAgBL,GAAeK,IAA6D,WAA5C/a,GAAiB+a,GAAc1N,UACpF0N,EAAeD,GAAoBC,GAGrC,OAAIA,IAA+C,SAA9B5C,GAAY4C,IAA0D,SAA9B5C,GAAY4C,IAAwE,WAA5C/a,GAAiB+a,GAAc1N,UAC3HhM,EAGF0Z,GA5CT,SAA4Bzd,GAC1B,IAAI2d,GAAsE,IAA1DpK,UAAUqK,UAAUzb,cAAcsE,QAAQ,WAG1D,IAFuD,IAA5C8M,UAAUqK,UAAUnX,QAAQ,YAE3B0U,GAAcnb,IAII,UAFX0C,GAAiB1C,GAEnB+P,SACb,OAAO,KAMX,IAFA,IAAI8N,EAAcP,GAActd,GAEzBmb,GAAc0C,IAAgB,CAAC,OAAQ,QAAQpX,QAAQoU,GAAYgD,IAAgB,GAAG,CAC3F,IAAIC,EAAMpb,GAAiBmb,GAI3B,GAAsB,SAAlBC,EAAIC,WAA4C,SAApBD,EAAIE,aAA0C,UAAhBF,EAAIG,UAAiF,IAA1D,CAAC,YAAa,eAAexX,QAAQqX,EAAII,aAAsBP,GAAgC,WAAnBG,EAAII,YAA2BP,GAAaG,EAAI1O,QAAyB,SAAf0O,EAAI1O,OACjO,OAAOyO,EAEPA,EAAcA,EAAYpa,WAI9B,OAAO,KAiBgB0a,CAAmBne,IAAY+D,EC9DzC,SAASqa,GAAyBnE,GAC/C,MAAO,CAAC,MAAO,UAAUxT,QAAQwT,IAAc,EAAI,IAAM,ICDpD,IAAIrT,GAAMD,KAAKC,IACXC,GAAMF,KAAKE,IACXwX,GAAQ1X,KAAK0X,MCDT,SAASC,GAAOzX,EAAK/E,EAAO8E,GACzC,OAAO2X,GAAQ1X,EAAK2X,GAAQ1c,EAAO8E,ICDtB,SAAS6X,GAAmBC,GACzC,OAAOjd,OAAOoa,OAAO,GCDd,CACLlM,IAAK,EACL4J,MAAO,EACPD,OAAQ,EACRzJ,KAAM,GDHuC6O,GEFlC,SAASC,GAAgB7c,EAAOJ,GAC7C,OAAOA,EAAKqY,QAAO,SAAU6E,EAASpT,GAEpC,OADAoT,EAAQpT,GAAO1J,EACR8c,IACN,ICwFL,MAAAC,GAAe,CACbra,KAAM,QACN+W,SAAS,EACTC,MAAO,OACP7W,GA9EF,SAAe8W,GACb,IAAIqD,EAEApD,EAAQD,EAAKC,MACblX,EAAOiX,EAAKjX,KACZyX,EAAUR,EAAKQ,QACf8C,EAAerD,EAAMC,SAASS,MAC9B4C,EAAgBtD,EAAMuD,cAAcD,cACpCE,EAAgB1C,GAAiBd,EAAMzB,WACvCkF,EAAOf,GAAyBc,GAEhCjX,EADa,CAAC4H,GAAM0J,IAAO9S,QAAQyY,IAAkB,EAClC,SAAW,QAElC,GAAKH,GAAiBC,EAAtB,CAIA,IAAIN,EAxBgB,SAAyBU,EAAS1D,GAItD,OAAO+C,GAAsC,iBAH7CW,EAA6B,mBAAZA,EAAyBA,EAAQ3d,OAAOoa,OAAO,GAAIH,EAAM2D,MAAO,CAC/EpF,UAAWyB,EAAMzB,aACbmF,GACkDA,EAAUT,GAAgBS,EAAS3F,KAoBvE6F,CAAgBrD,EAAQmD,QAAS1D,GACjD6D,EAAYzC,GAAciC,GAC1BS,EAAmB,MAATL,EAAexP,GAAME,GAC/B4P,EAAmB,MAATN,EAAe7F,GAASC,GAClCmG,EAAUhE,EAAM2D,MAAMxF,UAAU5R,GAAOyT,EAAM2D,MAAMxF,UAAUsF,GAAQH,EAAcG,GAAQzD,EAAM2D,MAAMzF,OAAO3R,GAC9G0X,EAAYX,EAAcG,GAAQzD,EAAM2D,MAAMxF,UAAUsF,GACxDS,EAAoBlC,GAAgBqB,GACpCc,EAAaD,EAA6B,MAATT,EAAeS,EAAkBE,cAAgB,EAAIF,EAAkBG,aAAe,EAAI,EAC3HC,EAAoBN,EAAU,EAAIC,EAAY,EAG9C9Y,EAAM6X,EAAcc,GACpB5Y,EAAMiZ,EAAaN,EAAUtX,GAAOyW,EAAce,GAClDQ,EAASJ,EAAa,EAAIN,EAAUtX,GAAO,EAAI+X,EAC/CxQ,EAAS8O,GAAOzX,EAAKoZ,EAAQrZ,GAE7BsZ,EAAWf,EACfzD,EAAMuD,cAAcza,KAASsa,EAAwB,IAA0BoB,GAAY1Q,EAAQsP,EAAsBqB,aAAe3Q,EAASyQ,EAAQnB,KA6CzJhD,OA1CF,SAAgBC,GACd,IAAIL,EAAQK,EAAML,MAEd0E,EADUrE,EAAME,QACWjc,QAC3B+e,OAAoC,IAArBqB,EAA8B,sBAAwBA,EAErD,MAAhBrB,IAKwB,iBAAjBA,IACTA,EAAerD,EAAMC,SAAS/B,OAAOlZ,cAAcqe,MAahD/b,GAAS0Y,EAAMC,SAAS/B,OAAQmF,KAQrCrD,EAAMC,SAASS,MAAQ2C,IAUvBxC,SAAU,CAAC,iBACX8D,iBAAkB,CAAC,oBCnGN,SAASC,GAAarG,GACnC,OAAOA,EAAU3Z,MAAM,KAAK,GCQ9B,IAAIigB,GAAa,CACf5Q,IAAK,OACL4J,MAAO,OACPD,OAAQ,OACRzJ,KAAM,QAgBD,SAAS2Q,GAAYzE,GAC1B,IAAI0E,EAEA7G,EAASmC,EAAMnC,OACf8G,EAAa3E,EAAM2E,WACnBzG,EAAY8B,EAAM9B,UAClB0G,EAAY5E,EAAM4E,UAClBC,EAAU7E,EAAM6E,QAChB7Q,EAAWgM,EAAMhM,SACjB8Q,EAAkB9E,EAAM8E,gBACxBC,EAAW/E,EAAM+E,SACjBC,EAAehF,EAAMgF,aAErBC,GAAyB,IAAjBD,EAxBd,SAA2BtF,GACzB,IAAImB,EAAInB,EAAKmB,EACTC,EAAIpB,EAAKoB,EAEToE,EADMld,OACImd,kBAAoB,EAClC,MAAO,CACLtE,EAAGyB,GAAMA,GAAMzB,EAAIqE,GAAOA,IAAQ,EAClCpE,EAAGwB,GAAMA,GAAMxB,EAAIoE,GAAOA,IAAQ,GAiBAE,CAAkBP,GAAmC,mBAAjBG,EAA8BA,EAAaH,GAAWA,EAC1HQ,EAAUJ,EAAMpE,EAChBA,OAAgB,IAAZwE,EAAqB,EAAIA,EAC7BC,EAAUL,EAAMnE,EAChBA,OAAgB,IAAZwE,EAAqB,EAAIA,EAE7BC,EAAOV,EAAQvE,eAAe,KAC9BkF,EAAOX,EAAQvE,eAAe,KAC9BmF,EAAQ3R,GACR4R,EAAQ9R,GACR+R,EAAM3d,OAEV,GAAI+c,EAAU,CACZ,IAAIrD,EAAeC,GAAgB9D,GAC/B+H,EAAa,eACbC,EAAY,cAEZnE,IAAiB1C,GAAUnB,IAGmB,WAA5ClX,GAFJ+a,EAAeJ,GAAmBzD,IAEC7J,UAAsC,aAAbA,IAC1D4R,EAAa,eACbC,EAAY,eAKhBnE,EAAeA,EAEXxD,IAActK,KAAQsK,IAAcpK,IAAQoK,IAAcV,IAAUoH,IAActL,MACpFoM,EAAQnI,GAERuD,GAAKY,EAAakE,GAAcjB,EAAW/D,OAC3CE,GAAKgE,EAAkB,GAAK,GAG1B5G,IAAcpK,KAASoK,IAActK,IAAOsK,IAAcX,IAAWqH,IAActL,MACrFmM,EAAQjI,GAERqD,GAAKa,EAAamE,GAAalB,EAAWhE,MAC1CE,GAAKiE,EAAkB,GAAK,GAIhC,IAKMgB,EALFC,EAAergB,OAAOoa,OAAO,CAC/B9L,SAAUA,GACT+Q,GAAYP,IAEf,OAAIM,EAGKpf,OAAOoa,OAAO,GAAIiG,IAAeD,EAAiB,IAAmBJ,GAASF,EAAO,IAAM,GAAIM,EAAeL,GAASF,EAAO,IAAM,GAAIO,EAAe9D,WAAa2D,EAAIR,kBAAoB,IAAM,EAAI,aAAetE,EAAI,OAASC,EAAI,MAAQ,eAAiBD,EAAI,OAASC,EAAI,SAAUgF,IAG5RpgB,OAAOoa,OAAO,GAAIiG,IAAerB,EAAkB,IAAoBgB,GAASF,EAAO1E,EAAI,KAAO,GAAI4D,EAAgBe,GAASF,EAAO1E,EAAI,KAAO,GAAI6D,EAAgB1C,UAAY,GAAI0C,IAuD9L,MAAAsB,GAAe,CACbvd,KAAM,gBACN+W,SAAS,EACTC,MAAO,cACP7W,GAxDF,SAAuBqd,GACrB,IAAItG,EAAQsG,EAAMtG,MACdO,EAAU+F,EAAM/F,QAChBgG,EAAwBhG,EAAQ4E,gBAChCA,OAA4C,IAA1BoB,GAA0CA,EAC5DC,EAAoBjG,EAAQ6E,SAC5BA,OAAiC,IAAtBoB,GAAsCA,EACjDC,EAAwBlG,EAAQ8E,aAChCA,OAAyC,IAA1BoB,GAA0CA,EAYzDL,EAAe,CACjB7H,UAAWuC,GAAiBd,EAAMzB,WAClC0G,UAAWL,GAAa5E,EAAMzB,WAC9BL,OAAQ8B,EAAMC,SAAS/B,OACvB8G,WAAYhF,EAAM2D,MAAMzF,OACxBiH,gBAAiBA,GAGsB,MAArCnF,EAAMuD,cAAcD,gBACtBtD,EAAME,OAAOhC,OAASnY,OAAOoa,OAAO,GAAIH,EAAME,OAAOhC,OAAQ4G,GAAY/e,OAAOoa,OAAO,GAAIiG,EAAc,CACvGlB,QAASlF,EAAMuD,cAAcD,cAC7BjP,SAAU2L,EAAMO,QAAQC,SACxB4E,SAAUA,EACVC,aAAcA,OAIe,MAA7BrF,EAAMuD,cAAc7C,QACtBV,EAAME,OAAOQ,MAAQ3a,OAAOoa,OAAO,GAAIH,EAAME,OAAOQ,MAAOoE,GAAY/e,OAAOoa,OAAO,GAAIiG,EAAc,CACrGlB,QAASlF,EAAMuD,cAAc7C,MAC7BrM,SAAU,WACV+Q,UAAU,EACVC,aAAcA,OAIlBrF,EAAMxM,WAAW0K,OAASnY,OAAOoa,OAAO,GAAIH,EAAMxM,WAAW0K,OAAQ,CACnE,wBAAyB8B,EAAMzB,aAUjC/L,KAAM,IC1JR,IAAIkU,GAAU,CACZA,SAAS,GAsCX,MAAAC,GAAe,CACb7d,KAAM,iBACN+W,SAAS,EACTC,MAAO,QACP7W,GAAI,aACJmX,OAxCF,SAAgBL,GACd,IAAIC,EAAQD,EAAKC,MACb1P,EAAWyP,EAAKzP,SAChBiQ,EAAUR,EAAKQ,QACfqG,EAAkBrG,EAAQsG,OAC1BA,OAA6B,IAApBD,GAAoCA,EAC7CE,EAAkBvG,EAAQwG,OAC1BA,OAA6B,IAApBD,GAAoCA,EAC7Cze,EAASgX,GAAUW,EAAMC,SAAS/B,QAClC8I,EAAgB,GAAGtS,OAAOsL,EAAMgH,cAAc7I,UAAW6B,EAAMgH,cAAc9I,QAYjF,OAVI2I,GACFG,EAAc/gB,SAAQ,SAAUghB,GAC9BA,EAAa3d,iBAAiB,SAAUgH,EAAS4W,OAAQR,OAIzDK,GACF1e,EAAOiB,iBAAiB,SAAUgH,EAAS4W,OAAQR,IAG9C,WACDG,GACFG,EAAc/gB,SAAQ,SAAUghB,GAC9BA,EAAa1c,oBAAoB,SAAU+F,EAAS4W,OAAQR,OAI5DK,GACF1e,EAAOkC,oBAAoB,SAAU+F,EAAS4W,OAAQR,MAY1DlU,KAAM,IC/CR,IAAI2U,GAAO,CACThT,KAAM,QACN0J,MAAO,OACPD,OAAQ,MACR3J,IAAK,UAEQ,SAASmT,GAAqB7I,GAC3C,OAAOA,EAAU9Q,QAAQ,0BAA0B,SAAU4Z,GAC3D,OAAOF,GAAKE,MCRhB,IAAIF,GAAO,CACT5N,MAAO,MACPI,IAAK,SAEQ,SAAS2N,GAA8B/I,GACpD,OAAOA,EAAU9Q,QAAQ,cAAc,SAAU4Z,GAC/C,OAAOF,GAAKE,MCLD,SAASE,GAAgBjI,GACtC,IAAI0G,EAAM3G,GAAUC,GAGpB,MAAO,CACLkI,WAHexB,EAAI5R,YAInBqT,UAHczB,EAAI9R,aCDP,SAASwT,GAAoBpjB,GAQ1C,OAAO0P,GAAsB2N,GAAmBrd,IAAU6P,KAAOoT,GAAgBjjB,GAASkjB,WCV7E,SAASG,GAAerjB,GAErC,IAAIsjB,EAAoB5gB,GAAiB1C,GACrCujB,EAAWD,EAAkBC,SAC7BC,EAAYF,EAAkBE,UAC9BC,EAAYH,EAAkBG,UAElC,MAAO,6BAA6BphB,KAAKkhB,EAAWE,EAAYD,GCJnD,SAASE,GAAgB1I,GACtC,MAAI,CAAC,OAAQ,OAAQ,aAAavU,QAAQoU,GAAYG,KAAU,EAEvDA,EAAKC,cAAcjX,KAGxBmX,GAAcH,IAASqI,GAAerI,GACjCA,EAGF0I,GAAgBpG,GAActC,ICHxB,SAAS2I,GAAkB3jB,EAASoG,GACjD,IAAIwd,OAES,IAATxd,IACFA,EAAO,IAGT,IAAIuc,EAAee,GAAgB1jB,GAC/B6jB,EAASlB,KAAqE,OAAlDiB,EAAwB5jB,EAAQib,oBAAyB,EAAS2I,EAAsB5f,MACpH0d,EAAM3G,GAAU4H,GAChB3c,EAAS6d,EAAS,CAACnC,GAAKtR,OAAOsR,EAAIoC,gBAAkB,GAAIT,GAAeV,GAAgBA,EAAe,IAAMA,EAC7GoB,EAAc3d,EAAKgK,OAAOpK,GAC9B,OAAO6d,EAASE,EAChBA,EAAY3T,OAAOuT,GAAkBrG,GAActX,KCxBtC,SAASge,GAAiBvU,GACvC,OAAOhO,OAAOoa,OAAO,GAAIpM,EAAM,CAC7BI,KAAMJ,EAAKmN,EACXjN,IAAKF,EAAKoN,EACVtD,MAAO9J,EAAKmN,EAAInN,EAAKiN,MACrBpD,OAAQ7J,EAAKoN,EAAIpN,EAAKkN,SCuB1B,SAASsH,GAA2BjkB,EAASkkB,GAC3C,OAAOA,IAAmBvK,GAAWqK,GC1BxB,SAAyBhkB,GACtC,IAAI0hB,EAAM3G,GAAU/a,GAChBmkB,EAAO9G,GAAmBrd,GAC1B8jB,EAAiBpC,EAAIoC,eACrBpH,EAAQyH,EAAKpE,YACbpD,EAASwH,EAAKrE,aACdlD,EAAI,EACJC,EAAI,EAuBR,OAjBIiH,IACFpH,EAAQoH,EAAepH,MACvBC,EAASmH,EAAenH,OASnB,iCAAiCta,KAAKkR,UAAUqK,aACnDhB,EAAIkH,EAAe7T,WACnB4M,EAAIiH,EAAe9T,YAIhB,CACL0M,MAAOA,EACPC,OAAQA,EACRC,EAAGA,EAAIwG,GAAoBpjB,GAC3B6c,EAAGA,GDRiDuH,CAAgBpkB,IAAYmb,GAAc+I,GAdlG,SAAoClkB,GAClC,IAAIyP,EAAOC,GAAsB1P,GASjC,OARAyP,EAAKE,IAAMF,EAAKE,IAAM3P,EAAQqkB,UAC9B5U,EAAKI,KAAOJ,EAAKI,KAAO7P,EAAQskB,WAChC7U,EAAK6J,OAAS7J,EAAKE,IAAM3P,EAAQ8f,aACjCrQ,EAAK8J,MAAQ9J,EAAKI,KAAO7P,EAAQ+f,YACjCtQ,EAAKiN,MAAQ1c,EAAQ+f,YACrBtQ,EAAKkN,OAAS3c,EAAQ8f,aACtBrQ,EAAKmN,EAAInN,EAAKI,KACdJ,EAAKoN,EAAIpN,EAAKE,IACPF,EAI2G8U,CAA2BL,GAAkBF,GEtBlJ,SAAyBhkB,GACtC,IAAI4jB,EAEAO,EAAO9G,GAAmBrd,GAC1BwkB,EAAYvB,GAAgBjjB,GAC5BgE,EAA0D,OAAlD4f,EAAwB5jB,EAAQib,oBAAyB,EAAS2I,EAAsB5f,KAChG0Y,EAAQ9V,GAAIud,EAAKM,YAAaN,EAAKpE,YAAa/b,EAAOA,EAAKygB,YAAc,EAAGzgB,EAAOA,EAAK+b,YAAc,GACvGpD,EAAS/V,GAAIud,EAAKO,aAAcP,EAAKrE,aAAc9b,EAAOA,EAAK0gB,aAAe,EAAG1gB,EAAOA,EAAK8b,aAAe,GAC5GlD,GAAK4H,EAAUtB,WAAaE,GAAoBpjB,GAChD6c,GAAK2H,EAAUrB,UAMnB,MAJiD,QAA7CzgB,GAAiBsB,GAAQmgB,GAAMvP,YACjCgI,GAAKhW,GAAIud,EAAKpE,YAAa/b,EAAOA,EAAK+b,YAAc,GAAKrD,GAGrD,CACLA,MAAOA,EACPC,OAAQA,EACRC,EAAGA,EACHC,EAAGA,GFG2K8H,CAAgBtH,GAAmBrd,KGzBtM,SAAS4kB,GAAenJ,GACrC,IAOImF,EAPA/G,EAAY4B,EAAK5B,UACjB7Z,EAAUyb,EAAKzb,QACfia,EAAYwB,EAAKxB,UACjBiF,EAAgBjF,EAAYuC,GAAiBvC,GAAa,KAC1D0G,EAAY1G,EAAYqG,GAAarG,GAAa,KAClD4K,EAAUhL,EAAU+C,EAAI/C,EAAU6C,MAAQ,EAAI1c,EAAQ0c,MAAQ,EAC9DoI,EAAUjL,EAAUgD,EAAIhD,EAAU8C,OAAS,EAAI3c,EAAQ2c,OAAS,EAGpE,OAAQuC,GACN,KAAKvP,GACHiR,EAAU,CACRhE,EAAGiI,EACHhI,EAAGhD,EAAUgD,EAAI7c,EAAQ2c,QAE3B,MAEF,KAAKrD,GACHsH,EAAU,CACRhE,EAAGiI,EACHhI,EAAGhD,EAAUgD,EAAIhD,EAAU8C,QAE7B,MAEF,KAAKpD,GACHqH,EAAU,CACRhE,EAAG/C,EAAU+C,EAAI/C,EAAU6C,MAC3BG,EAAGiI,GAEL,MAEF,KAAKjV,GACH+Q,EAAU,CACRhE,EAAG/C,EAAU+C,EAAI5c,EAAQ0c,MACzBG,EAAGiI,GAEL,MAEF,QACElE,EAAU,CACRhE,EAAG/C,EAAU+C,EACbC,EAAGhD,EAAUgD,GAInB,IAAIkI,EAAW7F,EAAgBd,GAAyBc,GAAiB,KAEzE,GAAgB,MAAZ6F,EAAkB,CACpB,IAAI9c,EAAmB,MAAb8c,EAAmB,SAAW,QAExC,OAAQpE,GACN,KAAK1L,GACH2L,EAAQmE,GAAYnE,EAAQmE,IAAalL,EAAU5R,GAAO,EAAIjI,EAAQiI,GAAO,GAC7E,MAEF,KAAKoN,GACHuL,EAAQmE,GAAYnE,EAAQmE,IAAalL,EAAU5R,GAAO,EAAIjI,EAAQiI,GAAO,IAOnF,OAAO2Y,EC1DM,SAASoE,GAAetJ,EAAOO,QAC5B,IAAZA,IACFA,EAAU,IAGZ,IAAIgJ,EAAWhJ,EACXiJ,EAAqBD,EAAShL,UAC9BA,OAAmC,IAAvBiL,EAAgCxJ,EAAMzB,UAAYiL,EAC9DC,EAAoBF,EAASG,SAC7BA,OAAiC,IAAtBD,EAA+BzL,GAAkByL,EAC5DE,EAAwBJ,EAASK,aACjCA,OAAyC,IAA1BD,EAAmC1L,GAAW0L,EAC7DE,EAAwBN,EAASO,eACjCA,OAA2C,IAA1BD,EAAmC3L,GAAS2L,EAC7DE,EAAuBR,EAASS,YAChCA,OAAuC,IAAzBD,GAA0CA,EACxDE,EAAmBV,EAAS7F,QAC5BA,OAA+B,IAArBuG,EAA8B,EAAIA,EAC5CjH,EAAgBD,GAAsC,iBAAZW,EAAuBA,EAAUT,GAAgBS,EAAS3F,KACpGmM,EAAaJ,IAAmB5L,GAASC,GAAYD,GACrD8G,EAAahF,EAAM2D,MAAMzF,OACzB5Z,EAAU0b,EAAMC,SAAS+J,EAAcE,EAAaJ,GACpDK,EJoBS,SAAyB7lB,EAASolB,EAAUE,GACzD,IAAIQ,EAAmC,oBAAbV,EAlB5B,SAA4BplB,GAC1B,IAAI0Z,EAAkBiK,GAAkBrG,GAActd,IAElD+lB,EADoB,CAAC,WAAY,SAAStf,QAAQ/D,GAAiB1C,GAAS+P,WAAa,GACnDoL,GAAcnb,GAAW0d,GAAgB1d,GAAWA,EAE9F,OAAKe,GAAUglB,GAKRrM,EAAgBtK,QAAO,SAAU8U,GACtC,OAAOnjB,GAAUmjB,IAAmBlhB,GAASkhB,EAAgB6B,IAAmD,SAAhClL,GAAYqJ,MALrF,GAYkD8B,CAAmBhmB,GAAW,GAAGoQ,OAAOgV,GAC/F1L,EAAkB,GAAGtJ,OAAO0V,EAAqB,CAACR,IAClDW,EAAsBvM,EAAgB,GACtCwM,EAAexM,EAAgBK,QAAO,SAAUoM,EAASjC,GAC3D,IAAIzU,EAAOwU,GAA2BjkB,EAASkkB,GAK/C,OAJAiC,EAAQxW,IAAM/I,GAAI6I,EAAKE,IAAKwW,EAAQxW,KACpCwW,EAAQ5M,MAAQ1S,GAAI4I,EAAK8J,MAAO4M,EAAQ5M,OACxC4M,EAAQ7M,OAASzS,GAAI4I,EAAK6J,OAAQ6M,EAAQ7M,QAC1C6M,EAAQtW,KAAOjJ,GAAI6I,EAAKI,KAAMsW,EAAQtW,MAC/BsW,IACNlC,GAA2BjkB,EAASimB,IAKvC,OAJAC,EAAaxJ,MAAQwJ,EAAa3M,MAAQ2M,EAAarW,KACvDqW,EAAavJ,OAASuJ,EAAa5M,OAAS4M,EAAavW,IACzDuW,EAAatJ,EAAIsJ,EAAarW,KAC9BqW,EAAarJ,EAAIqJ,EAAavW,IACvBuW,EIpCkBE,CAAgBrlB,GAAUf,GAAWA,EAAUA,EAAQqmB,gBAAkBhJ,GAAmB3B,EAAMC,SAAS/B,QAASwL,EAAUE,GACnJgB,EAAsB5W,GAAsBgM,EAAMC,SAAS9B,WAC3DmF,EAAgB4F,GAAe,CACjC/K,UAAWyM,EACXtmB,QAAS0gB,EACTxE,SAAU,WACVjC,UAAWA,IAETsM,EAAmBvC,GAAiBviB,OAAOoa,OAAO,GAAI6E,EAAY1B,IAClEwH,EAAoBhB,IAAmB5L,GAAS2M,EAAmBD,EAGnEG,EAAkB,CACpB9W,IAAKkW,EAAmBlW,IAAM6W,EAAkB7W,IAAM+O,EAAc/O,IACpE2J,OAAQkN,EAAkBlN,OAASuM,EAAmBvM,OAASoF,EAAcpF,OAC7EzJ,KAAMgW,EAAmBhW,KAAO2W,EAAkB3W,KAAO6O,EAAc7O,KACvE0J,MAAOiN,EAAkBjN,MAAQsM,EAAmBtM,MAAQmF,EAAcnF,OAExEmN,EAAahL,EAAMuD,cAAczP,OAErC,GAAIgW,IAAmB5L,IAAU8M,EAAY,CAC3C,IAAIlX,EAASkX,EAAWzM,GACxBxY,OAAOC,KAAK+kB,GAAiB9kB,SAAQ,SAAU6J,GAC7C,IAAImb,EAAW,CAACpN,GAAOD,IAAQ7S,QAAQ+E,IAAQ,EAAI,GAAK,EACpD2T,EAAO,CAACxP,GAAK2J,IAAQ7S,QAAQ+E,IAAQ,EAAI,IAAM,IACnDib,EAAgBjb,IAAQgE,EAAO2P,GAAQwH,KAI3C,OAAOF,ECzDM,SAASG,GAAqBlL,EAAOO,QAClC,IAAZA,IACFA,EAAU,IAGZ,IAAIgJ,EAAWhJ,EACXhC,EAAYgL,EAAShL,UACrBmL,EAAWH,EAASG,SACpBE,EAAeL,EAASK,aACxBlG,EAAU6F,EAAS7F,QACnByH,EAAiB5B,EAAS4B,eAC1BC,EAAwB7B,EAAS8B,sBACjCA,OAAkD,IAA1BD,EAAmCE,GAAgBF,EAC3EnG,EAAYL,GAAarG,GACzBC,EAAayG,EAAYkG,EAAiB/M,GAAsBA,GAAoB1K,QAAO,SAAU6K,GACvG,OAAOqG,GAAarG,KAAe0G,KAChClH,GACDwN,EAAoB/M,EAAW9K,QAAO,SAAU6K,GAClD,OAAO8M,EAAsBtgB,QAAQwT,IAAc,KAGpB,IAA7BgN,EAAkB7lB,SACpB6lB,EAAoB/M,GAQtB,IAAIgN,EAAYD,EAAkBlN,QAAO,SAAUC,EAAKC,GAOtD,OANAD,EAAIC,GAAa+K,GAAetJ,EAAO,CACrCzB,UAAWA,EACXmL,SAAUA,EACVE,aAAcA,EACdlG,QAASA,IACR5C,GAAiBvC,IACbD,IACN,IACH,OAAOvY,OAAOC,KAAKwlB,GAAWC,MAAK,SAAUC,EAAGC,GAC9C,OAAOH,EAAUE,GAAKF,EAAUG,MC6FpC,MAAAC,GAAe,CACb9iB,KAAM,OACN+W,SAAS,EACTC,MAAO,OACP7W,GA5HF,SAAc8W,GACZ,IAAIC,EAAQD,EAAKC,MACbO,EAAUR,EAAKQ,QACfzX,EAAOiX,EAAKjX,KAEhB,IAAIkX,EAAMuD,cAAcza,GAAM+iB,MAA9B,CAoCA,IAhCA,IAAIC,EAAoBvL,EAAQ8I,SAC5B0C,OAAsC,IAAtBD,GAAsCA,EACtDE,EAAmBzL,EAAQ0L,QAC3BC,OAAoC,IAArBF,GAAqCA,EACpDG,EAA8B5L,EAAQ6L,mBACtC1I,EAAUnD,EAAQmD,QAClBgG,EAAWnJ,EAAQmJ,SACnBE,EAAerJ,EAAQqJ,aACvBI,EAAczJ,EAAQyJ,YACtBqC,EAAwB9L,EAAQ4K,eAChCA,OAA2C,IAA1BkB,GAA0CA,EAC3DhB,EAAwB9K,EAAQ8K,sBAChCiB,EAAqBtM,EAAMO,QAAQhC,UACnCiF,EAAgB1C,GAAiBwL,GAEjCF,EAAqBD,IADH3I,IAAkB8I,GACqCnB,EAjC/E,SAAuC5M,GACrC,GAAIuC,GAAiBvC,KAAeT,GAClC,MAAO,GAGT,IAAIyO,EAAoBnF,GAAqB7I,GAC7C,MAAO,CAAC+I,GAA8B/I,GAAYgO,EAAmBjF,GAA8BiF,IA2BwCC,CAA8BF,GAA3E,CAAClF,GAAqBkF,KAChH9N,EAAa,CAAC8N,GAAoB5X,OAAO0X,GAAoB/N,QAAO,SAAUC,EAAKC,GACrF,OAAOD,EAAI5J,OAAOoM,GAAiBvC,KAAeT,GAAOoN,GAAqBlL,EAAO,CACnFzB,UAAWA,EACXmL,SAAUA,EACVE,aAAcA,EACdlG,QAASA,EACTyH,eAAgBA,EAChBE,sBAAuBA,IACpB9M,KACJ,IACCkO,EAAgBzM,EAAM2D,MAAMxF,UAC5B6G,EAAahF,EAAM2D,MAAMzF,OACzBwO,EAAY,IAAIvc,IAChBwc,GAAqB,EACrBC,EAAwBpO,EAAW,GAE9BlS,EAAI,EAAGA,EAAIkS,EAAW9Y,OAAQ4G,IAAK,CAC1C,IAAIiS,EAAYC,EAAWlS,GAEvBugB,EAAiB/L,GAAiBvC,GAElCuO,EAAmBlI,GAAarG,KAAehF,GAC/CwT,EAAa,CAAC9Y,GAAK2J,IAAQ7S,QAAQ8hB,IAAmB,EACtDtgB,EAAMwgB,EAAa,QAAU,SAC7BlF,EAAWyB,GAAetJ,EAAO,CACnCzB,UAAWA,EACXmL,SAAUA,EACVE,aAAcA,EACdI,YAAaA,EACbtG,QAASA,IAEPsJ,EAAoBD,EAAaD,EAAmBjP,GAAQ1J,GAAO2Y,EAAmBlP,GAAS3J,GAE/FwY,EAAclgB,GAAOyY,EAAWzY,KAClCygB,EAAoB5F,GAAqB4F,IAG3C,IAAIC,EAAmB7F,GAAqB4F,GACxCE,EAAS,GAUb,GARInB,GACFmB,EAAO3jB,KAAKse,EAASgF,IAAmB,GAGtCX,GACFgB,EAAO3jB,KAAKse,EAASmF,IAAsB,EAAGnF,EAASoF,IAAqB,GAG1EC,EAAOC,OAAM,SAAUC,GACzB,OAAOA,KACL,CACFR,EAAwBrO,EACxBoO,GAAqB,EACrB,MAGFD,EAAUrc,IAAIkO,EAAW2O,GAG3B,GAAIP,EAqBF,IAnBA,IAEIU,EAAQ,SAAeC,GACzB,IAAIC,EAAmB/O,EAAW/J,MAAK,SAAU8J,GAC/C,IAAI2O,EAASR,EAAU1c,IAAIuO,GAE3B,GAAI2O,EACF,OAAOA,EAAOre,MAAM,EAAGye,GAAIH,OAAM,SAAUC,GACzC,OAAOA,QAKb,GAAIG,EAEF,OADAX,EAAwBW,EACjB,SAIFD,EAnBYnC,EAAiB,EAAI,EAmBZmC,EAAK,GAGpB,UAFFD,EAAMC,GADmBA,KAOpCtN,EAAMzB,YAAcqO,IACtB5M,EAAMuD,cAAcza,GAAM+iB,OAAQ,EAClC7L,EAAMzB,UAAYqO,EAClB5M,EAAMwN,OAAQ,KAUhB7I,iBAAkB,CAAC,UACnBnS,KAAM,CACJqZ,OAAO,IC7IX,SAAS4B,GAAe5F,EAAU9T,EAAM2Z,GAQtC,YAPyB,IAArBA,IACFA,EAAmB,CACjBxM,EAAG,EACHC,EAAG,IAIA,CACLlN,IAAK4T,EAAS5T,IAAMF,EAAKkN,OAASyM,EAAiBvM,EACnDtD,MAAOgK,EAAShK,MAAQ9J,EAAKiN,MAAQ0M,EAAiBxM,EACtDtD,OAAQiK,EAASjK,OAAS7J,EAAKkN,OAASyM,EAAiBvM,EACzDhN,KAAM0T,EAAS1T,KAAOJ,EAAKiN,MAAQ0M,EAAiBxM,GAIxD,SAASyM,GAAsB9F,GAC7B,MAAO,CAAC5T,GAAK4J,GAAOD,GAAQzJ,IAAMyZ,MAAK,SAAUC,GAC/C,OAAOhG,EAASgG,IAAS,KAiC7B,MAAAC,GAAe,CACbhlB,KAAM,OACN+W,SAAS,EACTC,MAAO,OACP6E,iBAAkB,CAAC,mBACnB1b,GAlCF,SAAc8W,GACZ,IAAIC,EAAQD,EAAKC,MACblX,EAAOiX,EAAKjX,KACZ2jB,EAAgBzM,EAAM2D,MAAMxF,UAC5B6G,EAAahF,EAAM2D,MAAMzF,OACzBwP,EAAmB1N,EAAMuD,cAAcwK,gBACvCC,EAAoB1E,GAAetJ,EAAO,CAC5C8J,eAAgB,cAEdmE,EAAoB3E,GAAetJ,EAAO,CAC5CgK,aAAa,IAEXkE,EAA2BT,GAAeO,EAAmBvB,GAC7D0B,EAAsBV,GAAeQ,EAAmBjJ,EAAY0I,GACpEU,EAAoBT,GAAsBO,GAC1CG,EAAmBV,GAAsBQ,GAC7CnO,EAAMuD,cAAcza,GAAQ,CAC1BolB,yBAA0BA,EAC1BC,oBAAqBA,EACrBC,kBAAmBA,EACnBC,iBAAkBA,GAEpBrO,EAAMxM,WAAW0K,OAASnY,OAAOoa,OAAO,GAAIH,EAAMxM,WAAW0K,OAAQ,CACnE,+BAAgCkQ,EAChC,sBAAuBC,MCH3BC,GAAe,CACbxlB,KAAM,SACN+W,SAAS,EACTC,MAAO,OACPe,SAAU,CAAC,iBACX5X,GA5BF,SAAgBoX,GACd,IAAIL,EAAQK,EAAML,MACdO,EAAUF,EAAME,QAChBzX,EAAOuX,EAAMvX,KACbylB,EAAkBhO,EAAQzM,OAC1BA,OAA6B,IAApBya,EAA6B,CAAC,EAAG,GAAKA,EAC/C/b,EAAOgM,GAAWH,QAAO,SAAUC,EAAKC,GAE1C,OADAD,EAAIC,GA5BD,SAAiCA,EAAWoF,EAAO7P,GACxD,IAAI0P,EAAgB1C,GAAiBvC,GACjCiQ,EAAiB,CAACra,GAAMF,IAAKlJ,QAAQyY,IAAkB,GAAK,EAAI,EAEhEzD,EAAyB,mBAAXjM,EAAwBA,EAAO/N,OAAOoa,OAAO,GAAIwD,EAAO,CACxEpF,UAAWA,KACPzK,EACF2a,EAAW1O,EAAK,GAChB2O,EAAW3O,EAAK,GAIpB,OAFA0O,EAAWA,GAAY,EACvBC,GAAYA,GAAY,GAAKF,EACtB,CAACra,GAAM0J,IAAO9S,QAAQyY,IAAkB,EAAI,CACjDtC,EAAGwN,EACHvN,EAAGsN,GACD,CACFvN,EAAGuN,EACHtN,EAAGuN,GAWcC,CAAwBpQ,EAAWyB,EAAM2D,MAAO7P,GAC1DwK,IACN,IACCsQ,EAAwBpc,EAAKwN,EAAMzB,WACnC2C,EAAI0N,EAAsB1N,EAC1BC,EAAIyN,EAAsBzN,EAEW,MAArCnB,EAAMuD,cAAcD,gBACtBtD,EAAMuD,cAAcD,cAAcpC,GAAKA,EACvClB,EAAMuD,cAAcD,cAAcnC,GAAKA,GAGzCnB,EAAMuD,cAAcza,GAAQ0J,ICxB9Bqc,GAAe,CACb/lB,KAAM,gBACN+W,SAAS,EACTC,MAAO,OACP7W,GApBF,SAAuB8W,GACrB,IAAIC,EAAQD,EAAKC,MACblX,EAAOiX,EAAKjX,KAKhBkX,EAAMuD,cAAcza,GAAQogB,GAAe,CACzC/K,UAAW6B,EAAM2D,MAAMxF,UACvB7Z,QAAS0b,EAAM2D,MAAMzF,OACrBsC,SAAU,WACVjC,UAAWyB,EAAMzB,aAUnB/L,KAAM,IC6FRsc,GAAe,CACbhmB,KAAM,kBACN+W,SAAS,EACTC,MAAO,OACP7W,GA5GF,SAAyB8W,GACvB,IAAIC,EAAQD,EAAKC,MACbO,EAAUR,EAAKQ,QACfzX,EAAOiX,EAAKjX,KACZgjB,EAAoBvL,EAAQ8I,SAC5B0C,OAAsC,IAAtBD,GAAsCA,EACtDE,EAAmBzL,EAAQ0L,QAC3BC,OAAoC,IAArBF,GAAsCA,EACrDtC,EAAWnJ,EAAQmJ,SACnBE,EAAerJ,EAAQqJ,aACvBI,EAAczJ,EAAQyJ,YACtBtG,EAAUnD,EAAQmD,QAClBqL,EAAkBxO,EAAQyO,OAC1BA,OAA6B,IAApBD,GAAoCA,EAC7CE,EAAwB1O,EAAQ2O,aAChCA,OAAyC,IAA1BD,EAAmC,EAAIA,EACtDpH,EAAWyB,GAAetJ,EAAO,CACnC0J,SAAUA,EACVE,aAAcA,EACdlG,QAASA,EACTsG,YAAaA,IAEXxG,EAAgB1C,GAAiBd,EAAMzB,WACvC0G,EAAYL,GAAa5E,EAAMzB,WAC/B4Q,GAAmBlK,EACnBoE,EAAW3G,GAAyBc,GACpCyI,ECrCY,MDqCS5C,ECrCH,IAAM,IDsCxB/F,EAAgBtD,EAAMuD,cAAcD,cACpCmJ,EAAgBzM,EAAM2D,MAAMxF,UAC5B6G,EAAahF,EAAM2D,MAAMzF,OACzBkR,EAA4C,mBAAjBF,EAA8BA,EAAanpB,OAAOoa,OAAO,GAAIH,EAAM2D,MAAO,CACvGpF,UAAWyB,EAAMzB,aACb2Q,EACF1c,EAAO,CACT0O,EAAG,EACHC,EAAG,GAGL,GAAKmC,EAAL,CAIA,GAAIyI,GAAiBG,EAAc,CACjC,IAAImD,EAAwB,MAAbhG,EAAmBpV,GAAME,GACpCmb,EAAuB,MAAbjG,EAAmBzL,GAASC,GACtCtR,EAAmB,MAAb8c,EAAmB,SAAW,QACpCvV,EAASwP,EAAc+F,GACvBle,EAAMmY,EAAc+F,GAAYxB,EAASwH,GACzCnkB,EAAMoY,EAAc+F,GAAYxB,EAASyH,GACzCC,EAAWP,GAAUhK,EAAWzY,GAAO,EAAI,EAC3CijB,EAASvK,IAAc1L,GAAQkT,EAAclgB,GAAOyY,EAAWzY,GAC/DkjB,EAASxK,IAAc1L,IAASyL,EAAWzY,IAAQkgB,EAAclgB,GAGjE8W,EAAerD,EAAMC,SAASS,MAC9BmD,EAAYmL,GAAU3L,EAAejC,GAAciC,GAAgB,CACrErC,MAAO,EACPC,OAAQ,GAENyO,EAAqB1P,EAAMuD,cAAc,oBAAsBvD,EAAMuD,cAAc,oBAAoBG,QxBtEtG,CACLzP,IAAK,EACL4J,MAAO,EACPD,OAAQ,EACRzJ,KAAM,GwBmEFwb,EAAkBD,EAAmBL,GACrCO,EAAkBF,EAAmBJ,GAMrCO,EAAWjN,GAAO,EAAG6J,EAAclgB,GAAMsX,EAAUtX,IACnDujB,EAAYX,EAAkB1C,EAAclgB,GAAO,EAAIgjB,EAAWM,EAAWF,EAAkBP,EAAoBI,EAASK,EAAWF,EAAkBP,EACzJW,EAAYZ,GAAmB1C,EAAclgB,GAAO,EAAIgjB,EAAWM,EAAWD,EAAkBR,EAAoBK,EAASI,EAAWD,EAAkBR,EAC1JlL,EAAoBlE,EAAMC,SAASS,OAASsB,GAAgBhC,EAAMC,SAASS,OAC3EsP,EAAe9L,EAAiC,MAAbmF,EAAmBnF,EAAkByE,WAAa,EAAIzE,EAAkB0E,YAAc,EAAI,EAC7HqH,EAAsBjQ,EAAMuD,cAAczP,OAASkM,EAAMuD,cAAczP,OAAOkM,EAAMzB,WAAW8K,GAAY,EAC3G6G,EAAY5M,EAAc+F,GAAYyG,EAAYG,EAAsBD,EACxEG,EAAY7M,EAAc+F,GAAY0G,EAAYE,EAEtD,GAAIlE,EAAe,CACjB,IAAIqE,EAAkBxN,GAAOoM,EAASlM,GAAQ3X,EAAK+kB,GAAa/kB,EAAK2I,EAAQkb,EAASnM,GAAQ3X,EAAKilB,GAAajlB,GAChHoY,EAAc+F,GAAY+G,EAC1B5d,EAAK6W,GAAY+G,EAAkBtc,EAGrC,GAAIoY,EAAc,CAChB,IAAImE,EAAyB,MAAbhH,EAAmBpV,GAAME,GAErCmc,EAAwB,MAAbjH,EAAmBzL,GAASC,GAEvC0S,EAAUjN,EAAc2I,GAExBuE,EAAOD,EAAU1I,EAASwI,GAE1BI,GAAOF,EAAU1I,EAASyI,GAE1BI,GAAmB9N,GAAOoM,EAASlM,GAAQ0N,EAAMN,GAAaM,EAAMD,EAASvB,EAASnM,GAAQ4N,GAAMN,GAAaM,IAErHnN,EAAc2I,GAAWyE,GACzBle,EAAKyZ,GAAWyE,GAAmBH,GAIvCvQ,EAAMuD,cAAcza,GAAQ0J,IAS5BmS,iBAAkB,CAAC,WExGN,SAASgM,GAAiBC,EAAyB7O,EAAc8O,QAC9D,IAAZA,IACFA,GAAU,GAGZ,IAAIC,EAA0BrR,GAAcsC,GACjBtC,GAAcsC,IAf3C,SAAyBzd,GACvB,IAAIyP,EAAOzP,EAAQ0P,wBACND,EAAKiN,MAAQ1c,EAAQgd,YACrBvN,EAAKkN,OAAS3c,EAAQ4D,aAYuB6oB,CAAgBhP,GAC1E,ICpBoCzC,ECJOhb,EFwBvCoD,EAAkBia,GAAmBI,GACrChO,EAAOC,GAAsB4c,GAC7B/J,EAAS,CACXW,WAAY,EACZC,UAAW,GAETvC,EAAU,CACZhE,EAAG,EACHC,EAAG,GAkBL,OAfI2P,IAA4BA,IAA4BD,MACxB,SAA9B1R,GAAY4C,IAChB4F,GAAejgB,MACbmf,GClCgCvH,EDkCTyC,KCjCd1C,GAAUC,IAAUG,GAAcH,GCJxC,CACLkI,YAFyCljB,EDQbgb,GCNRkI,WACpBC,UAAWnjB,EAAQmjB,WDGZF,GAAgBjI,IDmCnBG,GAAcsC,KAChBmD,EAAUlR,GAAsB+N,IACxBb,GAAKa,EAAa6G,WAC1B1D,EAAQ/D,GAAKY,EAAa4G,WACjBjhB,IACTwd,EAAQhE,EAAIwG,GAAoBhgB,KAI7B,CACLwZ,EAAGnN,EAAKI,KAAO0S,EAAOW,WAAatC,EAAQhE,EAC3CC,EAAGpN,EAAKE,IAAM4S,EAAOY,UAAYvC,EAAQ/D,EACzCH,MAAOjN,EAAKiN,MACZC,OAAQlN,EAAKkN,QGpDjB,SAASnI,GAAMkY,GACb,IAAItb,EAAM,IAAIvF,IACV8gB,EAAU,IAAInlB,IACdolB,EAAS,GAKb,SAASzF,EAAK0F,GACZF,EAAQnX,IAAIqX,EAASroB,MACN,GAAG4L,OAAOyc,EAAStQ,UAAY,GAAIsQ,EAASxM,kBAAoB,IACtE1e,SAAQ,SAAUmrB,GACzB,IAAKH,EAAQjkB,IAAIokB,GAAM,CACrB,IAAIC,EAAc3b,EAAI1F,IAAIohB,GAEtBC,GACF5F,EAAK4F,OAIXH,EAAO3nB,KAAK4nB,GASd,OAzBAH,EAAU/qB,SAAQ,SAAUkrB,GAC1Bzb,EAAIrF,IAAI8gB,EAASroB,KAAMqoB,MAkBzBH,EAAU/qB,SAAQ,SAAUkrB,GACrBF,EAAQjkB,IAAImkB,EAASroB,OAExB2iB,EAAK0F,MAGFD,ECfT,IAAII,GAAkB,CACpB/S,UAAW,SACXyS,UAAW,GACXxQ,SAAU,YAGZ,SAAS+Q,KACP,IAAK,IAAIC,EAAOC,UAAU/rB,OAAQsJ,EAAO,IAAI2B,MAAM6gB,GAAOE,EAAO,EAAGA,EAAOF,EAAME,IAC/E1iB,EAAK0iB,GAAQD,UAAUC,GAGzB,OAAQ1iB,EAAK4e,MAAK,SAAUtpB,GAC1B,QAASA,GAAoD,mBAAlCA,EAAQ0P,0BAIhC,SAAS2d,GAAgBC,QACL,IAArBA,IACFA,EAAmB,IAGrB,IAAIC,EAAoBD,EACpBE,EAAwBD,EAAkBE,iBAC1CA,OAA6C,IAA1BD,EAAmC,GAAKA,EAC3DE,EAAyBH,EAAkBI,eAC3CA,OAA4C,IAA3BD,EAAoCV,GAAkBU,EAC3E,OAAO,SAAsB7T,EAAWD,EAAQqC,QAC9B,IAAZA,IACFA,EAAU0R,GAGZ,IC/C6BhpB,EAC3BipB,ED8CElS,EAAQ,CACVzB,UAAW,SACX4T,iBAAkB,GAClB5R,QAASxa,OAAOoa,OAAO,GAAImR,GAAiBW,GAC5C1O,cAAe,GACftD,SAAU,CACR9B,UAAWA,EACXD,OAAQA,GAEV1K,WAAY,GACZ0M,OAAQ,IAENkS,EAAmB,GACnBC,GAAc,EACd/hB,EAAW,CACb0P,MAAOA,EACPsS,WAAY,SAAoBC,GAC9B,IAAIhS,EAAsC,mBAArBgS,EAAkCA,EAAiBvS,EAAMO,SAAWgS,EACzFC,IACAxS,EAAMO,QAAUxa,OAAOoa,OAAO,GAAI8R,EAAgBjS,EAAMO,QAASA,GACjEP,EAAMgH,cAAgB,CACpB7I,UAAW9Y,GAAU8Y,GAAa8J,GAAkB9J,GAAaA,EAAUwM,eAAiB1C,GAAkB9J,EAAUwM,gBAAkB,GAC1IzM,OAAQ+J,GAAkB/J,IAI5B,IEzE4B8S,EAC9ByB,EFwEMN,EDvCG,SAAwBnB,GAErC,IAAImB,EAAmBrZ,GAAMkY,GAE7B,OAAO9R,GAAeb,QAAO,SAAUC,EAAKwB,GAC1C,OAAOxB,EAAI5J,OAAOyd,EAAiBze,QAAO,SAAUyd,GAClD,OAAOA,EAASrR,QAAUA,QAE3B,IC+B0B4S,EEzEK1B,EFyEsB,GAAGtc,OAAOqd,EAAkB/R,EAAMO,QAAQyQ,WExE9FyB,EAASzB,EAAU3S,QAAO,SAAUoU,EAAQE,GAC9C,IAAIC,EAAWH,EAAOE,EAAQ7pB,MAK9B,OAJA2pB,EAAOE,EAAQ7pB,MAAQ8pB,EAAW7sB,OAAOoa,OAAO,GAAIyS,EAAUD,EAAS,CACrEpS,QAASxa,OAAOoa,OAAO,GAAIyS,EAASrS,QAASoS,EAAQpS,SACrD/N,KAAMzM,OAAOoa,OAAO,GAAIyS,EAASpgB,KAAMmgB,EAAQngB,QAC5CmgB,EACEF,IACN,IAEI1sB,OAAOC,KAAKysB,GAAQ/c,KAAI,SAAU5F,GACvC,OAAO2iB,EAAO3iB,QFuGV,OAvCAkQ,EAAMmS,iBAAmBA,EAAiBze,QAAO,SAAUmf,GACzD,OAAOA,EAAEhT,WAqJbG,EAAMmS,iBAAiBlsB,SAAQ,SAAUqf,GACvC,IAAIxc,EAAOwc,EAAMxc,KACbgqB,EAAgBxN,EAAM/E,QACtBA,OAA4B,IAAlBuS,EAA2B,GAAKA,EAC1C1S,EAASkF,EAAMlF,OAEnB,GAAsB,mBAAXA,EAAuB,CAChC,IAAI2S,EAAY3S,EAAO,CACrBJ,MAAOA,EACPlX,KAAMA,EACNwH,SAAUA,EACViQ,QAASA,IAKX6R,EAAiB7oB,KAAKwpB,GAFT,kBA7HRziB,EAAS4W,UAOlB8L,YAAa,WACX,IAAIX,EAAJ,CAIA,IAAIY,EAAkBjT,EAAMC,SACxB9B,EAAY8U,EAAgB9U,UAC5BD,EAAS+U,EAAgB/U,OAG7B,GAAKqT,GAAiBpT,EAAWD,GAAjC,CASA8B,EAAM2D,MAAQ,CACZxF,UAAWwS,GAAiBxS,EAAW6D,GAAgB9D,GAAoC,UAA3B8B,EAAMO,QAAQC,UAC9EtC,OAAQkD,GAAclD,IAOxB8B,EAAMwN,OAAQ,EACdxN,EAAMzB,UAAYyB,EAAMO,QAAQhC,UAKhCyB,EAAMmS,iBAAiBlsB,SAAQ,SAAUkrB,GACvC,OAAOnR,EAAMuD,cAAc4N,EAASroB,MAAQ/C,OAAOoa,OAAO,GAAIgR,EAAS3e,SAIzE,IAAK,IAAI1H,EAAQ,EAAGA,EAAQkV,EAAMmS,iBAAiBzsB,OAAQoF,IAUzD,IAAoB,IAAhBkV,EAAMwN,MAAV,CAMA,IAAI0F,EAAwBlT,EAAMmS,iBAAiBrnB,GAC/C7B,EAAKiqB,EAAsBjqB,GAC3BkqB,EAAyBD,EAAsB3S,QAC/CgJ,OAAsC,IAA3B4J,EAAoC,GAAKA,EACpDrqB,EAAOoqB,EAAsBpqB,KAEf,mBAAPG,IACT+W,EAAQ/W,EAAG,CACT+W,MAAOA,EACPO,QAASgJ,EACTzgB,KAAMA,EACNwH,SAAUA,KACN0P,QAjBNA,EAAMwN,OAAQ,EACd1iB,GAAS,KAsBfoc,QClM2Bje,EDkMV,WACf,OAAO,IAAImqB,SAAQ,SAAUC,GAC3B/iB,EAAS0iB,cACTK,EAAQrT,OCnMT,WAUL,OATKkS,IACHA,EAAU,IAAIkB,SAAQ,SAAUC,GAC9BD,QAAQC,UAAUC,MAAK,WACrBpB,OAAUzf,EACV4gB,EAAQpqB,YAKPipB,ID4LLqB,QAAS,WACPf,IACAH,GAAc,IAIlB,IAAKd,GAAiBpT,EAAWD,GAK/B,OAAO5N,EAmCT,SAASkiB,IACPJ,EAAiBnsB,SAAQ,SAAUgD,GACjC,OAAOA,OAETmpB,EAAmB,GAGrB,OAvCA9hB,EAASgiB,WAAW/R,GAAS+S,MAAK,SAAUtT,IACrCqS,GAAe9R,EAAQiT,eAC1BjT,EAAQiT,cAAcxT,MAqCnB1P,GAGJ,IAAImjB,GAA4B9B,KG1PnC8B,GAA4B9B,GAAgB,CAC9CI,iBAFqB,CAACpL,GAAgBrD,GAAeoQ,GAAeC,MCMlEF,GAA4B9B,GAAgB,CAC9CI,iBAFqB,CAACpL,GAAgBrD,GAAeoQ,GAAeC,GAAa7f,GAAQ8f,GAAM7F,GAAiBrN,GAAO7D,0iBCsBnH9T,GAAO,WAKP8qB,GAAa,SACbC,GAAY,QAEZC,GAAe,UACfC,GAAiB,YAGjBC,GAAiB,IAAIvtB,OAAQ,4BAM7BwtB,GAAwB,6BACxBC,GAA0B,+BAG1BtY,GAAkB,OAMlBnJ,GAAuB,8BACvB0hB,GAAgB,iBAIhBC,GAAgB7rB,IAAU,UAAY,YACtC8rB,GAAmB9rB,IAAU,YAAc,UAC3C+rB,GAAmB/rB,IAAU,aAAe,eAC5CgsB,GAAsBhsB,IAAU,eAAiB,aACjDisB,GAAkBjsB,IAAU,aAAe,cAC3CksB,GAAiBlsB,IAAU,cAAgB,aAE3CqN,GAAU,CACd/B,OAAQ,CAAC,EAAG,GACZ4V,SAAU,kBACVvL,UAAW,SACXwW,QAAS,UACTC,aAAc,KACdC,WAAW,GAGPze,GAAc,CAClBtC,OAAQ,0BACR4V,SAAU,mBACVvL,UAAW,0BACXwW,QAAS,SACTC,aAAc,yBACdC,UAAW,oBASb,MAAMC,WAAiB/jB,EACrBC,YAAY1M,EAASuB,GACnBmR,MAAM1S,GAENgJ,KAAKynB,QAAU,KACfznB,KAAKmK,QAAUnK,KAAKoK,WAAW7R,GAC/ByH,KAAK0nB,MAAQ1nB,KAAK2nB,kBAClB3nB,KAAK4nB,UAAY5nB,KAAK6nB,gBAKbtf,qBACT,OAAOA,GAGEO,yBACT,OAAOA,GAGErN,kBACT,OAAOA,GAKT6J,SACE,OAAOtF,KAAKsP,WAAatP,KAAKuP,OAASvP,KAAKwP,OAG9CA,OACE,GAAI5V,EAAWoG,KAAK2D,WAAa3D,KAAKsP,SAAStP,KAAK0nB,OAClD,OAGF,MAAM5nB,EAAgB,CACpBA,cAAeE,KAAK2D,UAKtB,GAFkBrD,EAAamB,QAAQzB,KAAK2D,SAvF5B,mBAuFkD7D,GAEpDiC,iBACZ,OAGF,MAAMuM,EAASkZ,GAASM,qBAAqB9nB,KAAK2D,UAE9C3D,KAAK4nB,UACP/hB,EAAYC,iBAAiB9F,KAAK0nB,MAAO,SAAU,QAEnD1nB,KAAK+nB,cAAczZ,GAOjB,iBAAkB7W,SAAS2C,kBAC5BkU,EAAO1J,QA5Fc,gBA6FtB,GAAGwC,UAAU3P,SAASuD,KAAKwM,UACxB7O,SAAQqW,GAAQ1O,EAAaQ,GAAGkO,EAAM,YAAatU,KAGxDsF,KAAK2D,SAASqkB,QACdhoB,KAAK2D,SAAS4B,aAAa,iBAAiB,GAE5CvF,KAAK0nB,MAAM3tB,UAAUyS,IAAI+B,IACzBvO,KAAK2D,SAAS5J,UAAUyS,IAAI+B,IAC5BjO,EAAamB,QAAQzB,KAAK2D,SAnHT,oBAmHgC7D,GAGnDyP,OACE,GAAI3V,EAAWoG,KAAK2D,YAAc3D,KAAKsP,SAAStP,KAAK0nB,OACnD,OAGF,MAAM5nB,EAAgB,CACpBA,cAAeE,KAAK2D,UAGtB3D,KAAKioB,cAAcnoB,GAGrB+D,UACM7D,KAAKynB,SACPznB,KAAKynB,QAAQxB,UAGfvc,MAAM7F,UAGR+V,SACE5Z,KAAK4nB,UAAY5nB,KAAK6nB,gBAClB7nB,KAAKynB,SACPznB,KAAKynB,QAAQ7N,SAMjBqO,cAAcnoB,GACMQ,EAAamB,QAAQzB,KAAK2D,SAvJ5B,mBAuJkD7D,GACpDiC,mBAMV,iBAAkBtK,SAAS2C,iBAC7B,GAAGgN,UAAU3P,SAASuD,KAAKwM,UACxB7O,SAAQqW,GAAQ1O,EAAaC,IAAIyO,EAAM,YAAatU,KAGrDsF,KAAKynB,SACPznB,KAAKynB,QAAQxB,UAGfjmB,KAAK0nB,MAAM3tB,UAAUwJ,OAAOgL,IAC5BvO,KAAK2D,SAAS5J,UAAUwJ,OAAOgL,IAC/BvO,KAAK2D,SAAS4B,aAAa,gBAAiB,SAC5CM,EAAYE,oBAAoB/F,KAAK0nB,MAAO,UAC5CpnB,EAAamB,QAAQzB,KAAK2D,SA1KR,qBA0KgC7D,IAGpDsK,WAAW7R,GAST,GARAA,EAAS,IACJyH,KAAK0D,YAAY6E,WACjB1C,EAAYI,kBAAkBjG,KAAK2D,aACnCpL,GAGLF,EAAgBoD,GAAMlD,EAAQyH,KAAK0D,YAAYoF,aAEf,iBAArBvQ,EAAOsY,YAA2B9Y,EAAUQ,EAAOsY,YACV,mBAA3CtY,EAAOsY,UAAUnK,sBAGxB,MAAM,IAAIpN,UAAW,GAAEmC,GAAKlC,+GAG9B,OAAOhB,EAGTwvB,cAAczZ,GACZ,QAAsB,IAAX4Z,GACT,MAAM,IAAI5uB,UAAU,gEAGtB,IAAI6uB,EAAmBnoB,KAAK2D,SAEG,WAA3B3D,KAAKmK,QAAQ0G,UACfsX,EAAmB7Z,EACVvW,EAAUiI,KAAKmK,QAAQ0G,WAChCsX,EAAmBhwB,EAAW6H,KAAKmK,QAAQ0G,WACA,iBAA3B7Q,KAAKmK,QAAQ0G,YAC7BsX,EAAmBnoB,KAAKmK,QAAQ0G,WAGlC,MAAMyW,EAAetnB,KAAKooB,mBACpBC,EAAkBf,EAAa5D,UAAUvc,MAAK0c,GAA8B,gBAAlBA,EAASroB,OAA+C,IAArBqoB,EAAStR,UAE5GvS,KAAKynB,QAAUS,GAAoBC,EAAkBnoB,KAAK0nB,MAAOJ,GAE7De,GACFxiB,EAAYC,iBAAiB9F,KAAK0nB,MAAO,SAAU,UAIvDpY,SAAStY,EAAUgJ,KAAK2D,UACtB,OAAO3M,EAAQ+C,UAAUC,SAASuU,IAGpCoZ,kBACE,OAAOzgB,EAAec,KAAKhI,KAAK2D,SAAUmjB,IAAe,GAG3DwB,gBACE,MAAMC,EAAiBvoB,KAAK2D,SAASlJ,WAErC,GAAI8tB,EAAexuB,UAAUC,SA3NN,WA4NrB,OAAOmtB,GAGT,GAAIoB,EAAexuB,UAAUC,SA9NJ,aA+NvB,OAAOotB,GAIT,MAAMoB,EAAkF,QAA1E9uB,iBAAiBsG,KAAK0nB,OAAO/tB,iBAAiB,iBAAiBpC,OAE7E,OAAIgxB,EAAexuB,UAAUC,SAvOP,UAwObwuB,EAAQxB,GAAmBD,GAG7ByB,EAAQtB,GAAsBD,GAGvCY,gBACE,OAA0D,OAAnD7nB,KAAK2D,SAASiB,QAAS,WAGhC6jB,aACE,MAAMjiB,OAAEA,GAAWxG,KAAKmK,QAExB,MAAsB,iBAAX3D,EACFA,EAAOlP,MAAM,KAAK8Q,KAAI3C,GAAO/I,OAAOwQ,SAASzH,EAAK,MAGrC,mBAAXe,EACFkiB,GAAcliB,EAAOkiB,EAAY1oB,KAAK2D,UAGxC6C,EAGT4hB,mBACE,MAAMO,EAAwB,CAC5B1X,UAAWjR,KAAKsoB,gBAChB5E,UAAW,CAAC,CACVloB,KAAM,kBACNyX,QAAS,CACPmJ,SAAUpc,KAAKmK,QAAQiS,WAG3B,CACE5gB,KAAM,SACNyX,QAAS,CACPzM,OAAQxG,KAAKyoB,iBAanB,MAP6B,WAAzBzoB,KAAKmK,QAAQkd,UACfsB,EAAsBjF,UAAY,CAAC,CACjCloB,KAAM,cACN+W,SAAS,KAIN,IACFoW,KACsC,mBAA9B3oB,KAAKmK,QAAQmd,aAA8BtnB,KAAKmK,QAAQmd,aAAaqB,GAAyB3oB,KAAKmK,QAAQmd,cAI1HsB,iBAAgBpmB,IAAEA,EAAFxF,OAAOA,IACrB,MAAM6rB,EAAQ3hB,EAAeC,KAxRF,8DAwR+BnH,KAAK0nB,OAAOthB,OAAO5M,GAExEqvB,EAAMzwB,QAMX+E,EAAqB0rB,EAAO7rB,EAAQwF,IAAQkkB,IAAiBmC,EAAMzxB,SAAS4F,IAASgrB,QAKjE7jB,uBAAC5L,GACrB,OAAOyH,KAAKiF,MAAK,WACf,MAAMC,EAAOsiB,GAAS3iB,oBAAoB7E,KAAMzH,GAEhD,GAAsB,iBAAXA,EAAX,CAIA,QAA4B,IAAjB2M,EAAK3M,GACd,MAAM,IAAIe,UAAW,oBAAmBf,MAG1C2M,EAAK3M,SAIQ4L,kBAACjF,GAChB,GAAIA,IA3UmB,IA2UTA,EAAM0G,QAAiD,UAAf1G,EAAMsB,MA9UhD,QA8UoEtB,EAAMsD,KACpF,OAGF,MAAMsmB,EAAU5hB,EAAeC,KAAK/B,IAEpC,IAAK,IAAIpG,EAAI,EAAGC,EAAM6pB,EAAQ1wB,OAAQ4G,EAAIC,EAAKD,IAAK,CAClD,MAAM+pB,EAAUvB,GAASpjB,YAAY0kB,EAAQ9pB,IAC7C,IAAK+pB,IAAyC,IAA9BA,EAAQ5e,QAAQod,UAC9B,SAGF,IAAKwB,EAAQzZ,WACX,SAGF,MAAMxP,EAAgB,CACpBA,cAAeipB,EAAQplB,UAGzB,GAAIzE,EAAO,CACT,MAAM8pB,EAAe9pB,EAAM8pB,eACrBC,EAAeD,EAAa5xB,SAAS2xB,EAAQrB,OACnD,GACEsB,EAAa5xB,SAAS2xB,EAAQplB,WACC,WAA9BolB,EAAQ5e,QAAQod,YAA2B0B,GACb,YAA9BF,EAAQ5e,QAAQod,WAA2B0B,EAE5C,SAIF,GAAIF,EAAQrB,MAAM1tB,SAASkF,EAAMlC,UAA4B,UAAfkC,EAAMsB,MA9W5C,QA8WgEtB,EAAMsD,KAAoB,qCAAqCnJ,KAAK6F,EAAMlC,OAAO2H,UACvJ,SAGiB,UAAfzF,EAAMsB,OACRV,EAAc4E,WAAaxF,GAI/B6pB,EAAQd,cAAcnoB,IAICqE,4BAACnN,GAC1B,OAAOW,EAAuBX,IAAYA,EAAQyD,WAGxB0J,6BAACjF,GAQ3B,GAAI,kBAAkB7F,KAAK6F,EAAMlC,OAAO2H,SACtCzF,EAAMsD,MAAQgkB,IAActnB,EAAMsD,MAAQ+jB,KACxCrnB,EAAMsD,MAAQkkB,IAAkBxnB,EAAMsD,MAAQikB,IAC9CvnB,EAAMlC,OAAO4H,QAAQkiB,MACtBH,GAAettB,KAAK6F,EAAMsD,KAC3B,OAGF,MAAM0mB,EAAWlpB,KAAKjG,UAAUC,SAASuU,IAEzC,IAAK2a,GAAYhqB,EAAMsD,MAAQ+jB,GAC7B,OAMF,GAHArnB,EAAMyD,iBACNzD,EAAMiqB,kBAEFvvB,EAAWoG,MACb,OAGF,MAAMopB,EAAkBppB,KAAK0H,QAAQtC,IAAwBpF,KAAOkH,EAAeW,KAAK7H,KAAMoF,IAAsB,GAC9GpC,EAAWwkB,GAAS3iB,oBAAoBukB,GAE9C,GAAIlqB,EAAMsD,MAAQ+jB,GAKlB,OAAIrnB,EAAMsD,MAAQikB,IAAgBvnB,EAAMsD,MAAQkkB,IACzCwC,GACHlmB,EAASwM,YAGXxM,EAAS4lB,gBAAgB1pB,SAItBgqB,GAAYhqB,EAAMsD,MAAQgkB,IAC7BgB,GAAS6B,cAdTrmB,EAASuM,QAyBfjP,EAAaQ,GAAGrJ,SAAUovB,GAAwBzhB,GAAsBoiB,GAAS8B,uBACjFhpB,EAAaQ,GAAGrJ,SAAUovB,GAAwBC,GAAeU,GAAS8B,uBAC1EhpB,EAAaQ,GAAGrJ,SAAUmvB,GAAsBY,GAAS6B,YACzD/oB,EAAaQ,GAAGrJ,SA/ac,6BA+akB+vB,GAAS6B,YACzD/oB,EAAaQ,GAAGrJ,SAAUmvB,GAAsBxhB,IAAsB,SAAUlG,GAC9EA,EAAMyD,iBACN6kB,GAAS3iB,oBAAoB7E,MAAMsF,YAUrClK,EAAmBosB,ICrenB,MAAM+B,GAAyB,oDACzBC,GAA0B,cAEhC,MAAMC,GACJ/lB,cACE1D,KAAK2D,SAAWlM,SAASuD,KAG3B0uB,WAEE,MAAMC,EAAgBlyB,SAAS2C,gBAAgB2c,YAC/C,OAAOpZ,KAAKgO,IAAI5Q,OAAO6uB,WAAaD,GAGtCpa,OACE,MAAMmE,EAAQ1T,KAAK0pB,WACnB1pB,KAAK6pB,mBAEL7pB,KAAK8pB,sBAAsB9pB,KAAK2D,SAAU,gBAAgBomB,GAAmBA,EAAkBrW,IAE/F1T,KAAK8pB,sBAAsBP,GAAwB,gBAAgBQ,GAAmBA,EAAkBrW,IACxG1T,KAAK8pB,sBAAsBN,GAAyB,eAAeO,GAAmBA,EAAkBrW,IAG1GmW,mBACE7pB,KAAKgqB,sBAAsBhqB,KAAK2D,SAAU,YAC1C3D,KAAK2D,SAASqM,MAAMuK,SAAW,SAGjCuP,sBAAsB7yB,EAAUgzB,EAAW3uB,GACzC,MAAM4uB,EAAiBlqB,KAAK0pB,WAW5B1pB,KAAKmqB,2BAA2BlzB,GAVHD,IAC3B,GAAIA,IAAYgJ,KAAK2D,UAAY5I,OAAO6uB,WAAa5yB,EAAQ+f,YAAcmT,EACzE,OAGFlqB,KAAKgqB,sBAAsBhzB,EAASizB,GACpC,MAAMF,EAAkBhvB,OAAOrB,iBAAiB1C,GAASizB,GACzDjzB,EAAQgZ,MAAMia,GAAc,GAAE3uB,EAASoB,OAAOC,WAAWotB,WAM7D7J,QACElgB,KAAKoqB,wBAAwBpqB,KAAK2D,SAAU,YAC5C3D,KAAKoqB,wBAAwBpqB,KAAK2D,SAAU,gBAC5C3D,KAAKoqB,wBAAwBb,GAAwB,gBACrDvpB,KAAKoqB,wBAAwBZ,GAAyB,eAGxDQ,sBAAsBhzB,EAASizB,GAC7B,MAAMI,EAAcrzB,EAAQgZ,MAAMia,GAC9BI,GACFxkB,EAAYC,iBAAiB9O,EAASizB,EAAWI,GAIrDD,wBAAwBnzB,EAAUgzB,GAWhCjqB,KAAKmqB,2BAA2BlzB,GAVHD,IAC3B,MAAM8B,EAAQ+M,EAAYU,iBAAiBvP,EAASizB,QAC/B,IAAVnxB,EACT9B,EAAQgZ,MAAMsa,eAAeL,IAE7BpkB,EAAYE,oBAAoB/O,EAASizB,GACzCjzB,EAAQgZ,MAAMia,GAAanxB,MAOjCqxB,2BAA2BlzB,EAAUszB,GAC/BxyB,EAAUd,GACZszB,EAAStzB,GAETiQ,EAAeC,KAAKlQ,EAAU+I,KAAK2D,UAAUhL,QAAQ4xB,GAIzDC,gBACE,OAAOxqB,KAAK0pB,WAAa,GClF7B,MAAMnhB,GAAU,CACdkiB,UAAW,iBACXjxB,WAAW,EACX0K,YAAY,EACZwmB,YAAa,OACbC,cAAe,MAGX7hB,GAAc,CAClB2hB,UAAW,SACXjxB,UAAW,UACX0K,WAAY,UACZwmB,YAAa,mBACbC,cAAe,mBAIXpc,GAAkB,OAElBqc,GAAmB,wBAEzB,MAAMC,GACJnnB,YAAYnL,GACVyH,KAAKmK,QAAUnK,KAAKoK,WAAW7R,GAC/ByH,KAAK8qB,aAAc,EACnB9qB,KAAK2D,SAAW,KAGlB6L,KAAKlU,GACE0E,KAAKmK,QAAQ3Q,WAKlBwG,KAAK+qB,UAED/qB,KAAKmK,QAAQjG,YACfvJ,EAAOqF,KAAKgrB,eAGdhrB,KAAKgrB,cAAcjxB,UAAUyS,IAAI+B,IAEjCvO,KAAKirB,mBAAkB,KACrB/uB,EAAQZ,OAbRY,EAAQZ,GAiBZiU,KAAKjU,GACE0E,KAAKmK,QAAQ3Q,WAKlBwG,KAAKgrB,cAAcjxB,UAAUwJ,OAAOgL,IAEpCvO,KAAKirB,mBAAkB,KACrBjrB,KAAK6D,UACL3H,EAAQZ,OARRY,EAAQZ,GAcZ0vB,cACE,IAAKhrB,KAAK2D,SAAU,CAClB,MAAMunB,EAAWzzB,SAAS0zB,cAAc,OACxCD,EAAST,UAAYzqB,KAAKmK,QAAQsgB,UAC9BzqB,KAAKmK,QAAQjG,YACfgnB,EAASnxB,UAAUyS,IApDH,QAuDlBxM,KAAK2D,SAAWunB,EAGlB,OAAOlrB,KAAK2D,SAGdyG,WAAW7R,GAST,OARAA,EAAS,IACJgQ,MACmB,iBAAXhQ,EAAsBA,EAAS,KAIrCmyB,YAAcvyB,EAAWI,EAAOmyB,aACvCryB,EAtES,WAsEaE,EAAQuQ,IACvBvQ,EAGTwyB,UACM/qB,KAAK8qB,cAIT9qB,KAAKmK,QAAQugB,YAAYU,OAAOprB,KAAKgrB,eAErC1qB,EAAaQ,GAAGd,KAAKgrB,cAAeJ,IAAiB,KACnD1uB,EAAQ8D,KAAKmK,QAAQwgB,kBAGvB3qB,KAAK8qB,aAAc,GAGrBjnB,UACO7D,KAAK8qB,cAIVxqB,EAAaC,IAAIP,KAAK2D,SAAUinB,IAEhC5qB,KAAK2D,SAASJ,SACdvD,KAAK8qB,aAAc,GAGrBG,kBAAkB3vB,GAChBa,EAAuBb,EAAU0E,KAAKgrB,cAAehrB,KAAKmK,QAAQjG,aClHtE,MAAMqE,GAAU,CACd8iB,YAAa,KACbC,WAAW,GAGPxiB,GAAc,CAClBuiB,YAAa,UACbC,UAAW,WAKPxnB,GAAa,gBAMbynB,GAAmB,WAEzB,MAAMC,GACJ9nB,YAAYnL,GACVyH,KAAKmK,QAAUnK,KAAKoK,WAAW7R,GAC/ByH,KAAKyrB,WAAY,EACjBzrB,KAAK0rB,qBAAuB,KAG9BC,WACE,MAAMN,YAAEA,EAAFC,UAAeA,GAActrB,KAAKmK,QAEpCnK,KAAKyrB,YAILH,GACFD,EAAYrD,QAGd1nB,EAAaC,IAAI9I,SAAUqM,IAC3BxD,EAAaQ,GAAGrJ,SA1BG,wBA0BsByH,GAASc,KAAK4rB,eAAe1sB,KACtEoB,EAAaQ,GAAGrJ,SA1BO,4BA0BsByH,GAASc,KAAK6rB,eAAe3sB,KAE1Ec,KAAKyrB,WAAY,GAGnBK,aACO9rB,KAAKyrB,YAIVzrB,KAAKyrB,WAAY,EACjBnrB,EAAaC,IAAI9I,SAAUqM,KAK7B8nB,eAAe1sB,GACb,MAAMlC,OAAEA,GAAWkC,GACbmsB,YAAEA,GAAgBrrB,KAAKmK,QAE7B,GAAInN,IAAWvF,UAAYuF,IAAWquB,GAAeA,EAAYrxB,SAASgD,GACxE,OAGF,MAAM2V,EAAWzL,EAAegB,kBAAkBmjB,GAE1B,IAApB1Y,EAASva,OACXizB,EAAYrD,QACHhoB,KAAK0rB,uBAAyBH,GACvC5Y,EAASA,EAASva,OAAS,GAAG4vB,QAE9BrV,EAAS,GAAGqV,QAIhB6D,eAAe3sB,GA3DD,QA4DRA,EAAMsD,MAIVxC,KAAK0rB,qBAAuBxsB,EAAM6sB,SAAWR,GA/DzB,WAkEtBnhB,WAAW7R,GAMT,OALAA,EAAS,IACJgQ,MACmB,iBAAXhQ,EAAsBA,EAAS,IAE5CF,EA9ES,YA8EaE,EAAQuQ,IACvBvQ,GCtEX,MAAMkD,GAAO,QAIP8qB,GAAa,SAEbhe,GAAU,CACd2iB,UAAU,EACVziB,UAAU,EACVuf,OAAO,GAGHlf,GAAc,CAClBoiB,SAAU,mBACVziB,SAAU,UACVuf,MAAO,WAKHgE,GAAgB,kBAChBC,GAAc,gBAEdC,GAAgB,kBAChBC,GAAuB,yBACvBC,GAAyB,2BAEzBC,GAA2B,6BAG3BC,GAAkB,aAElB/d,GAAkB,OAClBge,GAAoB,eAa1B,MAAMC,WAAc/oB,EAClBC,YAAY1M,EAASuB,GACnBmR,MAAM1S,GAENgJ,KAAKmK,QAAUnK,KAAKoK,WAAW7R,GAC/ByH,KAAKysB,QAAUvlB,EAAeK,QAfV,gBAemCvH,KAAK2D,UAC5D3D,KAAK0sB,UAAY1sB,KAAK2sB,sBACtB3sB,KAAK4sB,WAAa5sB,KAAK6sB,uBACvB7sB,KAAKsP,UAAW,EAChBtP,KAAK8sB,sBAAuB,EAC5B9sB,KAAK6O,kBAAmB,EACxB7O,KAAK+sB,WAAa,IAAItD,GAKblhB,qBACT,OAAOA,GAGE9M,kBACT,OAAOA,GAKT6J,OAAOxF,GACL,OAAOE,KAAKsP,SAAWtP,KAAKuP,OAASvP,KAAKwP,KAAK1P,GAGjD0P,KAAK1P,GACCE,KAAKsP,UAAYtP,KAAK6O,kBAIRvO,EAAamB,QAAQzB,KAAK2D,SAAUsoB,GAAY,CAChEnsB,cAAAA,IAGYiC,mBAId/B,KAAKsP,UAAW,EAEZtP,KAAKgtB,gBACPhtB,KAAK6O,kBAAmB,GAG1B7O,KAAK+sB,WAAWxd,OAEhB9X,SAASuD,KAAKjB,UAAUyS,IAAI8f,IAE5BtsB,KAAKitB,gBAELjtB,KAAKktB,kBACLltB,KAAKmtB,kBAEL7sB,EAAaQ,GAAGd,KAAKysB,QAASJ,IAAyB,KACrD/rB,EAAaS,IAAIf,KAAK2D,SA/EG,4BA+E8BzE,IACjDA,EAAMlC,SAAWgD,KAAK2D,WACxB3D,KAAK8sB,sBAAuB,SAKlC9sB,KAAKotB,eAAc,IAAMptB,KAAKqtB,aAAavtB,MAG7CyP,OACE,IAAKvP,KAAKsP,UAAYtP,KAAK6O,iBACzB,OAKF,GAFkBvO,EAAamB,QAAQzB,KAAK2D,SAtG5B,iBAwGF5B,iBACZ,OAGF/B,KAAKsP,UAAW,EAChB,MAAMpL,EAAalE,KAAKgtB,cAEpB9oB,IACFlE,KAAK6O,kBAAmB,GAG1B7O,KAAKktB,kBACLltB,KAAKmtB,kBAELntB,KAAK4sB,WAAWd,aAEhB9rB,KAAK2D,SAAS5J,UAAUwJ,OAAOgL,IAE/BjO,EAAaC,IAAIP,KAAK2D,SAAUwoB,IAChC7rB,EAAaC,IAAIP,KAAKysB,QAASJ,IAE/BrsB,KAAKiE,gBAAe,IAAMjE,KAAKstB,cAActtB,KAAK2D,SAAUO,GAG9DL,UACE,CAAC9I,OAAQiF,KAAKysB,SACX9zB,SAAQ40B,GAAejtB,EAAaC,IAAIgtB,EAlJ5B,eAoJfvtB,KAAK0sB,UAAU7oB,UACf7D,KAAK4sB,WAAWd,aAChBpiB,MAAM7F,UAGR2pB,eACExtB,KAAKitB,gBAKPN,sBACE,OAAO,IAAI9B,GAAS,CAClBrxB,UAAWqH,QAAQb,KAAKmK,QAAQ+gB,UAChChnB,WAAYlE,KAAKgtB,gBAIrBH,uBACE,OAAO,IAAIrB,GAAU,CACnBH,YAAarrB,KAAK2D,WAItByG,WAAW7R,GAOT,OANAA,EAAS,IACJgQ,MACA1C,EAAYI,kBAAkBjG,KAAK2D,aAChB,iBAAXpL,EAAsBA,EAAS,IAE5CF,EAAgBoD,GAAMlD,EAAQuQ,IACvBvQ,EAGT80B,aAAavtB,GACX,MAAMoE,EAAalE,KAAKgtB,cAClBS,EAAYvmB,EAAeK,QArJT,cAqJsCvH,KAAKysB,SAE9DzsB,KAAK2D,SAASlJ,YAAcuF,KAAK2D,SAASlJ,WAAWvC,WAAa2B,KAAKC,cAE1ErC,SAASuD,KAAKowB,OAAOprB,KAAK2D,UAG5B3D,KAAK2D,SAASqM,MAAMqX,QAAU,QAC9BrnB,KAAK2D,SAASqC,gBAAgB,eAC9BhG,KAAK2D,SAAS4B,aAAa,cAAc,GACzCvF,KAAK2D,SAAS4B,aAAa,OAAQ,UACnCvF,KAAK2D,SAASwW,UAAY,EAEtBsT,IACFA,EAAUtT,UAAY,GAGpBjW,GACFvJ,EAAOqF,KAAK2D,UAGd3D,KAAK2D,SAAS5J,UAAUyS,IAAI+B,IAa5BvO,KAAKiE,gBAXsB,KACrBjE,KAAKmK,QAAQ6d,OACfhoB,KAAK4sB,WAAWjB,WAGlB3rB,KAAK6O,kBAAmB,EACxBvO,EAAamB,QAAQzB,KAAK2D,SAjMX,iBAiMkC,CAC/C7D,cAAAA,MAIoCE,KAAKysB,QAASvoB,GAGxDgpB,kBACMltB,KAAKsP,SACPhP,EAAaQ,GAAGd,KAAK2D,SAAUyoB,IAAuBltB,IAChDc,KAAKmK,QAAQ1B,UAAYvJ,EAAMsD,MAAQ+jB,IACzCrnB,EAAMyD,iBACN3C,KAAKuP,QACKvP,KAAKmK,QAAQ1B,UAAYvJ,EAAMsD,MAAQ+jB,IACjDvmB,KAAK0tB,gCAITptB,EAAaC,IAAIP,KAAK2D,SAAUyoB,IAIpCe,kBACMntB,KAAKsP,SACPhP,EAAaQ,GAAG/F,OAAQmxB,IAAc,IAAMlsB,KAAKitB,kBAEjD3sB,EAAaC,IAAIxF,OAAQmxB,IAI7BoB,aACEttB,KAAK2D,SAASqM,MAAMqX,QAAU,OAC9BrnB,KAAK2D,SAAS4B,aAAa,eAAe,GAC1CvF,KAAK2D,SAASqC,gBAAgB,cAC9BhG,KAAK2D,SAASqC,gBAAgB,QAC9BhG,KAAK6O,kBAAmB,EACxB7O,KAAK0sB,UAAUnd,MAAK,KAClB9X,SAASuD,KAAKjB,UAAUwJ,OAAO+oB,IAC/BtsB,KAAK2tB,oBACL3tB,KAAK+sB,WAAW7M,QAChB5f,EAAamB,QAAQzB,KAAK2D,SAAUqoB,OAIxCoB,cAAc9xB,GACZgF,EAAaQ,GAAGd,KAAK2D,SAAUwoB,IAAqBjtB,IAC9Cc,KAAK8sB,qBACP9sB,KAAK8sB,sBAAuB,EAI1B5tB,EAAMlC,SAAWkC,EAAM0uB,iBAIG,IAA1B5tB,KAAKmK,QAAQ+gB,SACflrB,KAAKuP,OAC8B,WAA1BvP,KAAKmK,QAAQ+gB,UACtBlrB,KAAK0tB,iCAIT1tB,KAAK0sB,UAAUld,KAAKlU,GAGtB0xB,cACE,OAAOhtB,KAAK2D,SAAS5J,UAAUC,SA3PX,QA8PtB0zB,6BAEE,GADkBptB,EAAamB,QAAQzB,KAAK2D,SA3QlB,0BA4QZ5B,iBACZ,OAGF,MAAMhI,UAAEA,EAAF2hB,aAAaA,EAAb1L,MAA2BA,GAAUhQ,KAAK2D,SAC1CkqB,EAAqBnS,EAAejkB,SAAS2C,gBAAgB0c,cAG7D+W,GAA0C,WAApB7d,EAAMyK,WAA2B1gB,EAAUC,SAASuyB,MAI3EsB,IACH7d,EAAMyK,UAAY,UAGpB1gB,EAAUyS,IAAI+f,IACdvsB,KAAKiE,gBAAe,KAClBlK,EAAUwJ,OAAOgpB,IACZsB,GACH7tB,KAAKiE,gBAAe,KAClB+L,EAAMyK,UAAY,KACjBza,KAAKysB,WAETzsB,KAAKysB,SAERzsB,KAAK2D,SAASqkB,SAOhBiF,gBACE,MAAMY,EAAqB7tB,KAAK2D,SAAS+X,aAAejkB,SAAS2C,gBAAgB0c,aAC3EoT,EAAiBlqB,KAAK+sB,WAAWrD,WACjCoE,EAAoB5D,EAAiB,IAErC4D,GAAqBD,IAAuB3yB,KAAa4yB,IAAsBD,GAAsB3yB,OACzG8E,KAAK2D,SAASqM,MAAM+d,YAAe,GAAE7D,QAGlC4D,IAAsBD,IAAuB3yB,MAAc4yB,GAAqBD,GAAsB3yB,OACzG8E,KAAK2D,SAASqM,MAAMge,aAAgB,GAAE9D,OAI1CyD,oBACE3tB,KAAK2D,SAASqM,MAAM+d,YAAc,GAClC/tB,KAAK2D,SAASqM,MAAMge,aAAe,GAKf7pB,uBAAC5L,EAAQuH,GAC7B,OAAOE,KAAKiF,MAAK,WACf,MAAMC,EAAOsnB,GAAM3nB,oBAAoB7E,KAAMzH,GAE7C,GAAsB,iBAAXA,EAAX,CAIA,QAA4B,IAAjB2M,EAAK3M,GACd,MAAM,IAAIe,UAAW,oBAAmBf,MAG1C2M,EAAK3M,GAAQuH,QAWnBQ,EAAaQ,GAAGrJ,SAhVc,0BAUD,4BAsUyC,SAAUyH,GAC9E,MAAMlC,EAASrF,EAAuBqI,MAElC,CAAC,IAAK,QAAQ5I,SAAS4I,KAAK2E,UAC9BzF,EAAMyD,iBAGRrC,EAAaS,IAAI/D,EAAQivB,IAAYgC,IAC/BA,EAAUlsB,kBAKdzB,EAAaS,IAAI/D,EAAQgvB,IAAc,KACjCxyB,EAAUwG,OACZA,KAAKgoB,cAMX,MAAMkG,EAAehnB,EAAeK,QA9VhB,eA+VhB2mB,GACF1B,GAAMpoB,YAAY8pB,GAAc3e,OAGrBid,GAAM3nB,oBAAoB7H,GAElCsI,OAAOtF,SAGduE,EAAqBioB,IASrBpxB,EAAmBoxB,ICrZnB,MAAM/wB,GAAO,YAOP8M,GAAU,CACd2iB,UAAU,EACVziB,UAAU,EACV8Q,QAAQ,GAGJzQ,GAAc,CAClBoiB,SAAU,UACVziB,SAAU,UACV8Q,OAAQ,WAGJhL,GAAkB,OAElB4f,GAAgB,kBAKhBnC,GAAgB,sBAYtB,MAAMoC,WAAkB3qB,EACtBC,YAAY1M,EAASuB,GACnBmR,MAAM1S,GAENgJ,KAAKmK,QAAUnK,KAAKoK,WAAW7R,GAC/ByH,KAAKsP,UAAW,EAChBtP,KAAK0sB,UAAY1sB,KAAK2sB,sBACtB3sB,KAAK4sB,WAAa5sB,KAAK6sB,uBACvB7sB,KAAK2K,qBAKIlP,kBACT,OAAOA,GAGE8M,qBACT,OAAOA,GAKTjD,OAAOxF,GACL,OAAOE,KAAKsP,SAAWtP,KAAKuP,OAASvP,KAAKwP,KAAK1P,GAGjD0P,KAAK1P,GACCE,KAAKsP,UAIShP,EAAamB,QAAQzB,KAAK2D,SA/C5B,oBA+CkD,CAAE7D,cAAAA,IAEtDiC,mBAId/B,KAAKsP,UAAW,EAChBtP,KAAK2D,SAASqM,MAAMqe,WAAa,UAEjCruB,KAAK0sB,UAAUld,OAEVxP,KAAKmK,QAAQoP,SAChB,IAAIkQ,IAAkBla,OAGxBvP,KAAK2D,SAASqC,gBAAgB,eAC9BhG,KAAK2D,SAAS4B,aAAa,cAAc,GACzCvF,KAAK2D,SAAS4B,aAAa,OAAQ,UACnCvF,KAAK2D,SAAS5J,UAAUyS,IAAI+B,IAU5BvO,KAAKiE,gBARoB,KAClBjE,KAAKmK,QAAQoP,QAChBvZ,KAAK4sB,WAAWjB,WAGlBrrB,EAAamB,QAAQzB,KAAK2D,SAvEX,qBAuEkC,CAAE7D,cAAAA,MAGfE,KAAK2D,UAAU,IAGvD4L,OACOvP,KAAKsP,WAIQhP,EAAamB,QAAQzB,KAAK2D,SAjF5B,qBAmFF5B,mBAId/B,KAAK4sB,WAAWd,aAChB9rB,KAAK2D,SAAS2qB,OACdtuB,KAAKsP,UAAW,EAChBtP,KAAK2D,SAAS5J,UAAUwJ,OAAOgL,IAC/BvO,KAAK0sB,UAAUnd,OAefvP,KAAKiE,gBAboB,KACvBjE,KAAK2D,SAAS4B,aAAa,eAAe,GAC1CvF,KAAK2D,SAASqC,gBAAgB,cAC9BhG,KAAK2D,SAASqC,gBAAgB,QAC9BhG,KAAK2D,SAASqM,MAAMqe,WAAa,SAE5BruB,KAAKmK,QAAQoP,SAChB,IAAIkQ,IAAkBvJ,QAGxB5f,EAAamB,QAAQzB,KAAK2D,SAAUqoB,MAGAhsB,KAAK2D,UAAU,KAGvDE,UACE7D,KAAK0sB,UAAU7oB,UACf7D,KAAK4sB,WAAWd,aAChBpiB,MAAM7F,UAKRuG,WAAW7R,GAOT,OANAA,EAAS,IACJgQ,MACA1C,EAAYI,kBAAkBjG,KAAK2D,aAChB,iBAAXpL,EAAsBA,EAAS,IAE5CF,EAAgBoD,GAAMlD,EAAQuQ,IACvBvQ,EAGTo0B,sBACE,OAAO,IAAI9B,GAAS,CAClBJ,UAtIsB,qBAuItBjxB,UAAWwG,KAAKmK,QAAQ+gB,SACxBhnB,YAAY,EACZwmB,YAAa1qB,KAAK2D,SAASlJ,WAC3BkwB,cAAe,IAAM3qB,KAAKuP,SAI9Bsd,uBACE,OAAO,IAAIrB,GAAU,CACnBH,YAAarrB,KAAK2D,WAItBgH,qBACErK,EAAaQ,GAAGd,KAAK2D,SA7IM,gCA6I2BzE,IAChDc,KAAKmK,QAAQ1B,UArKJ,WAqKgBvJ,EAAMsD,KACjCxC,KAAKuP,UAOWpL,uBAAC5L,GACrB,OAAOyH,KAAKiF,MAAK,WACf,MAAMC,EAAOkpB,GAAUvpB,oBAAoB7E,KAAMzH,GAEjD,GAAsB,iBAAXA,EAAX,CAIA,QAAqB4M,IAAjBD,EAAK3M,IAAyBA,EAAOlB,WAAW,MAAmB,gBAAXkB,EAC1D,MAAM,IAAIe,UAAW,oBAAmBf,MAG1C2M,EAAK3M,GAAQyH,WAWnBM,EAAaQ,GAAGrJ,SA9Kc,8BAGD,gCA2KyC,SAAUyH,GAC9E,MAAMlC,EAASrF,EAAuBqI,MAMtC,GAJI,CAAC,IAAK,QAAQ5I,SAAS4I,KAAK2E,UAC9BzF,EAAMyD,iBAGJ/I,EAAWoG,MACb,OAGFM,EAAaS,IAAI/D,EAAQgvB,IAAc,KAEjCxyB,EAAUwG,OACZA,KAAKgoB,WAKT,MAAMkG,EAAehnB,EAAeK,QAAQ4mB,IACxCD,GAAgBA,IAAiBlxB,GACnCoxB,GAAUhqB,YAAY8pB,GAAc3e,OAGzB6e,GAAUvpB,oBAAoB7H,GACtCsI,OAAOtF,SAGdM,EAAaQ,GAAG/F,OAjOa,8BAiOgB,IAC3CmM,EAAeC,KAAKgnB,IAAex1B,SAAQ2P,GAAM8lB,GAAUvpB,oBAAoByD,GAAIkH,WAGrFjL,EAAqB6pB,IAOrBhzB,EAAmBgzB,ICtQnB,MAAMG,GAAgB,IAAI/vB,IAAI,CAC5B,aACA,OACA,OACA,WACA,WACA,SACA,MACA,eAUIgwB,GAAmB,iEAOnBC,GAAmB,qIAEnBC,GAAmB,CAACpb,EAAWqb,KACnC,MAAMC,EAAgBtb,EAAUxB,SAAS3Y,cAEzC,GAAIw1B,EAAqBv3B,SAASw3B,GAChC,OAAIL,GAAc7uB,IAAIkvB,IACb/tB,QAAQ2tB,GAAiBn1B,KAAKia,EAAUub,YAAcJ,GAAiBp1B,KAAKia,EAAUub,YAMjG,MAAMC,EAASH,EAAqBvoB,QAAO2oB,GAAkBA,aAA0B31B,SAGvF,IAAK,IAAI4F,EAAI,EAAGC,EAAM6vB,EAAO12B,OAAQ4G,EAAIC,EAAKD,IAC5C,GAAI8vB,EAAO9vB,GAAG3F,KAAKu1B,GACjB,OAAO,EAIX,OAAO,GAqCF,SAASI,GAAaC,EAAYC,EAAWC,GAClD,IAAKF,EAAW72B,OACd,OAAO62B,EAGT,GAAIE,GAAoC,mBAAfA,EACvB,OAAOA,EAAWF,GAGpB,MACMG,GADY,IAAIr0B,OAAOs0B,WACKC,gBAAgBL,EAAY,aACxDtc,EAAW,GAAGvL,UAAUgoB,EAAgBp0B,KAAKqF,iBAAiB,MAEpE,IAAK,IAAIrB,EAAI,EAAGC,EAAM0T,EAASva,OAAQ4G,EAAIC,EAAKD,IAAK,CACnD,MAAMhI,EAAU2b,EAAS3T,GACnBuwB,EAAcv4B,EAAQ8a,SAAS3Y,cAErC,IAAKV,OAAOC,KAAKw2B,GAAW93B,SAASm4B,GAAc,CACjDv4B,EAAQuM,SAER,SAGF,MAAMisB,EAAgB,GAAGpoB,UAAUpQ,EAAQkP,YACrCupB,EAAoB,GAAGroB,OAAO8nB,EAAU,MAAQ,GAAIA,EAAUK,IAAgB,IAEpFC,EAAc72B,SAAQ2a,IACfob,GAAiBpb,EAAWmc,IAC/Bz4B,EAAQgP,gBAAgBsN,EAAUxB,aAKxC,OAAOsd,EAAgBp0B,KAAK00B,UC5F9B,MAAMj0B,GAAO,UAIPk0B,GAAwB,IAAInxB,IAAI,CAAC,WAAY,YAAa,eAE1DsK,GAAc,CAClB8mB,UAAW,UACXC,SAAU,SACVC,MAAO,4BACPruB,QAAS,SACTsuB,MAAO,kBACP5U,KAAM,UACNlkB,SAAU,mBACVga,UAAW,oBACXzK,OAAQ,0BACRmJ,UAAW,2BACXmP,mBAAoB,QACpB1C,SAAU,mBACV4T,YAAa,oBACbC,SAAU,UACVd,WAAY,kBACZD,UAAW,SACX5H,aAAc,0BAGV4I,GAAgB,CACpBC,KAAM,OACNC,IAAK,MACLC,MAAOn1B,IAAU,OAAS,QAC1Bo1B,OAAQ,SACRC,KAAMr1B,IAAU,QAAU,QAGtBqN,GAAU,CACdqnB,WAAW,EACXC,SAAU,+GAIVpuB,QAAS,cACTquB,MAAO,GACPC,MAAO,EACP5U,MAAM,EACNlkB,UAAU,EACVga,UAAW,MACXzK,OAAQ,CAAC,EAAG,GACZmJ,WAAW,EACXmP,mBAAoB,CAAC,MAAO,QAAS,SAAU,QAC/C1C,SAAU,kBACV4T,YAAa,GACbC,UAAU,EACVd,WAAY,KACZD,UD5B8B,CAE9B,IAAK,CAAC,QAAS,MAAO,KAAM,OAAQ,OAzCP,kBA0C7B9Q,EAAG,CAAC,SAAU,OAAQ,QAAS,OAC/BoS,KAAM,GACNnS,EAAG,GACHoS,GAAI,GACJC,IAAK,GACLC,KAAM,GACNC,IAAK,GACLC,GAAI,GACJC,GAAI,GACJC,GAAI,GACJC,GAAI,GACJC,GAAI,GACJC,GAAI,GACJC,GAAI,GACJC,GAAI,GACJpyB,EAAG,GACHqyB,IAAK,CAAC,MAAO,SAAU,MAAO,QAAS,QAAS,UAChDC,GAAI,GACJC,GAAI,GACJC,EAAG,GACHC,IAAK,GACLC,EAAG,GACHC,MAAO,GACPC,KAAM,GACNC,IAAK,GACLC,IAAK,GACLC,OAAQ,GACRC,EAAG,GACHC,GAAI,ICFJ3K,aAAc,MAGVxvB,GAAQ,CACZo6B,KAAO,kBACPC,OAAS,oBACTC,KAAO,kBACPC,MAAQ,mBACRC,SAAW,sBACXC,MAAQ,mBACRC,QAAU,qBACVC,SAAW,sBACXC,WAAa,wBACbC,WAAa,yBAGTC,GAAkB,OAElBrkB,GAAkB,OAElBskB,GAAmB,OACnBC,GAAkB,MAElBC,GAAyB,iBACzBC,GAAkB,SAElBC,GAAmB,gBAEnBC,GAAgB,QAChBC,GAAgB,QAUtB,MAAMC,WAAgB3vB,EACpBC,YAAY1M,EAASuB,GACnB,QAAsB,IAAX2vB,GACT,MAAM,IAAI5uB,UAAU,+DAGtBoQ,MAAM1S,GAGNgJ,KAAKqzB,YAAa,EAClBrzB,KAAKszB,SAAW,EAChBtzB,KAAKuzB,YAAc,GACnBvzB,KAAKwzB,eAAiB,GACtBxzB,KAAKynB,QAAU,KAGfznB,KAAKmK,QAAUnK,KAAKoK,WAAW7R,GAC/ByH,KAAKyzB,IAAM,KAEXzzB,KAAK0zB,gBAKInrB,qBACT,OAAOA,GAGE9M,kBACT,OAAOA,GAGE3D,mBACT,OAAOA,GAGEgR,yBACT,OAAOA,GAKT6qB,SACE3zB,KAAKqzB,YAAa,EAGpBO,UACE5zB,KAAKqzB,YAAa,EAGpBQ,gBACE7zB,KAAKqzB,YAAcrzB,KAAKqzB,WAG1B/tB,OAAOpG,GACL,GAAKc,KAAKqzB,WAIV,GAAIn0B,EAAO,CACT,MAAM6pB,EAAU/oB,KAAK8zB,6BAA6B50B,GAElD6pB,EAAQyK,eAAeO,OAAShL,EAAQyK,eAAeO,MAEnDhL,EAAQiL,uBACVjL,EAAQkL,OAAO,KAAMlL,GAErBA,EAAQmL,OAAO,KAAMnL,OAElB,CACL,GAAI/oB,KAAKm0B,gBAAgBp6B,UAAUC,SAASuU,IAE1C,YADAvO,KAAKk0B,OAAO,KAAMl0B,MAIpBA,KAAKi0B,OAAO,KAAMj0B,OAItB6D,UACEyI,aAAatM,KAAKszB,UAElBhzB,EAAaC,IAAIP,KAAK2D,SAASiB,QAAQouB,IAAiBC,GAAkBjzB,KAAKo0B,mBAE3Ep0B,KAAKyzB,KACPzzB,KAAKyzB,IAAIlwB,SAGXvD,KAAKq0B,iBACL3qB,MAAM7F,UAGR2L,OACE,GAAoC,SAAhCxP,KAAK2D,SAASqM,MAAMqX,QACtB,MAAM,IAAI/iB,MAAM,uCAGlB,IAAMtE,KAAKs0B,kBAAmBt0B,KAAKqzB,WACjC,OAGF,MAAMpF,EAAY3tB,EAAamB,QAAQzB,KAAK2D,SAAU3D,KAAK0D,YAAY5L,MAAMs6B,MACvEmC,EAAap6B,EAAe6F,KAAK2D,UACjC6wB,EAA4B,OAAfD,EACjBv0B,KAAK2D,SAASsO,cAAc7X,gBAAgBJ,SAASgG,KAAK2D,UAC1D4wB,EAAWv6B,SAASgG,KAAK2D,UAE3B,GAAIsqB,EAAUlsB,mBAAqByyB,EACjC,OAK4B,YAA1Bx0B,KAAK0D,YAAYjI,MAAsBuE,KAAKyzB,KAAOzzB,KAAKy0B,aAAez0B,KAAKyzB,IAAI/7B,cAAcq7B,IAAwBrD,YACxH1vB,KAAKq0B,iBACLr0B,KAAKyzB,IAAIlwB,SACTvD,KAAKyzB,IAAM,MAGb,MAAMA,EAAMzzB,KAAKm0B,gBACXO,EvE3NKC,CAAAA,IACb,GACEA,GAAUh3B,KAAKi3B,MArBH,IAqBSj3B,KAAKk3B,gBACnBp9B,SAASq9B,eAAeH,IAEjC,OAAOA,GuEsNSI,CAAO/0B,KAAK0D,YAAYjI,MAEtCg4B,EAAIluB,aAAa,KAAMmvB,GACvB10B,KAAK2D,SAAS4B,aAAa,mBAAoBmvB,GAE3C10B,KAAKmK,QAAQylB,WACf6D,EAAI15B,UAAUyS,IAAIomB,IAGpB,MAAM3hB,EAA8C,mBAA3BjR,KAAKmK,QAAQ8G,UACpCjR,KAAKmK,QAAQ8G,UAAUhY,KAAK+G,KAAMyzB,EAAKzzB,KAAK2D,UAC5C3D,KAAKmK,QAAQ8G,UAET+jB,EAAah1B,KAAKi1B,eAAehkB,GACvCjR,KAAKk1B,oBAAoBF,GAEzB,MAAMrlB,UAAEA,GAAc3P,KAAKmK,QAC3BrH,EAAKC,IAAI0wB,EAAKzzB,KAAK0D,YAAYE,SAAU5D,MAEpCA,KAAK2D,SAASsO,cAAc7X,gBAAgBJ,SAASgG,KAAKyzB,OAC7D9jB,EAAUyb,OAAOqI,GACjBnzB,EAAamB,QAAQzB,KAAK2D,SAAU3D,KAAK0D,YAAY5L,MAAMw6B,WAGzDtyB,KAAKynB,QACPznB,KAAKynB,QAAQ7N,SAEb5Z,KAAKynB,QAAUS,GAAoBloB,KAAK2D,SAAU8vB,EAAKzzB,KAAKooB,iBAAiB4M,IAG/EvB,EAAI15B,UAAUyS,IAAI+B,IAElB,MAAMyhB,EAAchwB,KAAKm1B,yBAAyBn1B,KAAKmK,QAAQ6lB,aAC3DA,GACFyD,EAAI15B,UAAUyS,OAAOwjB,EAAY14B,MAAM,MAOrC,iBAAkBG,SAAS2C,iBAC7B,GAAGgN,UAAU3P,SAASuD,KAAKwM,UAAU7O,SAAQ3B,IAC3CsJ,EAAaQ,GAAG9J,EAAS,YAAa0D,MAI1C,MAWMwJ,EAAalE,KAAKyzB,IAAI15B,UAAUC,SAAS44B,IAC/C5yB,KAAKiE,gBAZY,KACf,MAAMmxB,EAAiBp1B,KAAKuzB,YAE5BvzB,KAAKuzB,YAAc,KACnBjzB,EAAamB,QAAQzB,KAAK2D,SAAU3D,KAAK0D,YAAY5L,MAAMu6B,OAEvD+C,IAAmBtC,IACrB9yB,KAAKk0B,OAAO,KAAMl0B,QAKQA,KAAKyzB,IAAKvvB,GAG1CqL,OACE,IAAKvP,KAAKynB,QACR,OAGF,MAAMgM,EAAMzzB,KAAKm0B,gBAkBjB,GADkB7zB,EAAamB,QAAQzB,KAAK2D,SAAU3D,KAAK0D,YAAY5L,MAAMo6B,MAC/DnwB,iBACZ,OAGF0xB,EAAI15B,UAAUwJ,OAAOgL,IAIjB,iBAAkB9W,SAAS2C,iBAC7B,GAAGgN,UAAU3P,SAASuD,KAAKwM,UACxB7O,SAAQ3B,GAAWsJ,EAAaC,IAAIvJ,EAAS,YAAa0D,KAG/DsF,KAAKwzB,eAAL,OAAqC,EACrCxzB,KAAKwzB,eAAL,OAAqC,EACrCxzB,KAAKwzB,eAAL,OAAqC,EAErC,MAAMtvB,EAAalE,KAAKyzB,IAAI15B,UAAUC,SAAS44B,IAC/C5yB,KAAKiE,gBAnCY,KACXjE,KAAKg0B,yBAILh0B,KAAKuzB,cAAgBV,IACvBY,EAAIlwB,SAGNvD,KAAKq1B,iBACLr1B,KAAK2D,SAASqC,gBAAgB,oBAC9B1F,EAAamB,QAAQzB,KAAK2D,SAAU3D,KAAK0D,YAAY5L,MAAMq6B,QAE3DnyB,KAAKq0B,oBAsBuBr0B,KAAKyzB,IAAKvvB,GACxClE,KAAKuzB,YAAc,GAGrB3Z,SACuB,OAAjB5Z,KAAKynB,SACPznB,KAAKynB,QAAQ7N,SAMjB0a,gBACE,OAAOzzB,QAAQb,KAAKy0B,YAGtBN,gBACE,GAAIn0B,KAAKyzB,IACP,OAAOzzB,KAAKyzB,IAGd,MAAMz8B,EAAUS,SAAS0zB,cAAc,OACvCn0B,EAAQ04B,UAAY1vB,KAAKmK,QAAQ0lB,SAEjC,MAAM4D,EAAMz8B,EAAQwQ,SAAS,GAK7B,OAJAxH,KAAKs1B,WAAW7B,GAChBA,EAAI15B,UAAUwJ,OAAOqvB,GAAiBrkB,IAEtCvO,KAAKyzB,IAAMA,EACJzzB,KAAKyzB,IAGd6B,WAAW7B,GACTzzB,KAAKu1B,uBAAuB9B,EAAKzzB,KAAKy0B,WAAY1B,IAGpDwC,uBAAuB1F,EAAU2F,EAASv+B,GACxC,MAAMw+B,EAAkBvuB,EAAeK,QAAQtQ,EAAU44B,GAEpD2F,IAAWC,EAMhBz1B,KAAK01B,kBAAkBD,EAAiBD,GALtCC,EAAgBlyB,SAQpBmyB,kBAAkB1+B,EAASw+B,GACzB,GAAgB,OAAZx+B,EAIJ,OAAIe,EAAUy9B,IACZA,EAAUr9B,EAAWq9B,QAGjBx1B,KAAKmK,QAAQgR,KACXqa,EAAQ/6B,aAAezD,IACzBA,EAAQ04B,UAAY,GACpB14B,EAAQo0B,OAAOoK,IAGjBx+B,EAAQ2+B,YAAcH,EAAQG,mBAM9B31B,KAAKmK,QAAQgR,MACXnb,KAAKmK,QAAQ8lB,WACfuF,EAAUxG,GAAawG,EAASx1B,KAAKmK,QAAQ+kB,UAAWlvB,KAAKmK,QAAQglB,aAGvEn4B,EAAQ04B,UAAY8F,GAEpBx+B,EAAQ2+B,YAAcH,GAI1Bf,WACE,MAAM3E,EAAQ9vB,KAAK2D,SAASzM,aAAa,2BAA6B8I,KAAKmK,QAAQ2lB,MAEnF,OAAO9vB,KAAKm1B,yBAAyBrF,GAGvC8F,iBAAiBZ,GACf,MAAmB,UAAfA,EACK,MAGU,SAAfA,EACK,QAGFA,EAKTlB,6BAA6B50B,EAAO6pB,GAClC,OAAOA,GAAW/oB,KAAK0D,YAAYmB,oBAAoB3F,EAAMa,eAAgBC,KAAK61B,sBAGpFpN,aACE,MAAMjiB,OAAEA,GAAWxG,KAAKmK,QAExB,MAAsB,iBAAX3D,EACFA,EAAOlP,MAAM,KAAK8Q,KAAI3C,GAAO/I,OAAOwQ,SAASzH,EAAK,MAGrC,mBAAXe,EACFkiB,GAAcliB,EAAOkiB,EAAY1oB,KAAK2D,UAGxC6C,EAGT2uB,yBAAyBK,GACvB,MAA0B,mBAAZA,EAAyBA,EAAQv8B,KAAK+G,KAAK2D,UAAY6xB,EAGvEpN,iBAAiB4M,GACf,MAAMrM,EAAwB,CAC5B1X,UAAW+jB,EACXtR,UAAW,CACT,CACEloB,KAAM,OACNyX,QAAS,CACP6L,mBAAoB9e,KAAKmK,QAAQ2U,qBAGrC,CACEtjB,KAAM,SACNyX,QAAS,CACPzM,OAAQxG,KAAKyoB,eAGjB,CACEjtB,KAAM,kBACNyX,QAAS,CACPmJ,SAAUpc,KAAKmK,QAAQiS,WAG3B,CACE5gB,KAAM,QACNyX,QAAS,CACPjc,QAAU,IAAGgJ,KAAK0D,YAAYjI,eAGlC,CACED,KAAM,WACN+W,SAAS,EACTC,MAAO,aACP7W,GAAIuJ,GAAQlF,KAAK81B,6BAA6B5wB,KAGlDghB,cAAehhB,IACTA,EAAK+N,QAAQhC,YAAc/L,EAAK+L,WAClCjR,KAAK81B,6BAA6B5wB,KAKxC,MAAO,IACFyjB,KACsC,mBAA9B3oB,KAAKmK,QAAQmd,aAA8BtnB,KAAKmK,QAAQmd,aAAaqB,GAAyB3oB,KAAKmK,QAAQmd,cAI1H4N,oBAAoBF,GAClBh1B,KAAKm0B,gBAAgBp6B,UAAUyS,IAAK,GAAExM,KAAK+1B,0BAA0B/1B,KAAK41B,iBAAiBZ,MAG7FC,eAAehkB,GACb,OAAOif,GAAcjf,EAAU1X,eAGjCm6B,gBACmB1zB,KAAKmK,QAAQ1I,QAAQnK,MAAM,KAEnCqB,SAAQ8I,IACf,GAAgB,UAAZA,EACFnB,EAAaQ,GAAGd,KAAK2D,SAAU3D,KAAK0D,YAAY5L,MAAMy6B,MAAOvyB,KAAKmK,QAAQlT,UAAUiI,GAASc,KAAKsF,OAAOpG,UACpG,GA/ZU,WA+ZNuC,EAA4B,CACrC,MAAMu0B,EAAUv0B,IAAYyxB,GAC1BlzB,KAAK0D,YAAY5L,MAAM46B,WACvB1yB,KAAK0D,YAAY5L,MAAM06B,QACnByD,EAAWx0B,IAAYyxB,GAC3BlzB,KAAK0D,YAAY5L,MAAM66B,WACvB3yB,KAAK0D,YAAY5L,MAAM26B,SAEzBnyB,EAAaQ,GAAGd,KAAK2D,SAAUqyB,EAASh2B,KAAKmK,QAAQlT,UAAUiI,GAASc,KAAKi0B,OAAO/0B,KACpFoB,EAAaQ,GAAGd,KAAK2D,SAAUsyB,EAAUj2B,KAAKmK,QAAQlT,UAAUiI,GAASc,KAAKk0B,OAAOh1B,SAIzFc,KAAKo0B,kBAAoB,KACnBp0B,KAAK2D,UACP3D,KAAKuP,QAITjP,EAAaQ,GAAGd,KAAK2D,SAASiB,QAAQouB,IAAiBC,GAAkBjzB,KAAKo0B,mBAE1Ep0B,KAAKmK,QAAQlT,SACf+I,KAAKmK,QAAU,IACVnK,KAAKmK,QACR1I,QAAS,SACTxK,SAAU,IAGZ+I,KAAKk2B,YAITA,YACE,MAAMpG,EAAQ9vB,KAAK2D,SAASzM,aAAa,SACnCi/B,SAA2Bn2B,KAAK2D,SAASzM,aAAa,2BAExD44B,GAA+B,WAAtBqG,KACXn2B,KAAK2D,SAAS4B,aAAa,yBAA0BuqB,GAAS,KAC1DA,GAAU9vB,KAAK2D,SAASzM,aAAa,eAAkB8I,KAAK2D,SAASgyB,aACvE31B,KAAK2D,SAAS4B,aAAa,aAAcuqB,GAG3C9vB,KAAK2D,SAAS4B,aAAa,QAAS,KAIxC0uB,OAAO/0B,EAAO6pB,GACZA,EAAU/oB,KAAK8zB,6BAA6B50B,EAAO6pB,GAE/C7pB,IACF6pB,EAAQyK,eACS,YAAft0B,EAAMsB,KAAqB2yB,GAAgBD,KACzC,GAGFnK,EAAQoL,gBAAgBp6B,UAAUC,SAASuU,KAAoBwa,EAAQwK,cAAgBV,GACzF9J,EAAQwK,YAAcV,IAIxBvmB,aAAayc,EAAQuK,UAErBvK,EAAQwK,YAAcV,GAEjB9J,EAAQ5e,QAAQ4lB,OAAUhH,EAAQ5e,QAAQ4lB,MAAMvgB,KAKrDuZ,EAAQuK,SAAWp2B,YAAW,KACxB6rB,EAAQwK,cAAgBV,IAC1B9J,EAAQvZ,SAETuZ,EAAQ5e,QAAQ4lB,MAAMvgB,MARvBuZ,EAAQvZ,QAWZ0kB,OAAOh1B,EAAO6pB,GACZA,EAAU/oB,KAAK8zB,6BAA6B50B,EAAO6pB,GAE/C7pB,IACF6pB,EAAQyK,eACS,aAAft0B,EAAMsB,KAAsB2yB,GAAgBD,IAC1CnK,EAAQplB,SAAS3J,SAASkF,EAAMY,gBAGlCipB,EAAQiL,yBAIZ1nB,aAAayc,EAAQuK,UAErBvK,EAAQwK,YAAcT,GAEjB/J,EAAQ5e,QAAQ4lB,OAAUhH,EAAQ5e,QAAQ4lB,MAAMxgB,KAKrDwZ,EAAQuK,SAAWp2B,YAAW,KACxB6rB,EAAQwK,cAAgBT,IAC1B/J,EAAQxZ,SAETwZ,EAAQ5e,QAAQ4lB,MAAMxgB,MARvBwZ,EAAQxZ,QAWZykB,uBACE,IAAK,MAAMvyB,KAAWzB,KAAKwzB,eACzB,GAAIxzB,KAAKwzB,eAAe/xB,GACtB,OAAO,EAIX,OAAO,EAGT2I,WAAW7R,GACT,MAAM69B,EAAiBvwB,EAAYI,kBAAkBjG,KAAK2D,UAqC1D,OAnCAlL,OAAOC,KAAK09B,GAAgBz9B,SAAQ09B,IAC9B1G,GAAsBjwB,IAAI22B,WACrBD,EAAeC,OAI1B99B,EAAS,IACJyH,KAAK0D,YAAY6E,WACjB6tB,KACmB,iBAAX79B,GAAuBA,EAASA,EAAS,KAG/CoX,WAAiC,IAArBpX,EAAOoX,UAAsBlY,SAASuD,KAAO7C,EAAWI,EAAOoX,WAEtD,iBAAjBpX,EAAOw3B,QAChBx3B,EAAOw3B,MAAQ,CACbvgB,KAAMjX,EAAOw3B,MACbxgB,KAAMhX,EAAOw3B,QAIW,iBAAjBx3B,EAAOu3B,QAChBv3B,EAAOu3B,MAAQv3B,EAAOu3B,MAAM92B,YAGA,iBAAnBT,EAAOi9B,UAChBj9B,EAAOi9B,QAAUj9B,EAAOi9B,QAAQx8B,YAGlCX,EAAgBoD,GAAMlD,EAAQyH,KAAK0D,YAAYoF,aAE3CvQ,EAAO03B,WACT13B,EAAOs3B,SAAWb,GAAaz2B,EAAOs3B,SAAUt3B,EAAO22B,UAAW32B,EAAO42B,aAGpE52B,EAGTs9B,qBACE,MAAMt9B,EAAS,GAEf,IAAK,MAAMiK,KAAOxC,KAAKmK,QACjBnK,KAAK0D,YAAY6E,QAAQ/F,KAASxC,KAAKmK,QAAQ3H,KACjDjK,EAAOiK,GAAOxC,KAAKmK,QAAQ3H,IAO/B,OAAOjK,EAGT88B,iBACE,MAAM5B,EAAMzzB,KAAKm0B,gBACXmC,EAAwB,IAAIl9B,OAAQ,UAAS4G,KAAK+1B,6BAA8B,KAChFQ,EAAW9C,EAAIv8B,aAAa,SAASgC,MAAMo9B,GAChC,OAAbC,GAAqBA,EAASn+B,OAAS,GACzCm+B,EAASnuB,KAAIouB,GAASA,EAAMj/B,SACzBoB,SAAQ89B,GAAUhD,EAAI15B,UAAUwJ,OAAOkzB,KAI9CV,uBACE,MAvqBiB,aA0qBnBD,6BAA6BpN,GAC3B,MAAMhW,MAAEA,GAAUgW,EAEbhW,IAIL1S,KAAKyzB,IAAM/gB,EAAMC,SAAS/B,OAC1B5Q,KAAKq1B,iBACLr1B,KAAKk1B,oBAAoBl1B,KAAKi1B,eAAeviB,EAAMzB,aAGrDojB,iBACMr0B,KAAKynB,UACPznB,KAAKynB,QAAQxB,UACbjmB,KAAKynB,QAAU,MAMGtjB,uBAAC5L,GACrB,OAAOyH,KAAKiF,MAAK,WACf,MAAMC,EAAOkuB,GAAQvuB,oBAAoB7E,KAAMzH,GAE/C,GAAsB,iBAAXA,EAAqB,CAC9B,QAA4B,IAAjB2M,EAAK3M,GACd,MAAM,IAAIe,UAAW,oBAAmBf,MAG1C2M,EAAK3M,UAab6C,EAAmBg4B,ICxuBnB,MAKM7qB,GAAU,IACX6qB,GAAQ7qB,QACX0I,UAAW,QACXzK,OAAQ,CAAC,EAAG,GACZ/E,QAAS,QACT+zB,QAAS,GACT3F,SAAU,+IAON/mB,GAAc,IACfsqB,GAAQtqB,YACX0sB,QAAS,6BAGL19B,GAAQ,CACZo6B,KAAO,kBACPC,OAAS,oBACTC,KAAO,kBACPC,MAAQ,mBACRC,SAAW,sBACXC,MAAQ,mBACRC,QAAU,qBACVC,SAAW,sBACXC,WAAa,wBACbC,WAAa,yBAYf,MAAM+D,WAAgBtD,GAGT7qB,qBACT,OAAOA,GAGE9M,kBACT,MArDS,UAwDA3D,mBACT,OAAOA,GAGEgR,yBACT,OAAOA,GAKTwrB,gBACE,OAAOt0B,KAAKy0B,YAAcz0B,KAAK22B,cAGjCrB,WAAW7B,GACTzzB,KAAKu1B,uBAAuB9B,EAAKzzB,KAAKy0B,WAnCnB,mBAoCnBz0B,KAAKu1B,uBAAuB9B,EAAKzzB,KAAK22B,cAnCjB,iBAwCvBA,cACE,OAAO32B,KAAKm1B,yBAAyBn1B,KAAKmK,QAAQqrB,SAGpDO,uBACE,MA/EiB,aAoFG5xB,uBAAC5L,GACrB,OAAOyH,KAAKiF,MAAK,WACf,MAAMC,EAAOwxB,GAAQ7xB,oBAAoB7E,KAAMzH,GAE/C,GAAsB,iBAAXA,EAAqB,CAC9B,QAA4B,IAAjB2M,EAAK3M,GACd,MAAM,IAAIe,UAAW,oBAAmBf,MAG1C2M,EAAK3M,UAab6C,EAAmBs7B,ICrGnB,MAAMj7B,GAAO,YAKP8M,GAAU,CACd/B,OAAQ,GACR/B,OAAQ,OACRzH,OAAQ,IAGJ8L,GAAc,CAClBtC,OAAQ,SACR/B,OAAQ,SACRzH,OAAQ,oBAQJuM,GAAoB,SAOpBqtB,GAAuB,8CAKvBC,GAAkB,WAQxB,MAAMC,WAAkBrzB,EACtBC,YAAY1M,EAASuB,GACnBmR,MAAM1S,GACNgJ,KAAK+2B,eAA2C,SAA1B/2B,KAAK2D,SAASgB,QAAqB5J,OAASiF,KAAK2D,SACvE3D,KAAKmK,QAAUnK,KAAKoK,WAAW7R,GAC/ByH,KAAKg3B,SAAW,GAChBh3B,KAAKi3B,SAAW,GAChBj3B,KAAKk3B,cAAgB,KACrBl3B,KAAKm3B,cAAgB,EAErB72B,EAAaQ,GAAGd,KAAK+2B,eAlCH,uBAkCiC,IAAM/2B,KAAKo3B,aAE9Dp3B,KAAKq3B,UACLr3B,KAAKo3B,WAKI7uB,qBACT,OAAOA,GAGE9M,kBACT,OAAOA,GAKT47B,UACE,MAAMC,EAAat3B,KAAK+2B,iBAAmB/2B,KAAK+2B,eAAeh8B,OAtC7C,SAwChB87B,GAEIU,EAAuC,SAAxBv3B,KAAKmK,QAAQ1F,OAChC6yB,EACAt3B,KAAKmK,QAAQ1F,OAET+yB,EAAaD,IAAiBV,GAClC72B,KAAKy3B,gBACL,EAEFz3B,KAAKg3B,SAAW,GAChBh3B,KAAKi3B,SAAW,GAChBj3B,KAAKm3B,cAAgBn3B,KAAK03B,mBAEVxwB,EAAeC,KAAKyvB,GAAqB52B,KAAKmK,QAAQnN,QAE9DoL,KAAIpR,IACV,MAAM2gC,EAAiBngC,EAAuBR,GACxCgG,EAAS26B,EAAiBzwB,EAAeK,QAAQowB,GAAkB,KAEzE,GAAI36B,EAAQ,CACV,MAAM46B,EAAY56B,EAAO0J,wBACzB,GAAIkxB,EAAUlkB,OAASkkB,EAAUjkB,OAC/B,MAAO,CACL9N,EAAY0xB,GAAcv6B,GAAQ2J,IAAM6wB,EACxCG,GAKN,OAAO,QAENvxB,QAAOyxB,GAAQA,IACf1Z,MAAK,CAACC,EAAGC,IAAMD,EAAE,GAAKC,EAAE,KACxB1lB,SAAQk/B,IACP73B,KAAKg3B,SAAS/6B,KAAK47B,EAAK,IACxB73B,KAAKi3B,SAASh7B,KAAK47B,EAAK,OAI9Bh0B,UACEvD,EAAaC,IAAIP,KAAK+2B,eAhHP,iBAiHfrtB,MAAM7F,UAKRuG,WAAW7R,GAWT,OAVAA,EAAS,IACJgQ,MACA1C,EAAYI,kBAAkBjG,KAAK2D,aAChB,iBAAXpL,GAAuBA,EAASA,EAAS,KAG/CyE,OAAS7E,EAAWI,EAAOyE,SAAWvF,SAAS2C,gBAEtD/B,EAAgBoD,GAAMlD,EAAQuQ,IAEvBvQ,EAGTk/B,gBACE,OAAOz3B,KAAK+2B,iBAAmBh8B,OAC7BiF,KAAK+2B,eAAenwB,YACpB5G,KAAK+2B,eAAe5c,UAGxBud,mBACE,OAAO13B,KAAK+2B,eAAerb,cAAgB/d,KAAKC,IAC9CnG,SAASuD,KAAK0gB,aACdjkB,SAAS2C,gBAAgBshB,cAI7Boc,mBACE,OAAO93B,KAAK+2B,iBAAmBh8B,OAC7BA,OAAOg9B,YACP/3B,KAAK+2B,eAAerwB,wBAAwBiN,OAGhDyjB,WACE,MAAMjd,EAAYna,KAAKy3B,gBAAkBz3B,KAAKmK,QAAQ3D,OAChDkV,EAAe1b,KAAK03B,mBACpBM,EAAYh4B,KAAKmK,QAAQ3D,OAASkV,EAAe1b,KAAK83B,mBAM5D,GAJI93B,KAAKm3B,gBAAkBzb,GACzB1b,KAAKq3B,UAGHld,GAAa6d,EAAjB,CACE,MAAMh7B,EAASgD,KAAKi3B,SAASj3B,KAAKi3B,SAAS7+B,OAAS,GAEhD4H,KAAKk3B,gBAAkBl6B,GACzBgD,KAAKi4B,UAAUj7B,OAJnB,CAUA,GAAIgD,KAAKk3B,eAAiB/c,EAAYna,KAAKg3B,SAAS,IAAMh3B,KAAKg3B,SAAS,GAAK,EAG3E,OAFAh3B,KAAKk3B,cAAgB,UACrBl3B,KAAKk4B,SAIP,IAAK,IAAIl5B,EAAIgB,KAAKg3B,SAAS5+B,OAAQ4G,KACVgB,KAAKk3B,gBAAkBl3B,KAAKi3B,SAASj4B,IACxDmb,GAAana,KAAKg3B,SAASh4B,UACM,IAAzBgB,KAAKg3B,SAASh4B,EAAI,IAAsBmb,EAAYna,KAAKg3B,SAASh4B,EAAI,KAGhFgB,KAAKi4B,UAAUj4B,KAAKi3B,SAASj4B,KAKnCi5B,UAAUj7B,GACRgD,KAAKk3B,cAAgBl6B,EAErBgD,KAAKk4B,SAEL,MAAMC,EAAUvB,GAAoBt/B,MAAM,KACvC8Q,KAAInR,GAAa,GAAEA,qBAA4B+F,OAAY/F,WAAkB+F,QAE1Eo7B,EAAOlxB,EAAeK,QAAQ4wB,EAAQ9vB,KAAK,KAAMrI,KAAKmK,QAAQnN,QAEpEo7B,EAAKr+B,UAAUyS,IAAIjD,IACf6uB,EAAKr+B,UAAUC,SAnLU,iBAoL3BkN,EAAeK,QA1KY,mBA0KsB6wB,EAAKxzB,QA3KlC,cA4KjB7K,UAAUyS,IAAIjD,IAEjBrC,EAAeS,QAAQywB,EAnLG,qBAoLvBz/B,SAAQ0/B,IAGPnxB,EAAeW,KAAKwwB,EAAY,+BAC7B1/B,SAAQk/B,GAAQA,EAAK99B,UAAUyS,IAAIjD,MAGtCrC,EAAeW,KAAKwwB,EAzLH,aA0Ld1/B,SAAQ2/B,IACPpxB,EAAeM,SAAS8wB,EA5LX,aA6LV3/B,SAAQk/B,GAAQA,EAAK99B,UAAUyS,IAAIjD,YAKhDjJ,EAAamB,QAAQzB,KAAK+2B,eA3MN,wBA2MsC,CACxDj3B,cAAe9C,IAInBk7B,SACEhxB,EAAeC,KAAKyvB,GAAqB52B,KAAKmK,QAAQnN,QACnDoJ,QAAO4L,GAAQA,EAAKjY,UAAUC,SAASuP,MACvC5Q,SAAQqZ,GAAQA,EAAKjY,UAAUwJ,OAAOgG,MAKrBpF,uBAAC5L,GACrB,OAAOyH,KAAKiF,MAAK,WACf,MAAMC,EAAO4xB,GAAUjyB,oBAAoB7E,KAAMzH,GAEjD,GAAsB,iBAAXA,EAAX,CAIA,QAA4B,IAAjB2M,EAAK3M,GACd,MAAM,IAAIe,UAAW,oBAAmBf,MAG1C2M,EAAK3M,UAWX+H,EAAaQ,GAAG/F,OA7Oa,8BA6OgB,KAC3CmM,EAAeC,KAzOS,0BA0OrBxO,SAAQ4/B,GAAO,IAAIzB,GAAUyB,QAUlCn9B,EAAmB07B,IC7QnB,MAYMvtB,GAAoB,SACpBqpB,GAAkB,OAClBrkB,GAAkB,OAIlBiqB,GAAkB,UAClBC,GAAqB,wBAW3B,MAAMC,WAAYj1B,EAGLhI,kBACT,MAlCS,MAuCX+T,OACE,GAAKxP,KAAK2D,SAASlJ,YACjBuF,KAAK2D,SAASlJ,WAAWvC,WAAa2B,KAAKC,cAC3CkG,KAAK2D,SAAS5J,UAAUC,SAASuP,IACjC,OAGF,IAAIzB,EACJ,MAAM9K,EAASrF,EAAuBqI,KAAK2D,UACrCg1B,EAAc34B,KAAK2D,SAASiB,QA/BN,qBAiC5B,GAAI+zB,EAAa,CACf,MAAMC,EAAwC,OAAzBD,EAAY7mB,UAA8C,OAAzB6mB,EAAY7mB,SAAoB2mB,GAAqBD,GAC3G1wB,EAAWZ,EAAeC,KAAKyxB,EAAcD,GAC7C7wB,EAAWA,EAASA,EAAS1P,OAAS,GAGxC,MAAMygC,EAAY/wB,EAChBxH,EAAamB,QAAQqG,EApDP,cAoD6B,CACzChI,cAAeE,KAAK2D,WAEtB,KAMF,GAJkBrD,EAAamB,QAAQzB,KAAK2D,SAvD5B,cAuDkD,CAChE7D,cAAegI,IAGH/F,kBAAmC,OAAd82B,GAAsBA,EAAU92B,iBACjE,OAGF/B,KAAKi4B,UAAUj4B,KAAK2D,SAAUg1B,GAE9B,MAAMG,EAAW,KACfx4B,EAAamB,QAAQqG,EAnEL,gBAmE6B,CAC3ChI,cAAeE,KAAK2D,WAEtBrD,EAAamB,QAAQzB,KAAK2D,SApEX,eAoEkC,CAC/C7D,cAAegI,KAIf9K,EACFgD,KAAKi4B,UAAUj7B,EAAQA,EAAOvC,WAAYq+B,GAE1CA,IAMJb,UAAUjhC,EAAS2Y,EAAWrU,GAC5B,MAIMy9B,IAJiBppB,GAAqC,OAAvBA,EAAUmC,UAA4C,OAAvBnC,EAAUmC,SAE5E5K,EAAeM,SAASmI,EAAW6oB,IADnCtxB,EAAeC,KAAKsxB,GAAoB9oB,IAGZ,GACxBqpB,EAAkB19B,GAAay9B,GAAUA,EAAOh/B,UAAUC,SAAS44B,IAEnEkG,EAAW,IAAM94B,KAAKi5B,oBAAoBjiC,EAAS+hC,EAAQz9B,GAE7Dy9B,GAAUC,GACZD,EAAOh/B,UAAUwJ,OAAOgL,IACxBvO,KAAKiE,eAAe60B,EAAU9hC,GAAS,IAEvC8hC,IAIJG,oBAAoBjiC,EAAS+hC,EAAQz9B,GACnC,GAAIy9B,EAAQ,CACVA,EAAOh/B,UAAUwJ,OAAOgG,IAExB,MAAM2vB,EAAgBhyB,EAAeK,QA1FJ,kCA0F4CwxB,EAAOt+B,YAEhFy+B,GACFA,EAAcn/B,UAAUwJ,OAAOgG,IAGG,QAAhCwvB,EAAO7hC,aAAa,SACtB6hC,EAAOxzB,aAAa,iBAAiB,GAIzCvO,EAAQ+C,UAAUyS,IAAIjD,IACe,QAAjCvS,EAAQE,aAAa,SACvBF,EAAQuO,aAAa,iBAAiB,GAGxC5K,EAAO3D,GAEHA,EAAQ+C,UAAUC,SAAS44B,KAC7B57B,EAAQ+C,UAAUyS,IAAI+B,IAGxB,IAAID,EAAStX,EAAQyD,WAKrB,GAJI6T,GAA8B,OAApBA,EAAOwD,WACnBxD,EAASA,EAAO7T,YAGd6T,GAAUA,EAAOvU,UAAUC,SAhIF,iBAgIsC,CACjE,MAAMm/B,EAAkBniC,EAAQ4N,QA5HZ,aA8HhBu0B,GACFjyB,EAAeC,KA1HU,mBA0HqBgyB,GAC3CxgC,SAAQygC,GAAYA,EAASr/B,UAAUyS,IAAIjD,MAGhDvS,EAAQuO,aAAa,iBAAiB,GAGpCjK,GACFA,IAMkB6I,uBAAC5L,GACrB,OAAOyH,KAAKiF,MAAK,WACf,MAAMC,EAAOwzB,GAAI7zB,oBAAoB7E,MAErC,GAAsB,iBAAXzH,EAAqB,CAC9B,QAA4B,IAAjB2M,EAAK3M,GACd,MAAM,IAAIe,UAAW,oBAAmBf,MAG1C2M,EAAK3M,UAYb+H,EAAaQ,GAAGrJ,SAzKc,wBAWD,4EA8JyC,SAAUyH,GAC1E,CAAC,IAAK,QAAQ9H,SAAS4I,KAAK2E,UAC9BzF,EAAMyD,iBAGJ/I,EAAWoG,OAIF04B,GAAI7zB,oBAAoB7E,MAChCwP,UAUPpU,EAAmBs9B,ICtMnB,MAAMj9B,GAAO,QAcP49B,GAAkB,OAClB9qB,GAAkB,OAClB+qB,GAAqB,UAErBxwB,GAAc,CAClB8mB,UAAW,UACX2J,SAAU,UACVxJ,MAAO,UAGHxnB,GAAU,CACdqnB,WAAW,EACX2J,UAAU,EACVxJ,MAAO,KAST,MAAMyJ,WAAc/1B,EAClBC,YAAY1M,EAASuB,GACnBmR,MAAM1S,GAENgJ,KAAKmK,QAAUnK,KAAKoK,WAAW7R,GAC/ByH,KAAKszB,SAAW,KAChBtzB,KAAKy5B,sBAAuB,EAC5Bz5B,KAAK05B,yBAA0B,EAC/B15B,KAAK0zB,gBAKI5qB,yBACT,OAAOA,GAGEP,qBACT,OAAOA,GAGE9M,kBACT,OAAOA,GAKT+T,OACoBlP,EAAamB,QAAQzB,KAAK2D,SAtD5B,iBAwDF5B,mBAId/B,KAAK25B,gBAED35B,KAAKmK,QAAQylB,WACf5vB,KAAK2D,SAAS5J,UAAUyS,IA5DN,QAsEpBxM,KAAK2D,SAAS5J,UAAUwJ,OAAO81B,IAC/B1+B,EAAOqF,KAAK2D,UACZ3D,KAAK2D,SAAS5J,UAAUyS,IAAI+B,IAC5BvO,KAAK2D,SAAS5J,UAAUyS,IAAI8sB,IAE5Bt5B,KAAKiE,gBAZY,KACfjE,KAAK2D,SAAS5J,UAAUwJ,OAAO+1B,IAC/Bh5B,EAAamB,QAAQzB,KAAK2D,SAnEX,kBAqEf3D,KAAK45B,uBAQuB55B,KAAK2D,SAAU3D,KAAKmK,QAAQylB,YAG5DrgB,OACOvP,KAAK2D,SAAS5J,UAAUC,SAASuU,MAIpBjO,EAAamB,QAAQzB,KAAK2D,SAxF5B,iBA0FF5B,mBAWd/B,KAAK2D,SAAS5J,UAAUyS,IAAI8sB,IAC5Bt5B,KAAKiE,gBARY,KACfjE,KAAK2D,SAAS5J,UAAUyS,IAAI6sB,IAC5Br5B,KAAK2D,SAAS5J,UAAUwJ,OAAO+1B,IAC/Bt5B,KAAK2D,SAAS5J,UAAUwJ,OAAOgL,IAC/BjO,EAAamB,QAAQzB,KAAK2D,SAjGV,qBAqGY3D,KAAK2D,SAAU3D,KAAKmK,QAAQylB,aAG5D/rB,UACE7D,KAAK25B,gBAED35B,KAAK2D,SAAS5J,UAAUC,SAASuU,KACnCvO,KAAK2D,SAAS5J,UAAUwJ,OAAOgL,IAGjC7E,MAAM7F,UAKRuG,WAAW7R,GAST,OARAA,EAAS,IACJgQ,MACA1C,EAAYI,kBAAkBjG,KAAK2D,aAChB,iBAAXpL,GAAuBA,EAASA,EAAS,IAGtDF,EAAgBoD,GAAMlD,EAAQyH,KAAK0D,YAAYoF,aAExCvQ,EAGTqhC,qBACO55B,KAAKmK,QAAQovB,WAIdv5B,KAAKy5B,sBAAwBz5B,KAAK05B,0BAItC15B,KAAKszB,SAAWp2B,YAAW,KACzB8C,KAAKuP,SACJvP,KAAKmK,QAAQ4lB,SAGlB8J,eAAe36B,EAAO46B,GACpB,OAAQ56B,EAAMsB,MACZ,IAAK,YACL,IAAK,WACHR,KAAKy5B,qBAAuBK,EAC5B,MACF,IAAK,UACL,IAAK,WACH95B,KAAK05B,wBAA0BI,EAMnC,GAAIA,EAEF,YADA95B,KAAK25B,gBAIP,MAAMnsB,EAActO,EAAMY,cACtBE,KAAK2D,WAAa6J,GAAexN,KAAK2D,SAAS3J,SAASwT,IAI5DxN,KAAK45B,qBAGPlG,gBACEpzB,EAAaQ,GAAGd,KAAK2D,SA/KA,sBA+K2BzE,GAASc,KAAK65B,eAAe36B,GAAO,KACpFoB,EAAaQ,GAAGd,KAAK2D,SA/KD,qBA+K2BzE,GAASc,KAAK65B,eAAe36B,GAAO,KACnFoB,EAAaQ,GAAGd,KAAK2D,SA/KF,oBA+K2BzE,GAASc,KAAK65B,eAAe36B,GAAO,KAClFoB,EAAaQ,GAAGd,KAAK2D,SA/KD,qBA+K2BzE,GAASc,KAAK65B,eAAe36B,GAAO,KAGrFy6B,gBACErtB,aAAatM,KAAKszB,UAClBtzB,KAAKszB,SAAW,KAKInvB,uBAAC5L,GACrB,OAAOyH,KAAKiF,MAAK,WACf,MAAMC,EAAOs0B,GAAM30B,oBAAoB7E,KAAMzH,GAE7C,GAAsB,iBAAXA,EAAqB,CAC9B,QAA4B,IAAjB2M,EAAK3M,GACd,MAAM,IAAIe,UAAW,oBAAmBf,MAG1C2M,EAAK3M,GAAQyH,kBAMrBuE,EAAqBi1B,IASrBp+B,EAAmBo+B,IC3NJ,CACb10B,MAAAA,EACAO,OAAAA,EACAoE,SAAAA,GACAmF,SAAAA,GACA4Y,SAAAA,GACAgF,MAAAA,GACA4B,UAAAA,GACAsI,QAAAA,GACAI,UAAAA,GACA4B,IAAAA,GACAc,MAAAA,GACApG,QAAAA","sourcesContent":["/**\n * --------------------------------------------------------------------------\n * Bootstrap (v5.1.3): util/index.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/main/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nconst MAX_UID = 1000000\nconst MILLISECONDS_MULTIPLIER = 1000\nconst TRANSITION_END = 'transitionend'\n\n// Shoutout AngusCroll (https://goo.gl/pxwQGp)\nconst toType = obj => {\n if (obj === null || obj === undefined) {\n return `${obj}`\n }\n\n return {}.toString.call(obj).match(/\\s([a-z]+)/i)[1].toLowerCase()\n}\n\n/**\n * --------------------------------------------------------------------------\n * Public Util Api\n * --------------------------------------------------------------------------\n */\n\nconst getUID = prefix => {\n do {\n prefix += Math.floor(Math.random() * MAX_UID)\n } while (document.getElementById(prefix))\n\n return prefix\n}\n\nconst getSelector = element => {\n let selector = element.getAttribute('data-bs-target')\n\n if (!selector || selector === '#') {\n let hrefAttr = element.getAttribute('href')\n\n // The only valid content that could double as a selector are IDs or classes,\n // so everything starting with `#` or `.`. If a \"real\" URL is used as the selector,\n // `document.querySelector` will rightfully complain it is invalid.\n // See https://github.com/twbs/bootstrap/issues/32273\n if (!hrefAttr || (!hrefAttr.includes('#') && !hrefAttr.startsWith('.'))) {\n return null\n }\n\n // Just in case some CMS puts out a full URL with the anchor appended\n if (hrefAttr.includes('#') && !hrefAttr.startsWith('#')) {\n hrefAttr = `#${hrefAttr.split('#')[1]}`\n }\n\n selector = hrefAttr && hrefAttr !== '#' ? hrefAttr.trim() : null\n }\n\n return selector\n}\n\nconst getSelectorFromElement = element => {\n const selector = getSelector(element)\n\n if (selector) {\n return document.querySelector(selector) ? selector : null\n }\n\n return null\n}\n\nconst getElementFromSelector = element => {\n const selector = getSelector(element)\n\n return selector ? document.querySelector(selector) : null\n}\n\nconst getTransitionDurationFromElement = element => {\n if (!element) {\n return 0\n }\n\n // Get transition-duration of the element\n let { transitionDuration, transitionDelay } = window.getComputedStyle(element)\n\n const floatTransitionDuration = Number.parseFloat(transitionDuration)\n const floatTransitionDelay = Number.parseFloat(transitionDelay)\n\n // Return 0 if element or transition duration is not found\n if (!floatTransitionDuration && !floatTransitionDelay) {\n return 0\n }\n\n // If multiple durations are defined, take the first\n transitionDuration = transitionDuration.split(',')[0]\n transitionDelay = transitionDelay.split(',')[0]\n\n return (Number.parseFloat(transitionDuration) + Number.parseFloat(transitionDelay)) * MILLISECONDS_MULTIPLIER\n}\n\nconst triggerTransitionEnd = element => {\n element.dispatchEvent(new Event(TRANSITION_END))\n}\n\nconst isElement = obj => {\n if (!obj || typeof obj !== 'object') {\n return false\n }\n\n if (typeof obj.jquery !== 'undefined') {\n obj = obj[0]\n }\n\n return typeof obj.nodeType !== 'undefined'\n}\n\nconst getElement = obj => {\n if (isElement(obj)) { // it's a jQuery object or a node element\n return obj.jquery ? obj[0] : obj\n }\n\n if (typeof obj === 'string' && obj.length > 0) {\n return document.querySelector(obj)\n }\n\n return null\n}\n\nconst typeCheckConfig = (componentName, config, configTypes) => {\n Object.keys(configTypes).forEach(property => {\n const expectedTypes = configTypes[property]\n const value = config[property]\n const valueType = value && isElement(value) ? 'element' : toType(value)\n\n if (!new RegExp(expectedTypes).test(valueType)) {\n throw new TypeError(\n `${componentName.toUpperCase()}: Option \"${property}\" provided type \"${valueType}\" but expected type \"${expectedTypes}\".`\n )\n }\n })\n}\n\nconst isVisible = element => {\n if (!isElement(element) || element.getClientRects().length === 0) {\n return false\n }\n\n return getComputedStyle(element).getPropertyValue('visibility') === 'visible'\n}\n\nconst isDisabled = element => {\n if (!element || element.nodeType !== Node.ELEMENT_NODE) {\n return true\n }\n\n if (element.classList.contains('disabled')) {\n return true\n }\n\n if (typeof element.disabled !== 'undefined') {\n return element.disabled\n }\n\n return element.hasAttribute('disabled') && element.getAttribute('disabled') !== 'false'\n}\n\nconst findShadowRoot = element => {\n if (!document.documentElement.attachShadow) {\n return null\n }\n\n // Can find the shadow root otherwise it'll return the document\n if (typeof element.getRootNode === 'function') {\n const root = element.getRootNode()\n return root instanceof ShadowRoot ? root : null\n }\n\n if (element instanceof ShadowRoot) {\n return element\n }\n\n // when we don't find a shadow root\n if (!element.parentNode) {\n return null\n }\n\n return findShadowRoot(element.parentNode)\n}\n\nconst noop = () => {}\n\n/**\n * Trick to restart an element's animation\n *\n * @param {HTMLElement} element\n * @return void\n *\n * @see https://www.charistheo.io/blog/2021/02/restart-a-css-animation-with-javascript/#restarting-a-css-animation\n */\nconst reflow = element => {\n // eslint-disable-next-line no-unused-expressions\n element.offsetHeight\n}\n\nconst getjQuery = () => {\n const { jQuery } = window\n\n if (jQuery && !document.body.hasAttribute('data-bs-no-jquery')) {\n return jQuery\n }\n\n return null\n}\n\nconst DOMContentLoadedCallbacks = []\n\nconst onDOMContentLoaded = callback => {\n if (document.readyState === 'loading') {\n // add listener on the first call when the document is in loading state\n if (!DOMContentLoadedCallbacks.length) {\n document.addEventListener('DOMContentLoaded', () => {\n DOMContentLoadedCallbacks.forEach(callback => callback())\n })\n }\n\n DOMContentLoadedCallbacks.push(callback)\n } else {\n callback()\n }\n}\n\nconst isRTL = () => document.documentElement.dir === 'rtl'\n\nconst defineJQueryPlugin = plugin => {\n onDOMContentLoaded(() => {\n const $ = getjQuery()\n /* istanbul ignore if */\n if ($) {\n const name = plugin.NAME\n const JQUERY_NO_CONFLICT = $.fn[name]\n $.fn[name] = plugin.jQueryInterface\n $.fn[name].Constructor = plugin\n $.fn[name].noConflict = () => {\n $.fn[name] = JQUERY_NO_CONFLICT\n return plugin.jQueryInterface\n }\n }\n })\n}\n\nconst execute = callback => {\n if (typeof callback === 'function') {\n callback()\n }\n}\n\nconst executeAfterTransition = (callback, transitionElement, waitForTransition = true) => {\n if (!waitForTransition) {\n execute(callback)\n return\n }\n\n const durationPadding = 5\n const emulatedDuration = getTransitionDurationFromElement(transitionElement) + durationPadding\n\n let called = false\n\n const handler = ({ target }) => {\n if (target !== transitionElement) {\n return\n }\n\n called = true\n transitionElement.removeEventListener(TRANSITION_END, handler)\n execute(callback)\n }\n\n transitionElement.addEventListener(TRANSITION_END, handler)\n setTimeout(() => {\n if (!called) {\n triggerTransitionEnd(transitionElement)\n }\n }, emulatedDuration)\n}\n\n/**\n * Return the previous/next element of a list.\n *\n * @param {array} list The list of elements\n * @param activeElement The active element\n * @param shouldGetNext Choose to get next or previous element\n * @param isCycleAllowed\n * @return {Element|elem} The proper element\n */\nconst getNextActiveElement = (list, activeElement, shouldGetNext, isCycleAllowed) => {\n let index = list.indexOf(activeElement)\n\n // if the element does not exist in the list return an element depending on the direction and if cycle is allowed\n if (index === -1) {\n return list[!shouldGetNext && isCycleAllowed ? list.length - 1 : 0]\n }\n\n const listLength = list.length\n\n index += shouldGetNext ? 1 : -1\n\n if (isCycleAllowed) {\n index = (index + listLength) % listLength\n }\n\n return list[Math.max(0, Math.min(index, listLength - 1))]\n}\n\nexport {\n getElement,\n getUID,\n getSelectorFromElement,\n getElementFromSelector,\n getTransitionDurationFromElement,\n triggerTransitionEnd,\n isElement,\n typeCheckConfig,\n isVisible,\n isDisabled,\n findShadowRoot,\n noop,\n getNextActiveElement,\n reflow,\n getjQuery,\n onDOMContentLoaded,\n isRTL,\n defineJQueryPlugin,\n execute,\n executeAfterTransition\n}\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v5.1.3): dom/event-handler.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/main/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport { getjQuery } from '../util/index'\n\n/**\n * ------------------------------------------------------------------------\n * Constants\n * ------------------------------------------------------------------------\n */\n\nconst namespaceRegex = /[^.]*(?=\\..*)\\.|.*/\nconst stripNameRegex = /\\..*/\nconst stripUidRegex = /::\\d+$/\nconst eventRegistry = {} // Events storage\nlet uidEvent = 1\nconst customEvents = {\n mouseenter: 'mouseover',\n mouseleave: 'mouseout'\n}\nconst customEventsRegex = /^(mouseenter|mouseleave)/i\nconst nativeEvents = new Set([\n 'click',\n 'dblclick',\n 'mouseup',\n 'mousedown',\n 'contextmenu',\n 'mousewheel',\n 'DOMMouseScroll',\n 'mouseover',\n 'mouseout',\n 'mousemove',\n 'selectstart',\n 'selectend',\n 'keydown',\n 'keypress',\n 'keyup',\n 'orientationchange',\n 'touchstart',\n 'touchmove',\n 'touchend',\n 'touchcancel',\n 'pointerdown',\n 'pointermove',\n 'pointerup',\n 'pointerleave',\n 'pointercancel',\n 'gesturestart',\n 'gesturechange',\n 'gestureend',\n 'focus',\n 'blur',\n 'change',\n 'reset',\n 'select',\n 'submit',\n 'focusin',\n 'focusout',\n 'load',\n 'unload',\n 'beforeunload',\n 'resize',\n 'move',\n 'DOMContentLoaded',\n 'readystatechange',\n 'error',\n 'abort',\n 'scroll'\n])\n\n/**\n * ------------------------------------------------------------------------\n * Private methods\n * ------------------------------------------------------------------------\n */\n\nfunction getUidEvent(element, uid) {\n return (uid && `${uid}::${uidEvent++}`) || element.uidEvent || uidEvent++\n}\n\nfunction getEvent(element) {\n const uid = getUidEvent(element)\n\n element.uidEvent = uid\n eventRegistry[uid] = eventRegistry[uid] || {}\n\n return eventRegistry[uid]\n}\n\nfunction bootstrapHandler(element, fn) {\n return function handler(event) {\n event.delegateTarget = element\n\n if (handler.oneOff) {\n EventHandler.off(element, event.type, fn)\n }\n\n return fn.apply(element, [event])\n }\n}\n\nfunction bootstrapDelegationHandler(element, selector, fn) {\n return function handler(event) {\n const domElements = element.querySelectorAll(selector)\n\n for (let { target } = event; target && target !== this; target = target.parentNode) {\n for (let i = domElements.length; i--;) {\n if (domElements[i] === target) {\n event.delegateTarget = target\n\n if (handler.oneOff) {\n EventHandler.off(element, event.type, selector, fn)\n }\n\n return fn.apply(target, [event])\n }\n }\n }\n\n // To please ESLint\n return null\n }\n}\n\nfunction findHandler(events, handler, delegationSelector = null) {\n const uidEventList = Object.keys(events)\n\n for (let i = 0, len = uidEventList.length; i < len; i++) {\n const event = events[uidEventList[i]]\n\n if (event.originalHandler === handler && event.delegationSelector === delegationSelector) {\n return event\n }\n }\n\n return null\n}\n\nfunction normalizeParams(originalTypeEvent, handler, delegationFn) {\n const delegation = typeof handler === 'string'\n const originalHandler = delegation ? delegationFn : handler\n\n let typeEvent = getTypeEvent(originalTypeEvent)\n const isNative = nativeEvents.has(typeEvent)\n\n if (!isNative) {\n typeEvent = originalTypeEvent\n }\n\n return [delegation, originalHandler, typeEvent]\n}\n\nfunction addHandler(element, originalTypeEvent, handler, delegationFn, oneOff) {\n if (typeof originalTypeEvent !== 'string' || !element) {\n return\n }\n\n if (!handler) {\n handler = delegationFn\n delegationFn = null\n }\n\n // in case of mouseenter or mouseleave wrap the handler within a function that checks for its DOM position\n // this prevents the handler from being dispatched the same way as mouseover or mouseout does\n if (customEventsRegex.test(originalTypeEvent)) {\n const wrapFn = fn => {\n return function (event) {\n if (!event.relatedTarget || (event.relatedTarget !== event.delegateTarget && !event.delegateTarget.contains(event.relatedTarget))) {\n return fn.call(this, event)\n }\n }\n }\n\n if (delegationFn) {\n delegationFn = wrapFn(delegationFn)\n } else {\n handler = wrapFn(handler)\n }\n }\n\n const [delegation, originalHandler, typeEvent] = normalizeParams(originalTypeEvent, handler, delegationFn)\n const events = getEvent(element)\n const handlers = events[typeEvent] || (events[typeEvent] = {})\n const previousFn = findHandler(handlers, originalHandler, delegation ? handler : null)\n\n if (previousFn) {\n previousFn.oneOff = previousFn.oneOff && oneOff\n\n return\n }\n\n const uid = getUidEvent(originalHandler, originalTypeEvent.replace(namespaceRegex, ''))\n const fn = delegation ?\n bootstrapDelegationHandler(element, handler, delegationFn) :\n bootstrapHandler(element, handler)\n\n fn.delegationSelector = delegation ? handler : null\n fn.originalHandler = originalHandler\n fn.oneOff = oneOff\n fn.uidEvent = uid\n handlers[uid] = fn\n\n element.addEventListener(typeEvent, fn, delegation)\n}\n\nfunction removeHandler(element, events, typeEvent, handler, delegationSelector) {\n const fn = findHandler(events[typeEvent], handler, delegationSelector)\n\n if (!fn) {\n return\n }\n\n element.removeEventListener(typeEvent, fn, Boolean(delegationSelector))\n delete events[typeEvent][fn.uidEvent]\n}\n\nfunction removeNamespacedHandlers(element, events, typeEvent, namespace) {\n const storeElementEvent = events[typeEvent] || {}\n\n Object.keys(storeElementEvent).forEach(handlerKey => {\n if (handlerKey.includes(namespace)) {\n const event = storeElementEvent[handlerKey]\n\n removeHandler(element, events, typeEvent, event.originalHandler, event.delegationSelector)\n }\n })\n}\n\nfunction getTypeEvent(event) {\n // allow to get the native events from namespaced events ('click.bs.button' --> 'click')\n event = event.replace(stripNameRegex, '')\n return customEvents[event] || event\n}\n\nconst EventHandler = {\n on(element, event, handler, delegationFn) {\n addHandler(element, event, handler, delegationFn, false)\n },\n\n one(element, event, handler, delegationFn) {\n addHandler(element, event, handler, delegationFn, true)\n },\n\n off(element, originalTypeEvent, handler, delegationFn) {\n if (typeof originalTypeEvent !== 'string' || !element) {\n return\n }\n\n const [delegation, originalHandler, typeEvent] = normalizeParams(originalTypeEvent, handler, delegationFn)\n const inNamespace = typeEvent !== originalTypeEvent\n const events = getEvent(element)\n const isNamespace = originalTypeEvent.startsWith('.')\n\n if (typeof originalHandler !== 'undefined') {\n // Simplest case: handler is passed, remove that listener ONLY.\n if (!events || !events[typeEvent]) {\n return\n }\n\n removeHandler(element, events, typeEvent, originalHandler, delegation ? handler : null)\n return\n }\n\n if (isNamespace) {\n Object.keys(events).forEach(elementEvent => {\n removeNamespacedHandlers(element, events, elementEvent, originalTypeEvent.slice(1))\n })\n }\n\n const storeElementEvent = events[typeEvent] || {}\n Object.keys(storeElementEvent).forEach(keyHandlers => {\n const handlerKey = keyHandlers.replace(stripUidRegex, '')\n\n if (!inNamespace || originalTypeEvent.includes(handlerKey)) {\n const event = storeElementEvent[keyHandlers]\n\n removeHandler(element, events, typeEvent, event.originalHandler, event.delegationSelector)\n }\n })\n },\n\n trigger(element, event, args) {\n if (typeof event !== 'string' || !element) {\n return null\n }\n\n const $ = getjQuery()\n const typeEvent = getTypeEvent(event)\n const inNamespace = event !== typeEvent\n const isNative = nativeEvents.has(typeEvent)\n\n let jQueryEvent\n let bubbles = true\n let nativeDispatch = true\n let defaultPrevented = false\n let evt = null\n\n if (inNamespace && $) {\n jQueryEvent = $.Event(event, args)\n\n $(element).trigger(jQueryEvent)\n bubbles = !jQueryEvent.isPropagationStopped()\n nativeDispatch = !jQueryEvent.isImmediatePropagationStopped()\n defaultPrevented = jQueryEvent.isDefaultPrevented()\n }\n\n if (isNative) {\n evt = document.createEvent('HTMLEvents')\n evt.initEvent(typeEvent, bubbles, true)\n } else {\n evt = new CustomEvent(event, {\n bubbles,\n cancelable: true\n })\n }\n\n // merge custom information in our event\n if (typeof args !== 'undefined') {\n Object.keys(args).forEach(key => {\n Object.defineProperty(evt, key, {\n get() {\n return args[key]\n }\n })\n })\n }\n\n if (defaultPrevented) {\n evt.preventDefault()\n }\n\n if (nativeDispatch) {\n element.dispatchEvent(evt)\n }\n\n if (evt.defaultPrevented && typeof jQueryEvent !== 'undefined') {\n jQueryEvent.preventDefault()\n }\n\n return evt\n }\n}\n\nexport default EventHandler\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v5.1.3): dom/data.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/main/LICENSE)\n * --------------------------------------------------------------------------\n */\n\n/**\n * ------------------------------------------------------------------------\n * Constants\n * ------------------------------------------------------------------------\n */\n\nconst elementMap = new Map()\n\nexport default {\n set(element, key, instance) {\n if (!elementMap.has(element)) {\n elementMap.set(element, new Map())\n }\n\n const instanceMap = elementMap.get(element)\n\n // make it clear we only want one instance per element\n // can be removed later when multiple key/instances are fine to be used\n if (!instanceMap.has(key) && instanceMap.size !== 0) {\n // eslint-disable-next-line no-console\n console.error(`Bootstrap doesn't allow more than one instance per element. Bound instance: ${Array.from(instanceMap.keys())[0]}.`)\n return\n }\n\n instanceMap.set(key, instance)\n },\n\n get(element, key) {\n if (elementMap.has(element)) {\n return elementMap.get(element).get(key) || null\n }\n\n return null\n },\n\n remove(element, key) {\n if (!elementMap.has(element)) {\n return\n }\n\n const instanceMap = elementMap.get(element)\n\n instanceMap.delete(key)\n\n // free up element references if there are no instances left for an element\n if (instanceMap.size === 0) {\n elementMap.delete(element)\n }\n }\n}\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v5.1.3): base-component.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/main/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport Data from './dom/data'\nimport {\n executeAfterTransition,\n getElement\n} from './util/index'\nimport EventHandler from './dom/event-handler'\n\n/**\n * ------------------------------------------------------------------------\n * Constants\n * ------------------------------------------------------------------------\n */\n\nconst VERSION = '5.1.3'\n\nclass BaseComponent {\n constructor(element) {\n element = getElement(element)\n\n if (!element) {\n return\n }\n\n this._element = element\n Data.set(this._element, this.constructor.DATA_KEY, this)\n }\n\n dispose() {\n Data.remove(this._element, this.constructor.DATA_KEY)\n EventHandler.off(this._element, this.constructor.EVENT_KEY)\n\n Object.getOwnPropertyNames(this).forEach(propertyName => {\n this[propertyName] = null\n })\n }\n\n _queueCallback(callback, element, isAnimated = true) {\n executeAfterTransition(callback, element, isAnimated)\n }\n\n /** Static */\n\n static getInstance(element) {\n return Data.get(getElement(element), this.DATA_KEY)\n }\n\n static getOrCreateInstance(element, config = {}) {\n return this.getInstance(element) || new this(element, typeof config === 'object' ? config : null)\n }\n\n static get VERSION() {\n return VERSION\n }\n\n static get NAME() {\n throw new Error('You have to implement the static method \"NAME\", for each component!')\n }\n\n static get DATA_KEY() {\n return `bs.${this.NAME}`\n }\n\n static get EVENT_KEY() {\n return `.${this.DATA_KEY}`\n }\n}\n\nexport default BaseComponent\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v5.1.3): util/component-functions.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/main/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport EventHandler from '../dom/event-handler'\nimport { getElementFromSelector, isDisabled } from './index'\n\nconst enableDismissTrigger = (component, method = 'hide') => {\n const clickEvent = `click.dismiss${component.EVENT_KEY}`\n const name = component.NAME\n\n EventHandler.on(document, clickEvent, `[data-bs-dismiss=\"${name}\"]`, function (event) {\n if (['A', 'AREA'].includes(this.tagName)) {\n event.preventDefault()\n }\n\n if (isDisabled(this)) {\n return\n }\n\n const target = getElementFromSelector(this) || this.closest(`.${name}`)\n const instance = component.getOrCreateInstance(target)\n\n // Method argument is left, for Alert and only, as it doesn't implement the 'hide' method\n instance[method]()\n })\n}\n\nexport {\n enableDismissTrigger\n}\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v5.1.3): alert.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/main/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport { defineJQueryPlugin } from './util/index'\nimport EventHandler from './dom/event-handler'\nimport BaseComponent from './base-component'\nimport { enableDismissTrigger } from './util/component-functions'\n\n/**\n * ------------------------------------------------------------------------\n * Constants\n * ------------------------------------------------------------------------\n */\n\nconst NAME = 'alert'\nconst DATA_KEY = 'bs.alert'\nconst EVENT_KEY = `.${DATA_KEY}`\n\nconst EVENT_CLOSE = `close${EVENT_KEY}`\nconst EVENT_CLOSED = `closed${EVENT_KEY}`\nconst CLASS_NAME_FADE = 'fade'\nconst CLASS_NAME_SHOW = 'show'\n\n/**\n * ------------------------------------------------------------------------\n * Class Definition\n * ------------------------------------------------------------------------\n */\n\nclass Alert extends BaseComponent {\n // Getters\n\n static get NAME() {\n return NAME\n }\n\n // Public\n\n close() {\n const closeEvent = EventHandler.trigger(this._element, EVENT_CLOSE)\n\n if (closeEvent.defaultPrevented) {\n return\n }\n\n this._element.classList.remove(CLASS_NAME_SHOW)\n\n const isAnimated = this._element.classList.contains(CLASS_NAME_FADE)\n this._queueCallback(() => this._destroyElement(), this._element, isAnimated)\n }\n\n // Private\n _destroyElement() {\n this._element.remove()\n EventHandler.trigger(this._element, EVENT_CLOSED)\n this.dispose()\n }\n\n // Static\n\n static jQueryInterface(config) {\n return this.each(function () {\n const data = Alert.getOrCreateInstance(this)\n\n if (typeof config !== 'string') {\n return\n }\n\n if (data[config] === undefined || config.startsWith('_') || config === 'constructor') {\n throw new TypeError(`No method named \"${config}\"`)\n }\n\n data[config](this)\n })\n }\n}\n\n/**\n * ------------------------------------------------------------------------\n * Data Api implementation\n * ------------------------------------------------------------------------\n */\n\nenableDismissTrigger(Alert, 'close')\n\n/**\n * ------------------------------------------------------------------------\n * jQuery\n * ------------------------------------------------------------------------\n * add .Alert to jQuery only if jQuery is present\n */\n\ndefineJQueryPlugin(Alert)\n\nexport default Alert\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v5.1.3): button.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/main/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport { defineJQueryPlugin } from './util/index'\nimport EventHandler from './dom/event-handler'\nimport BaseComponent from './base-component'\n\n/**\n * ------------------------------------------------------------------------\n * Constants\n * ------------------------------------------------------------------------\n */\n\nconst NAME = 'button'\nconst DATA_KEY = 'bs.button'\nconst EVENT_KEY = `.${DATA_KEY}`\nconst DATA_API_KEY = '.data-api'\n\nconst CLASS_NAME_ACTIVE = 'active'\n\nconst SELECTOR_DATA_TOGGLE = '[data-bs-toggle=\"button\"]'\n\nconst EVENT_CLICK_DATA_API = `click${EVENT_KEY}${DATA_API_KEY}`\n\n/**\n * ------------------------------------------------------------------------\n * Class Definition\n * ------------------------------------------------------------------------\n */\n\nclass Button extends BaseComponent {\n // Getters\n\n static get NAME() {\n return NAME\n }\n\n // Public\n\n toggle() {\n // Toggle class and sync the `aria-pressed` attribute with the return value of the `.toggle()` method\n this._element.setAttribute('aria-pressed', this._element.classList.toggle(CLASS_NAME_ACTIVE))\n }\n\n // Static\n\n static jQueryInterface(config) {\n return this.each(function () {\n const data = Button.getOrCreateInstance(this)\n\n if (config === 'toggle') {\n data[config]()\n }\n })\n }\n}\n\n/**\n * ------------------------------------------------------------------------\n * Data Api implementation\n * ------------------------------------------------------------------------\n */\n\nEventHandler.on(document, EVENT_CLICK_DATA_API, SELECTOR_DATA_TOGGLE, event => {\n event.preventDefault()\n\n const button = event.target.closest(SELECTOR_DATA_TOGGLE)\n const data = Button.getOrCreateInstance(button)\n\n data.toggle()\n})\n\n/**\n * ------------------------------------------------------------------------\n * jQuery\n * ------------------------------------------------------------------------\n * add .Button to jQuery only if jQuery is present\n */\n\ndefineJQueryPlugin(Button)\n\nexport default Button\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v5.1.3): dom/manipulator.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/main/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nfunction normalizeData(val) {\n if (val === 'true') {\n return true\n }\n\n if (val === 'false') {\n return false\n }\n\n if (val === Number(val).toString()) {\n return Number(val)\n }\n\n if (val === '' || val === 'null') {\n return null\n }\n\n return val\n}\n\nfunction normalizeDataKey(key) {\n return key.replace(/[A-Z]/g, chr => `-${chr.toLowerCase()}`)\n}\n\nconst Manipulator = {\n setDataAttribute(element, key, value) {\n element.setAttribute(`data-bs-${normalizeDataKey(key)}`, value)\n },\n\n removeDataAttribute(element, key) {\n element.removeAttribute(`data-bs-${normalizeDataKey(key)}`)\n },\n\n getDataAttributes(element) {\n if (!element) {\n return {}\n }\n\n const attributes = {}\n\n Object.keys(element.dataset)\n .filter(key => key.startsWith('bs'))\n .forEach(key => {\n let pureKey = key.replace(/^bs/, '')\n pureKey = pureKey.charAt(0).toLowerCase() + pureKey.slice(1, pureKey.length)\n attributes[pureKey] = normalizeData(element.dataset[key])\n })\n\n return attributes\n },\n\n getDataAttribute(element, key) {\n return normalizeData(element.getAttribute(`data-bs-${normalizeDataKey(key)}`))\n },\n\n offset(element) {\n const rect = element.getBoundingClientRect()\n\n return {\n top: rect.top + window.pageYOffset,\n left: rect.left + window.pageXOffset\n }\n },\n\n position(element) {\n return {\n top: element.offsetTop,\n left: element.offsetLeft\n }\n }\n}\n\nexport default Manipulator\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v5.1.3): dom/selector-engine.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/main/LICENSE)\n * --------------------------------------------------------------------------\n */\n\n/**\n * ------------------------------------------------------------------------\n * Constants\n * ------------------------------------------------------------------------\n */\n\nimport { isDisabled, isVisible } from '../util/index'\n\nconst NODE_TEXT = 3\n\nconst SelectorEngine = {\n find(selector, element = document.documentElement) {\n return [].concat(...Element.prototype.querySelectorAll.call(element, selector))\n },\n\n findOne(selector, element = document.documentElement) {\n return Element.prototype.querySelector.call(element, selector)\n },\n\n children(element, selector) {\n return [].concat(...element.children)\n .filter(child => child.matches(selector))\n },\n\n parents(element, selector) {\n const parents = []\n\n let ancestor = element.parentNode\n\n while (ancestor && ancestor.nodeType === Node.ELEMENT_NODE && ancestor.nodeType !== NODE_TEXT) {\n if (ancestor.matches(selector)) {\n parents.push(ancestor)\n }\n\n ancestor = ancestor.parentNode\n }\n\n return parents\n },\n\n prev(element, selector) {\n let previous = element.previousElementSibling\n\n while (previous) {\n if (previous.matches(selector)) {\n return [previous]\n }\n\n previous = previous.previousElementSibling\n }\n\n return []\n },\n\n next(element, selector) {\n let next = element.nextElementSibling\n\n while (next) {\n if (next.matches(selector)) {\n return [next]\n }\n\n next = next.nextElementSibling\n }\n\n return []\n },\n\n focusableChildren(element) {\n const focusables = [\n 'a',\n 'button',\n 'input',\n 'textarea',\n 'select',\n 'details',\n '[tabindex]',\n '[contenteditable=\"true\"]'\n ].map(selector => `${selector}:not([tabindex^=\"-\"])`).join(', ')\n\n return this.find(focusables, element).filter(el => !isDisabled(el) && isVisible(el))\n }\n}\n\nexport default SelectorEngine\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v5.1.3): carousel.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/main/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport {\n defineJQueryPlugin,\n getElementFromSelector,\n isRTL,\n isVisible,\n getNextActiveElement,\n reflow,\n triggerTransitionEnd,\n typeCheckConfig\n} from './util/index'\nimport EventHandler from './dom/event-handler'\nimport Manipulator from './dom/manipulator'\nimport SelectorEngine from './dom/selector-engine'\nimport BaseComponent from './base-component'\n\n/**\n * ------------------------------------------------------------------------\n * Constants\n * ------------------------------------------------------------------------\n */\n\nconst NAME = 'carousel'\nconst DATA_KEY = 'bs.carousel'\nconst EVENT_KEY = `.${DATA_KEY}`\nconst DATA_API_KEY = '.data-api'\n\nconst ARROW_LEFT_KEY = 'ArrowLeft'\nconst ARROW_RIGHT_KEY = 'ArrowRight'\nconst TOUCHEVENT_COMPAT_WAIT = 500 // Time for mouse compat events to fire after touch\nconst SWIPE_THRESHOLD = 40\n\nconst Default = {\n interval: 5000,\n keyboard: true,\n slide: false,\n pause: 'hover',\n wrap: true,\n touch: true\n}\n\nconst DefaultType = {\n interval: '(number|boolean)',\n keyboard: 'boolean',\n slide: '(boolean|string)',\n pause: '(string|boolean)',\n wrap: 'boolean',\n touch: 'boolean'\n}\n\nconst ORDER_NEXT = 'next'\nconst ORDER_PREV = 'prev'\nconst DIRECTION_LEFT = 'left'\nconst DIRECTION_RIGHT = 'right'\n\nconst KEY_TO_DIRECTION = {\n [ARROW_LEFT_KEY]: DIRECTION_RIGHT,\n [ARROW_RIGHT_KEY]: DIRECTION_LEFT\n}\n\nconst EVENT_SLIDE = `slide${EVENT_KEY}`\nconst EVENT_SLID = `slid${EVENT_KEY}`\nconst EVENT_KEYDOWN = `keydown${EVENT_KEY}`\nconst EVENT_MOUSEENTER = `mouseenter${EVENT_KEY}`\nconst EVENT_MOUSELEAVE = `mouseleave${EVENT_KEY}`\nconst EVENT_TOUCHSTART = `touchstart${EVENT_KEY}`\nconst EVENT_TOUCHMOVE = `touchmove${EVENT_KEY}`\nconst EVENT_TOUCHEND = `touchend${EVENT_KEY}`\nconst EVENT_POINTERDOWN = `pointerdown${EVENT_KEY}`\nconst EVENT_POINTERUP = `pointerup${EVENT_KEY}`\nconst EVENT_DRAG_START = `dragstart${EVENT_KEY}`\nconst EVENT_LOAD_DATA_API = `load${EVENT_KEY}${DATA_API_KEY}`\nconst EVENT_CLICK_DATA_API = `click${EVENT_KEY}${DATA_API_KEY}`\n\nconst CLASS_NAME_CAROUSEL = 'carousel'\nconst CLASS_NAME_ACTIVE = 'active'\nconst CLASS_NAME_SLIDE = 'slide'\nconst CLASS_NAME_END = 'carousel-item-end'\nconst CLASS_NAME_START = 'carousel-item-start'\nconst CLASS_NAME_NEXT = 'carousel-item-next'\nconst CLASS_NAME_PREV = 'carousel-item-prev'\nconst CLASS_NAME_POINTER_EVENT = 'pointer-event'\n\nconst SELECTOR_ACTIVE = '.active'\nconst SELECTOR_ACTIVE_ITEM = '.active.carousel-item'\nconst SELECTOR_ITEM = '.carousel-item'\nconst SELECTOR_ITEM_IMG = '.carousel-item img'\nconst SELECTOR_NEXT_PREV = '.carousel-item-next, .carousel-item-prev'\nconst SELECTOR_INDICATORS = '.carousel-indicators'\nconst SELECTOR_INDICATOR = '[data-bs-target]'\nconst SELECTOR_DATA_SLIDE = '[data-bs-slide], [data-bs-slide-to]'\nconst SELECTOR_DATA_RIDE = '[data-bs-ride=\"carousel\"]'\n\nconst POINTER_TYPE_TOUCH = 'touch'\nconst POINTER_TYPE_PEN = 'pen'\n\n/**\n * ------------------------------------------------------------------------\n * Class Definition\n * ------------------------------------------------------------------------\n */\nclass Carousel extends BaseComponent {\n constructor(element, config) {\n super(element)\n\n this._items = null\n this._interval = null\n this._activeElement = null\n this._isPaused = false\n this._isSliding = false\n this.touchTimeout = null\n this.touchStartX = 0\n this.touchDeltaX = 0\n\n this._config = this._getConfig(config)\n this._indicatorsElement = SelectorEngine.findOne(SELECTOR_INDICATORS, this._element)\n this._touchSupported = 'ontouchstart' in document.documentElement || navigator.maxTouchPoints > 0\n this._pointerEvent = Boolean(window.PointerEvent)\n\n this._addEventListeners()\n }\n\n // Getters\n\n static get Default() {\n return Default\n }\n\n static get NAME() {\n return NAME\n }\n\n // Public\n\n next() {\n this._slide(ORDER_NEXT)\n }\n\n nextWhenVisible() {\n // Don't call next when the page isn't visible\n // or the carousel or its parent isn't visible\n if (!document.hidden && isVisible(this._element)) {\n this.next()\n }\n }\n\n prev() {\n this._slide(ORDER_PREV)\n }\n\n pause(event) {\n if (!event) {\n this._isPaused = true\n }\n\n if (SelectorEngine.findOne(SELECTOR_NEXT_PREV, this._element)) {\n triggerTransitionEnd(this._element)\n this.cycle(true)\n }\n\n clearInterval(this._interval)\n this._interval = null\n }\n\n cycle(event) {\n if (!event) {\n this._isPaused = false\n }\n\n if (this._interval) {\n clearInterval(this._interval)\n this._interval = null\n }\n\n if (this._config && this._config.interval && !this._isPaused) {\n this._updateInterval()\n\n this._interval = setInterval(\n (document.visibilityState ? this.nextWhenVisible : this.next).bind(this),\n this._config.interval\n )\n }\n }\n\n to(index) {\n this._activeElement = SelectorEngine.findOne(SELECTOR_ACTIVE_ITEM, this._element)\n const activeIndex = this._getItemIndex(this._activeElement)\n\n if (index > this._items.length - 1 || index < 0) {\n return\n }\n\n if (this._isSliding) {\n EventHandler.one(this._element, EVENT_SLID, () => this.to(index))\n return\n }\n\n if (activeIndex === index) {\n this.pause()\n this.cycle()\n return\n }\n\n const order = index > activeIndex ?\n ORDER_NEXT :\n ORDER_PREV\n\n this._slide(order, this._items[index])\n }\n\n // Private\n\n _getConfig(config) {\n config = {\n ...Default,\n ...Manipulator.getDataAttributes(this._element),\n ...(typeof config === 'object' ? config : {})\n }\n typeCheckConfig(NAME, config, DefaultType)\n return config\n }\n\n _handleSwipe() {\n const absDeltax = Math.abs(this.touchDeltaX)\n\n if (absDeltax <= SWIPE_THRESHOLD) {\n return\n }\n\n const direction = absDeltax / this.touchDeltaX\n\n this.touchDeltaX = 0\n\n if (!direction) {\n return\n }\n\n this._slide(direction > 0 ? DIRECTION_RIGHT : DIRECTION_LEFT)\n }\n\n _addEventListeners() {\n if (this._config.keyboard) {\n EventHandler.on(this._element, EVENT_KEYDOWN, event => this._keydown(event))\n }\n\n if (this._config.pause === 'hover') {\n EventHandler.on(this._element, EVENT_MOUSEENTER, event => this.pause(event))\n EventHandler.on(this._element, EVENT_MOUSELEAVE, event => this.cycle(event))\n }\n\n if (this._config.touch && this._touchSupported) {\n this._addTouchEventListeners()\n }\n }\n\n _addTouchEventListeners() {\n const hasPointerPenTouch = event => {\n return this._pointerEvent &&\n (event.pointerType === POINTER_TYPE_PEN || event.pointerType === POINTER_TYPE_TOUCH)\n }\n\n const start = event => {\n if (hasPointerPenTouch(event)) {\n this.touchStartX = event.clientX\n } else if (!this._pointerEvent) {\n this.touchStartX = event.touches[0].clientX\n }\n }\n\n const move = event => {\n // ensure swiping with one touch and not pinching\n this.touchDeltaX = event.touches && event.touches.length > 1 ?\n 0 :\n event.touches[0].clientX - this.touchStartX\n }\n\n const end = event => {\n if (hasPointerPenTouch(event)) {\n this.touchDeltaX = event.clientX - this.touchStartX\n }\n\n this._handleSwipe()\n if (this._config.pause === 'hover') {\n // If it's a touch-enabled device, mouseenter/leave are fired as\n // part of the mouse compatibility events on first tap - the carousel\n // would stop cycling until user tapped out of it;\n // here, we listen for touchend, explicitly pause the carousel\n // (as if it's the second time we tap on it, mouseenter compat event\n // is NOT fired) and after a timeout (to allow for mouse compatibility\n // events to fire) we explicitly restart cycling\n\n this.pause()\n if (this.touchTimeout) {\n clearTimeout(this.touchTimeout)\n }\n\n this.touchTimeout = setTimeout(event => this.cycle(event), TOUCHEVENT_COMPAT_WAIT + this._config.interval)\n }\n }\n\n SelectorEngine.find(SELECTOR_ITEM_IMG, this._element).forEach(itemImg => {\n EventHandler.on(itemImg, EVENT_DRAG_START, event => event.preventDefault())\n })\n\n if (this._pointerEvent) {\n EventHandler.on(this._element, EVENT_POINTERDOWN, event => start(event))\n EventHandler.on(this._element, EVENT_POINTERUP, event => end(event))\n\n this._element.classList.add(CLASS_NAME_POINTER_EVENT)\n } else {\n EventHandler.on(this._element, EVENT_TOUCHSTART, event => start(event))\n EventHandler.on(this._element, EVENT_TOUCHMOVE, event => move(event))\n EventHandler.on(this._element, EVENT_TOUCHEND, event => end(event))\n }\n }\n\n _keydown(event) {\n if (/input|textarea/i.test(event.target.tagName)) {\n return\n }\n\n const direction = KEY_TO_DIRECTION[event.key]\n if (direction) {\n event.preventDefault()\n this._slide(direction)\n }\n }\n\n _getItemIndex(element) {\n this._items = element && element.parentNode ?\n SelectorEngine.find(SELECTOR_ITEM, element.parentNode) :\n []\n\n return this._items.indexOf(element)\n }\n\n _getItemByOrder(order, activeElement) {\n const isNext = order === ORDER_NEXT\n return getNextActiveElement(this._items, activeElement, isNext, this._config.wrap)\n }\n\n _triggerSlideEvent(relatedTarget, eventDirectionName) {\n const targetIndex = this._getItemIndex(relatedTarget)\n const fromIndex = this._getItemIndex(SelectorEngine.findOne(SELECTOR_ACTIVE_ITEM, this._element))\n\n return EventHandler.trigger(this._element, EVENT_SLIDE, {\n relatedTarget,\n direction: eventDirectionName,\n from: fromIndex,\n to: targetIndex\n })\n }\n\n _setActiveIndicatorElement(element) {\n if (this._indicatorsElement) {\n const activeIndicator = SelectorEngine.findOne(SELECTOR_ACTIVE, this._indicatorsElement)\n\n activeIndicator.classList.remove(CLASS_NAME_ACTIVE)\n activeIndicator.removeAttribute('aria-current')\n\n const indicators = SelectorEngine.find(SELECTOR_INDICATOR, this._indicatorsElement)\n\n for (let i = 0; i < indicators.length; i++) {\n if (Number.parseInt(indicators[i].getAttribute('data-bs-slide-to'), 10) === this._getItemIndex(element)) {\n indicators[i].classList.add(CLASS_NAME_ACTIVE)\n indicators[i].setAttribute('aria-current', 'true')\n break\n }\n }\n }\n }\n\n _updateInterval() {\n const element = this._activeElement || SelectorEngine.findOne(SELECTOR_ACTIVE_ITEM, this._element)\n\n if (!element) {\n return\n }\n\n const elementInterval = Number.parseInt(element.getAttribute('data-bs-interval'), 10)\n\n if (elementInterval) {\n this._config.defaultInterval = this._config.defaultInterval || this._config.interval\n this._config.interval = elementInterval\n } else {\n this._config.interval = this._config.defaultInterval || this._config.interval\n }\n }\n\n _slide(directionOrOrder, element) {\n const order = this._directionToOrder(directionOrOrder)\n const activeElement = SelectorEngine.findOne(SELECTOR_ACTIVE_ITEM, this._element)\n const activeElementIndex = this._getItemIndex(activeElement)\n const nextElement = element || this._getItemByOrder(order, activeElement)\n\n const nextElementIndex = this._getItemIndex(nextElement)\n const isCycling = Boolean(this._interval)\n\n const isNext = order === ORDER_NEXT\n const directionalClassName = isNext ? CLASS_NAME_START : CLASS_NAME_END\n const orderClassName = isNext ? CLASS_NAME_NEXT : CLASS_NAME_PREV\n const eventDirectionName = this._orderToDirection(order)\n\n if (nextElement && nextElement.classList.contains(CLASS_NAME_ACTIVE)) {\n this._isSliding = false\n return\n }\n\n if (this._isSliding) {\n return\n }\n\n const slideEvent = this._triggerSlideEvent(nextElement, eventDirectionName)\n if (slideEvent.defaultPrevented) {\n return\n }\n\n if (!activeElement || !nextElement) {\n // Some weirdness is happening, so we bail\n return\n }\n\n this._isSliding = true\n\n if (isCycling) {\n this.pause()\n }\n\n this._setActiveIndicatorElement(nextElement)\n this._activeElement = nextElement\n\n const triggerSlidEvent = () => {\n EventHandler.trigger(this._element, EVENT_SLID, {\n relatedTarget: nextElement,\n direction: eventDirectionName,\n from: activeElementIndex,\n to: nextElementIndex\n })\n }\n\n if (this._element.classList.contains(CLASS_NAME_SLIDE)) {\n nextElement.classList.add(orderClassName)\n\n reflow(nextElement)\n\n activeElement.classList.add(directionalClassName)\n nextElement.classList.add(directionalClassName)\n\n const completeCallBack = () => {\n nextElement.classList.remove(directionalClassName, orderClassName)\n nextElement.classList.add(CLASS_NAME_ACTIVE)\n\n activeElement.classList.remove(CLASS_NAME_ACTIVE, orderClassName, directionalClassName)\n\n this._isSliding = false\n\n setTimeout(triggerSlidEvent, 0)\n }\n\n this._queueCallback(completeCallBack, activeElement, true)\n } else {\n activeElement.classList.remove(CLASS_NAME_ACTIVE)\n nextElement.classList.add(CLASS_NAME_ACTIVE)\n\n this._isSliding = false\n triggerSlidEvent()\n }\n\n if (isCycling) {\n this.cycle()\n }\n }\n\n _directionToOrder(direction) {\n if (![DIRECTION_RIGHT, DIRECTION_LEFT].includes(direction)) {\n return direction\n }\n\n if (isRTL()) {\n return direction === DIRECTION_LEFT ? ORDER_PREV : ORDER_NEXT\n }\n\n return direction === DIRECTION_LEFT ? ORDER_NEXT : ORDER_PREV\n }\n\n _orderToDirection(order) {\n if (![ORDER_NEXT, ORDER_PREV].includes(order)) {\n return order\n }\n\n if (isRTL()) {\n return order === ORDER_PREV ? DIRECTION_LEFT : DIRECTION_RIGHT\n }\n\n return order === ORDER_PREV ? DIRECTION_RIGHT : DIRECTION_LEFT\n }\n\n // Static\n\n static carouselInterface(element, config) {\n const data = Carousel.getOrCreateInstance(element, config)\n\n let { _config } = data\n if (typeof config === 'object') {\n _config = {\n ..._config,\n ...config\n }\n }\n\n const action = typeof config === 'string' ? config : _config.slide\n\n if (typeof config === 'number') {\n data.to(config)\n } else if (typeof action === 'string') {\n if (typeof data[action] === 'undefined') {\n throw new TypeError(`No method named \"${action}\"`)\n }\n\n data[action]()\n } else if (_config.interval && _config.ride) {\n data.pause()\n data.cycle()\n }\n }\n\n static jQueryInterface(config) {\n return this.each(function () {\n Carousel.carouselInterface(this, config)\n })\n }\n\n static dataApiClickHandler(event) {\n const target = getElementFromSelector(this)\n\n if (!target || !target.classList.contains(CLASS_NAME_CAROUSEL)) {\n return\n }\n\n const config = {\n ...Manipulator.getDataAttributes(target),\n ...Manipulator.getDataAttributes(this)\n }\n const slideIndex = this.getAttribute('data-bs-slide-to')\n\n if (slideIndex) {\n config.interval = false\n }\n\n Carousel.carouselInterface(target, config)\n\n if (slideIndex) {\n Carousel.getInstance(target).to(slideIndex)\n }\n\n event.preventDefault()\n }\n}\n\n/**\n * ------------------------------------------------------------------------\n * Data Api implementation\n * ------------------------------------------------------------------------\n */\n\nEventHandler.on(document, EVENT_CLICK_DATA_API, SELECTOR_DATA_SLIDE, Carousel.dataApiClickHandler)\n\nEventHandler.on(window, EVENT_LOAD_DATA_API, () => {\n const carousels = SelectorEngine.find(SELECTOR_DATA_RIDE)\n\n for (let i = 0, len = carousels.length; i < len; i++) {\n Carousel.carouselInterface(carousels[i], Carousel.getInstance(carousels[i]))\n }\n})\n\n/**\n * ------------------------------------------------------------------------\n * jQuery\n * ------------------------------------------------------------------------\n * add .Carousel to jQuery only if jQuery is present\n */\n\ndefineJQueryPlugin(Carousel)\n\nexport default Carousel\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v5.1.3): collapse.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/main/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport {\n defineJQueryPlugin,\n getElement,\n getSelectorFromElement,\n getElementFromSelector,\n reflow,\n typeCheckConfig\n} from './util/index'\nimport Data from './dom/data'\nimport EventHandler from './dom/event-handler'\nimport Manipulator from './dom/manipulator'\nimport SelectorEngine from './dom/selector-engine'\nimport BaseComponent from './base-component'\n\n/**\n * ------------------------------------------------------------------------\n * Constants\n * ------------------------------------------------------------------------\n */\n\nconst NAME = 'collapse'\nconst DATA_KEY = 'bs.collapse'\nconst EVENT_KEY = `.${DATA_KEY}`\nconst DATA_API_KEY = '.data-api'\n\nconst Default = {\n toggle: true,\n parent: null\n}\n\nconst DefaultType = {\n toggle: 'boolean',\n parent: '(null|element)'\n}\n\nconst EVENT_SHOW = `show${EVENT_KEY}`\nconst EVENT_SHOWN = `shown${EVENT_KEY}`\nconst EVENT_HIDE = `hide${EVENT_KEY}`\nconst EVENT_HIDDEN = `hidden${EVENT_KEY}`\nconst EVENT_CLICK_DATA_API = `click${EVENT_KEY}${DATA_API_KEY}`\n\nconst CLASS_NAME_SHOW = 'show'\nconst CLASS_NAME_COLLAPSE = 'collapse'\nconst CLASS_NAME_COLLAPSING = 'collapsing'\nconst CLASS_NAME_COLLAPSED = 'collapsed'\nconst CLASS_NAME_DEEPER_CHILDREN = `:scope .${CLASS_NAME_COLLAPSE} .${CLASS_NAME_COLLAPSE}`\nconst CLASS_NAME_HORIZONTAL = 'collapse-horizontal'\n\nconst WIDTH = 'width'\nconst HEIGHT = 'height'\n\nconst SELECTOR_ACTIVES = '.collapse.show, .collapse.collapsing'\nconst SELECTOR_DATA_TOGGLE = '[data-bs-toggle=\"collapse\"]'\n\n/**\n * ------------------------------------------------------------------------\n * Class Definition\n * ------------------------------------------------------------------------\n */\n\nclass Collapse extends BaseComponent {\n constructor(element, config) {\n super(element)\n\n this._isTransitioning = false\n this._config = this._getConfig(config)\n this._triggerArray = []\n\n const toggleList = SelectorEngine.find(SELECTOR_DATA_TOGGLE)\n\n for (let i = 0, len = toggleList.length; i < len; i++) {\n const elem = toggleList[i]\n const selector = getSelectorFromElement(elem)\n const filterElement = SelectorEngine.find(selector)\n .filter(foundElem => foundElem === this._element)\n\n if (selector !== null && filterElement.length) {\n this._selector = selector\n this._triggerArray.push(elem)\n }\n }\n\n this._initializeChildren()\n\n if (!this._config.parent) {\n this._addAriaAndCollapsedClass(this._triggerArray, this._isShown())\n }\n\n if (this._config.toggle) {\n this.toggle()\n }\n }\n\n // Getters\n\n static get Default() {\n return Default\n }\n\n static get NAME() {\n return NAME\n }\n\n // Public\n\n toggle() {\n if (this._isShown()) {\n this.hide()\n } else {\n this.show()\n }\n }\n\n show() {\n if (this._isTransitioning || this._isShown()) {\n return\n }\n\n let actives = []\n let activesData\n\n if (this._config.parent) {\n const children = SelectorEngine.find(CLASS_NAME_DEEPER_CHILDREN, this._config.parent)\n actives = SelectorEngine.find(SELECTOR_ACTIVES, this._config.parent).filter(elem => !children.includes(elem)) // remove children if greater depth\n }\n\n const container = SelectorEngine.findOne(this._selector)\n if (actives.length) {\n const tempActiveData = actives.find(elem => container !== elem)\n activesData = tempActiveData ? Collapse.getInstance(tempActiveData) : null\n\n if (activesData && activesData._isTransitioning) {\n return\n }\n }\n\n const startEvent = EventHandler.trigger(this._element, EVENT_SHOW)\n if (startEvent.defaultPrevented) {\n return\n }\n\n actives.forEach(elemActive => {\n if (container !== elemActive) {\n Collapse.getOrCreateInstance(elemActive, { toggle: false }).hide()\n }\n\n if (!activesData) {\n Data.set(elemActive, DATA_KEY, null)\n }\n })\n\n const dimension = this._getDimension()\n\n this._element.classList.remove(CLASS_NAME_COLLAPSE)\n this._element.classList.add(CLASS_NAME_COLLAPSING)\n\n this._element.style[dimension] = 0\n\n this._addAriaAndCollapsedClass(this._triggerArray, true)\n this._isTransitioning = true\n\n const complete = () => {\n this._isTransitioning = false\n\n this._element.classList.remove(CLASS_NAME_COLLAPSING)\n this._element.classList.add(CLASS_NAME_COLLAPSE, CLASS_NAME_SHOW)\n\n this._element.style[dimension] = ''\n\n EventHandler.trigger(this._element, EVENT_SHOWN)\n }\n\n const capitalizedDimension = dimension[0].toUpperCase() + dimension.slice(1)\n const scrollSize = `scroll${capitalizedDimension}`\n\n this._queueCallback(complete, this._element, true)\n this._element.style[dimension] = `${this._element[scrollSize]}px`\n }\n\n hide() {\n if (this._isTransitioning || !this._isShown()) {\n return\n }\n\n const startEvent = EventHandler.trigger(this._element, EVENT_HIDE)\n if (startEvent.defaultPrevented) {\n return\n }\n\n const dimension = this._getDimension()\n\n this._element.style[dimension] = `${this._element.getBoundingClientRect()[dimension]}px`\n\n reflow(this._element)\n\n this._element.classList.add(CLASS_NAME_COLLAPSING)\n this._element.classList.remove(CLASS_NAME_COLLAPSE, CLASS_NAME_SHOW)\n\n const triggerArrayLength = this._triggerArray.length\n for (let i = 0; i < triggerArrayLength; i++) {\n const trigger = this._triggerArray[i]\n const elem = getElementFromSelector(trigger)\n\n if (elem && !this._isShown(elem)) {\n this._addAriaAndCollapsedClass([trigger], false)\n }\n }\n\n this._isTransitioning = true\n\n const complete = () => {\n this._isTransitioning = false\n this._element.classList.remove(CLASS_NAME_COLLAPSING)\n this._element.classList.add(CLASS_NAME_COLLAPSE)\n EventHandler.trigger(this._element, EVENT_HIDDEN)\n }\n\n this._element.style[dimension] = ''\n\n this._queueCallback(complete, this._element, true)\n }\n\n _isShown(element = this._element) {\n return element.classList.contains(CLASS_NAME_SHOW)\n }\n\n // Private\n\n _getConfig(config) {\n config = {\n ...Default,\n ...Manipulator.getDataAttributes(this._element),\n ...config\n }\n config.toggle = Boolean(config.toggle) // Coerce string values\n config.parent = getElement(config.parent)\n typeCheckConfig(NAME, config, DefaultType)\n return config\n }\n\n _getDimension() {\n return this._element.classList.contains(CLASS_NAME_HORIZONTAL) ? WIDTH : HEIGHT\n }\n\n _initializeChildren() {\n if (!this._config.parent) {\n return\n }\n\n const children = SelectorEngine.find(CLASS_NAME_DEEPER_CHILDREN, this._config.parent)\n SelectorEngine.find(SELECTOR_DATA_TOGGLE, this._config.parent).filter(elem => !children.includes(elem))\n .forEach(element => {\n const selected = getElementFromSelector(element)\n\n if (selected) {\n this._addAriaAndCollapsedClass([element], this._isShown(selected))\n }\n })\n }\n\n _addAriaAndCollapsedClass(triggerArray, isOpen) {\n if (!triggerArray.length) {\n return\n }\n\n triggerArray.forEach(elem => {\n if (isOpen) {\n elem.classList.remove(CLASS_NAME_COLLAPSED)\n } else {\n elem.classList.add(CLASS_NAME_COLLAPSED)\n }\n\n elem.setAttribute('aria-expanded', isOpen)\n })\n }\n\n // Static\n\n static jQueryInterface(config) {\n return this.each(function () {\n const _config = {}\n if (typeof config === 'string' && /show|hide/.test(config)) {\n _config.toggle = false\n }\n\n const data = Collapse.getOrCreateInstance(this, _config)\n\n if (typeof config === 'string') {\n if (typeof data[config] === 'undefined') {\n throw new TypeError(`No method named \"${config}\"`)\n }\n\n data[config]()\n }\n })\n }\n}\n\n/**\n * ------------------------------------------------------------------------\n * Data Api implementation\n * ------------------------------------------------------------------------\n */\n\nEventHandler.on(document, EVENT_CLICK_DATA_API, SELECTOR_DATA_TOGGLE, function (event) {\n // preventDefault only for elements (which change the URL) not inside the collapsible element\n if (event.target.tagName === 'A' || (event.delegateTarget && event.delegateTarget.tagName === 'A')) {\n event.preventDefault()\n }\n\n const selector = getSelectorFromElement(this)\n const selectorElements = SelectorEngine.find(selector)\n\n selectorElements.forEach(element => {\n Collapse.getOrCreateInstance(element, { toggle: false }).toggle()\n })\n})\n\n/**\n * ------------------------------------------------------------------------\n * jQuery\n * ------------------------------------------------------------------------\n * add .Collapse to jQuery only if jQuery is present\n */\n\ndefineJQueryPlugin(Collapse)\n\nexport default Collapse\n","export var top = 'top';\nexport var bottom = 'bottom';\nexport var right = 'right';\nexport var left = 'left';\nexport var auto = 'auto';\nexport var basePlacements = [top, bottom, right, left];\nexport var start = 'start';\nexport var end = 'end';\nexport var clippingParents = 'clippingParents';\nexport var viewport = 'viewport';\nexport var popper = 'popper';\nexport var reference = 'reference';\nexport var variationPlacements = /*#__PURE__*/basePlacements.reduce(function (acc, placement) {\n return acc.concat([placement + \"-\" + start, placement + \"-\" + end]);\n}, []);\nexport var placements = /*#__PURE__*/[].concat(basePlacements, [auto]).reduce(function (acc, placement) {\n return acc.concat([placement, placement + \"-\" + start, placement + \"-\" + end]);\n}, []); // modifiers that need to read the DOM\n\nexport var beforeRead = 'beforeRead';\nexport var read = 'read';\nexport var afterRead = 'afterRead'; // pure-logic modifiers\n\nexport var beforeMain = 'beforeMain';\nexport var main = 'main';\nexport var afterMain = 'afterMain'; // modifier with the purpose to write to the DOM (or write into a framework state)\n\nexport var beforeWrite = 'beforeWrite';\nexport var write = 'write';\nexport var afterWrite = 'afterWrite';\nexport var modifierPhases = [beforeRead, read, afterRead, beforeMain, main, afterMain, beforeWrite, write, afterWrite];","export default function getNodeName(element) {\n return element ? (element.nodeName || '').toLowerCase() : null;\n}","export default function getWindow(node) {\n if (node == null) {\n return window;\n }\n\n if (node.toString() !== '[object Window]') {\n var ownerDocument = node.ownerDocument;\n return ownerDocument ? ownerDocument.defaultView || window : window;\n }\n\n return node;\n}","import getWindow from \"./getWindow.js\";\n\nfunction isElement(node) {\n var OwnElement = getWindow(node).Element;\n return node instanceof OwnElement || node instanceof Element;\n}\n\nfunction isHTMLElement(node) {\n var OwnElement = getWindow(node).HTMLElement;\n return node instanceof OwnElement || node instanceof HTMLElement;\n}\n\nfunction isShadowRoot(node) {\n // IE 11 has no ShadowRoot\n if (typeof ShadowRoot === 'undefined') {\n return false;\n }\n\n var OwnElement = getWindow(node).ShadowRoot;\n return node instanceof OwnElement || node instanceof ShadowRoot;\n}\n\nexport { isElement, isHTMLElement, isShadowRoot };","import getNodeName from \"../dom-utils/getNodeName.js\";\nimport { isHTMLElement } from \"../dom-utils/instanceOf.js\"; // This modifier takes the styles prepared by the `computeStyles` modifier\n// and applies them to the HTMLElements such as popper and arrow\n\nfunction applyStyles(_ref) {\n var state = _ref.state;\n Object.keys(state.elements).forEach(function (name) {\n var style = state.styles[name] || {};\n var attributes = state.attributes[name] || {};\n var element = state.elements[name]; // arrow is optional + virtual elements\n\n if (!isHTMLElement(element) || !getNodeName(element)) {\n return;\n } // Flow doesn't support to extend this property, but it's the most\n // effective way to apply styles to an HTMLElement\n // $FlowFixMe[cannot-write]\n\n\n Object.assign(element.style, style);\n Object.keys(attributes).forEach(function (name) {\n var value = attributes[name];\n\n if (value === false) {\n element.removeAttribute(name);\n } else {\n element.setAttribute(name, value === true ? '' : value);\n }\n });\n });\n}\n\nfunction effect(_ref2) {\n var state = _ref2.state;\n var initialStyles = {\n popper: {\n position: state.options.strategy,\n left: '0',\n top: '0',\n margin: '0'\n },\n arrow: {\n position: 'absolute'\n },\n reference: {}\n };\n Object.assign(state.elements.popper.style, initialStyles.popper);\n state.styles = initialStyles;\n\n if (state.elements.arrow) {\n Object.assign(state.elements.arrow.style, initialStyles.arrow);\n }\n\n return function () {\n Object.keys(state.elements).forEach(function (name) {\n var element = state.elements[name];\n var attributes = state.attributes[name] || {};\n var styleProperties = Object.keys(state.styles.hasOwnProperty(name) ? state.styles[name] : initialStyles[name]); // Set all values to an empty string to unset them\n\n var style = styleProperties.reduce(function (style, property) {\n style[property] = '';\n return style;\n }, {}); // arrow is optional + virtual elements\n\n if (!isHTMLElement(element) || !getNodeName(element)) {\n return;\n }\n\n Object.assign(element.style, style);\n Object.keys(attributes).forEach(function (attribute) {\n element.removeAttribute(attribute);\n });\n });\n };\n} // eslint-disable-next-line import/no-unused-modules\n\n\nexport default {\n name: 'applyStyles',\n enabled: true,\n phase: 'write',\n fn: applyStyles,\n effect: effect,\n requires: ['computeStyles']\n};","import { auto } from \"../enums.js\";\nexport default function getBasePlacement(placement) {\n return placement.split('-')[0];\n}","// import { isHTMLElement } from './instanceOf';\nexport default function getBoundingClientRect(element, // eslint-disable-next-line unused-imports/no-unused-vars\nincludeScale) {\n if (includeScale === void 0) {\n includeScale = false;\n }\n\n var rect = element.getBoundingClientRect();\n var scaleX = 1;\n var scaleY = 1; // FIXME:\n // `offsetWidth` returns an integer while `getBoundingClientRect`\n // returns a float. This results in `scaleX` or `scaleY` being\n // non-1 when it should be for elements that aren't a full pixel in\n // width or height.\n // if (isHTMLElement(element) && includeScale) {\n // const offsetHeight = element.offsetHeight;\n // const offsetWidth = element.offsetWidth;\n // // Do not attempt to divide by 0, otherwise we get `Infinity` as scale\n // // Fallback to 1 in case both values are `0`\n // if (offsetWidth > 0) {\n // scaleX = rect.width / offsetWidth || 1;\n // }\n // if (offsetHeight > 0) {\n // scaleY = rect.height / offsetHeight || 1;\n // }\n // }\n\n return {\n width: rect.width / scaleX,\n height: rect.height / scaleY,\n top: rect.top / scaleY,\n right: rect.right / scaleX,\n bottom: rect.bottom / scaleY,\n left: rect.left / scaleX,\n x: rect.left / scaleX,\n y: rect.top / scaleY\n };\n}","import getBoundingClientRect from \"./getBoundingClientRect.js\"; // Returns the layout rect of an element relative to its offsetParent. Layout\n// means it doesn't take into account transforms.\n\nexport default function getLayoutRect(element) {\n var clientRect = getBoundingClientRect(element); // Use the clientRect sizes if it's not been transformed.\n // Fixes https://github.com/popperjs/popper-core/issues/1223\n\n var width = element.offsetWidth;\n var height = element.offsetHeight;\n\n if (Math.abs(clientRect.width - width) <= 1) {\n width = clientRect.width;\n }\n\n if (Math.abs(clientRect.height - height) <= 1) {\n height = clientRect.height;\n }\n\n return {\n x: element.offsetLeft,\n y: element.offsetTop,\n width: width,\n height: height\n };\n}","import { isShadowRoot } from \"./instanceOf.js\";\nexport default function contains(parent, child) {\n var rootNode = child.getRootNode && child.getRootNode(); // First, attempt with faster native method\n\n if (parent.contains(child)) {\n return true;\n } // then fallback to custom implementation with Shadow DOM support\n else if (rootNode && isShadowRoot(rootNode)) {\n var next = child;\n\n do {\n if (next && parent.isSameNode(next)) {\n return true;\n } // $FlowFixMe[prop-missing]: need a better way to handle this...\n\n\n next = next.parentNode || next.host;\n } while (next);\n } // Give up, the result is false\n\n\n return false;\n}","import getWindow from \"./getWindow.js\";\nexport default function getComputedStyle(element) {\n return getWindow(element).getComputedStyle(element);\n}","import getNodeName from \"./getNodeName.js\";\nexport default function isTableElement(element) {\n return ['table', 'td', 'th'].indexOf(getNodeName(element)) >= 0;\n}","import { isElement } from \"./instanceOf.js\";\nexport default function getDocumentElement(element) {\n // $FlowFixMe[incompatible-return]: assume body is always available\n return ((isElement(element) ? element.ownerDocument : // $FlowFixMe[prop-missing]\n element.document) || window.document).documentElement;\n}","import getNodeName from \"./getNodeName.js\";\nimport getDocumentElement from \"./getDocumentElement.js\";\nimport { isShadowRoot } from \"./instanceOf.js\";\nexport default function getParentNode(element) {\n if (getNodeName(element) === 'html') {\n return element;\n }\n\n return (// this is a quicker (but less type safe) way to save quite some bytes from the bundle\n // $FlowFixMe[incompatible-return]\n // $FlowFixMe[prop-missing]\n element.assignedSlot || // step into the shadow DOM of the parent of a slotted node\n element.parentNode || ( // DOM Element detected\n isShadowRoot(element) ? element.host : null) || // ShadowRoot detected\n // $FlowFixMe[incompatible-call]: HTMLElement is a Node\n getDocumentElement(element) // fallback\n\n );\n}","import getWindow from \"./getWindow.js\";\nimport getNodeName from \"./getNodeName.js\";\nimport getComputedStyle from \"./getComputedStyle.js\";\nimport { isHTMLElement } from \"./instanceOf.js\";\nimport isTableElement from \"./isTableElement.js\";\nimport getParentNode from \"./getParentNode.js\";\n\nfunction getTrueOffsetParent(element) {\n if (!isHTMLElement(element) || // https://github.com/popperjs/popper-core/issues/837\n getComputedStyle(element).position === 'fixed') {\n return null;\n }\n\n return element.offsetParent;\n} // `.offsetParent` reports `null` for fixed elements, while absolute elements\n// return the containing block\n\n\nfunction getContainingBlock(element) {\n var isFirefox = navigator.userAgent.toLowerCase().indexOf('firefox') !== -1;\n var isIE = navigator.userAgent.indexOf('Trident') !== -1;\n\n if (isIE && isHTMLElement(element)) {\n // In IE 9, 10 and 11 fixed elements containing block is always established by the viewport\n var elementCss = getComputedStyle(element);\n\n if (elementCss.position === 'fixed') {\n return null;\n }\n }\n\n var currentNode = getParentNode(element);\n\n while (isHTMLElement(currentNode) && ['html', 'body'].indexOf(getNodeName(currentNode)) < 0) {\n var css = getComputedStyle(currentNode); // This is non-exhaustive but covers the most common CSS properties that\n // create a containing block.\n // https://developer.mozilla.org/en-US/docs/Web/CSS/Containing_block#identifying_the_containing_block\n\n if (css.transform !== 'none' || css.perspective !== 'none' || css.contain === 'paint' || ['transform', 'perspective'].indexOf(css.willChange) !== -1 || isFirefox && css.willChange === 'filter' || isFirefox && css.filter && css.filter !== 'none') {\n return currentNode;\n } else {\n currentNode = currentNode.parentNode;\n }\n }\n\n return null;\n} // Gets the closest ancestor positioned element. Handles some edge cases,\n// such as table ancestors and cross browser bugs.\n\n\nexport default function getOffsetParent(element) {\n var window = getWindow(element);\n var offsetParent = getTrueOffsetParent(element);\n\n while (offsetParent && isTableElement(offsetParent) && getComputedStyle(offsetParent).position === 'static') {\n offsetParent = getTrueOffsetParent(offsetParent);\n }\n\n if (offsetParent && (getNodeName(offsetParent) === 'html' || getNodeName(offsetParent) === 'body' && getComputedStyle(offsetParent).position === 'static')) {\n return window;\n }\n\n return offsetParent || getContainingBlock(element) || window;\n}","export default function getMainAxisFromPlacement(placement) {\n return ['top', 'bottom'].indexOf(placement) >= 0 ? 'x' : 'y';\n}","export var max = Math.max;\nexport var min = Math.min;\nexport var round = Math.round;","import { max as mathMax, min as mathMin } from \"./math.js\";\nexport default function within(min, value, max) {\n return mathMax(min, mathMin(value, max));\n}","import getFreshSideObject from \"./getFreshSideObject.js\";\nexport default function mergePaddingObject(paddingObject) {\n return Object.assign({}, getFreshSideObject(), paddingObject);\n}","export default function getFreshSideObject() {\n return {\n top: 0,\n right: 0,\n bottom: 0,\n left: 0\n };\n}","export default function expandToHashMap(value, keys) {\n return keys.reduce(function (hashMap, key) {\n hashMap[key] = value;\n return hashMap;\n }, {});\n}","import getBasePlacement from \"../utils/getBasePlacement.js\";\nimport getLayoutRect from \"../dom-utils/getLayoutRect.js\";\nimport contains from \"../dom-utils/contains.js\";\nimport getOffsetParent from \"../dom-utils/getOffsetParent.js\";\nimport getMainAxisFromPlacement from \"../utils/getMainAxisFromPlacement.js\";\nimport within from \"../utils/within.js\";\nimport mergePaddingObject from \"../utils/mergePaddingObject.js\";\nimport expandToHashMap from \"../utils/expandToHashMap.js\";\nimport { left, right, basePlacements, top, bottom } from \"../enums.js\";\nimport { isHTMLElement } from \"../dom-utils/instanceOf.js\"; // eslint-disable-next-line import/no-unused-modules\n\nvar toPaddingObject = function toPaddingObject(padding, state) {\n padding = typeof padding === 'function' ? padding(Object.assign({}, state.rects, {\n placement: state.placement\n })) : padding;\n return mergePaddingObject(typeof padding !== 'number' ? padding : expandToHashMap(padding, basePlacements));\n};\n\nfunction arrow(_ref) {\n var _state$modifiersData$;\n\n var state = _ref.state,\n name = _ref.name,\n options = _ref.options;\n var arrowElement = state.elements.arrow;\n var popperOffsets = state.modifiersData.popperOffsets;\n var basePlacement = getBasePlacement(state.placement);\n var axis = getMainAxisFromPlacement(basePlacement);\n var isVertical = [left, right].indexOf(basePlacement) >= 0;\n var len = isVertical ? 'height' : 'width';\n\n if (!arrowElement || !popperOffsets) {\n return;\n }\n\n var paddingObject = toPaddingObject(options.padding, state);\n var arrowRect = getLayoutRect(arrowElement);\n var minProp = axis === 'y' ? top : left;\n var maxProp = axis === 'y' ? bottom : right;\n var endDiff = state.rects.reference[len] + state.rects.reference[axis] - popperOffsets[axis] - state.rects.popper[len];\n var startDiff = popperOffsets[axis] - state.rects.reference[axis];\n var arrowOffsetParent = getOffsetParent(arrowElement);\n var clientSize = arrowOffsetParent ? axis === 'y' ? arrowOffsetParent.clientHeight || 0 : arrowOffsetParent.clientWidth || 0 : 0;\n var centerToReference = endDiff / 2 - startDiff / 2; // Make sure the arrow doesn't overflow the popper if the center point is\n // outside of the popper bounds\n\n var min = paddingObject[minProp];\n var max = clientSize - arrowRect[len] - paddingObject[maxProp];\n var center = clientSize / 2 - arrowRect[len] / 2 + centerToReference;\n var offset = within(min, center, max); // Prevents breaking syntax highlighting...\n\n var axisProp = axis;\n state.modifiersData[name] = (_state$modifiersData$ = {}, _state$modifiersData$[axisProp] = offset, _state$modifiersData$.centerOffset = offset - center, _state$modifiersData$);\n}\n\nfunction effect(_ref2) {\n var state = _ref2.state,\n options = _ref2.options;\n var _options$element = options.element,\n arrowElement = _options$element === void 0 ? '[data-popper-arrow]' : _options$element;\n\n if (arrowElement == null) {\n return;\n } // CSS selector\n\n\n if (typeof arrowElement === 'string') {\n arrowElement = state.elements.popper.querySelector(arrowElement);\n\n if (!arrowElement) {\n return;\n }\n }\n\n if (process.env.NODE_ENV !== \"production\") {\n if (!isHTMLElement(arrowElement)) {\n console.error(['Popper: \"arrow\" element must be an HTMLElement (not an SVGElement).', 'To use an SVG arrow, wrap it in an HTMLElement that will be used as', 'the arrow.'].join(' '));\n }\n }\n\n if (!contains(state.elements.popper, arrowElement)) {\n if (process.env.NODE_ENV !== \"production\") {\n console.error(['Popper: \"arrow\" modifier\\'s `element` must be a child of the popper', 'element.'].join(' '));\n }\n\n return;\n }\n\n state.elements.arrow = arrowElement;\n} // eslint-disable-next-line import/no-unused-modules\n\n\nexport default {\n name: 'arrow',\n enabled: true,\n phase: 'main',\n fn: arrow,\n effect: effect,\n requires: ['popperOffsets'],\n requiresIfExists: ['preventOverflow']\n};","export default function getVariation(placement) {\n return placement.split('-')[1];\n}","import { top, left, right, bottom, end } from \"../enums.js\";\nimport getOffsetParent from \"../dom-utils/getOffsetParent.js\";\nimport getWindow from \"../dom-utils/getWindow.js\";\nimport getDocumentElement from \"../dom-utils/getDocumentElement.js\";\nimport getComputedStyle from \"../dom-utils/getComputedStyle.js\";\nimport getBasePlacement from \"../utils/getBasePlacement.js\";\nimport getVariation from \"../utils/getVariation.js\";\nimport { round } from \"../utils/math.js\"; // eslint-disable-next-line import/no-unused-modules\n\nvar unsetSides = {\n top: 'auto',\n right: 'auto',\n bottom: 'auto',\n left: 'auto'\n}; // Round the offsets to the nearest suitable subpixel based on the DPR.\n// Zooming can change the DPR, but it seems to report a value that will\n// cleanly divide the values into the appropriate subpixels.\n\nfunction roundOffsetsByDPR(_ref) {\n var x = _ref.x,\n y = _ref.y;\n var win = window;\n var dpr = win.devicePixelRatio || 1;\n return {\n x: round(round(x * dpr) / dpr) || 0,\n y: round(round(y * dpr) / dpr) || 0\n };\n}\n\nexport function mapToStyles(_ref2) {\n var _Object$assign2;\n\n var popper = _ref2.popper,\n popperRect = _ref2.popperRect,\n placement = _ref2.placement,\n variation = _ref2.variation,\n offsets = _ref2.offsets,\n position = _ref2.position,\n gpuAcceleration = _ref2.gpuAcceleration,\n adaptive = _ref2.adaptive,\n roundOffsets = _ref2.roundOffsets;\n\n var _ref3 = roundOffsets === true ? roundOffsetsByDPR(offsets) : typeof roundOffsets === 'function' ? roundOffsets(offsets) : offsets,\n _ref3$x = _ref3.x,\n x = _ref3$x === void 0 ? 0 : _ref3$x,\n _ref3$y = _ref3.y,\n y = _ref3$y === void 0 ? 0 : _ref3$y;\n\n var hasX = offsets.hasOwnProperty('x');\n var hasY = offsets.hasOwnProperty('y');\n var sideX = left;\n var sideY = top;\n var win = window;\n\n if (adaptive) {\n var offsetParent = getOffsetParent(popper);\n var heightProp = 'clientHeight';\n var widthProp = 'clientWidth';\n\n if (offsetParent === getWindow(popper)) {\n offsetParent = getDocumentElement(popper);\n\n if (getComputedStyle(offsetParent).position !== 'static' && position === 'absolute') {\n heightProp = 'scrollHeight';\n widthProp = 'scrollWidth';\n }\n } // $FlowFixMe[incompatible-cast]: force type refinement, we compare offsetParent with window above, but Flow doesn't detect it\n\n\n offsetParent = offsetParent;\n\n if (placement === top || (placement === left || placement === right) && variation === end) {\n sideY = bottom; // $FlowFixMe[prop-missing]\n\n y -= offsetParent[heightProp] - popperRect.height;\n y *= gpuAcceleration ? 1 : -1;\n }\n\n if (placement === left || (placement === top || placement === bottom) && variation === end) {\n sideX = right; // $FlowFixMe[prop-missing]\n\n x -= offsetParent[widthProp] - popperRect.width;\n x *= gpuAcceleration ? 1 : -1;\n }\n }\n\n var commonStyles = Object.assign({\n position: position\n }, adaptive && unsetSides);\n\n if (gpuAcceleration) {\n var _Object$assign;\n\n return Object.assign({}, commonStyles, (_Object$assign = {}, _Object$assign[sideY] = hasY ? '0' : '', _Object$assign[sideX] = hasX ? '0' : '', _Object$assign.transform = (win.devicePixelRatio || 1) <= 1 ? \"translate(\" + x + \"px, \" + y + \"px)\" : \"translate3d(\" + x + \"px, \" + y + \"px, 0)\", _Object$assign));\n }\n\n return Object.assign({}, commonStyles, (_Object$assign2 = {}, _Object$assign2[sideY] = hasY ? y + \"px\" : '', _Object$assign2[sideX] = hasX ? x + \"px\" : '', _Object$assign2.transform = '', _Object$assign2));\n}\n\nfunction computeStyles(_ref4) {\n var state = _ref4.state,\n options = _ref4.options;\n var _options$gpuAccelerat = options.gpuAcceleration,\n gpuAcceleration = _options$gpuAccelerat === void 0 ? true : _options$gpuAccelerat,\n _options$adaptive = options.adaptive,\n adaptive = _options$adaptive === void 0 ? true : _options$adaptive,\n _options$roundOffsets = options.roundOffsets,\n roundOffsets = _options$roundOffsets === void 0 ? true : _options$roundOffsets;\n\n if (process.env.NODE_ENV !== \"production\") {\n var transitionProperty = getComputedStyle(state.elements.popper).transitionProperty || '';\n\n if (adaptive && ['transform', 'top', 'right', 'bottom', 'left'].some(function (property) {\n return transitionProperty.indexOf(property) >= 0;\n })) {\n console.warn(['Popper: Detected CSS transitions on at least one of the following', 'CSS properties: \"transform\", \"top\", \"right\", \"bottom\", \"left\".', '\\n\\n', 'Disable the \"computeStyles\" modifier\\'s `adaptive` option to allow', 'for smooth transitions, or remove these properties from the CSS', 'transition declaration on the popper element if only transitioning', 'opacity or background-color for example.', '\\n\\n', 'We recommend using the popper element as a wrapper around an inner', 'element that can have any CSS property transitioned for animations.'].join(' '));\n }\n }\n\n var commonStyles = {\n placement: getBasePlacement(state.placement),\n variation: getVariation(state.placement),\n popper: state.elements.popper,\n popperRect: state.rects.popper,\n gpuAcceleration: gpuAcceleration\n };\n\n if (state.modifiersData.popperOffsets != null) {\n state.styles.popper = Object.assign({}, state.styles.popper, mapToStyles(Object.assign({}, commonStyles, {\n offsets: state.modifiersData.popperOffsets,\n position: state.options.strategy,\n adaptive: adaptive,\n roundOffsets: roundOffsets\n })));\n }\n\n if (state.modifiersData.arrow != null) {\n state.styles.arrow = Object.assign({}, state.styles.arrow, mapToStyles(Object.assign({}, commonStyles, {\n offsets: state.modifiersData.arrow,\n position: 'absolute',\n adaptive: false,\n roundOffsets: roundOffsets\n })));\n }\n\n state.attributes.popper = Object.assign({}, state.attributes.popper, {\n 'data-popper-placement': state.placement\n });\n} // eslint-disable-next-line import/no-unused-modules\n\n\nexport default {\n name: 'computeStyles',\n enabled: true,\n phase: 'beforeWrite',\n fn: computeStyles,\n data: {}\n};","import getWindow from \"../dom-utils/getWindow.js\"; // eslint-disable-next-line import/no-unused-modules\n\nvar passive = {\n passive: true\n};\n\nfunction effect(_ref) {\n var state = _ref.state,\n instance = _ref.instance,\n options = _ref.options;\n var _options$scroll = options.scroll,\n scroll = _options$scroll === void 0 ? true : _options$scroll,\n _options$resize = options.resize,\n resize = _options$resize === void 0 ? true : _options$resize;\n var window = getWindow(state.elements.popper);\n var scrollParents = [].concat(state.scrollParents.reference, state.scrollParents.popper);\n\n if (scroll) {\n scrollParents.forEach(function (scrollParent) {\n scrollParent.addEventListener('scroll', instance.update, passive);\n });\n }\n\n if (resize) {\n window.addEventListener('resize', instance.update, passive);\n }\n\n return function () {\n if (scroll) {\n scrollParents.forEach(function (scrollParent) {\n scrollParent.removeEventListener('scroll', instance.update, passive);\n });\n }\n\n if (resize) {\n window.removeEventListener('resize', instance.update, passive);\n }\n };\n} // eslint-disable-next-line import/no-unused-modules\n\n\nexport default {\n name: 'eventListeners',\n enabled: true,\n phase: 'write',\n fn: function fn() {},\n effect: effect,\n data: {}\n};","var hash = {\n left: 'right',\n right: 'left',\n bottom: 'top',\n top: 'bottom'\n};\nexport default function getOppositePlacement(placement) {\n return placement.replace(/left|right|bottom|top/g, function (matched) {\n return hash[matched];\n });\n}","var hash = {\n start: 'end',\n end: 'start'\n};\nexport default function getOppositeVariationPlacement(placement) {\n return placement.replace(/start|end/g, function (matched) {\n return hash[matched];\n });\n}","import getWindow from \"./getWindow.js\";\nexport default function getWindowScroll(node) {\n var win = getWindow(node);\n var scrollLeft = win.pageXOffset;\n var scrollTop = win.pageYOffset;\n return {\n scrollLeft: scrollLeft,\n scrollTop: scrollTop\n };\n}","import getBoundingClientRect from \"./getBoundingClientRect.js\";\nimport getDocumentElement from \"./getDocumentElement.js\";\nimport getWindowScroll from \"./getWindowScroll.js\";\nexport default function getWindowScrollBarX(element) {\n // If has a CSS width greater than the viewport, then this will be\n // incorrect for RTL.\n // Popper 1 is broken in this case and never had a bug report so let's assume\n // it's not an issue. I don't think anyone ever specifies width on \n // anyway.\n // Browsers where the left scrollbar doesn't cause an issue report `0` for\n // this (e.g. Edge 2019, IE11, Safari)\n return getBoundingClientRect(getDocumentElement(element)).left + getWindowScroll(element).scrollLeft;\n}","import getComputedStyle from \"./getComputedStyle.js\";\nexport default function isScrollParent(element) {\n // Firefox wants us to check `-x` and `-y` variations as well\n var _getComputedStyle = getComputedStyle(element),\n overflow = _getComputedStyle.overflow,\n overflowX = _getComputedStyle.overflowX,\n overflowY = _getComputedStyle.overflowY;\n\n return /auto|scroll|overlay|hidden/.test(overflow + overflowY + overflowX);\n}","import getParentNode from \"./getParentNode.js\";\nimport isScrollParent from \"./isScrollParent.js\";\nimport getNodeName from \"./getNodeName.js\";\nimport { isHTMLElement } from \"./instanceOf.js\";\nexport default function getScrollParent(node) {\n if (['html', 'body', '#document'].indexOf(getNodeName(node)) >= 0) {\n // $FlowFixMe[incompatible-return]: assume body is always available\n return node.ownerDocument.body;\n }\n\n if (isHTMLElement(node) && isScrollParent(node)) {\n return node;\n }\n\n return getScrollParent(getParentNode(node));\n}","import getScrollParent from \"./getScrollParent.js\";\nimport getParentNode from \"./getParentNode.js\";\nimport getWindow from \"./getWindow.js\";\nimport isScrollParent from \"./isScrollParent.js\";\n/*\ngiven a DOM element, return the list of all scroll parents, up the list of ancesors\nuntil we get to the top window object. This list is what we attach scroll listeners\nto, because if any of these parent elements scroll, we'll need to re-calculate the\nreference element's position.\n*/\n\nexport default function listScrollParents(element, list) {\n var _element$ownerDocumen;\n\n if (list === void 0) {\n list = [];\n }\n\n var scrollParent = getScrollParent(element);\n var isBody = scrollParent === ((_element$ownerDocumen = element.ownerDocument) == null ? void 0 : _element$ownerDocumen.body);\n var win = getWindow(scrollParent);\n var target = isBody ? [win].concat(win.visualViewport || [], isScrollParent(scrollParent) ? scrollParent : []) : scrollParent;\n var updatedList = list.concat(target);\n return isBody ? updatedList : // $FlowFixMe[incompatible-call]: isBody tells us target will be an HTMLElement here\n updatedList.concat(listScrollParents(getParentNode(target)));\n}","export default function rectToClientRect(rect) {\n return Object.assign({}, rect, {\n left: rect.x,\n top: rect.y,\n right: rect.x + rect.width,\n bottom: rect.y + rect.height\n });\n}","import { viewport } from \"../enums.js\";\nimport getViewportRect from \"./getViewportRect.js\";\nimport getDocumentRect from \"./getDocumentRect.js\";\nimport listScrollParents from \"./listScrollParents.js\";\nimport getOffsetParent from \"./getOffsetParent.js\";\nimport getDocumentElement from \"./getDocumentElement.js\";\nimport getComputedStyle from \"./getComputedStyle.js\";\nimport { isElement, isHTMLElement } from \"./instanceOf.js\";\nimport getBoundingClientRect from \"./getBoundingClientRect.js\";\nimport getParentNode from \"./getParentNode.js\";\nimport contains from \"./contains.js\";\nimport getNodeName from \"./getNodeName.js\";\nimport rectToClientRect from \"../utils/rectToClientRect.js\";\nimport { max, min } from \"../utils/math.js\";\n\nfunction getInnerBoundingClientRect(element) {\n var rect = getBoundingClientRect(element);\n rect.top = rect.top + element.clientTop;\n rect.left = rect.left + element.clientLeft;\n rect.bottom = rect.top + element.clientHeight;\n rect.right = rect.left + element.clientWidth;\n rect.width = element.clientWidth;\n rect.height = element.clientHeight;\n rect.x = rect.left;\n rect.y = rect.top;\n return rect;\n}\n\nfunction getClientRectFromMixedType(element, clippingParent) {\n return clippingParent === viewport ? rectToClientRect(getViewportRect(element)) : isHTMLElement(clippingParent) ? getInnerBoundingClientRect(clippingParent) : rectToClientRect(getDocumentRect(getDocumentElement(element)));\n} // A \"clipping parent\" is an overflowable container with the characteristic of\n// clipping (or hiding) overflowing elements with a position different from\n// `initial`\n\n\nfunction getClippingParents(element) {\n var clippingParents = listScrollParents(getParentNode(element));\n var canEscapeClipping = ['absolute', 'fixed'].indexOf(getComputedStyle(element).position) >= 0;\n var clipperElement = canEscapeClipping && isHTMLElement(element) ? getOffsetParent(element) : element;\n\n if (!isElement(clipperElement)) {\n return [];\n } // $FlowFixMe[incompatible-return]: https://github.com/facebook/flow/issues/1414\n\n\n return clippingParents.filter(function (clippingParent) {\n return isElement(clippingParent) && contains(clippingParent, clipperElement) && getNodeName(clippingParent) !== 'body';\n });\n} // Gets the maximum area that the element is visible in due to any number of\n// clipping parents\n\n\nexport default function getClippingRect(element, boundary, rootBoundary) {\n var mainClippingParents = boundary === 'clippingParents' ? getClippingParents(element) : [].concat(boundary);\n var clippingParents = [].concat(mainClippingParents, [rootBoundary]);\n var firstClippingParent = clippingParents[0];\n var clippingRect = clippingParents.reduce(function (accRect, clippingParent) {\n var rect = getClientRectFromMixedType(element, clippingParent);\n accRect.top = max(rect.top, accRect.top);\n accRect.right = min(rect.right, accRect.right);\n accRect.bottom = min(rect.bottom, accRect.bottom);\n accRect.left = max(rect.left, accRect.left);\n return accRect;\n }, getClientRectFromMixedType(element, firstClippingParent));\n clippingRect.width = clippingRect.right - clippingRect.left;\n clippingRect.height = clippingRect.bottom - clippingRect.top;\n clippingRect.x = clippingRect.left;\n clippingRect.y = clippingRect.top;\n return clippingRect;\n}","import getWindow from \"./getWindow.js\";\nimport getDocumentElement from \"./getDocumentElement.js\";\nimport getWindowScrollBarX from \"./getWindowScrollBarX.js\";\nexport default function getViewportRect(element) {\n var win = getWindow(element);\n var html = getDocumentElement(element);\n var visualViewport = win.visualViewport;\n var width = html.clientWidth;\n var height = html.clientHeight;\n var x = 0;\n var y = 0; // NB: This isn't supported on iOS <= 12. If the keyboard is open, the popper\n // can be obscured underneath it.\n // Also, `html.clientHeight` adds the bottom bar height in Safari iOS, even\n // if it isn't open, so if this isn't available, the popper will be detected\n // to overflow the bottom of the screen too early.\n\n if (visualViewport) {\n width = visualViewport.width;\n height = visualViewport.height; // Uses Layout Viewport (like Chrome; Safari does not currently)\n // In Chrome, it returns a value very close to 0 (+/-) but contains rounding\n // errors due to floating point numbers, so we need to check precision.\n // Safari returns a number <= 0, usually < -1 when pinch-zoomed\n // Feature detection fails in mobile emulation mode in Chrome.\n // Math.abs(win.innerWidth / visualViewport.scale - visualViewport.width) <\n // 0.001\n // Fallback here: \"Not Safari\" userAgent\n\n if (!/^((?!chrome|android).)*safari/i.test(navigator.userAgent)) {\n x = visualViewport.offsetLeft;\n y = visualViewport.offsetTop;\n }\n }\n\n return {\n width: width,\n height: height,\n x: x + getWindowScrollBarX(element),\n y: y\n };\n}","import getDocumentElement from \"./getDocumentElement.js\";\nimport getComputedStyle from \"./getComputedStyle.js\";\nimport getWindowScrollBarX from \"./getWindowScrollBarX.js\";\nimport getWindowScroll from \"./getWindowScroll.js\";\nimport { max } from \"../utils/math.js\"; // Gets the entire size of the scrollable document area, even extending outside\n// of the `` and `` rect bounds if horizontally scrollable\n\nexport default function getDocumentRect(element) {\n var _element$ownerDocumen;\n\n var html = getDocumentElement(element);\n var winScroll = getWindowScroll(element);\n var body = (_element$ownerDocumen = element.ownerDocument) == null ? void 0 : _element$ownerDocumen.body;\n var width = max(html.scrollWidth, html.clientWidth, body ? body.scrollWidth : 0, body ? body.clientWidth : 0);\n var height = max(html.scrollHeight, html.clientHeight, body ? body.scrollHeight : 0, body ? body.clientHeight : 0);\n var x = -winScroll.scrollLeft + getWindowScrollBarX(element);\n var y = -winScroll.scrollTop;\n\n if (getComputedStyle(body || html).direction === 'rtl') {\n x += max(html.clientWidth, body ? body.clientWidth : 0) - width;\n }\n\n return {\n width: width,\n height: height,\n x: x,\n y: y\n };\n}","import getBasePlacement from \"./getBasePlacement.js\";\nimport getVariation from \"./getVariation.js\";\nimport getMainAxisFromPlacement from \"./getMainAxisFromPlacement.js\";\nimport { top, right, bottom, left, start, end } from \"../enums.js\";\nexport default function computeOffsets(_ref) {\n var reference = _ref.reference,\n element = _ref.element,\n placement = _ref.placement;\n var basePlacement = placement ? getBasePlacement(placement) : null;\n var variation = placement ? getVariation(placement) : null;\n var commonX = reference.x + reference.width / 2 - element.width / 2;\n var commonY = reference.y + reference.height / 2 - element.height / 2;\n var offsets;\n\n switch (basePlacement) {\n case top:\n offsets = {\n x: commonX,\n y: reference.y - element.height\n };\n break;\n\n case bottom:\n offsets = {\n x: commonX,\n y: reference.y + reference.height\n };\n break;\n\n case right:\n offsets = {\n x: reference.x + reference.width,\n y: commonY\n };\n break;\n\n case left:\n offsets = {\n x: reference.x - element.width,\n y: commonY\n };\n break;\n\n default:\n offsets = {\n x: reference.x,\n y: reference.y\n };\n }\n\n var mainAxis = basePlacement ? getMainAxisFromPlacement(basePlacement) : null;\n\n if (mainAxis != null) {\n var len = mainAxis === 'y' ? 'height' : 'width';\n\n switch (variation) {\n case start:\n offsets[mainAxis] = offsets[mainAxis] - (reference[len] / 2 - element[len] / 2);\n break;\n\n case end:\n offsets[mainAxis] = offsets[mainAxis] + (reference[len] / 2 - element[len] / 2);\n break;\n\n default:\n }\n }\n\n return offsets;\n}","import getClippingRect from \"../dom-utils/getClippingRect.js\";\nimport getDocumentElement from \"../dom-utils/getDocumentElement.js\";\nimport getBoundingClientRect from \"../dom-utils/getBoundingClientRect.js\";\nimport computeOffsets from \"./computeOffsets.js\";\nimport rectToClientRect from \"./rectToClientRect.js\";\nimport { clippingParents, reference, popper, bottom, top, right, basePlacements, viewport } from \"../enums.js\";\nimport { isElement } from \"../dom-utils/instanceOf.js\";\nimport mergePaddingObject from \"./mergePaddingObject.js\";\nimport expandToHashMap from \"./expandToHashMap.js\"; // eslint-disable-next-line import/no-unused-modules\n\nexport default function detectOverflow(state, options) {\n if (options === void 0) {\n options = {};\n }\n\n var _options = options,\n _options$placement = _options.placement,\n placement = _options$placement === void 0 ? state.placement : _options$placement,\n _options$boundary = _options.boundary,\n boundary = _options$boundary === void 0 ? clippingParents : _options$boundary,\n _options$rootBoundary = _options.rootBoundary,\n rootBoundary = _options$rootBoundary === void 0 ? viewport : _options$rootBoundary,\n _options$elementConte = _options.elementContext,\n elementContext = _options$elementConte === void 0 ? popper : _options$elementConte,\n _options$altBoundary = _options.altBoundary,\n altBoundary = _options$altBoundary === void 0 ? false : _options$altBoundary,\n _options$padding = _options.padding,\n padding = _options$padding === void 0 ? 0 : _options$padding;\n var paddingObject = mergePaddingObject(typeof padding !== 'number' ? padding : expandToHashMap(padding, basePlacements));\n var altContext = elementContext === popper ? reference : popper;\n var popperRect = state.rects.popper;\n var element = state.elements[altBoundary ? altContext : elementContext];\n var clippingClientRect = getClippingRect(isElement(element) ? element : element.contextElement || getDocumentElement(state.elements.popper), boundary, rootBoundary);\n var referenceClientRect = getBoundingClientRect(state.elements.reference);\n var popperOffsets = computeOffsets({\n reference: referenceClientRect,\n element: popperRect,\n strategy: 'absolute',\n placement: placement\n });\n var popperClientRect = rectToClientRect(Object.assign({}, popperRect, popperOffsets));\n var elementClientRect = elementContext === popper ? popperClientRect : referenceClientRect; // positive = overflowing the clipping rect\n // 0 or negative = within the clipping rect\n\n var overflowOffsets = {\n top: clippingClientRect.top - elementClientRect.top + paddingObject.top,\n bottom: elementClientRect.bottom - clippingClientRect.bottom + paddingObject.bottom,\n left: clippingClientRect.left - elementClientRect.left + paddingObject.left,\n right: elementClientRect.right - clippingClientRect.right + paddingObject.right\n };\n var offsetData = state.modifiersData.offset; // Offsets can be applied only to the popper element\n\n if (elementContext === popper && offsetData) {\n var offset = offsetData[placement];\n Object.keys(overflowOffsets).forEach(function (key) {\n var multiply = [right, bottom].indexOf(key) >= 0 ? 1 : -1;\n var axis = [top, bottom].indexOf(key) >= 0 ? 'y' : 'x';\n overflowOffsets[key] += offset[axis] * multiply;\n });\n }\n\n return overflowOffsets;\n}","import getVariation from \"./getVariation.js\";\nimport { variationPlacements, basePlacements, placements as allPlacements } from \"../enums.js\";\nimport detectOverflow from \"./detectOverflow.js\";\nimport getBasePlacement from \"./getBasePlacement.js\";\nexport default function computeAutoPlacement(state, options) {\n if (options === void 0) {\n options = {};\n }\n\n var _options = options,\n placement = _options.placement,\n boundary = _options.boundary,\n rootBoundary = _options.rootBoundary,\n padding = _options.padding,\n flipVariations = _options.flipVariations,\n _options$allowedAutoP = _options.allowedAutoPlacements,\n allowedAutoPlacements = _options$allowedAutoP === void 0 ? allPlacements : _options$allowedAutoP;\n var variation = getVariation(placement);\n var placements = variation ? flipVariations ? variationPlacements : variationPlacements.filter(function (placement) {\n return getVariation(placement) === variation;\n }) : basePlacements;\n var allowedPlacements = placements.filter(function (placement) {\n return allowedAutoPlacements.indexOf(placement) >= 0;\n });\n\n if (allowedPlacements.length === 0) {\n allowedPlacements = placements;\n\n if (process.env.NODE_ENV !== \"production\") {\n console.error(['Popper: The `allowedAutoPlacements` option did not allow any', 'placements. Ensure the `placement` option matches the variation', 'of the allowed placements.', 'For example, \"auto\" cannot be used to allow \"bottom-start\".', 'Use \"auto-start\" instead.'].join(' '));\n }\n } // $FlowFixMe[incompatible-type]: Flow seems to have problems with two array unions...\n\n\n var overflows = allowedPlacements.reduce(function (acc, placement) {\n acc[placement] = detectOverflow(state, {\n placement: placement,\n boundary: boundary,\n rootBoundary: rootBoundary,\n padding: padding\n })[getBasePlacement(placement)];\n return acc;\n }, {});\n return Object.keys(overflows).sort(function (a, b) {\n return overflows[a] - overflows[b];\n });\n}","import getOppositePlacement from \"../utils/getOppositePlacement.js\";\nimport getBasePlacement from \"../utils/getBasePlacement.js\";\nimport getOppositeVariationPlacement from \"../utils/getOppositeVariationPlacement.js\";\nimport detectOverflow from \"../utils/detectOverflow.js\";\nimport computeAutoPlacement from \"../utils/computeAutoPlacement.js\";\nimport { bottom, top, start, right, left, auto } from \"../enums.js\";\nimport getVariation from \"../utils/getVariation.js\"; // eslint-disable-next-line import/no-unused-modules\n\nfunction getExpandedFallbackPlacements(placement) {\n if (getBasePlacement(placement) === auto) {\n return [];\n }\n\n var oppositePlacement = getOppositePlacement(placement);\n return [getOppositeVariationPlacement(placement), oppositePlacement, getOppositeVariationPlacement(oppositePlacement)];\n}\n\nfunction flip(_ref) {\n var state = _ref.state,\n options = _ref.options,\n name = _ref.name;\n\n if (state.modifiersData[name]._skip) {\n return;\n }\n\n var _options$mainAxis = options.mainAxis,\n checkMainAxis = _options$mainAxis === void 0 ? true : _options$mainAxis,\n _options$altAxis = options.altAxis,\n checkAltAxis = _options$altAxis === void 0 ? true : _options$altAxis,\n specifiedFallbackPlacements = options.fallbackPlacements,\n padding = options.padding,\n boundary = options.boundary,\n rootBoundary = options.rootBoundary,\n altBoundary = options.altBoundary,\n _options$flipVariatio = options.flipVariations,\n flipVariations = _options$flipVariatio === void 0 ? true : _options$flipVariatio,\n allowedAutoPlacements = options.allowedAutoPlacements;\n var preferredPlacement = state.options.placement;\n var basePlacement = getBasePlacement(preferredPlacement);\n var isBasePlacement = basePlacement === preferredPlacement;\n var fallbackPlacements = specifiedFallbackPlacements || (isBasePlacement || !flipVariations ? [getOppositePlacement(preferredPlacement)] : getExpandedFallbackPlacements(preferredPlacement));\n var placements = [preferredPlacement].concat(fallbackPlacements).reduce(function (acc, placement) {\n return acc.concat(getBasePlacement(placement) === auto ? computeAutoPlacement(state, {\n placement: placement,\n boundary: boundary,\n rootBoundary: rootBoundary,\n padding: padding,\n flipVariations: flipVariations,\n allowedAutoPlacements: allowedAutoPlacements\n }) : placement);\n }, []);\n var referenceRect = state.rects.reference;\n var popperRect = state.rects.popper;\n var checksMap = new Map();\n var makeFallbackChecks = true;\n var firstFittingPlacement = placements[0];\n\n for (var i = 0; i < placements.length; i++) {\n var placement = placements[i];\n\n var _basePlacement = getBasePlacement(placement);\n\n var isStartVariation = getVariation(placement) === start;\n var isVertical = [top, bottom].indexOf(_basePlacement) >= 0;\n var len = isVertical ? 'width' : 'height';\n var overflow = detectOverflow(state, {\n placement: placement,\n boundary: boundary,\n rootBoundary: rootBoundary,\n altBoundary: altBoundary,\n padding: padding\n });\n var mainVariationSide = isVertical ? isStartVariation ? right : left : isStartVariation ? bottom : top;\n\n if (referenceRect[len] > popperRect[len]) {\n mainVariationSide = getOppositePlacement(mainVariationSide);\n }\n\n var altVariationSide = getOppositePlacement(mainVariationSide);\n var checks = [];\n\n if (checkMainAxis) {\n checks.push(overflow[_basePlacement] <= 0);\n }\n\n if (checkAltAxis) {\n checks.push(overflow[mainVariationSide] <= 0, overflow[altVariationSide] <= 0);\n }\n\n if (checks.every(function (check) {\n return check;\n })) {\n firstFittingPlacement = placement;\n makeFallbackChecks = false;\n break;\n }\n\n checksMap.set(placement, checks);\n }\n\n if (makeFallbackChecks) {\n // `2` may be desired in some cases – research later\n var numberOfChecks = flipVariations ? 3 : 1;\n\n var _loop = function _loop(_i) {\n var fittingPlacement = placements.find(function (placement) {\n var checks = checksMap.get(placement);\n\n if (checks) {\n return checks.slice(0, _i).every(function (check) {\n return check;\n });\n }\n });\n\n if (fittingPlacement) {\n firstFittingPlacement = fittingPlacement;\n return \"break\";\n }\n };\n\n for (var _i = numberOfChecks; _i > 0; _i--) {\n var _ret = _loop(_i);\n\n if (_ret === \"break\") break;\n }\n }\n\n if (state.placement !== firstFittingPlacement) {\n state.modifiersData[name]._skip = true;\n state.placement = firstFittingPlacement;\n state.reset = true;\n }\n} // eslint-disable-next-line import/no-unused-modules\n\n\nexport default {\n name: 'flip',\n enabled: true,\n phase: 'main',\n fn: flip,\n requiresIfExists: ['offset'],\n data: {\n _skip: false\n }\n};","import { top, bottom, left, right } from \"../enums.js\";\nimport detectOverflow from \"../utils/detectOverflow.js\";\n\nfunction getSideOffsets(overflow, rect, preventedOffsets) {\n if (preventedOffsets === void 0) {\n preventedOffsets = {\n x: 0,\n y: 0\n };\n }\n\n return {\n top: overflow.top - rect.height - preventedOffsets.y,\n right: overflow.right - rect.width + preventedOffsets.x,\n bottom: overflow.bottom - rect.height + preventedOffsets.y,\n left: overflow.left - rect.width - preventedOffsets.x\n };\n}\n\nfunction isAnySideFullyClipped(overflow) {\n return [top, right, bottom, left].some(function (side) {\n return overflow[side] >= 0;\n });\n}\n\nfunction hide(_ref) {\n var state = _ref.state,\n name = _ref.name;\n var referenceRect = state.rects.reference;\n var popperRect = state.rects.popper;\n var preventedOffsets = state.modifiersData.preventOverflow;\n var referenceOverflow = detectOverflow(state, {\n elementContext: 'reference'\n });\n var popperAltOverflow = detectOverflow(state, {\n altBoundary: true\n });\n var referenceClippingOffsets = getSideOffsets(referenceOverflow, referenceRect);\n var popperEscapeOffsets = getSideOffsets(popperAltOverflow, popperRect, preventedOffsets);\n var isReferenceHidden = isAnySideFullyClipped(referenceClippingOffsets);\n var hasPopperEscaped = isAnySideFullyClipped(popperEscapeOffsets);\n state.modifiersData[name] = {\n referenceClippingOffsets: referenceClippingOffsets,\n popperEscapeOffsets: popperEscapeOffsets,\n isReferenceHidden: isReferenceHidden,\n hasPopperEscaped: hasPopperEscaped\n };\n state.attributes.popper = Object.assign({}, state.attributes.popper, {\n 'data-popper-reference-hidden': isReferenceHidden,\n 'data-popper-escaped': hasPopperEscaped\n });\n} // eslint-disable-next-line import/no-unused-modules\n\n\nexport default {\n name: 'hide',\n enabled: true,\n phase: 'main',\n requiresIfExists: ['preventOverflow'],\n fn: hide\n};","import getBasePlacement from \"../utils/getBasePlacement.js\";\nimport { top, left, right, placements } from \"../enums.js\";\nexport function distanceAndSkiddingToXY(placement, rects, offset) {\n var basePlacement = getBasePlacement(placement);\n var invertDistance = [left, top].indexOf(basePlacement) >= 0 ? -1 : 1;\n\n var _ref = typeof offset === 'function' ? offset(Object.assign({}, rects, {\n placement: placement\n })) : offset,\n skidding = _ref[0],\n distance = _ref[1];\n\n skidding = skidding || 0;\n distance = (distance || 0) * invertDistance;\n return [left, right].indexOf(basePlacement) >= 0 ? {\n x: distance,\n y: skidding\n } : {\n x: skidding,\n y: distance\n };\n}\n\nfunction offset(_ref2) {\n var state = _ref2.state,\n options = _ref2.options,\n name = _ref2.name;\n var _options$offset = options.offset,\n offset = _options$offset === void 0 ? [0, 0] : _options$offset;\n var data = placements.reduce(function (acc, placement) {\n acc[placement] = distanceAndSkiddingToXY(placement, state.rects, offset);\n return acc;\n }, {});\n var _data$state$placement = data[state.placement],\n x = _data$state$placement.x,\n y = _data$state$placement.y;\n\n if (state.modifiersData.popperOffsets != null) {\n state.modifiersData.popperOffsets.x += x;\n state.modifiersData.popperOffsets.y += y;\n }\n\n state.modifiersData[name] = data;\n} // eslint-disable-next-line import/no-unused-modules\n\n\nexport default {\n name: 'offset',\n enabled: true,\n phase: 'main',\n requires: ['popperOffsets'],\n fn: offset\n};","import computeOffsets from \"../utils/computeOffsets.js\";\n\nfunction popperOffsets(_ref) {\n var state = _ref.state,\n name = _ref.name;\n // Offsets are the actual position the popper needs to have to be\n // properly positioned near its reference element\n // This is the most basic placement, and will be adjusted by\n // the modifiers in the next step\n state.modifiersData[name] = computeOffsets({\n reference: state.rects.reference,\n element: state.rects.popper,\n strategy: 'absolute',\n placement: state.placement\n });\n} // eslint-disable-next-line import/no-unused-modules\n\n\nexport default {\n name: 'popperOffsets',\n enabled: true,\n phase: 'read',\n fn: popperOffsets,\n data: {}\n};","import { top, left, right, bottom, start } from \"../enums.js\";\nimport getBasePlacement from \"../utils/getBasePlacement.js\";\nimport getMainAxisFromPlacement from \"../utils/getMainAxisFromPlacement.js\";\nimport getAltAxis from \"../utils/getAltAxis.js\";\nimport within from \"../utils/within.js\";\nimport getLayoutRect from \"../dom-utils/getLayoutRect.js\";\nimport getOffsetParent from \"../dom-utils/getOffsetParent.js\";\nimport detectOverflow from \"../utils/detectOverflow.js\";\nimport getVariation from \"../utils/getVariation.js\";\nimport getFreshSideObject from \"../utils/getFreshSideObject.js\";\nimport { max as mathMax, min as mathMin } from \"../utils/math.js\";\n\nfunction preventOverflow(_ref) {\n var state = _ref.state,\n options = _ref.options,\n name = _ref.name;\n var _options$mainAxis = options.mainAxis,\n checkMainAxis = _options$mainAxis === void 0 ? true : _options$mainAxis,\n _options$altAxis = options.altAxis,\n checkAltAxis = _options$altAxis === void 0 ? false : _options$altAxis,\n boundary = options.boundary,\n rootBoundary = options.rootBoundary,\n altBoundary = options.altBoundary,\n padding = options.padding,\n _options$tether = options.tether,\n tether = _options$tether === void 0 ? true : _options$tether,\n _options$tetherOffset = options.tetherOffset,\n tetherOffset = _options$tetherOffset === void 0 ? 0 : _options$tetherOffset;\n var overflow = detectOverflow(state, {\n boundary: boundary,\n rootBoundary: rootBoundary,\n padding: padding,\n altBoundary: altBoundary\n });\n var basePlacement = getBasePlacement(state.placement);\n var variation = getVariation(state.placement);\n var isBasePlacement = !variation;\n var mainAxis = getMainAxisFromPlacement(basePlacement);\n var altAxis = getAltAxis(mainAxis);\n var popperOffsets = state.modifiersData.popperOffsets;\n var referenceRect = state.rects.reference;\n var popperRect = state.rects.popper;\n var tetherOffsetValue = typeof tetherOffset === 'function' ? tetherOffset(Object.assign({}, state.rects, {\n placement: state.placement\n })) : tetherOffset;\n var data = {\n x: 0,\n y: 0\n };\n\n if (!popperOffsets) {\n return;\n }\n\n if (checkMainAxis || checkAltAxis) {\n var mainSide = mainAxis === 'y' ? top : left;\n var altSide = mainAxis === 'y' ? bottom : right;\n var len = mainAxis === 'y' ? 'height' : 'width';\n var offset = popperOffsets[mainAxis];\n var min = popperOffsets[mainAxis] + overflow[mainSide];\n var max = popperOffsets[mainAxis] - overflow[altSide];\n var additive = tether ? -popperRect[len] / 2 : 0;\n var minLen = variation === start ? referenceRect[len] : popperRect[len];\n var maxLen = variation === start ? -popperRect[len] : -referenceRect[len]; // We need to include the arrow in the calculation so the arrow doesn't go\n // outside the reference bounds\n\n var arrowElement = state.elements.arrow;\n var arrowRect = tether && arrowElement ? getLayoutRect(arrowElement) : {\n width: 0,\n height: 0\n };\n var arrowPaddingObject = state.modifiersData['arrow#persistent'] ? state.modifiersData['arrow#persistent'].padding : getFreshSideObject();\n var arrowPaddingMin = arrowPaddingObject[mainSide];\n var arrowPaddingMax = arrowPaddingObject[altSide]; // If the reference length is smaller than the arrow length, we don't want\n // to include its full size in the calculation. If the reference is small\n // and near the edge of a boundary, the popper can overflow even if the\n // reference is not overflowing as well (e.g. virtual elements with no\n // width or height)\n\n var arrowLen = within(0, referenceRect[len], arrowRect[len]);\n var minOffset = isBasePlacement ? referenceRect[len] / 2 - additive - arrowLen - arrowPaddingMin - tetherOffsetValue : minLen - arrowLen - arrowPaddingMin - tetherOffsetValue;\n var maxOffset = isBasePlacement ? -referenceRect[len] / 2 + additive + arrowLen + arrowPaddingMax + tetherOffsetValue : maxLen + arrowLen + arrowPaddingMax + tetherOffsetValue;\n var arrowOffsetParent = state.elements.arrow && getOffsetParent(state.elements.arrow);\n var clientOffset = arrowOffsetParent ? mainAxis === 'y' ? arrowOffsetParent.clientTop || 0 : arrowOffsetParent.clientLeft || 0 : 0;\n var offsetModifierValue = state.modifiersData.offset ? state.modifiersData.offset[state.placement][mainAxis] : 0;\n var tetherMin = popperOffsets[mainAxis] + minOffset - offsetModifierValue - clientOffset;\n var tetherMax = popperOffsets[mainAxis] + maxOffset - offsetModifierValue;\n\n if (checkMainAxis) {\n var preventedOffset = within(tether ? mathMin(min, tetherMin) : min, offset, tether ? mathMax(max, tetherMax) : max);\n popperOffsets[mainAxis] = preventedOffset;\n data[mainAxis] = preventedOffset - offset;\n }\n\n if (checkAltAxis) {\n var _mainSide = mainAxis === 'x' ? top : left;\n\n var _altSide = mainAxis === 'x' ? bottom : right;\n\n var _offset = popperOffsets[altAxis];\n\n var _min = _offset + overflow[_mainSide];\n\n var _max = _offset - overflow[_altSide];\n\n var _preventedOffset = within(tether ? mathMin(_min, tetherMin) : _min, _offset, tether ? mathMax(_max, tetherMax) : _max);\n\n popperOffsets[altAxis] = _preventedOffset;\n data[altAxis] = _preventedOffset - _offset;\n }\n }\n\n state.modifiersData[name] = data;\n} // eslint-disable-next-line import/no-unused-modules\n\n\nexport default {\n name: 'preventOverflow',\n enabled: true,\n phase: 'main',\n fn: preventOverflow,\n requiresIfExists: ['offset']\n};","export default function getAltAxis(axis) {\n return axis === 'x' ? 'y' : 'x';\n}","import getBoundingClientRect from \"./getBoundingClientRect.js\";\nimport getNodeScroll from \"./getNodeScroll.js\";\nimport getNodeName from \"./getNodeName.js\";\nimport { isHTMLElement } from \"./instanceOf.js\";\nimport getWindowScrollBarX from \"./getWindowScrollBarX.js\";\nimport getDocumentElement from \"./getDocumentElement.js\";\nimport isScrollParent from \"./isScrollParent.js\";\n\nfunction isElementScaled(element) {\n var rect = element.getBoundingClientRect();\n var scaleX = rect.width / element.offsetWidth || 1;\n var scaleY = rect.height / element.offsetHeight || 1;\n return scaleX !== 1 || scaleY !== 1;\n} // Returns the composite rect of an element relative to its offsetParent.\n// Composite means it takes into account transforms as well as layout.\n\n\nexport default function getCompositeRect(elementOrVirtualElement, offsetParent, isFixed) {\n if (isFixed === void 0) {\n isFixed = false;\n }\n\n var isOffsetParentAnElement = isHTMLElement(offsetParent);\n var offsetParentIsScaled = isHTMLElement(offsetParent) && isElementScaled(offsetParent);\n var documentElement = getDocumentElement(offsetParent);\n var rect = getBoundingClientRect(elementOrVirtualElement, offsetParentIsScaled);\n var scroll = {\n scrollLeft: 0,\n scrollTop: 0\n };\n var offsets = {\n x: 0,\n y: 0\n };\n\n if (isOffsetParentAnElement || !isOffsetParentAnElement && !isFixed) {\n if (getNodeName(offsetParent) !== 'body' || // https://github.com/popperjs/popper-core/issues/1078\n isScrollParent(documentElement)) {\n scroll = getNodeScroll(offsetParent);\n }\n\n if (isHTMLElement(offsetParent)) {\n offsets = getBoundingClientRect(offsetParent, true);\n offsets.x += offsetParent.clientLeft;\n offsets.y += offsetParent.clientTop;\n } else if (documentElement) {\n offsets.x = getWindowScrollBarX(documentElement);\n }\n }\n\n return {\n x: rect.left + scroll.scrollLeft - offsets.x,\n y: rect.top + scroll.scrollTop - offsets.y,\n width: rect.width,\n height: rect.height\n };\n}","import getWindowScroll from \"./getWindowScroll.js\";\nimport getWindow from \"./getWindow.js\";\nimport { isHTMLElement } from \"./instanceOf.js\";\nimport getHTMLElementScroll from \"./getHTMLElementScroll.js\";\nexport default function getNodeScroll(node) {\n if (node === getWindow(node) || !isHTMLElement(node)) {\n return getWindowScroll(node);\n } else {\n return getHTMLElementScroll(node);\n }\n}","export default function getHTMLElementScroll(element) {\n return {\n scrollLeft: element.scrollLeft,\n scrollTop: element.scrollTop\n };\n}","import { modifierPhases } from \"../enums.js\"; // source: https://stackoverflow.com/questions/49875255\n\nfunction order(modifiers) {\n var map = new Map();\n var visited = new Set();\n var result = [];\n modifiers.forEach(function (modifier) {\n map.set(modifier.name, modifier);\n }); // On visiting object, check for its dependencies and visit them recursively\n\n function sort(modifier) {\n visited.add(modifier.name);\n var requires = [].concat(modifier.requires || [], modifier.requiresIfExists || []);\n requires.forEach(function (dep) {\n if (!visited.has(dep)) {\n var depModifier = map.get(dep);\n\n if (depModifier) {\n sort(depModifier);\n }\n }\n });\n result.push(modifier);\n }\n\n modifiers.forEach(function (modifier) {\n if (!visited.has(modifier.name)) {\n // check for visited object\n sort(modifier);\n }\n });\n return result;\n}\n\nexport default function orderModifiers(modifiers) {\n // order based on dependencies\n var orderedModifiers = order(modifiers); // order based on phase\n\n return modifierPhases.reduce(function (acc, phase) {\n return acc.concat(orderedModifiers.filter(function (modifier) {\n return modifier.phase === phase;\n }));\n }, []);\n}","import getCompositeRect from \"./dom-utils/getCompositeRect.js\";\nimport getLayoutRect from \"./dom-utils/getLayoutRect.js\";\nimport listScrollParents from \"./dom-utils/listScrollParents.js\";\nimport getOffsetParent from \"./dom-utils/getOffsetParent.js\";\nimport getComputedStyle from \"./dom-utils/getComputedStyle.js\";\nimport orderModifiers from \"./utils/orderModifiers.js\";\nimport debounce from \"./utils/debounce.js\";\nimport validateModifiers from \"./utils/validateModifiers.js\";\nimport uniqueBy from \"./utils/uniqueBy.js\";\nimport getBasePlacement from \"./utils/getBasePlacement.js\";\nimport mergeByName from \"./utils/mergeByName.js\";\nimport detectOverflow from \"./utils/detectOverflow.js\";\nimport { isElement } from \"./dom-utils/instanceOf.js\";\nimport { auto } from \"./enums.js\";\nvar INVALID_ELEMENT_ERROR = 'Popper: Invalid reference or popper argument provided. They must be either a DOM element or virtual element.';\nvar INFINITE_LOOP_ERROR = 'Popper: An infinite loop in the modifiers cycle has been detected! The cycle has been interrupted to prevent a browser crash.';\nvar DEFAULT_OPTIONS = {\n placement: 'bottom',\n modifiers: [],\n strategy: 'absolute'\n};\n\nfunction areValidElements() {\n for (var _len = arguments.length, args = new Array(_len), _key = 0; _key < _len; _key++) {\n args[_key] = arguments[_key];\n }\n\n return !args.some(function (element) {\n return !(element && typeof element.getBoundingClientRect === 'function');\n });\n}\n\nexport function popperGenerator(generatorOptions) {\n if (generatorOptions === void 0) {\n generatorOptions = {};\n }\n\n var _generatorOptions = generatorOptions,\n _generatorOptions$def = _generatorOptions.defaultModifiers,\n defaultModifiers = _generatorOptions$def === void 0 ? [] : _generatorOptions$def,\n _generatorOptions$def2 = _generatorOptions.defaultOptions,\n defaultOptions = _generatorOptions$def2 === void 0 ? DEFAULT_OPTIONS : _generatorOptions$def2;\n return function createPopper(reference, popper, options) {\n if (options === void 0) {\n options = defaultOptions;\n }\n\n var state = {\n placement: 'bottom',\n orderedModifiers: [],\n options: Object.assign({}, DEFAULT_OPTIONS, defaultOptions),\n modifiersData: {},\n elements: {\n reference: reference,\n popper: popper\n },\n attributes: {},\n styles: {}\n };\n var effectCleanupFns = [];\n var isDestroyed = false;\n var instance = {\n state: state,\n setOptions: function setOptions(setOptionsAction) {\n var options = typeof setOptionsAction === 'function' ? setOptionsAction(state.options) : setOptionsAction;\n cleanupModifierEffects();\n state.options = Object.assign({}, defaultOptions, state.options, options);\n state.scrollParents = {\n reference: isElement(reference) ? listScrollParents(reference) : reference.contextElement ? listScrollParents(reference.contextElement) : [],\n popper: listScrollParents(popper)\n }; // Orders the modifiers based on their dependencies and `phase`\n // properties\n\n var orderedModifiers = orderModifiers(mergeByName([].concat(defaultModifiers, state.options.modifiers))); // Strip out disabled modifiers\n\n state.orderedModifiers = orderedModifiers.filter(function (m) {\n return m.enabled;\n }); // Validate the provided modifiers so that the consumer will get warned\n // if one of the modifiers is invalid for any reason\n\n if (process.env.NODE_ENV !== \"production\") {\n var modifiers = uniqueBy([].concat(orderedModifiers, state.options.modifiers), function (_ref) {\n var name = _ref.name;\n return name;\n });\n validateModifiers(modifiers);\n\n if (getBasePlacement(state.options.placement) === auto) {\n var flipModifier = state.orderedModifiers.find(function (_ref2) {\n var name = _ref2.name;\n return name === 'flip';\n });\n\n if (!flipModifier) {\n console.error(['Popper: \"auto\" placements require the \"flip\" modifier be', 'present and enabled to work.'].join(' '));\n }\n }\n\n var _getComputedStyle = getComputedStyle(popper),\n marginTop = _getComputedStyle.marginTop,\n marginRight = _getComputedStyle.marginRight,\n marginBottom = _getComputedStyle.marginBottom,\n marginLeft = _getComputedStyle.marginLeft; // We no longer take into account `margins` on the popper, and it can\n // cause bugs with positioning, so we'll warn the consumer\n\n\n if ([marginTop, marginRight, marginBottom, marginLeft].some(function (margin) {\n return parseFloat(margin);\n })) {\n console.warn(['Popper: CSS \"margin\" styles cannot be used to apply padding', 'between the popper and its reference element or boundary.', 'To replicate margin, use the `offset` modifier, as well as', 'the `padding` option in the `preventOverflow` and `flip`', 'modifiers.'].join(' '));\n }\n }\n\n runModifierEffects();\n return instance.update();\n },\n // Sync update – it will always be executed, even if not necessary. This\n // is useful for low frequency updates where sync behavior simplifies the\n // logic.\n // For high frequency updates (e.g. `resize` and `scroll` events), always\n // prefer the async Popper#update method\n forceUpdate: function forceUpdate() {\n if (isDestroyed) {\n return;\n }\n\n var _state$elements = state.elements,\n reference = _state$elements.reference,\n popper = _state$elements.popper; // Don't proceed if `reference` or `popper` are not valid elements\n // anymore\n\n if (!areValidElements(reference, popper)) {\n if (process.env.NODE_ENV !== \"production\") {\n console.error(INVALID_ELEMENT_ERROR);\n }\n\n return;\n } // Store the reference and popper rects to be read by modifiers\n\n\n state.rects = {\n reference: getCompositeRect(reference, getOffsetParent(popper), state.options.strategy === 'fixed'),\n popper: getLayoutRect(popper)\n }; // Modifiers have the ability to reset the current update cycle. The\n // most common use case for this is the `flip` modifier changing the\n // placement, which then needs to re-run all the modifiers, because the\n // logic was previously ran for the previous placement and is therefore\n // stale/incorrect\n\n state.reset = false;\n state.placement = state.options.placement; // On each update cycle, the `modifiersData` property for each modifier\n // is filled with the initial data specified by the modifier. This means\n // it doesn't persist and is fresh on each update.\n // To ensure persistent data, use `${name}#persistent`\n\n state.orderedModifiers.forEach(function (modifier) {\n return state.modifiersData[modifier.name] = Object.assign({}, modifier.data);\n });\n var __debug_loops__ = 0;\n\n for (var index = 0; index < state.orderedModifiers.length; index++) {\n if (process.env.NODE_ENV !== \"production\") {\n __debug_loops__ += 1;\n\n if (__debug_loops__ > 100) {\n console.error(INFINITE_LOOP_ERROR);\n break;\n }\n }\n\n if (state.reset === true) {\n state.reset = false;\n index = -1;\n continue;\n }\n\n var _state$orderedModifie = state.orderedModifiers[index],\n fn = _state$orderedModifie.fn,\n _state$orderedModifie2 = _state$orderedModifie.options,\n _options = _state$orderedModifie2 === void 0 ? {} : _state$orderedModifie2,\n name = _state$orderedModifie.name;\n\n if (typeof fn === 'function') {\n state = fn({\n state: state,\n options: _options,\n name: name,\n instance: instance\n }) || state;\n }\n }\n },\n // Async and optimistically optimized update – it will not be executed if\n // not necessary (debounced to run at most once-per-tick)\n update: debounce(function () {\n return new Promise(function (resolve) {\n instance.forceUpdate();\n resolve(state);\n });\n }),\n destroy: function destroy() {\n cleanupModifierEffects();\n isDestroyed = true;\n }\n };\n\n if (!areValidElements(reference, popper)) {\n if (process.env.NODE_ENV !== \"production\") {\n console.error(INVALID_ELEMENT_ERROR);\n }\n\n return instance;\n }\n\n instance.setOptions(options).then(function (state) {\n if (!isDestroyed && options.onFirstUpdate) {\n options.onFirstUpdate(state);\n }\n }); // Modifiers have the ability to execute arbitrary code before the first\n // update cycle runs. They will be executed in the same order as the update\n // cycle. This is useful when a modifier adds some persistent data that\n // other modifiers need to use, but the modifier is run after the dependent\n // one.\n\n function runModifierEffects() {\n state.orderedModifiers.forEach(function (_ref3) {\n var name = _ref3.name,\n _ref3$options = _ref3.options,\n options = _ref3$options === void 0 ? {} : _ref3$options,\n effect = _ref3.effect;\n\n if (typeof effect === 'function') {\n var cleanupFn = effect({\n state: state,\n name: name,\n instance: instance,\n options: options\n });\n\n var noopFn = function noopFn() {};\n\n effectCleanupFns.push(cleanupFn || noopFn);\n }\n });\n }\n\n function cleanupModifierEffects() {\n effectCleanupFns.forEach(function (fn) {\n return fn();\n });\n effectCleanupFns = [];\n }\n\n return instance;\n };\n}\nexport var createPopper = /*#__PURE__*/popperGenerator(); // eslint-disable-next-line import/no-unused-modules\n\nexport { detectOverflow };","export default function debounce(fn) {\n var pending;\n return function () {\n if (!pending) {\n pending = new Promise(function (resolve) {\n Promise.resolve().then(function () {\n pending = undefined;\n resolve(fn());\n });\n });\n }\n\n return pending;\n };\n}","export default function mergeByName(modifiers) {\n var merged = modifiers.reduce(function (merged, current) {\n var existing = merged[current.name];\n merged[current.name] = existing ? Object.assign({}, existing, current, {\n options: Object.assign({}, existing.options, current.options),\n data: Object.assign({}, existing.data, current.data)\n }) : current;\n return merged;\n }, {}); // IE11 does not support Object.values\n\n return Object.keys(merged).map(function (key) {\n return merged[key];\n });\n}","import { popperGenerator, detectOverflow } from \"./createPopper.js\";\nimport eventListeners from \"./modifiers/eventListeners.js\";\nimport popperOffsets from \"./modifiers/popperOffsets.js\";\nimport computeStyles from \"./modifiers/computeStyles.js\";\nimport applyStyles from \"./modifiers/applyStyles.js\";\nvar defaultModifiers = [eventListeners, popperOffsets, computeStyles, applyStyles];\nvar createPopper = /*#__PURE__*/popperGenerator({\n defaultModifiers: defaultModifiers\n}); // eslint-disable-next-line import/no-unused-modules\n\nexport { createPopper, popperGenerator, defaultModifiers, detectOverflow };","import { popperGenerator, detectOverflow } from \"./createPopper.js\";\nimport eventListeners from \"./modifiers/eventListeners.js\";\nimport popperOffsets from \"./modifiers/popperOffsets.js\";\nimport computeStyles from \"./modifiers/computeStyles.js\";\nimport applyStyles from \"./modifiers/applyStyles.js\";\nimport offset from \"./modifiers/offset.js\";\nimport flip from \"./modifiers/flip.js\";\nimport preventOverflow from \"./modifiers/preventOverflow.js\";\nimport arrow from \"./modifiers/arrow.js\";\nimport hide from \"./modifiers/hide.js\";\nvar defaultModifiers = [eventListeners, popperOffsets, computeStyles, applyStyles, offset, flip, preventOverflow, arrow, hide];\nvar createPopper = /*#__PURE__*/popperGenerator({\n defaultModifiers: defaultModifiers\n}); // eslint-disable-next-line import/no-unused-modules\n\nexport { createPopper, popperGenerator, defaultModifiers, detectOverflow }; // eslint-disable-next-line import/no-unused-modules\n\nexport { createPopper as createPopperLite } from \"./popper-lite.js\"; // eslint-disable-next-line import/no-unused-modules\n\nexport * from \"./modifiers/index.js\";","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v5.1.3): dropdown.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/main/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport * as Popper from '@popperjs/core'\n\nimport {\n defineJQueryPlugin,\n getElement,\n getElementFromSelector,\n getNextActiveElement,\n isDisabled,\n isElement,\n isRTL,\n isVisible,\n noop,\n typeCheckConfig\n} from './util/index'\nimport EventHandler from './dom/event-handler'\nimport Manipulator from './dom/manipulator'\nimport SelectorEngine from './dom/selector-engine'\nimport BaseComponent from './base-component'\n\n/**\n * ------------------------------------------------------------------------\n * Constants\n * ------------------------------------------------------------------------\n */\n\nconst NAME = 'dropdown'\nconst DATA_KEY = 'bs.dropdown'\nconst EVENT_KEY = `.${DATA_KEY}`\nconst DATA_API_KEY = '.data-api'\n\nconst ESCAPE_KEY = 'Escape'\nconst SPACE_KEY = 'Space'\nconst TAB_KEY = 'Tab'\nconst ARROW_UP_KEY = 'ArrowUp'\nconst ARROW_DOWN_KEY = 'ArrowDown'\nconst RIGHT_MOUSE_BUTTON = 2 // MouseEvent.button value for the secondary button, usually the right button\n\nconst REGEXP_KEYDOWN = new RegExp(`${ARROW_UP_KEY}|${ARROW_DOWN_KEY}|${ESCAPE_KEY}`)\n\nconst EVENT_HIDE = `hide${EVENT_KEY}`\nconst EVENT_HIDDEN = `hidden${EVENT_KEY}`\nconst EVENT_SHOW = `show${EVENT_KEY}`\nconst EVENT_SHOWN = `shown${EVENT_KEY}`\nconst EVENT_CLICK_DATA_API = `click${EVENT_KEY}${DATA_API_KEY}`\nconst EVENT_KEYDOWN_DATA_API = `keydown${EVENT_KEY}${DATA_API_KEY}`\nconst EVENT_KEYUP_DATA_API = `keyup${EVENT_KEY}${DATA_API_KEY}`\n\nconst CLASS_NAME_SHOW = 'show'\nconst CLASS_NAME_DROPUP = 'dropup'\nconst CLASS_NAME_DROPEND = 'dropend'\nconst CLASS_NAME_DROPSTART = 'dropstart'\nconst CLASS_NAME_NAVBAR = 'navbar'\n\nconst SELECTOR_DATA_TOGGLE = '[data-bs-toggle=\"dropdown\"]'\nconst SELECTOR_MENU = '.dropdown-menu'\nconst SELECTOR_NAVBAR_NAV = '.navbar-nav'\nconst SELECTOR_VISIBLE_ITEMS = '.dropdown-menu .dropdown-item:not(.disabled):not(:disabled)'\n\nconst PLACEMENT_TOP = isRTL() ? 'top-end' : 'top-start'\nconst PLACEMENT_TOPEND = isRTL() ? 'top-start' : 'top-end'\nconst PLACEMENT_BOTTOM = isRTL() ? 'bottom-end' : 'bottom-start'\nconst PLACEMENT_BOTTOMEND = isRTL() ? 'bottom-start' : 'bottom-end'\nconst PLACEMENT_RIGHT = isRTL() ? 'left-start' : 'right-start'\nconst PLACEMENT_LEFT = isRTL() ? 'right-start' : 'left-start'\n\nconst Default = {\n offset: [0, 2],\n boundary: 'clippingParents',\n reference: 'toggle',\n display: 'dynamic',\n popperConfig: null,\n autoClose: true\n}\n\nconst DefaultType = {\n offset: '(array|string|function)',\n boundary: '(string|element)',\n reference: '(string|element|object)',\n display: 'string',\n popperConfig: '(null|object|function)',\n autoClose: '(boolean|string)'\n}\n\n/**\n * ------------------------------------------------------------------------\n * Class Definition\n * ------------------------------------------------------------------------\n */\n\nclass Dropdown extends BaseComponent {\n constructor(element, config) {\n super(element)\n\n this._popper = null\n this._config = this._getConfig(config)\n this._menu = this._getMenuElement()\n this._inNavbar = this._detectNavbar()\n }\n\n // Getters\n\n static get Default() {\n return Default\n }\n\n static get DefaultType() {\n return DefaultType\n }\n\n static get NAME() {\n return NAME\n }\n\n // Public\n\n toggle() {\n return this._isShown() ? this.hide() : this.show()\n }\n\n show() {\n if (isDisabled(this._element) || this._isShown(this._menu)) {\n return\n }\n\n const relatedTarget = {\n relatedTarget: this._element\n }\n\n const showEvent = EventHandler.trigger(this._element, EVENT_SHOW, relatedTarget)\n\n if (showEvent.defaultPrevented) {\n return\n }\n\n const parent = Dropdown.getParentFromElement(this._element)\n // Totally disable Popper for Dropdowns in Navbar\n if (this._inNavbar) {\n Manipulator.setDataAttribute(this._menu, 'popper', 'none')\n } else {\n this._createPopper(parent)\n }\n\n // If this is a touch-enabled device we add extra\n // empty mouseover listeners to the body's immediate children;\n // only needed because of broken event delegation on iOS\n // https://www.quirksmode.org/blog/archives/2014/02/mouse_event_bub.html\n if ('ontouchstart' in document.documentElement &&\n !parent.closest(SELECTOR_NAVBAR_NAV)) {\n [].concat(...document.body.children)\n .forEach(elem => EventHandler.on(elem, 'mouseover', noop))\n }\n\n this._element.focus()\n this._element.setAttribute('aria-expanded', true)\n\n this._menu.classList.add(CLASS_NAME_SHOW)\n this._element.classList.add(CLASS_NAME_SHOW)\n EventHandler.trigger(this._element, EVENT_SHOWN, relatedTarget)\n }\n\n hide() {\n if (isDisabled(this._element) || !this._isShown(this._menu)) {\n return\n }\n\n const relatedTarget = {\n relatedTarget: this._element\n }\n\n this._completeHide(relatedTarget)\n }\n\n dispose() {\n if (this._popper) {\n this._popper.destroy()\n }\n\n super.dispose()\n }\n\n update() {\n this._inNavbar = this._detectNavbar()\n if (this._popper) {\n this._popper.update()\n }\n }\n\n // Private\n\n _completeHide(relatedTarget) {\n const hideEvent = EventHandler.trigger(this._element, EVENT_HIDE, relatedTarget)\n if (hideEvent.defaultPrevented) {\n return\n }\n\n // If this is a touch-enabled device we remove the extra\n // empty mouseover listeners we added for iOS support\n if ('ontouchstart' in document.documentElement) {\n [].concat(...document.body.children)\n .forEach(elem => EventHandler.off(elem, 'mouseover', noop))\n }\n\n if (this._popper) {\n this._popper.destroy()\n }\n\n this._menu.classList.remove(CLASS_NAME_SHOW)\n this._element.classList.remove(CLASS_NAME_SHOW)\n this._element.setAttribute('aria-expanded', 'false')\n Manipulator.removeDataAttribute(this._menu, 'popper')\n EventHandler.trigger(this._element, EVENT_HIDDEN, relatedTarget)\n }\n\n _getConfig(config) {\n config = {\n ...this.constructor.Default,\n ...Manipulator.getDataAttributes(this._element),\n ...config\n }\n\n typeCheckConfig(NAME, config, this.constructor.DefaultType)\n\n if (typeof config.reference === 'object' && !isElement(config.reference) &&\n typeof config.reference.getBoundingClientRect !== 'function'\n ) {\n // Popper virtual elements require a getBoundingClientRect method\n throw new TypeError(`${NAME.toUpperCase()}: Option \"reference\" provided type \"object\" without a required \"getBoundingClientRect\" method.`)\n }\n\n return config\n }\n\n _createPopper(parent) {\n if (typeof Popper === 'undefined') {\n throw new TypeError('Bootstrap\\'s dropdowns require Popper (https://popper.js.org)')\n }\n\n let referenceElement = this._element\n\n if (this._config.reference === 'parent') {\n referenceElement = parent\n } else if (isElement(this._config.reference)) {\n referenceElement = getElement(this._config.reference)\n } else if (typeof this._config.reference === 'object') {\n referenceElement = this._config.reference\n }\n\n const popperConfig = this._getPopperConfig()\n const isDisplayStatic = popperConfig.modifiers.find(modifier => modifier.name === 'applyStyles' && modifier.enabled === false)\n\n this._popper = Popper.createPopper(referenceElement, this._menu, popperConfig)\n\n if (isDisplayStatic) {\n Manipulator.setDataAttribute(this._menu, 'popper', 'static')\n }\n }\n\n _isShown(element = this._element) {\n return element.classList.contains(CLASS_NAME_SHOW)\n }\n\n _getMenuElement() {\n return SelectorEngine.next(this._element, SELECTOR_MENU)[0]\n }\n\n _getPlacement() {\n const parentDropdown = this._element.parentNode\n\n if (parentDropdown.classList.contains(CLASS_NAME_DROPEND)) {\n return PLACEMENT_RIGHT\n }\n\n if (parentDropdown.classList.contains(CLASS_NAME_DROPSTART)) {\n return PLACEMENT_LEFT\n }\n\n // We need to trim the value because custom properties can also include spaces\n const isEnd = getComputedStyle(this._menu).getPropertyValue('--bs-position').trim() === 'end'\n\n if (parentDropdown.classList.contains(CLASS_NAME_DROPUP)) {\n return isEnd ? PLACEMENT_TOPEND : PLACEMENT_TOP\n }\n\n return isEnd ? PLACEMENT_BOTTOMEND : PLACEMENT_BOTTOM\n }\n\n _detectNavbar() {\n return this._element.closest(`.${CLASS_NAME_NAVBAR}`) !== null\n }\n\n _getOffset() {\n const { offset } = this._config\n\n if (typeof offset === 'string') {\n return offset.split(',').map(val => Number.parseInt(val, 10))\n }\n\n if (typeof offset === 'function') {\n return popperData => offset(popperData, this._element)\n }\n\n return offset\n }\n\n _getPopperConfig() {\n const defaultBsPopperConfig = {\n placement: this._getPlacement(),\n modifiers: [{\n name: 'preventOverflow',\n options: {\n boundary: this._config.boundary\n }\n },\n {\n name: 'offset',\n options: {\n offset: this._getOffset()\n }\n }]\n }\n\n // Disable Popper if we have a static display\n if (this._config.display === 'static') {\n defaultBsPopperConfig.modifiers = [{\n name: 'applyStyles',\n enabled: false\n }]\n }\n\n return {\n ...defaultBsPopperConfig,\n ...(typeof this._config.popperConfig === 'function' ? this._config.popperConfig(defaultBsPopperConfig) : this._config.popperConfig)\n }\n }\n\n _selectMenuItem({ key, target }) {\n const items = SelectorEngine.find(SELECTOR_VISIBLE_ITEMS, this._menu).filter(isVisible)\n\n if (!items.length) {\n return\n }\n\n // if target isn't included in items (e.g. when expanding the dropdown)\n // allow cycling to get the last item in case key equals ARROW_UP_KEY\n getNextActiveElement(items, target, key === ARROW_DOWN_KEY, !items.includes(target)).focus()\n }\n\n // Static\n\n static jQueryInterface(config) {\n return this.each(function () {\n const data = Dropdown.getOrCreateInstance(this, config)\n\n if (typeof config !== 'string') {\n return\n }\n\n if (typeof data[config] === 'undefined') {\n throw new TypeError(`No method named \"${config}\"`)\n }\n\n data[config]()\n })\n }\n\n static clearMenus(event) {\n if (event && (event.button === RIGHT_MOUSE_BUTTON || (event.type === 'keyup' && event.key !== TAB_KEY))) {\n return\n }\n\n const toggles = SelectorEngine.find(SELECTOR_DATA_TOGGLE)\n\n for (let i = 0, len = toggles.length; i < len; i++) {\n const context = Dropdown.getInstance(toggles[i])\n if (!context || context._config.autoClose === false) {\n continue\n }\n\n if (!context._isShown()) {\n continue\n }\n\n const relatedTarget = {\n relatedTarget: context._element\n }\n\n if (event) {\n const composedPath = event.composedPath()\n const isMenuTarget = composedPath.includes(context._menu)\n if (\n composedPath.includes(context._element) ||\n (context._config.autoClose === 'inside' && !isMenuTarget) ||\n (context._config.autoClose === 'outside' && isMenuTarget)\n ) {\n continue\n }\n\n // Tab navigation through the dropdown menu or events from contained inputs shouldn't close the menu\n if (context._menu.contains(event.target) && ((event.type === 'keyup' && event.key === TAB_KEY) || /input|select|option|textarea|form/i.test(event.target.tagName))) {\n continue\n }\n\n if (event.type === 'click') {\n relatedTarget.clickEvent = event\n }\n }\n\n context._completeHide(relatedTarget)\n }\n }\n\n static getParentFromElement(element) {\n return getElementFromSelector(element) || element.parentNode\n }\n\n static dataApiKeydownHandler(event) {\n // If not input/textarea:\n // - And not a key in REGEXP_KEYDOWN => not a dropdown command\n // If input/textarea:\n // - If space key => not a dropdown command\n // - If key is other than escape\n // - If key is not up or down => not a dropdown command\n // - If trigger inside the menu => not a dropdown command\n if (/input|textarea/i.test(event.target.tagName) ?\n event.key === SPACE_KEY || (event.key !== ESCAPE_KEY &&\n ((event.key !== ARROW_DOWN_KEY && event.key !== ARROW_UP_KEY) ||\n event.target.closest(SELECTOR_MENU))) :\n !REGEXP_KEYDOWN.test(event.key)) {\n return\n }\n\n const isActive = this.classList.contains(CLASS_NAME_SHOW)\n\n if (!isActive && event.key === ESCAPE_KEY) {\n return\n }\n\n event.preventDefault()\n event.stopPropagation()\n\n if (isDisabled(this)) {\n return\n }\n\n const getToggleButton = this.matches(SELECTOR_DATA_TOGGLE) ? this : SelectorEngine.prev(this, SELECTOR_DATA_TOGGLE)[0]\n const instance = Dropdown.getOrCreateInstance(getToggleButton)\n\n if (event.key === ESCAPE_KEY) {\n instance.hide()\n return\n }\n\n if (event.key === ARROW_UP_KEY || event.key === ARROW_DOWN_KEY) {\n if (!isActive) {\n instance.show()\n }\n\n instance._selectMenuItem(event)\n return\n }\n\n if (!isActive || event.key === SPACE_KEY) {\n Dropdown.clearMenus()\n }\n }\n}\n\n/**\n * ------------------------------------------------------------------------\n * Data Api implementation\n * ------------------------------------------------------------------------\n */\n\nEventHandler.on(document, EVENT_KEYDOWN_DATA_API, SELECTOR_DATA_TOGGLE, Dropdown.dataApiKeydownHandler)\nEventHandler.on(document, EVENT_KEYDOWN_DATA_API, SELECTOR_MENU, Dropdown.dataApiKeydownHandler)\nEventHandler.on(document, EVENT_CLICK_DATA_API, Dropdown.clearMenus)\nEventHandler.on(document, EVENT_KEYUP_DATA_API, Dropdown.clearMenus)\nEventHandler.on(document, EVENT_CLICK_DATA_API, SELECTOR_DATA_TOGGLE, function (event) {\n event.preventDefault()\n Dropdown.getOrCreateInstance(this).toggle()\n})\n\n/**\n * ------------------------------------------------------------------------\n * jQuery\n * ------------------------------------------------------------------------\n * add .Dropdown to jQuery only if jQuery is present\n */\n\ndefineJQueryPlugin(Dropdown)\n\nexport default Dropdown\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v5.1.3): util/scrollBar.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/main/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport SelectorEngine from '../dom/selector-engine'\nimport Manipulator from '../dom/manipulator'\nimport { isElement } from './index'\n\nconst SELECTOR_FIXED_CONTENT = '.fixed-top, .fixed-bottom, .is-fixed, .sticky-top'\nconst SELECTOR_STICKY_CONTENT = '.sticky-top'\n\nclass ScrollBarHelper {\n constructor() {\n this._element = document.body\n }\n\n getWidth() {\n // https://developer.mozilla.org/en-US/docs/Web/API/Window/innerWidth#usage_notes\n const documentWidth = document.documentElement.clientWidth\n return Math.abs(window.innerWidth - documentWidth)\n }\n\n hide() {\n const width = this.getWidth()\n this._disableOverFlow()\n // give padding to element to balance the hidden scrollbar width\n this._setElementAttributes(this._element, 'paddingRight', calculatedValue => calculatedValue + width)\n // trick: We adjust positive paddingRight and negative marginRight to sticky-top elements to keep showing fullwidth\n this._setElementAttributes(SELECTOR_FIXED_CONTENT, 'paddingRight', calculatedValue => calculatedValue + width)\n this._setElementAttributes(SELECTOR_STICKY_CONTENT, 'marginRight', calculatedValue => calculatedValue - width)\n }\n\n _disableOverFlow() {\n this._saveInitialAttribute(this._element, 'overflow')\n this._element.style.overflow = 'hidden'\n }\n\n _setElementAttributes(selector, styleProp, callback) {\n const scrollbarWidth = this.getWidth()\n const manipulationCallBack = element => {\n if (element !== this._element && window.innerWidth > element.clientWidth + scrollbarWidth) {\n return\n }\n\n this._saveInitialAttribute(element, styleProp)\n const calculatedValue = window.getComputedStyle(element)[styleProp]\n element.style[styleProp] = `${callback(Number.parseFloat(calculatedValue))}px`\n }\n\n this._applyManipulationCallback(selector, manipulationCallBack)\n }\n\n reset() {\n this._resetElementAttributes(this._element, 'overflow')\n this._resetElementAttributes(this._element, 'paddingRight')\n this._resetElementAttributes(SELECTOR_FIXED_CONTENT, 'paddingRight')\n this._resetElementAttributes(SELECTOR_STICKY_CONTENT, 'marginRight')\n }\n\n _saveInitialAttribute(element, styleProp) {\n const actualValue = element.style[styleProp]\n if (actualValue) {\n Manipulator.setDataAttribute(element, styleProp, actualValue)\n }\n }\n\n _resetElementAttributes(selector, styleProp) {\n const manipulationCallBack = element => {\n const value = Manipulator.getDataAttribute(element, styleProp)\n if (typeof value === 'undefined') {\n element.style.removeProperty(styleProp)\n } else {\n Manipulator.removeDataAttribute(element, styleProp)\n element.style[styleProp] = value\n }\n }\n\n this._applyManipulationCallback(selector, manipulationCallBack)\n }\n\n _applyManipulationCallback(selector, callBack) {\n if (isElement(selector)) {\n callBack(selector)\n } else {\n SelectorEngine.find(selector, this._element).forEach(callBack)\n }\n }\n\n isOverflowing() {\n return this.getWidth() > 0\n }\n}\n\nexport default ScrollBarHelper\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v5.1.3): util/backdrop.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/main/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport EventHandler from '../dom/event-handler'\nimport { execute, executeAfterTransition, getElement, reflow, typeCheckConfig } from './index'\n\nconst Default = {\n className: 'modal-backdrop',\n isVisible: true, // if false, we use the backdrop helper without adding any element to the dom\n isAnimated: false,\n rootElement: 'body', // give the choice to place backdrop under different elements\n clickCallback: null\n}\n\nconst DefaultType = {\n className: 'string',\n isVisible: 'boolean',\n isAnimated: 'boolean',\n rootElement: '(element|string)',\n clickCallback: '(function|null)'\n}\nconst NAME = 'backdrop'\nconst CLASS_NAME_FADE = 'fade'\nconst CLASS_NAME_SHOW = 'show'\n\nconst EVENT_MOUSEDOWN = `mousedown.bs.${NAME}`\n\nclass Backdrop {\n constructor(config) {\n this._config = this._getConfig(config)\n this._isAppended = false\n this._element = null\n }\n\n show(callback) {\n if (!this._config.isVisible) {\n execute(callback)\n return\n }\n\n this._append()\n\n if (this._config.isAnimated) {\n reflow(this._getElement())\n }\n\n this._getElement().classList.add(CLASS_NAME_SHOW)\n\n this._emulateAnimation(() => {\n execute(callback)\n })\n }\n\n hide(callback) {\n if (!this._config.isVisible) {\n execute(callback)\n return\n }\n\n this._getElement().classList.remove(CLASS_NAME_SHOW)\n\n this._emulateAnimation(() => {\n this.dispose()\n execute(callback)\n })\n }\n\n // Private\n\n _getElement() {\n if (!this._element) {\n const backdrop = document.createElement('div')\n backdrop.className = this._config.className\n if (this._config.isAnimated) {\n backdrop.classList.add(CLASS_NAME_FADE)\n }\n\n this._element = backdrop\n }\n\n return this._element\n }\n\n _getConfig(config) {\n config = {\n ...Default,\n ...(typeof config === 'object' ? config : {})\n }\n\n // use getElement() with the default \"body\" to get a fresh Element on each instantiation\n config.rootElement = getElement(config.rootElement)\n typeCheckConfig(NAME, config, DefaultType)\n return config\n }\n\n _append() {\n if (this._isAppended) {\n return\n }\n\n this._config.rootElement.append(this._getElement())\n\n EventHandler.on(this._getElement(), EVENT_MOUSEDOWN, () => {\n execute(this._config.clickCallback)\n })\n\n this._isAppended = true\n }\n\n dispose() {\n if (!this._isAppended) {\n return\n }\n\n EventHandler.off(this._element, EVENT_MOUSEDOWN)\n\n this._element.remove()\n this._isAppended = false\n }\n\n _emulateAnimation(callback) {\n executeAfterTransition(callback, this._getElement(), this._config.isAnimated)\n }\n}\n\nexport default Backdrop\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v5.1.3): util/focustrap.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/main/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport EventHandler from '../dom/event-handler'\nimport SelectorEngine from '../dom/selector-engine'\nimport { typeCheckConfig } from './index'\n\nconst Default = {\n trapElement: null, // The element to trap focus inside of\n autofocus: true\n}\n\nconst DefaultType = {\n trapElement: 'element',\n autofocus: 'boolean'\n}\n\nconst NAME = 'focustrap'\nconst DATA_KEY = 'bs.focustrap'\nconst EVENT_KEY = `.${DATA_KEY}`\nconst EVENT_FOCUSIN = `focusin${EVENT_KEY}`\nconst EVENT_KEYDOWN_TAB = `keydown.tab${EVENT_KEY}`\n\nconst TAB_KEY = 'Tab'\nconst TAB_NAV_FORWARD = 'forward'\nconst TAB_NAV_BACKWARD = 'backward'\n\nclass FocusTrap {\n constructor(config) {\n this._config = this._getConfig(config)\n this._isActive = false\n this._lastTabNavDirection = null\n }\n\n activate() {\n const { trapElement, autofocus } = this._config\n\n if (this._isActive) {\n return\n }\n\n if (autofocus) {\n trapElement.focus()\n }\n\n EventHandler.off(document, EVENT_KEY) // guard against infinite focus loop\n EventHandler.on(document, EVENT_FOCUSIN, event => this._handleFocusin(event))\n EventHandler.on(document, EVENT_KEYDOWN_TAB, event => this._handleKeydown(event))\n\n this._isActive = true\n }\n\n deactivate() {\n if (!this._isActive) {\n return\n }\n\n this._isActive = false\n EventHandler.off(document, EVENT_KEY)\n }\n\n // Private\n\n _handleFocusin(event) {\n const { target } = event\n const { trapElement } = this._config\n\n if (target === document || target === trapElement || trapElement.contains(target)) {\n return\n }\n\n const elements = SelectorEngine.focusableChildren(trapElement)\n\n if (elements.length === 0) {\n trapElement.focus()\n } else if (this._lastTabNavDirection === TAB_NAV_BACKWARD) {\n elements[elements.length - 1].focus()\n } else {\n elements[0].focus()\n }\n }\n\n _handleKeydown(event) {\n if (event.key !== TAB_KEY) {\n return\n }\n\n this._lastTabNavDirection = event.shiftKey ? TAB_NAV_BACKWARD : TAB_NAV_FORWARD\n }\n\n _getConfig(config) {\n config = {\n ...Default,\n ...(typeof config === 'object' ? config : {})\n }\n typeCheckConfig(NAME, config, DefaultType)\n return config\n }\n}\n\nexport default FocusTrap\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v5.1.3): modal.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/main/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport {\n defineJQueryPlugin,\n getElementFromSelector,\n isRTL,\n isVisible,\n reflow,\n typeCheckConfig\n} from './util/index'\nimport EventHandler from './dom/event-handler'\nimport Manipulator from './dom/manipulator'\nimport SelectorEngine from './dom/selector-engine'\nimport ScrollBarHelper from './util/scrollbar'\nimport BaseComponent from './base-component'\nimport Backdrop from './util/backdrop'\nimport FocusTrap from './util/focustrap'\nimport { enableDismissTrigger } from './util/component-functions'\n\n/**\n * ------------------------------------------------------------------------\n * Constants\n * ------------------------------------------------------------------------\n */\n\nconst NAME = 'modal'\nconst DATA_KEY = 'bs.modal'\nconst EVENT_KEY = `.${DATA_KEY}`\nconst DATA_API_KEY = '.data-api'\nconst ESCAPE_KEY = 'Escape'\n\nconst Default = {\n backdrop: true,\n keyboard: true,\n focus: true\n}\n\nconst DefaultType = {\n backdrop: '(boolean|string)',\n keyboard: 'boolean',\n focus: 'boolean'\n}\n\nconst EVENT_HIDE = `hide${EVENT_KEY}`\nconst EVENT_HIDE_PREVENTED = `hidePrevented${EVENT_KEY}`\nconst EVENT_HIDDEN = `hidden${EVENT_KEY}`\nconst EVENT_SHOW = `show${EVENT_KEY}`\nconst EVENT_SHOWN = `shown${EVENT_KEY}`\nconst EVENT_RESIZE = `resize${EVENT_KEY}`\nconst EVENT_CLICK_DISMISS = `click.dismiss${EVENT_KEY}`\nconst EVENT_KEYDOWN_DISMISS = `keydown.dismiss${EVENT_KEY}`\nconst EVENT_MOUSEUP_DISMISS = `mouseup.dismiss${EVENT_KEY}`\nconst EVENT_MOUSEDOWN_DISMISS = `mousedown.dismiss${EVENT_KEY}`\nconst EVENT_CLICK_DATA_API = `click${EVENT_KEY}${DATA_API_KEY}`\n\nconst CLASS_NAME_OPEN = 'modal-open'\nconst CLASS_NAME_FADE = 'fade'\nconst CLASS_NAME_SHOW = 'show'\nconst CLASS_NAME_STATIC = 'modal-static'\n\nconst OPEN_SELECTOR = '.modal.show'\nconst SELECTOR_DIALOG = '.modal-dialog'\nconst SELECTOR_MODAL_BODY = '.modal-body'\nconst SELECTOR_DATA_TOGGLE = '[data-bs-toggle=\"modal\"]'\n\n/**\n * ------------------------------------------------------------------------\n * Class Definition\n * ------------------------------------------------------------------------\n */\n\nclass Modal extends BaseComponent {\n constructor(element, config) {\n super(element)\n\n this._config = this._getConfig(config)\n this._dialog = SelectorEngine.findOne(SELECTOR_DIALOG, this._element)\n this._backdrop = this._initializeBackDrop()\n this._focustrap = this._initializeFocusTrap()\n this._isShown = false\n this._ignoreBackdropClick = false\n this._isTransitioning = false\n this._scrollBar = new ScrollBarHelper()\n }\n\n // Getters\n\n static get Default() {\n return Default\n }\n\n static get NAME() {\n return NAME\n }\n\n // Public\n\n toggle(relatedTarget) {\n return this._isShown ? this.hide() : this.show(relatedTarget)\n }\n\n show(relatedTarget) {\n if (this._isShown || this._isTransitioning) {\n return\n }\n\n const showEvent = EventHandler.trigger(this._element, EVENT_SHOW, {\n relatedTarget\n })\n\n if (showEvent.defaultPrevented) {\n return\n }\n\n this._isShown = true\n\n if (this._isAnimated()) {\n this._isTransitioning = true\n }\n\n this._scrollBar.hide()\n\n document.body.classList.add(CLASS_NAME_OPEN)\n\n this._adjustDialog()\n\n this._setEscapeEvent()\n this._setResizeEvent()\n\n EventHandler.on(this._dialog, EVENT_MOUSEDOWN_DISMISS, () => {\n EventHandler.one(this._element, EVENT_MOUSEUP_DISMISS, event => {\n if (event.target === this._element) {\n this._ignoreBackdropClick = true\n }\n })\n })\n\n this._showBackdrop(() => this._showElement(relatedTarget))\n }\n\n hide() {\n if (!this._isShown || this._isTransitioning) {\n return\n }\n\n const hideEvent = EventHandler.trigger(this._element, EVENT_HIDE)\n\n if (hideEvent.defaultPrevented) {\n return\n }\n\n this._isShown = false\n const isAnimated = this._isAnimated()\n\n if (isAnimated) {\n this._isTransitioning = true\n }\n\n this._setEscapeEvent()\n this._setResizeEvent()\n\n this._focustrap.deactivate()\n\n this._element.classList.remove(CLASS_NAME_SHOW)\n\n EventHandler.off(this._element, EVENT_CLICK_DISMISS)\n EventHandler.off(this._dialog, EVENT_MOUSEDOWN_DISMISS)\n\n this._queueCallback(() => this._hideModal(), this._element, isAnimated)\n }\n\n dispose() {\n [window, this._dialog]\n .forEach(htmlElement => EventHandler.off(htmlElement, EVENT_KEY))\n\n this._backdrop.dispose()\n this._focustrap.deactivate()\n super.dispose()\n }\n\n handleUpdate() {\n this._adjustDialog()\n }\n\n // Private\n\n _initializeBackDrop() {\n return new Backdrop({\n isVisible: Boolean(this._config.backdrop), // 'static' option will be translated to true, and booleans will keep their value\n isAnimated: this._isAnimated()\n })\n }\n\n _initializeFocusTrap() {\n return new FocusTrap({\n trapElement: this._element\n })\n }\n\n _getConfig(config) {\n config = {\n ...Default,\n ...Manipulator.getDataAttributes(this._element),\n ...(typeof config === 'object' ? config : {})\n }\n typeCheckConfig(NAME, config, DefaultType)\n return config\n }\n\n _showElement(relatedTarget) {\n const isAnimated = this._isAnimated()\n const modalBody = SelectorEngine.findOne(SELECTOR_MODAL_BODY, this._dialog)\n\n if (!this._element.parentNode || this._element.parentNode.nodeType !== Node.ELEMENT_NODE) {\n // Don't move modal's DOM position\n document.body.append(this._element)\n }\n\n this._element.style.display = 'block'\n this._element.removeAttribute('aria-hidden')\n this._element.setAttribute('aria-modal', true)\n this._element.setAttribute('role', 'dialog')\n this._element.scrollTop = 0\n\n if (modalBody) {\n modalBody.scrollTop = 0\n }\n\n if (isAnimated) {\n reflow(this._element)\n }\n\n this._element.classList.add(CLASS_NAME_SHOW)\n\n const transitionComplete = () => {\n if (this._config.focus) {\n this._focustrap.activate()\n }\n\n this._isTransitioning = false\n EventHandler.trigger(this._element, EVENT_SHOWN, {\n relatedTarget\n })\n }\n\n this._queueCallback(transitionComplete, this._dialog, isAnimated)\n }\n\n _setEscapeEvent() {\n if (this._isShown) {\n EventHandler.on(this._element, EVENT_KEYDOWN_DISMISS, event => {\n if (this._config.keyboard && event.key === ESCAPE_KEY) {\n event.preventDefault()\n this.hide()\n } else if (!this._config.keyboard && event.key === ESCAPE_KEY) {\n this._triggerBackdropTransition()\n }\n })\n } else {\n EventHandler.off(this._element, EVENT_KEYDOWN_DISMISS)\n }\n }\n\n _setResizeEvent() {\n if (this._isShown) {\n EventHandler.on(window, EVENT_RESIZE, () => this._adjustDialog())\n } else {\n EventHandler.off(window, EVENT_RESIZE)\n }\n }\n\n _hideModal() {\n this._element.style.display = 'none'\n this._element.setAttribute('aria-hidden', true)\n this._element.removeAttribute('aria-modal')\n this._element.removeAttribute('role')\n this._isTransitioning = false\n this._backdrop.hide(() => {\n document.body.classList.remove(CLASS_NAME_OPEN)\n this._resetAdjustments()\n this._scrollBar.reset()\n EventHandler.trigger(this._element, EVENT_HIDDEN)\n })\n }\n\n _showBackdrop(callback) {\n EventHandler.on(this._element, EVENT_CLICK_DISMISS, event => {\n if (this._ignoreBackdropClick) {\n this._ignoreBackdropClick = false\n return\n }\n\n if (event.target !== event.currentTarget) {\n return\n }\n\n if (this._config.backdrop === true) {\n this.hide()\n } else if (this._config.backdrop === 'static') {\n this._triggerBackdropTransition()\n }\n })\n\n this._backdrop.show(callback)\n }\n\n _isAnimated() {\n return this._element.classList.contains(CLASS_NAME_FADE)\n }\n\n _triggerBackdropTransition() {\n const hideEvent = EventHandler.trigger(this._element, EVENT_HIDE_PREVENTED)\n if (hideEvent.defaultPrevented) {\n return\n }\n\n const { classList, scrollHeight, style } = this._element\n const isModalOverflowing = scrollHeight > document.documentElement.clientHeight\n\n // return if the following background transition hasn't yet completed\n if ((!isModalOverflowing && style.overflowY === 'hidden') || classList.contains(CLASS_NAME_STATIC)) {\n return\n }\n\n if (!isModalOverflowing) {\n style.overflowY = 'hidden'\n }\n\n classList.add(CLASS_NAME_STATIC)\n this._queueCallback(() => {\n classList.remove(CLASS_NAME_STATIC)\n if (!isModalOverflowing) {\n this._queueCallback(() => {\n style.overflowY = ''\n }, this._dialog)\n }\n }, this._dialog)\n\n this._element.focus()\n }\n\n // ----------------------------------------------------------------------\n // the following methods are used to handle overflowing modals\n // ----------------------------------------------------------------------\n\n _adjustDialog() {\n const isModalOverflowing = this._element.scrollHeight > document.documentElement.clientHeight\n const scrollbarWidth = this._scrollBar.getWidth()\n const isBodyOverflowing = scrollbarWidth > 0\n\n if ((!isBodyOverflowing && isModalOverflowing && !isRTL()) || (isBodyOverflowing && !isModalOverflowing && isRTL())) {\n this._element.style.paddingLeft = `${scrollbarWidth}px`\n }\n\n if ((isBodyOverflowing && !isModalOverflowing && !isRTL()) || (!isBodyOverflowing && isModalOverflowing && isRTL())) {\n this._element.style.paddingRight = `${scrollbarWidth}px`\n }\n }\n\n _resetAdjustments() {\n this._element.style.paddingLeft = ''\n this._element.style.paddingRight = ''\n }\n\n // Static\n\n static jQueryInterface(config, relatedTarget) {\n return this.each(function () {\n const data = Modal.getOrCreateInstance(this, config)\n\n if (typeof config !== 'string') {\n return\n }\n\n if (typeof data[config] === 'undefined') {\n throw new TypeError(`No method named \"${config}\"`)\n }\n\n data[config](relatedTarget)\n })\n }\n}\n\n/**\n * ------------------------------------------------------------------------\n * Data Api implementation\n * ------------------------------------------------------------------------\n */\n\nEventHandler.on(document, EVENT_CLICK_DATA_API, SELECTOR_DATA_TOGGLE, function (event) {\n const target = getElementFromSelector(this)\n\n if (['A', 'AREA'].includes(this.tagName)) {\n event.preventDefault()\n }\n\n EventHandler.one(target, EVENT_SHOW, showEvent => {\n if (showEvent.defaultPrevented) {\n // only register focus restorer if modal will actually get shown\n return\n }\n\n EventHandler.one(target, EVENT_HIDDEN, () => {\n if (isVisible(this)) {\n this.focus()\n }\n })\n })\n\n // avoid conflict when clicking moddal toggler while another one is open\n const allReadyOpen = SelectorEngine.findOne(OPEN_SELECTOR)\n if (allReadyOpen) {\n Modal.getInstance(allReadyOpen).hide()\n }\n\n const data = Modal.getOrCreateInstance(target)\n\n data.toggle(this)\n})\n\nenableDismissTrigger(Modal)\n\n/**\n * ------------------------------------------------------------------------\n * jQuery\n * ------------------------------------------------------------------------\n * add .Modal to jQuery only if jQuery is present\n */\n\ndefineJQueryPlugin(Modal)\n\nexport default Modal\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v5.1.3): offcanvas.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/main/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport {\n defineJQueryPlugin,\n getElementFromSelector,\n isDisabled,\n isVisible,\n typeCheckConfig\n} from './util/index'\nimport ScrollBarHelper from './util/scrollbar'\nimport EventHandler from './dom/event-handler'\nimport BaseComponent from './base-component'\nimport SelectorEngine from './dom/selector-engine'\nimport Manipulator from './dom/manipulator'\nimport Backdrop from './util/backdrop'\nimport FocusTrap from './util/focustrap'\nimport { enableDismissTrigger } from './util/component-functions'\n\n/**\n * ------------------------------------------------------------------------\n * Constants\n * ------------------------------------------------------------------------\n */\n\nconst NAME = 'offcanvas'\nconst DATA_KEY = 'bs.offcanvas'\nconst EVENT_KEY = `.${DATA_KEY}`\nconst DATA_API_KEY = '.data-api'\nconst EVENT_LOAD_DATA_API = `load${EVENT_KEY}${DATA_API_KEY}`\nconst ESCAPE_KEY = 'Escape'\n\nconst Default = {\n backdrop: true,\n keyboard: true,\n scroll: false\n}\n\nconst DefaultType = {\n backdrop: 'boolean',\n keyboard: 'boolean',\n scroll: 'boolean'\n}\n\nconst CLASS_NAME_SHOW = 'show'\nconst CLASS_NAME_BACKDROP = 'offcanvas-backdrop'\nconst OPEN_SELECTOR = '.offcanvas.show'\n\nconst EVENT_SHOW = `show${EVENT_KEY}`\nconst EVENT_SHOWN = `shown${EVENT_KEY}`\nconst EVENT_HIDE = `hide${EVENT_KEY}`\nconst EVENT_HIDDEN = `hidden${EVENT_KEY}`\nconst EVENT_CLICK_DATA_API = `click${EVENT_KEY}${DATA_API_KEY}`\nconst EVENT_KEYDOWN_DISMISS = `keydown.dismiss${EVENT_KEY}`\n\nconst SELECTOR_DATA_TOGGLE = '[data-bs-toggle=\"offcanvas\"]'\n\n/**\n * ------------------------------------------------------------------------\n * Class Definition\n * ------------------------------------------------------------------------\n */\n\nclass Offcanvas extends BaseComponent {\n constructor(element, config) {\n super(element)\n\n this._config = this._getConfig(config)\n this._isShown = false\n this._backdrop = this._initializeBackDrop()\n this._focustrap = this._initializeFocusTrap()\n this._addEventListeners()\n }\n\n // Getters\n\n static get NAME() {\n return NAME\n }\n\n static get Default() {\n return Default\n }\n\n // Public\n\n toggle(relatedTarget) {\n return this._isShown ? this.hide() : this.show(relatedTarget)\n }\n\n show(relatedTarget) {\n if (this._isShown) {\n return\n }\n\n const showEvent = EventHandler.trigger(this._element, EVENT_SHOW, { relatedTarget })\n\n if (showEvent.defaultPrevented) {\n return\n }\n\n this._isShown = true\n this._element.style.visibility = 'visible'\n\n this._backdrop.show()\n\n if (!this._config.scroll) {\n new ScrollBarHelper().hide()\n }\n\n this._element.removeAttribute('aria-hidden')\n this._element.setAttribute('aria-modal', true)\n this._element.setAttribute('role', 'dialog')\n this._element.classList.add(CLASS_NAME_SHOW)\n\n const completeCallBack = () => {\n if (!this._config.scroll) {\n this._focustrap.activate()\n }\n\n EventHandler.trigger(this._element, EVENT_SHOWN, { relatedTarget })\n }\n\n this._queueCallback(completeCallBack, this._element, true)\n }\n\n hide() {\n if (!this._isShown) {\n return\n }\n\n const hideEvent = EventHandler.trigger(this._element, EVENT_HIDE)\n\n if (hideEvent.defaultPrevented) {\n return\n }\n\n this._focustrap.deactivate()\n this._element.blur()\n this._isShown = false\n this._element.classList.remove(CLASS_NAME_SHOW)\n this._backdrop.hide()\n\n const completeCallback = () => {\n this._element.setAttribute('aria-hidden', true)\n this._element.removeAttribute('aria-modal')\n this._element.removeAttribute('role')\n this._element.style.visibility = 'hidden'\n\n if (!this._config.scroll) {\n new ScrollBarHelper().reset()\n }\n\n EventHandler.trigger(this._element, EVENT_HIDDEN)\n }\n\n this._queueCallback(completeCallback, this._element, true)\n }\n\n dispose() {\n this._backdrop.dispose()\n this._focustrap.deactivate()\n super.dispose()\n }\n\n // Private\n\n _getConfig(config) {\n config = {\n ...Default,\n ...Manipulator.getDataAttributes(this._element),\n ...(typeof config === 'object' ? config : {})\n }\n typeCheckConfig(NAME, config, DefaultType)\n return config\n }\n\n _initializeBackDrop() {\n return new Backdrop({\n className: CLASS_NAME_BACKDROP,\n isVisible: this._config.backdrop,\n isAnimated: true,\n rootElement: this._element.parentNode,\n clickCallback: () => this.hide()\n })\n }\n\n _initializeFocusTrap() {\n return new FocusTrap({\n trapElement: this._element\n })\n }\n\n _addEventListeners() {\n EventHandler.on(this._element, EVENT_KEYDOWN_DISMISS, event => {\n if (this._config.keyboard && event.key === ESCAPE_KEY) {\n this.hide()\n }\n })\n }\n\n // Static\n\n static jQueryInterface(config) {\n return this.each(function () {\n const data = Offcanvas.getOrCreateInstance(this, config)\n\n if (typeof config !== 'string') {\n return\n }\n\n if (data[config] === undefined || config.startsWith('_') || config === 'constructor') {\n throw new TypeError(`No method named \"${config}\"`)\n }\n\n data[config](this)\n })\n }\n}\n\n/**\n * ------------------------------------------------------------------------\n * Data Api implementation\n * ------------------------------------------------------------------------\n */\n\nEventHandler.on(document, EVENT_CLICK_DATA_API, SELECTOR_DATA_TOGGLE, function (event) {\n const target = getElementFromSelector(this)\n\n if (['A', 'AREA'].includes(this.tagName)) {\n event.preventDefault()\n }\n\n if (isDisabled(this)) {\n return\n }\n\n EventHandler.one(target, EVENT_HIDDEN, () => {\n // focus on trigger when it is closed\n if (isVisible(this)) {\n this.focus()\n }\n })\n\n // avoid conflict when clicking a toggler of an offcanvas, while another is open\n const allReadyOpen = SelectorEngine.findOne(OPEN_SELECTOR)\n if (allReadyOpen && allReadyOpen !== target) {\n Offcanvas.getInstance(allReadyOpen).hide()\n }\n\n const data = Offcanvas.getOrCreateInstance(target)\n data.toggle(this)\n})\n\nEventHandler.on(window, EVENT_LOAD_DATA_API, () =>\n SelectorEngine.find(OPEN_SELECTOR).forEach(el => Offcanvas.getOrCreateInstance(el).show())\n)\n\nenableDismissTrigger(Offcanvas)\n/**\n * ------------------------------------------------------------------------\n * jQuery\n * ------------------------------------------------------------------------\n */\n\ndefineJQueryPlugin(Offcanvas)\n\nexport default Offcanvas\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v5.1.3): util/sanitizer.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/main/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nconst uriAttributes = new Set([\n 'background',\n 'cite',\n 'href',\n 'itemtype',\n 'longdesc',\n 'poster',\n 'src',\n 'xlink:href'\n])\n\nconst ARIA_ATTRIBUTE_PATTERN = /^aria-[\\w-]*$/i\n\n/**\n * A pattern that recognizes a commonly useful subset of URLs that are safe.\n *\n * Shoutout to Angular https://github.com/angular/angular/blob/12.2.x/packages/core/src/sanitization/url_sanitizer.ts\n */\nconst SAFE_URL_PATTERN = /^(?:(?:https?|mailto|ftp|tel|file|sms):|[^#&/:?]*(?:[#/?]|$))/i\n\n/**\n * A pattern that matches safe data URLs. Only matches image, video and audio types.\n *\n * Shoutout to Angular https://github.com/angular/angular/blob/12.2.x/packages/core/src/sanitization/url_sanitizer.ts\n */\nconst DATA_URL_PATTERN = /^data:(?:image\\/(?:bmp|gif|jpeg|jpg|png|tiff|webp)|video\\/(?:mpeg|mp4|ogg|webm)|audio\\/(?:mp3|oga|ogg|opus));base64,[\\d+/a-z]+=*$/i\n\nconst allowedAttribute = (attribute, allowedAttributeList) => {\n const attributeName = attribute.nodeName.toLowerCase()\n\n if (allowedAttributeList.includes(attributeName)) {\n if (uriAttributes.has(attributeName)) {\n return Boolean(SAFE_URL_PATTERN.test(attribute.nodeValue) || DATA_URL_PATTERN.test(attribute.nodeValue))\n }\n\n return true\n }\n\n const regExp = allowedAttributeList.filter(attributeRegex => attributeRegex instanceof RegExp)\n\n // Check if a regular expression validates the attribute.\n for (let i = 0, len = regExp.length; i < len; i++) {\n if (regExp[i].test(attributeName)) {\n return true\n }\n }\n\n return false\n}\n\nexport const DefaultAllowlist = {\n // Global attributes allowed on any supplied element below.\n '*': ['class', 'dir', 'id', 'lang', 'role', ARIA_ATTRIBUTE_PATTERN],\n a: ['target', 'href', 'title', 'rel'],\n area: [],\n b: [],\n br: [],\n col: [],\n code: [],\n div: [],\n em: [],\n hr: [],\n h1: [],\n h2: [],\n h3: [],\n h4: [],\n h5: [],\n h6: [],\n i: [],\n img: ['src', 'srcset', 'alt', 'title', 'width', 'height'],\n li: [],\n ol: [],\n p: [],\n pre: [],\n s: [],\n small: [],\n span: [],\n sub: [],\n sup: [],\n strong: [],\n u: [],\n ul: []\n}\n\nexport function sanitizeHtml(unsafeHtml, allowList, sanitizeFn) {\n if (!unsafeHtml.length) {\n return unsafeHtml\n }\n\n if (sanitizeFn && typeof sanitizeFn === 'function') {\n return sanitizeFn(unsafeHtml)\n }\n\n const domParser = new window.DOMParser()\n const createdDocument = domParser.parseFromString(unsafeHtml, 'text/html')\n const elements = [].concat(...createdDocument.body.querySelectorAll('*'))\n\n for (let i = 0, len = elements.length; i < len; i++) {\n const element = elements[i]\n const elementName = element.nodeName.toLowerCase()\n\n if (!Object.keys(allowList).includes(elementName)) {\n element.remove()\n\n continue\n }\n\n const attributeList = [].concat(...element.attributes)\n const allowedAttributes = [].concat(allowList['*'] || [], allowList[elementName] || [])\n\n attributeList.forEach(attribute => {\n if (!allowedAttribute(attribute, allowedAttributes)) {\n element.removeAttribute(attribute.nodeName)\n }\n })\n }\n\n return createdDocument.body.innerHTML\n}\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v5.1.3): tooltip.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/main/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport * as Popper from '@popperjs/core'\n\nimport {\n defineJQueryPlugin,\n findShadowRoot,\n getElement,\n getUID,\n isElement,\n isRTL,\n noop,\n typeCheckConfig\n} from './util/index'\nimport { DefaultAllowlist, sanitizeHtml } from './util/sanitizer'\nimport Data from './dom/data'\nimport EventHandler from './dom/event-handler'\nimport Manipulator from './dom/manipulator'\nimport SelectorEngine from './dom/selector-engine'\nimport BaseComponent from './base-component'\n\n/**\n * ------------------------------------------------------------------------\n * Constants\n * ------------------------------------------------------------------------\n */\n\nconst NAME = 'tooltip'\nconst DATA_KEY = 'bs.tooltip'\nconst EVENT_KEY = `.${DATA_KEY}`\nconst CLASS_PREFIX = 'bs-tooltip'\nconst DISALLOWED_ATTRIBUTES = new Set(['sanitize', 'allowList', 'sanitizeFn'])\n\nconst DefaultType = {\n animation: 'boolean',\n template: 'string',\n title: '(string|element|function)',\n trigger: 'string',\n delay: '(number|object)',\n html: 'boolean',\n selector: '(string|boolean)',\n placement: '(string|function)',\n offset: '(array|string|function)',\n container: '(string|element|boolean)',\n fallbackPlacements: 'array',\n boundary: '(string|element)',\n customClass: '(string|function)',\n sanitize: 'boolean',\n sanitizeFn: '(null|function)',\n allowList: 'object',\n popperConfig: '(null|object|function)'\n}\n\nconst AttachmentMap = {\n AUTO: 'auto',\n TOP: 'top',\n RIGHT: isRTL() ? 'left' : 'right',\n BOTTOM: 'bottom',\n LEFT: isRTL() ? 'right' : 'left'\n}\n\nconst Default = {\n animation: true,\n template: '
' +\n '
' +\n '
' +\n '
',\n trigger: 'hover focus',\n title: '',\n delay: 0,\n html: false,\n selector: false,\n placement: 'top',\n offset: [0, 0],\n container: false,\n fallbackPlacements: ['top', 'right', 'bottom', 'left'],\n boundary: 'clippingParents',\n customClass: '',\n sanitize: true,\n sanitizeFn: null,\n allowList: DefaultAllowlist,\n popperConfig: null\n}\n\nconst Event = {\n HIDE: `hide${EVENT_KEY}`,\n HIDDEN: `hidden${EVENT_KEY}`,\n SHOW: `show${EVENT_KEY}`,\n SHOWN: `shown${EVENT_KEY}`,\n INSERTED: `inserted${EVENT_KEY}`,\n CLICK: `click${EVENT_KEY}`,\n FOCUSIN: `focusin${EVENT_KEY}`,\n FOCUSOUT: `focusout${EVENT_KEY}`,\n MOUSEENTER: `mouseenter${EVENT_KEY}`,\n MOUSELEAVE: `mouseleave${EVENT_KEY}`\n}\n\nconst CLASS_NAME_FADE = 'fade'\nconst CLASS_NAME_MODAL = 'modal'\nconst CLASS_NAME_SHOW = 'show'\n\nconst HOVER_STATE_SHOW = 'show'\nconst HOVER_STATE_OUT = 'out'\n\nconst SELECTOR_TOOLTIP_INNER = '.tooltip-inner'\nconst SELECTOR_MODAL = `.${CLASS_NAME_MODAL}`\n\nconst EVENT_MODAL_HIDE = 'hide.bs.modal'\n\nconst TRIGGER_HOVER = 'hover'\nconst TRIGGER_FOCUS = 'focus'\nconst TRIGGER_CLICK = 'click'\nconst TRIGGER_MANUAL = 'manual'\n\n/**\n * ------------------------------------------------------------------------\n * Class Definition\n * ------------------------------------------------------------------------\n */\n\nclass Tooltip extends BaseComponent {\n constructor(element, config) {\n if (typeof Popper === 'undefined') {\n throw new TypeError('Bootstrap\\'s tooltips require Popper (https://popper.js.org)')\n }\n\n super(element)\n\n // private\n this._isEnabled = true\n this._timeout = 0\n this._hoverState = ''\n this._activeTrigger = {}\n this._popper = null\n\n // Protected\n this._config = this._getConfig(config)\n this.tip = null\n\n this._setListeners()\n }\n\n // Getters\n\n static get Default() {\n return Default\n }\n\n static get NAME() {\n return NAME\n }\n\n static get Event() {\n return Event\n }\n\n static get DefaultType() {\n return DefaultType\n }\n\n // Public\n\n enable() {\n this._isEnabled = true\n }\n\n disable() {\n this._isEnabled = false\n }\n\n toggleEnabled() {\n this._isEnabled = !this._isEnabled\n }\n\n toggle(event) {\n if (!this._isEnabled) {\n return\n }\n\n if (event) {\n const context = this._initializeOnDelegatedTarget(event)\n\n context._activeTrigger.click = !context._activeTrigger.click\n\n if (context._isWithActiveTrigger()) {\n context._enter(null, context)\n } else {\n context._leave(null, context)\n }\n } else {\n if (this.getTipElement().classList.contains(CLASS_NAME_SHOW)) {\n this._leave(null, this)\n return\n }\n\n this._enter(null, this)\n }\n }\n\n dispose() {\n clearTimeout(this._timeout)\n\n EventHandler.off(this._element.closest(SELECTOR_MODAL), EVENT_MODAL_HIDE, this._hideModalHandler)\n\n if (this.tip) {\n this.tip.remove()\n }\n\n this._disposePopper()\n super.dispose()\n }\n\n show() {\n if (this._element.style.display === 'none') {\n throw new Error('Please use show on visible elements')\n }\n\n if (!(this.isWithContent() && this._isEnabled)) {\n return\n }\n\n const showEvent = EventHandler.trigger(this._element, this.constructor.Event.SHOW)\n const shadowRoot = findShadowRoot(this._element)\n const isInTheDom = shadowRoot === null ?\n this._element.ownerDocument.documentElement.contains(this._element) :\n shadowRoot.contains(this._element)\n\n if (showEvent.defaultPrevented || !isInTheDom) {\n return\n }\n\n // A trick to recreate a tooltip in case a new title is given by using the NOT documented `data-bs-original-title`\n // This will be removed later in favor of a `setContent` method\n if (this.constructor.NAME === 'tooltip' && this.tip && this.getTitle() !== this.tip.querySelector(SELECTOR_TOOLTIP_INNER).innerHTML) {\n this._disposePopper()\n this.tip.remove()\n this.tip = null\n }\n\n const tip = this.getTipElement()\n const tipId = getUID(this.constructor.NAME)\n\n tip.setAttribute('id', tipId)\n this._element.setAttribute('aria-describedby', tipId)\n\n if (this._config.animation) {\n tip.classList.add(CLASS_NAME_FADE)\n }\n\n const placement = typeof this._config.placement === 'function' ?\n this._config.placement.call(this, tip, this._element) :\n this._config.placement\n\n const attachment = this._getAttachment(placement)\n this._addAttachmentClass(attachment)\n\n const { container } = this._config\n Data.set(tip, this.constructor.DATA_KEY, this)\n\n if (!this._element.ownerDocument.documentElement.contains(this.tip)) {\n container.append(tip)\n EventHandler.trigger(this._element, this.constructor.Event.INSERTED)\n }\n\n if (this._popper) {\n this._popper.update()\n } else {\n this._popper = Popper.createPopper(this._element, tip, this._getPopperConfig(attachment))\n }\n\n tip.classList.add(CLASS_NAME_SHOW)\n\n const customClass = this._resolvePossibleFunction(this._config.customClass)\n if (customClass) {\n tip.classList.add(...customClass.split(' '))\n }\n\n // If this is a touch-enabled device we add extra\n // empty mouseover listeners to the body's immediate children;\n // only needed because of broken event delegation on iOS\n // https://www.quirksmode.org/blog/archives/2014/02/mouse_event_bub.html\n if ('ontouchstart' in document.documentElement) {\n [].concat(...document.body.children).forEach(element => {\n EventHandler.on(element, 'mouseover', noop)\n })\n }\n\n const complete = () => {\n const prevHoverState = this._hoverState\n\n this._hoverState = null\n EventHandler.trigger(this._element, this.constructor.Event.SHOWN)\n\n if (prevHoverState === HOVER_STATE_OUT) {\n this._leave(null, this)\n }\n }\n\n const isAnimated = this.tip.classList.contains(CLASS_NAME_FADE)\n this._queueCallback(complete, this.tip, isAnimated)\n }\n\n hide() {\n if (!this._popper) {\n return\n }\n\n const tip = this.getTipElement()\n const complete = () => {\n if (this._isWithActiveTrigger()) {\n return\n }\n\n if (this._hoverState !== HOVER_STATE_SHOW) {\n tip.remove()\n }\n\n this._cleanTipClass()\n this._element.removeAttribute('aria-describedby')\n EventHandler.trigger(this._element, this.constructor.Event.HIDDEN)\n\n this._disposePopper()\n }\n\n const hideEvent = EventHandler.trigger(this._element, this.constructor.Event.HIDE)\n if (hideEvent.defaultPrevented) {\n return\n }\n\n tip.classList.remove(CLASS_NAME_SHOW)\n\n // If this is a touch-enabled device we remove the extra\n // empty mouseover listeners we added for iOS support\n if ('ontouchstart' in document.documentElement) {\n [].concat(...document.body.children)\n .forEach(element => EventHandler.off(element, 'mouseover', noop))\n }\n\n this._activeTrigger[TRIGGER_CLICK] = false\n this._activeTrigger[TRIGGER_FOCUS] = false\n this._activeTrigger[TRIGGER_HOVER] = false\n\n const isAnimated = this.tip.classList.contains(CLASS_NAME_FADE)\n this._queueCallback(complete, this.tip, isAnimated)\n this._hoverState = ''\n }\n\n update() {\n if (this._popper !== null) {\n this._popper.update()\n }\n }\n\n // Protected\n\n isWithContent() {\n return Boolean(this.getTitle())\n }\n\n getTipElement() {\n if (this.tip) {\n return this.tip\n }\n\n const element = document.createElement('div')\n element.innerHTML = this._config.template\n\n const tip = element.children[0]\n this.setContent(tip)\n tip.classList.remove(CLASS_NAME_FADE, CLASS_NAME_SHOW)\n\n this.tip = tip\n return this.tip\n }\n\n setContent(tip) {\n this._sanitizeAndSetContent(tip, this.getTitle(), SELECTOR_TOOLTIP_INNER)\n }\n\n _sanitizeAndSetContent(template, content, selector) {\n const templateElement = SelectorEngine.findOne(selector, template)\n\n if (!content && templateElement) {\n templateElement.remove()\n return\n }\n\n // we use append for html objects to maintain js events\n this.setElementContent(templateElement, content)\n }\n\n setElementContent(element, content) {\n if (element === null) {\n return\n }\n\n if (isElement(content)) {\n content = getElement(content)\n\n // content is a DOM node or a jQuery\n if (this._config.html) {\n if (content.parentNode !== element) {\n element.innerHTML = ''\n element.append(content)\n }\n } else {\n element.textContent = content.textContent\n }\n\n return\n }\n\n if (this._config.html) {\n if (this._config.sanitize) {\n content = sanitizeHtml(content, this._config.allowList, this._config.sanitizeFn)\n }\n\n element.innerHTML = content\n } else {\n element.textContent = content\n }\n }\n\n getTitle() {\n const title = this._element.getAttribute('data-bs-original-title') || this._config.title\n\n return this._resolvePossibleFunction(title)\n }\n\n updateAttachment(attachment) {\n if (attachment === 'right') {\n return 'end'\n }\n\n if (attachment === 'left') {\n return 'start'\n }\n\n return attachment\n }\n\n // Private\n\n _initializeOnDelegatedTarget(event, context) {\n return context || this.constructor.getOrCreateInstance(event.delegateTarget, this._getDelegateConfig())\n }\n\n _getOffset() {\n const { offset } = this._config\n\n if (typeof offset === 'string') {\n return offset.split(',').map(val => Number.parseInt(val, 10))\n }\n\n if (typeof offset === 'function') {\n return popperData => offset(popperData, this._element)\n }\n\n return offset\n }\n\n _resolvePossibleFunction(content) {\n return typeof content === 'function' ? content.call(this._element) : content\n }\n\n _getPopperConfig(attachment) {\n const defaultBsPopperConfig = {\n placement: attachment,\n modifiers: [\n {\n name: 'flip',\n options: {\n fallbackPlacements: this._config.fallbackPlacements\n }\n },\n {\n name: 'offset',\n options: {\n offset: this._getOffset()\n }\n },\n {\n name: 'preventOverflow',\n options: {\n boundary: this._config.boundary\n }\n },\n {\n name: 'arrow',\n options: {\n element: `.${this.constructor.NAME}-arrow`\n }\n },\n {\n name: 'onChange',\n enabled: true,\n phase: 'afterWrite',\n fn: data => this._handlePopperPlacementChange(data)\n }\n ],\n onFirstUpdate: data => {\n if (data.options.placement !== data.placement) {\n this._handlePopperPlacementChange(data)\n }\n }\n }\n\n return {\n ...defaultBsPopperConfig,\n ...(typeof this._config.popperConfig === 'function' ? this._config.popperConfig(defaultBsPopperConfig) : this._config.popperConfig)\n }\n }\n\n _addAttachmentClass(attachment) {\n this.getTipElement().classList.add(`${this._getBasicClassPrefix()}-${this.updateAttachment(attachment)}`)\n }\n\n _getAttachment(placement) {\n return AttachmentMap[placement.toUpperCase()]\n }\n\n _setListeners() {\n const triggers = this._config.trigger.split(' ')\n\n triggers.forEach(trigger => {\n if (trigger === 'click') {\n EventHandler.on(this._element, this.constructor.Event.CLICK, this._config.selector, event => this.toggle(event))\n } else if (trigger !== TRIGGER_MANUAL) {\n const eventIn = trigger === TRIGGER_HOVER ?\n this.constructor.Event.MOUSEENTER :\n this.constructor.Event.FOCUSIN\n const eventOut = trigger === TRIGGER_HOVER ?\n this.constructor.Event.MOUSELEAVE :\n this.constructor.Event.FOCUSOUT\n\n EventHandler.on(this._element, eventIn, this._config.selector, event => this._enter(event))\n EventHandler.on(this._element, eventOut, this._config.selector, event => this._leave(event))\n }\n })\n\n this._hideModalHandler = () => {\n if (this._element) {\n this.hide()\n }\n }\n\n EventHandler.on(this._element.closest(SELECTOR_MODAL), EVENT_MODAL_HIDE, this._hideModalHandler)\n\n if (this._config.selector) {\n this._config = {\n ...this._config,\n trigger: 'manual',\n selector: ''\n }\n } else {\n this._fixTitle()\n }\n }\n\n _fixTitle() {\n const title = this._element.getAttribute('title')\n const originalTitleType = typeof this._element.getAttribute('data-bs-original-title')\n\n if (title || originalTitleType !== 'string') {\n this._element.setAttribute('data-bs-original-title', title || '')\n if (title && !this._element.getAttribute('aria-label') && !this._element.textContent) {\n this._element.setAttribute('aria-label', title)\n }\n\n this._element.setAttribute('title', '')\n }\n }\n\n _enter(event, context) {\n context = this._initializeOnDelegatedTarget(event, context)\n\n if (event) {\n context._activeTrigger[\n event.type === 'focusin' ? TRIGGER_FOCUS : TRIGGER_HOVER\n ] = true\n }\n\n if (context.getTipElement().classList.contains(CLASS_NAME_SHOW) || context._hoverState === HOVER_STATE_SHOW) {\n context._hoverState = HOVER_STATE_SHOW\n return\n }\n\n clearTimeout(context._timeout)\n\n context._hoverState = HOVER_STATE_SHOW\n\n if (!context._config.delay || !context._config.delay.show) {\n context.show()\n return\n }\n\n context._timeout = setTimeout(() => {\n if (context._hoverState === HOVER_STATE_SHOW) {\n context.show()\n }\n }, context._config.delay.show)\n }\n\n _leave(event, context) {\n context = this._initializeOnDelegatedTarget(event, context)\n\n if (event) {\n context._activeTrigger[\n event.type === 'focusout' ? TRIGGER_FOCUS : TRIGGER_HOVER\n ] = context._element.contains(event.relatedTarget)\n }\n\n if (context._isWithActiveTrigger()) {\n return\n }\n\n clearTimeout(context._timeout)\n\n context._hoverState = HOVER_STATE_OUT\n\n if (!context._config.delay || !context._config.delay.hide) {\n context.hide()\n return\n }\n\n context._timeout = setTimeout(() => {\n if (context._hoverState === HOVER_STATE_OUT) {\n context.hide()\n }\n }, context._config.delay.hide)\n }\n\n _isWithActiveTrigger() {\n for (const trigger in this._activeTrigger) {\n if (this._activeTrigger[trigger]) {\n return true\n }\n }\n\n return false\n }\n\n _getConfig(config) {\n const dataAttributes = Manipulator.getDataAttributes(this._element)\n\n Object.keys(dataAttributes).forEach(dataAttr => {\n if (DISALLOWED_ATTRIBUTES.has(dataAttr)) {\n delete dataAttributes[dataAttr]\n }\n })\n\n config = {\n ...this.constructor.Default,\n ...dataAttributes,\n ...(typeof config === 'object' && config ? config : {})\n }\n\n config.container = config.container === false ? document.body : getElement(config.container)\n\n if (typeof config.delay === 'number') {\n config.delay = {\n show: config.delay,\n hide: config.delay\n }\n }\n\n if (typeof config.title === 'number') {\n config.title = config.title.toString()\n }\n\n if (typeof config.content === 'number') {\n config.content = config.content.toString()\n }\n\n typeCheckConfig(NAME, config, this.constructor.DefaultType)\n\n if (config.sanitize) {\n config.template = sanitizeHtml(config.template, config.allowList, config.sanitizeFn)\n }\n\n return config\n }\n\n _getDelegateConfig() {\n const config = {}\n\n for (const key in this._config) {\n if (this.constructor.Default[key] !== this._config[key]) {\n config[key] = this._config[key]\n }\n }\n\n // In the future can be replaced with:\n // const keysWithDifferentValues = Object.entries(this._config).filter(entry => this.constructor.Default[entry[0]] !== this._config[entry[0]])\n // `Object.fromEntries(keysWithDifferentValues)`\n return config\n }\n\n _cleanTipClass() {\n const tip = this.getTipElement()\n const basicClassPrefixRegex = new RegExp(`(^|\\\\s)${this._getBasicClassPrefix()}\\\\S+`, 'g')\n const tabClass = tip.getAttribute('class').match(basicClassPrefixRegex)\n if (tabClass !== null && tabClass.length > 0) {\n tabClass.map(token => token.trim())\n .forEach(tClass => tip.classList.remove(tClass))\n }\n }\n\n _getBasicClassPrefix() {\n return CLASS_PREFIX\n }\n\n _handlePopperPlacementChange(popperData) {\n const { state } = popperData\n\n if (!state) {\n return\n }\n\n this.tip = state.elements.popper\n this._cleanTipClass()\n this._addAttachmentClass(this._getAttachment(state.placement))\n }\n\n _disposePopper() {\n if (this._popper) {\n this._popper.destroy()\n this._popper = null\n }\n }\n\n // Static\n\n static jQueryInterface(config) {\n return this.each(function () {\n const data = Tooltip.getOrCreateInstance(this, config)\n\n if (typeof config === 'string') {\n if (typeof data[config] === 'undefined') {\n throw new TypeError(`No method named \"${config}\"`)\n }\n\n data[config]()\n }\n })\n }\n}\n\n/**\n * ------------------------------------------------------------------------\n * jQuery\n * ------------------------------------------------------------------------\n * add .Tooltip to jQuery only if jQuery is present\n */\n\ndefineJQueryPlugin(Tooltip)\n\nexport default Tooltip\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v5.1.3): popover.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/main/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport { defineJQueryPlugin } from './util/index'\nimport Tooltip from './tooltip'\n\n/**\n * ------------------------------------------------------------------------\n * Constants\n * ------------------------------------------------------------------------\n */\n\nconst NAME = 'popover'\nconst DATA_KEY = 'bs.popover'\nconst EVENT_KEY = `.${DATA_KEY}`\nconst CLASS_PREFIX = 'bs-popover'\n\nconst Default = {\n ...Tooltip.Default,\n placement: 'right',\n offset: [0, 8],\n trigger: 'click',\n content: '',\n template: '
' +\n '
' +\n '

' +\n '
' +\n '
'\n}\n\nconst DefaultType = {\n ...Tooltip.DefaultType,\n content: '(string|element|function)'\n}\n\nconst Event = {\n HIDE: `hide${EVENT_KEY}`,\n HIDDEN: `hidden${EVENT_KEY}`,\n SHOW: `show${EVENT_KEY}`,\n SHOWN: `shown${EVENT_KEY}`,\n INSERTED: `inserted${EVENT_KEY}`,\n CLICK: `click${EVENT_KEY}`,\n FOCUSIN: `focusin${EVENT_KEY}`,\n FOCUSOUT: `focusout${EVENT_KEY}`,\n MOUSEENTER: `mouseenter${EVENT_KEY}`,\n MOUSELEAVE: `mouseleave${EVENT_KEY}`\n}\n\nconst SELECTOR_TITLE = '.popover-header'\nconst SELECTOR_CONTENT = '.popover-body'\n\n/**\n * ------------------------------------------------------------------------\n * Class Definition\n * ------------------------------------------------------------------------\n */\n\nclass Popover extends Tooltip {\n // Getters\n\n static get Default() {\n return Default\n }\n\n static get NAME() {\n return NAME\n }\n\n static get Event() {\n return Event\n }\n\n static get DefaultType() {\n return DefaultType\n }\n\n // Overrides\n\n isWithContent() {\n return this.getTitle() || this._getContent()\n }\n\n setContent(tip) {\n this._sanitizeAndSetContent(tip, this.getTitle(), SELECTOR_TITLE)\n this._sanitizeAndSetContent(tip, this._getContent(), SELECTOR_CONTENT)\n }\n\n // Private\n\n _getContent() {\n return this._resolvePossibleFunction(this._config.content)\n }\n\n _getBasicClassPrefix() {\n return CLASS_PREFIX\n }\n\n // Static\n\n static jQueryInterface(config) {\n return this.each(function () {\n const data = Popover.getOrCreateInstance(this, config)\n\n if (typeof config === 'string') {\n if (typeof data[config] === 'undefined') {\n throw new TypeError(`No method named \"${config}\"`)\n }\n\n data[config]()\n }\n })\n }\n}\n\n/**\n * ------------------------------------------------------------------------\n * jQuery\n * ------------------------------------------------------------------------\n * add .Popover to jQuery only if jQuery is present\n */\n\ndefineJQueryPlugin(Popover)\n\nexport default Popover\n","/**\n * --------------------------------------------------------------------------\n * Bootstrap (v5.1.3): scrollspy.js\n * Licensed under MIT (https://github.com/twbs/bootstrap/blob/main/LICENSE)\n * --------------------------------------------------------------------------\n */\n\nimport {\n defineJQueryPlugin,\n getElement,\n getSelectorFromElement,\n typeCheckConfig\n} from './util/index'\nimport EventHandler from './dom/event-handler'\nimport Manipulator from './dom/manipulator'\nimport SelectorEngine from './dom/selector-engine'\nimport BaseComponent from './base-component'\n\n/**\n * ------------------------------------------------------------------------\n * Constants\n * ------------------------------------------------------------------------\n */\n\nconst NAME = 'scrollspy'\nconst DATA_KEY = 'bs.scrollspy'\nconst EVENT_KEY = `.${DATA_KEY}`\nconst DATA_API_KEY = '.data-api'\n\nconst Default = {\n offset: 10,\n method: 'auto',\n target: ''\n}\n\nconst DefaultType = {\n offset: 'number',\n method: 'string',\n target: '(string|element)'\n}\n\nconst EVENT_ACTIVATE = `activate${EVENT_KEY}`\nconst EVENT_SCROLL = `scroll${EVENT_KEY}`\nconst EVENT_LOAD_DATA_API = `load${EVENT_KEY}${DATA_API_KEY}`\n\nconst CLASS_NAME_DROPDOWN_ITEM = 'dropdown-item'\nconst CLASS_NAME_ACTIVE = 'active'\n\nconst SELECTOR_DATA_SPY = '[data-bs-spy=\"scroll\"]'\nconst SELECTOR_NAV_LIST_GROUP = '.nav, .list-group'\nconst SELECTOR_NAV_LINKS = '.nav-link'\nconst SELECTOR_NAV_ITEMS = '.nav-item'\nconst SELECTOR_LIST_ITEMS = '.list-group-item'\nconst SELECTOR_LINK_ITEMS = `${SELECTOR_NAV_LINKS}, ${SELECTOR_LIST_ITEMS}, .${CLASS_NAME_DROPDOWN_ITEM}`\nconst SELECTOR_DROPDOWN = '.dropdown'\nconst SELECTOR_DROPDOWN_TOGGLE = '.dropdown-toggle'\n\nconst METHOD_OFFSET = 'offset'\nconst METHOD_POSITION = 'position'\n\n/**\n * ------------------------------------------------------------------------\n * Class Definition\n * ------------------------------------------------------------------------\n */\n\nclass ScrollSpy extends BaseComponent {\n constructor(element, config) {\n super(element)\n this._scrollElement = this._element.tagName === 'BODY' ? window : this._element\n this._config = this._getConfig(config)\n this._offsets = []\n this._targets = []\n this._activeTarget = null\n this._scrollHeight = 0\n\n EventHandler.on(this._scrollElement, EVENT_SCROLL, () => this._process())\n\n this.refresh()\n this._process()\n }\n\n // Getters\n\n static get Default() {\n return Default\n }\n\n static get NAME() {\n return NAME\n }\n\n // Public\n\n refresh() {\n const autoMethod = this._scrollElement === this._scrollElement.window ?\n METHOD_OFFSET :\n METHOD_POSITION\n\n const offsetMethod = this._config.method === 'auto' ?\n autoMethod :\n this._config.method\n\n const offsetBase = offsetMethod === METHOD_POSITION ?\n this._getScrollTop() :\n 0\n\n this._offsets = []\n this._targets = []\n this._scrollHeight = this._getScrollHeight()\n\n const targets = SelectorEngine.find(SELECTOR_LINK_ITEMS, this._config.target)\n\n targets.map(element => {\n const targetSelector = getSelectorFromElement(element)\n const target = targetSelector ? SelectorEngine.findOne(targetSelector) : null\n\n if (target) {\n const targetBCR = target.getBoundingClientRect()\n if (targetBCR.width || targetBCR.height) {\n return [\n Manipulator[offsetMethod](target).top + offsetBase,\n targetSelector\n ]\n }\n }\n\n return null\n })\n .filter(item => item)\n .sort((a, b) => a[0] - b[0])\n .forEach(item => {\n this._offsets.push(item[0])\n this._targets.push(item[1])\n })\n }\n\n dispose() {\n EventHandler.off(this._scrollElement, EVENT_KEY)\n super.dispose()\n }\n\n // Private\n\n _getConfig(config) {\n config = {\n ...Default,\n ...Manipulator.getDataAttributes(this._element),\n ...(typeof config === 'object' && config ? config : {})\n }\n\n config.target = getElement(config.target) || document.documentElement\n\n typeCheckConfig(NAME, config, DefaultType)\n\n return config\n }\n\n _getScrollTop() {\n return this._scrollElement === window ?\n this._scrollElement.pageYOffset :\n this._scrollElement.scrollTop\n }\n\n _getScrollHeight() {\n return this._scrollElement.scrollHeight || Math.max(\n document.body.scrollHeight,\n document.documentElement.scrollHeight\n )\n }\n\n _getOffsetHeight() {\n return this._scrollElement === window ?\n window.innerHeight :\n this._scrollElement.getBoundingClientRect().height\n }\n\n _process() {\n const scrollTop = this._getScrollTop() + this._config.offset\n const scrollHeight = this._getScrollHeight()\n const maxScroll = this._config.offset + scrollHeight - this._getOffsetHeight()\n\n if (this._scrollHeight !== scrollHeight) {\n this.refresh()\n }\n\n if (scrollTop >= maxScroll) {\n const target = this._targets[this._targets.length - 1]\n\n if (this._activeTarget !== target) {\n this._activate(target)\n }\n\n return\n }\n\n if (this._activeTarget && scrollTop < this._offsets[0] && this._offsets[0] > 0) {\n this._activeTarget = null\n this._clear()\n return\n }\n\n for (let i = this._offsets.length; i--;) {\n const isActiveTarget = this._activeTarget !== this._targets[i] &&\n scrollTop >= this._offsets[i] &&\n (typeof this._offsets[i + 1] === 'undefined' || scrollTop < this._offsets[i + 1])\n\n if (isActiveTarget) {\n this._activate(this._targets[i])\n }\n }\n }\n\n _activate(target) {\n this._activeTarget = target\n\n this._clear()\n\n const queries = SELECTOR_LINK_ITEMS.split(',')\n .map(selector => `${selector}[data-bs-target=\"${target}\"],${selector}[href=\"${target}\"]`)\n\n const link = SelectorEngine.findOne(queries.join(','), this._config.target)\n\n link.classList.add(CLASS_NAME_ACTIVE)\n if (link.classList.contains(CLASS_NAME_DROPDOWN_ITEM)) {\n SelectorEngine.findOne(SELECTOR_DROPDOWN_TOGGLE, link.closest(SELECTOR_DROPDOWN))\n .classList.add(CLASS_NAME_ACTIVE)\n } else {\n SelectorEngine.parents(link, SELECTOR_NAV_LIST_GROUP)\n .forEach(listGroup => {\n // Set triggered links parents as active\n // With both
    and

Installation

-

You can install the latest release from CRAN:

+

You can install the latest release from CRAN:

 install.packages('mikropml')

or the development version from GitHub:

 # install.packages("devtools")
 devtools::install_github("SchlossLab/mikropml")
-

or install from a terminal using conda:

-
conda install -c conda-forge r-mikropml
+

or install from a terminal using conda or mamba:

+
mamba install -c conda-forge r-mikropml

Dependencies

@@ -260,7 +260,7 @@

diff --git a/docs/dev/news/index.html b/docs/dev/news/index.html index 6a4603a6..9f5c7fec 100644 --- a/docs/dev/news/index.html +++ b/docs/dev/news/index.html @@ -1,5 +1,5 @@ -Changelog • mikropmlChangelog • mikropml @@ -151,7 +151,7 @@

mikropml 0.0.1

diff --git a/docs/dev/pkgdown.yml b/docs/dev/pkgdown.yml index e4baea2d..a4652846 100644 --- a/docs/dev/pkgdown.yml +++ b/docs/dev/pkgdown.yml @@ -1,5 +1,5 @@ pandoc: 2.7.3 -pkgdown: 2.0.5 +pkgdown: 2.0.6 pkgdown_sha: ~ articles: introduction: introduction.html @@ -7,7 +7,7 @@ articles: parallel: parallel.html preprocess: preprocess.html tuning: tuning.html -last_built: 2022-07-13T20:42Z +last_built: 2022-09-28T23:51Z urls: reference: http://www.schlosslab.org/mikropml/reference article: http://www.schlosslab.org/mikropml/articles diff --git a/docs/dev/pull_request_template.html b/docs/dev/pull_request_template.html index 9ffd3afb..1309bc5a 100644 --- a/docs/dev/pull_request_template.html +++ b/docs/dev/pull_request_template.html @@ -1,5 +1,5 @@ -NA • mikropmlNA • mikropml @@ -92,7 +92,7 @@

Checklist -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/reference/calc_perf_metrics.html b/docs/dev/reference/calc_perf_metrics.html index bd0a5503..99db58a5 100644 --- a/docs/dev/reference/calc_perf_metrics.html +++ b/docs/dev/reference/calc_perf_metrics.html @@ -1,5 +1,5 @@ -Get performance metrics for test data — calc_perf_metrics • mikropmlGet performance metrics for test data — calc_perf_metrics • mikropml @@ -134,7 +134,7 @@

Examples -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/reference/combine_hp_performance.html b/docs/dev/reference/combine_hp_performance.html index f8a6dbe8..64a82a9f 100644 --- a/docs/dev/reference/combine_hp_performance.html +++ b/docs/dev/reference/combine_hp_performance.html @@ -1,5 +1,5 @@ -Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance • mikropmlCombine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance • mikropml @@ -106,7 +106,7 @@

Examples -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/reference/compare_models.html b/docs/dev/reference/compare_models.html index 4a192ae1..746299d9 100644 --- a/docs/dev/reference/compare_models.html +++ b/docs/dev/reference/compare_models.html @@ -1,6 +1,6 @@ Perform permutation tests to compare the performance metric -across all pairs of a group variable. — compare_models • mikropmlDefine cross-validation scheme and training parameters — define_cv • mikropmlDefine cross-validation scheme and training parameters — define_cv • mikropml @@ -171,7 +171,7 @@

Examples -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/reference/get_caret_processed_df.html b/docs/dev/reference/get_caret_processed_df.html index b3da58c4..c1815611 100644 --- a/docs/dev/reference/get_caret_processed_df.html +++ b/docs/dev/reference/get_caret_processed_df.html @@ -1,5 +1,5 @@ -Get preprocessed dataframe for continuous variables — get_caret_processed_df • mikropmlGet preprocessed dataframe for continuous variables — get_caret_processed_df • mikropml @@ -2321,7 +2321,7 @@

Examples -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/reference/get_feature_importance.html b/docs/dev/reference/get_feature_importance.html index 207775db..ab5fe51b 100644 --- a/docs/dev/reference/get_feature_importance.html +++ b/docs/dev/reference/get_feature_importance.html @@ -1,6 +1,6 @@ Get feature importance using the permutation method — get_feature_importance • mikropmlGet feature importance using the permutation method — get_feature_importance • mikropmlGet hyperparameter performance metrics — get_hp_performance • mikropmlGet hyperparameter performance metrics — get_hp_performance • mikropml @@ -116,7 +116,7 @@

Examples -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/reference/get_hyperparams_list.html b/docs/dev/reference/get_hyperparams_list.html index 6bb0ec1f..ffc57c50 100644 --- a/docs/dev/reference/get_hyperparams_list.html +++ b/docs/dev/reference/get_hyperparams_list.html @@ -1,5 +1,5 @@ -Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list • mikropmlSet hyperparameters based on ML method and dataset characteristics — get_hyperparams_list • mikropml @@ -123,7 +123,7 @@

Examples -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/reference/get_outcome_type.html b/docs/dev/reference/get_outcome_type.html index 76146009..19b9b59c 100644 --- a/docs/dev/reference/get_outcome_type.html +++ b/docs/dev/reference/get_outcome_type.html @@ -1,7 +1,7 @@ Get outcome type. — get_outcome_type • mikropmlGet outcome type. — get_outcome_type • mikropmlSelect indices to partition the data into training & testing sets. — get_partition_indices • mikropmlSelect indices to partition the data into training & testing sets. — get_partition_indices • mikropml @@ -140,7 +140,7 @@

Examples -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/reference/get_perf_metric_fn.html b/docs/dev/reference/get_perf_metric_fn.html index fc0923f1..174a9316 100644 --- a/docs/dev/reference/get_perf_metric_fn.html +++ b/docs/dev/reference/get_perf_metric_fn.html @@ -1,5 +1,5 @@ -Get default performance metric function — get_perf_metric_fn • mikropmlGet default performance metric function — get_perf_metric_fn • mikropml @@ -93,7 +93,7 @@

Examples#> data$obs <- factor(data$obs, levels = lev) #> postResample(data[, "pred"], data[, "obs"]) #> } -#> <bytecode: 0x7fae55cad5f8> +#> <bytecode: 0x7f8221132138> #> <environment: namespace:caret> get_perf_metric_fn("binary") #> function (data, lev = NULL, model = NULL) @@ -151,7 +151,7 @@

Examples#> stats <- stats[c(stat_list)] #> return(stats) #> } -#> <bytecode: 0x7fae5cc18aa0> +#> <bytecode: 0x7f8220a98ab8> #> <environment: namespace:caret> get_perf_metric_fn("multiclass") #> function (data, lev = NULL, model = NULL) @@ -209,7 +209,7 @@

Examples#> stats <- stats[c(stat_list)] #> return(stats) #> } -#> <bytecode: 0x7fae5cc18aa0> +#> <bytecode: 0x7f8220a98ab8> #> <environment: namespace:caret> @@ -222,7 +222,7 @@

Examples -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/reference/get_perf_metric_name.html b/docs/dev/reference/get_perf_metric_name.html index 9030b8bd..7185b12d 100644 --- a/docs/dev/reference/get_perf_metric_name.html +++ b/docs/dev/reference/get_perf_metric_name.html @@ -1,5 +1,5 @@ -Get default performance metric name — get_perf_metric_name • mikropmlGet default performance metric name — get_perf_metric_name • mikropml @@ -103,7 +103,7 @@

Examples -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/reference/get_performance_tbl.html b/docs/dev/reference/get_performance_tbl.html index 1fc71f21..4dab24d9 100644 --- a/docs/dev/reference/get_performance_tbl.html +++ b/docs/dev/reference/get_performance_tbl.html @@ -1,5 +1,5 @@ -Get model performance metrics as a one-row tibble — get_performance_tbl • mikropmlGet model performance metrics as a one-row tibble — get_performance_tbl • mikropml @@ -163,7 +163,7 @@

Examples -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/reference/get_tuning_grid.html b/docs/dev/reference/get_tuning_grid.html index 85d5d78b..89da5d8e 100644 --- a/docs/dev/reference/get_tuning_grid.html +++ b/docs/dev/reference/get_tuning_grid.html @@ -1,5 +1,5 @@ -Generate the tuning grid for tuning hyperparameters — get_tuning_grid • mikropmlGenerate the tuning grid for tuning hyperparameters — get_tuning_grid • mikropml @@ -118,7 +118,7 @@

Examples -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/reference/group_correlated_features.html b/docs/dev/reference/group_correlated_features.html index 5e9c336c..01621e23 100644 --- a/docs/dev/reference/group_correlated_features.html +++ b/docs/dev/reference/group_correlated_features.html @@ -1,5 +1,5 @@ -Group correlated features — group_correlated_features • mikropmlGroup correlated features — group_correlated_features • mikropml @@ -124,7 +124,7 @@

Examples -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/reference/index.html b/docs/dev/reference/index.html index f883f0f1..2673d96a 100644 --- a/docs/dev/reference/index.html +++ b/docs/dev/reference/index.html @@ -1,5 +1,5 @@ -Function reference • mikropmlFunction reference • mikropml @@ -349,7 +349,7 @@

Pipeline customization diff --git a/docs/dev/reference/mikropml.html b/docs/dev/reference/mikropml.html index 9778690f..125d0a27 100644 --- a/docs/dev/reference/mikropml.html +++ b/docs/dev/reference/mikropml.html @@ -2,7 +2,7 @@ mikropml: User-Friendly R Package for Robust Machine Learning Pipelines — mikropml • mikropmlmikropml: User-Friendly R Package for Robust Machine Learning Pipelines — mikropml • mikropmlMini OTU abundance dataset — otu_mini_bin • mikropmlMini OTU abundance dataset — otu_mini_bin • mikropmlResults from running the pipeline with L2 logistic regression on otu_mini_bin with feature importance and grouping — otu_mini_bin_results_glmnet • mikropmlResults from running the pipeline with L2 logistic regression on otu_mini_bin with feature importance and grouping — otu_mini_bin_results_glmnet • mikropml @@ -81,7 +81,7 @@

Format< diff --git a/docs/dev/reference/otu_mini_bin_results_rf.html b/docs/dev/reference/otu_mini_bin_results_rf.html index 4424736a..8949b240 100644 --- a/docs/dev/reference/otu_mini_bin_results_rf.html +++ b/docs/dev/reference/otu_mini_bin_results_rf.html @@ -1,5 +1,5 @@ -Results from running the pipeline with random forest on otu_mini_bin — otu_mini_bin_results_rf • mikropmlResults from running the pipeline with random forest on otu_mini_bin — otu_mini_bin_results_rf • mikropml @@ -81,7 +81,7 @@

Format< diff --git a/docs/dev/reference/otu_mini_bin_results_rpart2.html b/docs/dev/reference/otu_mini_bin_results_rpart2.html index d74a6ff9..a2cd81d0 100644 --- a/docs/dev/reference/otu_mini_bin_results_rpart2.html +++ b/docs/dev/reference/otu_mini_bin_results_rpart2.html @@ -1,5 +1,5 @@ -Results from running the pipeline with rpart2 on otu_mini_bin — otu_mini_bin_results_rpart2 • mikropmlResults from running the pipeline with rpart2 on otu_mini_bin — otu_mini_bin_results_rpart2 • mikropml @@ -81,7 +81,7 @@

Format< diff --git a/docs/dev/reference/otu_mini_bin_results_svmRadial.html b/docs/dev/reference/otu_mini_bin_results_svmRadial.html index 733ed804..f37b721d 100644 --- a/docs/dev/reference/otu_mini_bin_results_svmRadial.html +++ b/docs/dev/reference/otu_mini_bin_results_svmRadial.html @@ -1,5 +1,5 @@ -Results from running the pipeline with svmRadial on otu_mini_bin — otu_mini_bin_results_svmRadial • mikropmlResults from running the pipeline with svmRadial on otu_mini_bin — otu_mini_bin_results_svmRadial • mikropml @@ -81,7 +81,7 @@

Format< diff --git a/docs/dev/reference/otu_mini_bin_results_xgbTree.html b/docs/dev/reference/otu_mini_bin_results_xgbTree.html index e365c17e..73249e84 100644 --- a/docs/dev/reference/otu_mini_bin_results_xgbTree.html +++ b/docs/dev/reference/otu_mini_bin_results_xgbTree.html @@ -1,5 +1,5 @@ -Results from running the pipeline with xbgTree on otu_mini_bin — otu_mini_bin_results_xgbTree • mikropmlResults from running the pipeline with xbgTree on otu_mini_bin — otu_mini_bin_results_xgbTree • mikropml @@ -81,7 +81,7 @@

Format< diff --git a/docs/dev/reference/otu_mini_cont_results_glmnet.html b/docs/dev/reference/otu_mini_cont_results_glmnet.html index 22d508bc..a3bc221e 100644 --- a/docs/dev/reference/otu_mini_cont_results_glmnet.html +++ b/docs/dev/reference/otu_mini_cont_results_glmnet.html @@ -1,7 +1,7 @@ Results from running the pipeline with glmnet on otu_mini_bin with Otu00001 -as the outcome — otu_mini_cont_results_glmnet • mikropmlFormat< diff --git a/docs/dev/reference/otu_mini_cv.html b/docs/dev/reference/otu_mini_cv.html index e2bdb6bc..8721ceaa 100644 --- a/docs/dev/reference/otu_mini_cv.html +++ b/docs/dev/reference/otu_mini_cv.html @@ -1,5 +1,5 @@ -Cross validation on train_data_mini with grouped features. — otu_mini_cv • mikropmlCross validation on train_data_mini with grouped features. — otu_mini_cv • mikropml @@ -81,7 +81,7 @@

Format< diff --git a/docs/dev/reference/otu_mini_multi.html b/docs/dev/reference/otu_mini_multi.html index 50666993..2cb50809 100644 --- a/docs/dev/reference/otu_mini_multi.html +++ b/docs/dev/reference/otu_mini_multi.html @@ -1,5 +1,5 @@ -Mini OTU abundance dataset with 3 categorical variables — otu_mini_multi • mikropmlMini OTU abundance dataset with 3 categorical variables — otu_mini_multi • mikropml @@ -83,7 +83,7 @@

Format< diff --git a/docs/dev/reference/otu_mini_multi_group.html b/docs/dev/reference/otu_mini_multi_group.html index 1e127d8e..f3384f02 100644 --- a/docs/dev/reference/otu_mini_multi_group.html +++ b/docs/dev/reference/otu_mini_multi_group.html @@ -1,5 +1,5 @@ -Groups for otu_mini_multi — otu_mini_multi_group • mikropmlGroups for otu_mini_multi — otu_mini_multi_group • mikropml @@ -81,7 +81,7 @@

Format< diff --git a/docs/dev/reference/otu_mini_multi_results_glmnet.html b/docs/dev/reference/otu_mini_multi_results_glmnet.html index 5ddfc06f..69e46dd2 100644 --- a/docs/dev/reference/otu_mini_multi_results_glmnet.html +++ b/docs/dev/reference/otu_mini_multi_results_glmnet.html @@ -1,7 +1,7 @@ Results from running the pipeline with glmnet on otu_mini_multi for -multiclass outcomes — otu_mini_multi_results_glmnet • mikropmlSmall OTU abundance dataset — otu_small • mikropmlSmall OTU abundance dataset — otu_small • mikropmlCalculated a permuted p-value comparing two models — permute_p_value • mikropmlCalculated a permuted p-value comparing two models — permute_p_value • mikropml @@ -132,7 +132,7 @@

Examples -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/reference/plot_hp_performance.html b/docs/dev/reference/plot_hp_performance.html index c2c28398..259cb42a 100644 --- a/docs/dev/reference/plot_hp_performance.html +++ b/docs/dev/reference/plot_hp_performance.html @@ -1,5 +1,5 @@ -Plot hyperparameter performance metrics — plot_hp_performance • mikropmlPlot hyperparameter performance metrics — plot_hp_performance • mikropml @@ -135,7 +135,7 @@

Examples -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/reference/plot_model_performance.html b/docs/dev/reference/plot_model_performance.html index 97ee46d7..a004902b 100644 --- a/docs/dev/reference/plot_model_performance.html +++ b/docs/dev/reference/plot_model_performance.html @@ -1,5 +1,5 @@ -Plot performance metrics for multiple ML runs with different parameters — plot_model_performance • mikropmlPlot performance metrics for multiple ML runs with different parameters — plot_model_performance • mikropml @@ -134,7 +134,7 @@

Examples -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/reference/preprocess_data.html b/docs/dev/reference/preprocess_data.html index eb5bd78c..b713d84a 100644 --- a/docs/dev/reference/preprocess_data.html +++ b/docs/dev/reference/preprocess_data.html @@ -1,5 +1,5 @@ -Preprocess data prior to running machine learning — preprocess_data • mikropmlPreprocess data prior to running machine learning — preprocess_data • mikropml @@ -151,19 +151,19 @@

Examples#> Using 'dx' as the outcome column. #> $dat_transformed #> # A tibble: 200 × 61 -#> dx Otu00001 Otu00002 Otu00003 Otu00004 Otu00005 Otu00006 Otu00007 Otu00008 -#> <chr> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> -#> 1 norm… -0.420 -0.219 -0.174 -0.591 -0.0488 -0.167 -0.569 -0.0624 -#> 2 norm… -0.105 1.75 -0.718 0.0381 1.54 -0.573 -0.643 -0.132 -#> 3 norm… -0.708 0.696 1.43 0.604 -0.265 -0.0364 -0.612 -0.207 -#> 4 norm… -0.494 -0.665 2.02 -0.593 -0.676 -0.586 -0.552 -0.470 -#> 5 norm… 1.11 -0.395 -0.754 -0.586 -0.754 2.73 0.191 -0.676 -#> 6 norm… -0.685 0.614 -0.174 -0.584 0.376 0.804 -0.337 -0.00608 -#> 7 canc… -0.770 -0.496 -0.318 0.159 -0.658 2.20 -0.717 0.0636 -#> 8 norm… -0.424 -0.478 -0.397 -0.556 -0.391 -0.0620 0.376 -0.0222 -#> 9 norm… -0.556 1.14 1.62 -0.352 -0.275 -0.465 -0.804 0.294 -#> 10 canc… 1.46 -0.451 -0.694 -0.0567 -0.706 0.689 -0.370 1.59 -#> # … with 190 more rows, and 52 more variables: Otu00009 <dbl>, Otu00010 <dbl>, +#> dx Otu00001 Otu00002 Otu00003 Otu00004 Otu00005 Otu00006 Otu00…¹ Otu00008 +#> <chr> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> +#> 1 normal -0.420 -0.219 -0.174 -0.591 -0.0488 -0.167 -0.569 -0.0624 +#> 2 normal -0.105 1.75 -0.718 0.0381 1.54 -0.573 -0.643 -0.132 +#> 3 normal -0.708 0.696 1.43 0.604 -0.265 -0.0364 -0.612 -0.207 +#> 4 normal -0.494 -0.665 2.02 -0.593 -0.676 -0.586 -0.552 -0.470 +#> 5 normal 1.11 -0.395 -0.754 -0.586 -0.754 2.73 0.191 -0.676 +#> 6 normal -0.685 0.614 -0.174 -0.584 0.376 0.804 -0.337 -0.00608 +#> 7 cancer -0.770 -0.496 -0.318 0.159 -0.658 2.20 -0.717 0.0636 +#> 8 normal -0.424 -0.478 -0.397 -0.556 -0.391 -0.0620 0.376 -0.0222 +#> 9 normal -0.556 1.14 1.62 -0.352 -0.275 -0.465 -0.804 0.294 +#> 10 cancer 1.46 -0.451 -0.694 -0.0567 -0.706 0.689 -0.370 1.59 +#> # … with 190 more rows, 52 more variables: Otu00009 <dbl>, Otu00010 <dbl>, #> # Otu00011 <dbl>, Otu00012 <dbl>, Otu00013 <dbl>, Otu00014 <dbl>, #> # Otu00015 <dbl>, Otu00016 <dbl>, Otu00017 <dbl>, Otu00018 <dbl>, #> # Otu00019 <dbl>, Otu00020 <dbl>, Otu00021 <dbl>, Otu00022 <dbl>, @@ -202,7 +202,7 @@

Examples -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/reference/randomize_feature_order.html b/docs/dev/reference/randomize_feature_order.html index 8b14ac1c..eb654d72 100644 --- a/docs/dev/reference/randomize_feature_order.html +++ b/docs/dev/reference/randomize_feature_order.html @@ -1,5 +1,5 @@ -Randomize feature order to eliminate any position-dependent effects — randomize_feature_order • mikropmlRandomize feature order to eliminate any position-dependent effects — randomize_feature_order • mikropml @@ -112,7 +112,7 @@

Examples -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/reference/reexports.html b/docs/dev/reference/reexports.html index e48f58c1..490f0eba 100644 --- a/docs/dev/reference/reexports.html +++ b/docs/dev/reference/reexports.html @@ -14,7 +14,7 @@ :=, !!, .data -">dplyr pipe — reexports • mikropmldplyr pipe — reexports • mikropmlVignettes

diff --git a/docs/dev/reference/remove_singleton_columns.html b/docs/dev/reference/remove_singleton_columns.html index 4dc7c1a1..ffb8eb41 100644 --- a/docs/dev/reference/remove_singleton_columns.html +++ b/docs/dev/reference/remove_singleton_columns.html @@ -1,5 +1,5 @@ -Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns • mikropmlRemove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns • mikropml @@ -143,7 +143,7 @@

Examples -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/reference/replace_spaces.html b/docs/dev/reference/replace_spaces.html index 597efb72..e3220fae 100644 --- a/docs/dev/reference/replace_spaces.html +++ b/docs/dev/reference/replace_spaces.html @@ -1,5 +1,5 @@ -Replace spaces in all elements of a character vector with underscores — replace_spaces • mikropmlReplace spaces in all elements of a character vector with underscores — replace_spaces • mikropml @@ -113,7 +113,7 @@

Examples -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/reference/run_ml.html b/docs/dev/reference/run_ml.html index 90352fd2..1a4f2da5 100644 --- a/docs/dev/reference/run_ml.html +++ b/docs/dev/reference/run_ml.html @@ -5,7 +5,7 @@ ). Required inputs are a dataframe with an outcome variable and other columns as features, as well as the ML method. -See vignette('introduction') for more details.">Run the machine learning pipeline — run_ml • mikropmlRun the machine learning pipeline — run_ml • mikropmlExamples -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/reference/tidy_perf_data.html b/docs/dev/reference/tidy_perf_data.html index 534a6674..0e6cb7f1 100644 --- a/docs/dev/reference/tidy_perf_data.html +++ b/docs/dev/reference/tidy_perf_data.html @@ -1,5 +1,5 @@ -Tidy the performance dataframe — tidy_perf_data • mikropmlTidy the performance dataframe — tidy_perf_data • mikropml @@ -111,7 +111,7 @@

Examples -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/reference/train_model.html b/docs/dev/reference/train_model.html index 2f12d214..d47ef4bf 100644 --- a/docs/dev/reference/train_model.html +++ b/docs/dev/reference/train_model.html @@ -1,5 +1,5 @@ -Train model using caret::train(). — train_model • mikropmlTrain model using caret::train(). — train_model • mikropml @@ -165,7 +165,7 @@

Examples -

Site built with pkgdown 2.0.5.

+

Site built with pkgdown 2.0.6.

diff --git a/docs/dev/search.json b/docs/dev/search.json index 563d0b86..5cd86855 100644 --- a/docs/dev/search.json +++ b/docs/dev/search.json @@ -1 +1 @@ -[{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":null,"dir":"","previous_headings":"","what":"Contributor Covenant Code of Conduct","title":"Contributor Covenant Code of Conduct","text":"document adapted Tidyverse Code Conduct.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"our-pledge","dir":"","previous_headings":"","what":"Our Pledge","title":"Contributor Covenant Code of Conduct","text":"members, contributors, leaders pledge make participation community harassment-free experience everyone, regardless age, body size, visible invisible disability, ethnicity, sex characteristics, gender identity expression, level experience, education, socio-economic status, nationality, personal appearance, race, religion, sexual identity orientation. pledge act interact ways contribute open, welcoming, diverse, inclusive, healthy community.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"our-standards","dir":"","previous_headings":"","what":"Our Standards","title":"Contributor Covenant Code of Conduct","text":"Examples behavior contributes positive environment community include: Demonstrating empathy kindness toward people respectful differing opinions, viewpoints, experiences Giving gracefully accepting constructive feedback Accepting responsibility apologizing affected mistakes, learning experience Focusing best just us individuals, overall community Examples unacceptable behavior include: use sexualized language imagery, sexual attention advances kind Trolling, insulting derogatory comments, personal political attacks Public private harassment Publishing others’ private information, physical email address, without explicit permission conduct reasonably considered inappropriate professional setting","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"enforcement-responsibilities","dir":"","previous_headings":"","what":"Enforcement Responsibilities","title":"Contributor Covenant Code of Conduct","text":"Community leaders responsible clarifying enforcing standards acceptable behavior take appropriate fair corrective action response behavior deem inappropriate, threatening, offensive, harmful. Community leaders right responsibility remove, edit, reject comments, commits, code, wiki edits, issues, contributions aligned Code Conduct, communicate reasons moderation decisions appropriate.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"scope","dir":"","previous_headings":"","what":"Scope","title":"Contributor Covenant Code of Conduct","text":"Code Conduct applies within community spaces, also applies individual officially representing community public spaces. Examples representing community include using official e-mail address, posting via official social media account, acting appointed representative online offline event.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"enforcement","dir":"","previous_headings":"","what":"Enforcement","title":"Contributor Covenant Code of Conduct","text":"Instances abusive, harassing, otherwise unacceptable behavior may reported community leaders responsible enforcement [INSERT CONTACT METHOD]. complaints reviewed investigated promptly fairly. community leaders obligated respect privacy security reporter incident.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"enforcement-guidelines","dir":"","previous_headings":"","what":"Enforcement Guidelines","title":"Contributor Covenant Code of Conduct","text":"Community leaders follow Community Impact Guidelines determining consequences action deem violation Code Conduct:","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"1-correction","dir":"","previous_headings":"Enforcement Guidelines","what":"1. Correction","title":"Contributor Covenant Code of Conduct","text":"Community Impact: Use inappropriate language behavior deemed unprofessional unwelcome community. Consequence: private, written warning community leaders, providing clarity around nature violation explanation behavior inappropriate. public apology may requested.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"2-warning","dir":"","previous_headings":"Enforcement Guidelines","what":"2. Warning","title":"Contributor Covenant Code of Conduct","text":"Community Impact: violation single incident series actions. Consequence: warning consequences continued behavior. interaction people involved, including unsolicited interaction enforcing Code Conduct, specified period time. includes avoiding interactions community spaces well external channels like social media. Violating terms may lead temporary permanent ban.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"3-temporary-ban","dir":"","previous_headings":"Enforcement Guidelines","what":"3. Temporary Ban","title":"Contributor Covenant Code of Conduct","text":"Community Impact: serious violation community standards, including sustained inappropriate behavior. Consequence: temporary ban sort interaction public communication community specified period time. public private interaction people involved, including unsolicited interaction enforcing Code Conduct, allowed period. Violating terms may lead permanent ban.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"4-permanent-ban","dir":"","previous_headings":"Enforcement Guidelines","what":"4. Permanent Ban","title":"Contributor Covenant Code of Conduct","text":"Community Impact: Demonstrating pattern violation community standards, including sustained inappropriate behavior, harassment individual, aggression toward disparagement classes individuals. Consequence: permanent ban sort public interaction within community.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"attribution","dir":"","previous_headings":"","what":"Attribution","title":"Contributor Covenant Code of Conduct","text":"Code Conduct adapted Contributor Covenant, version 2.0, available https://www.contributor-covenant.org/version/2/0/ code_of_conduct.html. Community Impact Guidelines inspired Mozilla’s code conduct enforcement ladder. answers common questions code conduct, see FAQ https://www.contributor-covenant.org/faq. Translations available https:// www.contributor-covenant.org/translations.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CONTRIBUTING.html","id":null,"dir":"","previous_headings":"","what":"Contributing to mikropml","title":"Contributing to mikropml","text":"document adapted Tidyverse Contributing guide.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CONTRIBUTING.html","id":"fixing-typos","dir":"","previous_headings":"","what":"Fixing typos","title":"Contributing to mikropml","text":"can fix typos, spelling mistakes, grammatical errors documentation directly using GitHub web interface, long changes made source file. generally means ’ll need edit roxygen2 comments .R, .Rd file. can find .R file generates .Rd reading comment first line.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CONTRIBUTING.html","id":"bigger-changes","dir":"","previous_headings":"","what":"Bigger changes","title":"Contributing to mikropml","text":"want make bigger change, ’s good idea first file issue make sure someone team agrees ’s needed. ’ve found bug, please file issue illustrates bug minimal reprex (also help write unit test, needed).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CONTRIBUTING.html","id":"pull-request-process","dir":"","previous_headings":"Bigger changes","what":"Pull request process","title":"Contributing to mikropml","text":"Fork package clone onto computer. haven’t done , recommend using usethis::create_from_github(\"SchlossLab/mikropml\", fork = TRUE). Install development dependences devtools::install_dev_deps(), make sure package passes R CMD check running devtools::check(). R CMD check doesn’t pass cleanly, ’s good idea ask help continuing. Create Git branch pull request (PR). recommend using usethis::pr_init(\"brief-description--change\"). Make changes, commit git, create PR running usethis::pr_push(), following prompts browser. title PR briefly describe change. body PR contain Fixes #issue-number. user-facing changes, add bullet top NEWS.md (.e. just first header). Follow style described https://style.tidyverse.org/news.html.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CONTRIBUTING.html","id":"code-style","dir":"","previous_headings":"Bigger changes","what":"Code style","title":"Contributing to mikropml","text":"New code follow tidyverse style guide. can use styler package apply styles, please don’t restyle code nothing PR. use roxygen2, Markdown syntax, documentation. use testthat unit tests. Contributions test cases included easier accept.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CONTRIBUTING.html","id":"code-of-conduct","dir":"","previous_headings":"","what":"Code of Conduct","title":"Contributing to mikropml","text":"Please note mikropml project released Contributor Code Conduct. contributing project agree abide terms.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/LICENSE.html","id":null,"dir":"","previous_headings":"","what":"MIT License","title":"MIT License","text":"Copyright (c) 2019-2021 Begüm D. Topçuoğlu, Zena Lapp, Kelly L. Sovacool, Evan Snitkin, Jenna Wiens, Patrick D. Schloss Permission hereby granted, free charge, person obtaining copy software associated documentation files (“Software”), deal Software without restriction, including without limitation rights use, copy, modify, merge, publish, distribute, sublicense, /sell copies Software, permit persons Software furnished , subject following conditions: copyright notice permission notice shall included copies substantial portions Software. SOFTWARE PROVIDED “”, WITHOUT WARRANTY KIND, EXPRESS IMPLIED, INCLUDING LIMITED WARRANTIES MERCHANTABILITY, FITNESS PARTICULAR PURPOSE NONINFRINGEMENT. EVENT SHALL AUTHORS COPYRIGHT HOLDERS LIABLE CLAIM, DAMAGES LIABILITY, WHETHER ACTION CONTRACT, TORT OTHERWISE, ARISING , CONNECTION SOFTWARE USE DEALINGS SOFTWARE.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/SUPPORT.html","id":null,"dir":"","previous_headings":"","what":"Getting help with mikropml","title":"Getting help with mikropml","text":"Thanks using mikropml! filing issue, places explore pieces put together make process smooth possible.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/SUPPORT.html","id":"make-a-reprex","dir":"","previous_headings":"","what":"Make a reprex","title":"Getting help with mikropml","text":"Start making minimal reproducible example using reprex package. haven’t heard used reprex , ’re treat! Seriously, reprex make R-question-asking endeavors easier (pretty insane ROI five ten minutes ’ll take learn ’s ). additional reprex pointers, check Get help! section tidyverse site.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/SUPPORT.html","id":"where-to-ask","dir":"","previous_headings":"","what":"Where to ask?","title":"Getting help with mikropml","text":"Armed reprex, next step figure ask. ’s question: start community.rstudio.com, /StackOverflow. people answer questions. ’s bug: ’re right place, file issue. ’re sure: let community help figure ! problem bug feature request, can easily return report . opening new issue, sure search issues pull requests make sure bug hasn’t reported /already fixed development version. default, search pre-populated :issue :open. can edit qualifiers (e.g. :pr, :closed) needed. example, ’d simply remove :open search issues repo, open closed.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/SUPPORT.html","id":"what-happens-next","dir":"","previous_headings":"","what":"What happens next?","title":"Getting help with mikropml","text":"efficient possible, development tidyverse packages tends bursty, shouldn’t worry don’t get immediate response. Typically don’t look repo sufficient quantity issues accumulates, ’s burst intense activity focus efforts. makes development efficient avoids expensive context switching problems, cost taking longer get back . process makes good reprex particularly important might multiple months initial report start working . can’t reproduce bug, can’t fix !","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"its-running-so-slow","dir":"Articles","previous_headings":"","what":"It’s running so slow!","title":"Introduction to mikropml","text":"Since assume lot won’t read entire vignette, ’m going say beginning. run_ml() function running super slow, consider parallelizing. See vignette(\"parallel\") examples.","code":""},{"path":[]},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"the-input-data","dir":"Articles","previous_headings":"Understanding the inputs","what":"The input data","title":"Introduction to mikropml","text":"input data run_ml() dataframe row sample observation. One column (assumed first) outcome interest, columns features. package otu_mini_bin small example dataset mikropml. , dx outcome column (normal cancer), 10 features (Otu00001 Otu00010). 2 outcomes, performing binary classification majority examples . bottom, also briefly provide examples multi-class continuous outcomes. ’ll see, run way binary classification! feature columns amount Operational Taxonomic Unit (OTU) microbiome samples patients cancer without cancer. goal predict dx, stands diagnosis. diagnosis can cancer based individual’s microbiome. need understand exactly means, ’re interested can read original paper (Topçuoğlu et al. 2020). real machine learning applications ’ll need use features, purposes vignette ’ll stick example dataset everything runs faster.","code":"#install.packages(\"devtools\") #devtools::install_github(\"SchlossLab/mikropml\") library(mikropml) head(otu_mini_bin) #> dx Otu00001 Otu00002 Otu00003 Otu00004 Otu00005 Otu00006 Otu00007 #> 1 normal 350 268 213 1 208 230 70 #> 2 normal 568 1320 13 293 671 103 48 #> 3 normal 151 756 802 556 145 271 57 #> 4 normal 299 30 1018 0 25 99 75 #> 5 normal 1409 174 0 3 2 1136 296 #> 6 normal 167 712 213 4 332 534 139 #> Otu00008 Otu00009 Otu00010 #> 1 230 235 64 #> 2 204 119 115 #> 3 176 37 710 #> 4 78 255 197 #> 5 1 537 533 #> 6 251 155 122"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"the-methods-we-support","dir":"Articles","previous_headings":"Understanding the inputs","what":"The methods we support","title":"Introduction to mikropml","text":"methods use supported great ML wrapper package caret, use train machine learning models. methods tested (backend packages) : Logistic/multiclass/linear regression (\"glmnet\") Random forest (\"rf\") Decision tree (\"rpart2\") Support vector machine radial basis kernel (\"svmRadial\") xgboost (\"xgbTree\") documentation methods, well many others, can look available models (see list tag). vetted models used caret, function general enough others might work. can’t promise can help models, feel free open issue GitHub questions models might able help. first focus glmnet, default implementation L2-regularized logistic regression. cover examples towards end.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"before-running-ml","dir":"Articles","previous_headings":"","what":"Before running ML","title":"Introduction to mikropml","text":"execute run_ml(), consider preprocessing data, either preprocess_data() function. can learn preprocessing vignette: vignette(\"preprocess\").","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"the-simplest-way-to-run_ml","dir":"Articles","previous_headings":"","what":"The simplest way to run_ml()","title":"Introduction to mikropml","text":"mentioned , minimal input dataset (dataset) machine learning model want use (method). may also want provide: outcome column name. default run_ml() pick first column, ’s best practice specify column name explicitly. seed results reproducible, get results see (.e train/test split). Say want use logistic regression, method use glmnet. , run ML pipeline : ’ll notice things: takes little run. parameters use. message stating ‘dx’ used outcome column. want, ’s nice sanity check! warning. Don’t worry warning right now - just means hyperparameters aren’t good fit - ’re interested learning , see vignette(\"tuning\"). Now, let’s dig output bit. results list 4 things: trained_model trained model caret. bunch info won’t get , can learn caret::train() documentation. test_data partition dataset used testing. machine learning, ’s always important held-test dataset used training stage. pipeline using run_ml() split data training testing sets. training data used build model (e.g. tune hyperparameters, learn data) test data used evaluate well model performs. performance dataframe (mainly) performance metrics (1 column cross-validation performance metric, several test performance metrics, 2 columns end ML method seed): using logistic regression binary classification, area receiver-operator characteristic curve (AUC) useful metric evaluate model performance. , ’s default use mikropml. However, crucial evaluate model performance using multiple metrics. can find information performance metrics use package. cv_metric_AUC AUC cross-validation folds training data. gives us sense well model performs training data. columns performance metrics test data — data wasn’t used build model. , can see AUC test data much 0.5, suggesting model predict much better chance, model overfit cross-validation AUC (cv_metric_AUC, measured training) much higher testing AUC. isn’t surprising since ’re using features example dataset, don’t discouraged. default option also provides number performance metrics might interested , including area precision-recall curve (prAUC). last columns results$performance method seed (set one) help combining results multiple runs (see vignette(\"parallel\")). feature_importance information feature importance values find_feature_importance = TRUE (default FALSE). Since used defaults, ’s nothing :","code":"results <- run_ml(otu_mini_bin, 'glmnet', outcome_colname = 'dx', seed = 2019) names(results) #> [1] \"trained_model\" \"test_data\" \"performance\" #> [4] \"feature_importance\" names(results$trained_model) #> [1] \"method\" \"modelInfo\" \"modelType\" \"results\" \"pred\" #> [6] \"bestTune\" \"call\" \"dots\" \"metric\" \"control\" #> [11] \"finalModel\" \"preProcess\" \"trainingData\" \"ptype\" \"resample\" #> [16] \"resampledCM\" \"perfNames\" \"maximize\" \"yLimits\" \"times\" #> [21] \"levels\" \"terms\" \"coefnames\" \"xlevels\" head(results$test_data) #> dx Otu00009 Otu00005 Otu00010 Otu00001 Otu00008 Otu00004 Otu00003 #> 9 normal 119 142 248 256 363 112 871 #> 14 normal 60 209 70 86 96 1 123 #> 16 cancer 205 5 180 1668 95 22 3 #> 17 normal 188 356 107 381 1035 915 315 #> 27 normal 4 21 161 7 1 27 8 #> 30 normal 13 166 5 31 33 5 58 #> Otu00002 Otu00007 Otu00006 #> 9 995 0 137 #> 14 426 54 40 #> 16 20 590 570 #> 17 357 253 341 #> 27 25 322 5 #> 30 179 6 30 results$performance #> # A tibble: 1 × 17 #> cv_metric_AUC logLoss AUC prAUC Accuracy Kappa F1 Sensitivity Specificity #> #> 1 0.622 0.684 0.647 0.606 0.590 0.179 0.6 0.6 0.579 #> # … with 8 more variables: Pos_Pred_Value , Neg_Pred_Value , #> # Precision , Recall , Detection_Rate , #> # Balanced_Accuracy , method , seed results$feature_importance #> [1] \"Skipped feature importance\""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"customizing-parameters","dir":"Articles","previous_headings":"","what":"Customizing parameters","title":"Introduction to mikropml","text":"arguments allow change execute run_ml(). ’ve chosen reasonable defaults , encourage change think something else better data.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"changing-kfold-cv_times-and-training_frac","dir":"Articles","previous_headings":"Customizing parameters","what":"Changing kfold, cv_times, and training_frac","title":"Introduction to mikropml","text":"kfold: number folds run cross-validation (default: 5). cv_times: number times run repeated cross-validation (default: 100). training_frac: fraction data training set (default: 0.8). rest data used testing. ’s example change default parameters: might noticed one ran faster — ’s reduced kfold cv_times. okay testing things may even necessary smaller datasets. general may better larger numbers parameters; think defaults good starting point (Topçuoğlu et al. 2020).","code":"results_custom <- run_ml(otu_mini_bin, 'glmnet', kfold = 2, cv_times = 5, training_frac = 0.5, seed = 2019) #> Using 'dx' as the outcome column. #> Training the model... #> Loading required package: ggplot2 #> Loading required package: lattice #> #> Attaching package: 'caret' #> The following object is masked from 'package:mikropml': #> #> compare_models #> Warning in (function (w) : `caret::train()` issued the following warning: #> #> simpleWarning in nominalTrainWorkflow(x = x, y = y, wts = weights, info = trainInfo, : There were missing values in resampled performance measures. #> #> This warning usually means that the model didn't converge in some cross-validation folds because it is predicting something close to a constant. As a result, certain performance metrics can't be calculated. This suggests that some of the hyperparameters chosen are doing very poorly. #> Training complete."},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"custom-training-indices","dir":"Articles","previous_headings":"Customizing parameters > Changing kfold, cv_times, and training_frac","what":"Custom training indices","title":"Introduction to mikropml","text":"training_frac fraction 0 1, random sample observations dataset chosen training set satisfy training_frac. However, cases might wish control exactly observations training set. can instead assign training_frac vector indices correspond rows dataset go training set (remaining sequences go testing set).","code":"n_obs <- otu_mini_bin %>% nrow() training_size <- 0.8 * n_obs training_rows <- sample(n_obs, training_size) results_custom_train <- run_ml(otu_mini_bin, 'glmnet', kfold = 2, cv_times = 5, training_frac = training_rows, seed = 2019 ) #> Using 'dx' as the outcome column. #> Using the custom training set indices provided by `training_frac`. #> The fraction of data in the training set will be 0.8 #> Training the model... #> Training complete."},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"changing-the-performance-metric","dir":"Articles","previous_headings":"Customizing parameters","what":"Changing the performance metric","title":"Introduction to mikropml","text":"two arguments allow change performance metric use model evaluation, performance metrics calculate using test data. perf_metric_function function used calculate performance metrics. default classification caret::multiClassSummary() default regression caret::defaultSummary(). ’d suggest changing unless really know ’re . perf_metric_name column name output perf_metric_function. chose reasonable defaults (AUC binary, logLoss multiclass, RMSE continuous), default functions calculate bunch different performance metrics, can choose different one ’d like. default performance metrics available classification : default performance metrics available regression : ’s example using prAUC instead AUC: ’ll see cross-validation metric prAUC, instead default AUC:","code":"#> [1] \"logLoss\" \"AUC\" \"prAUC\" #> [4] \"Accuracy\" \"Kappa\" \"Mean_F1\" #> [7] \"Mean_Sensitivity\" \"Mean_Specificity\" \"Mean_Pos_Pred_Value\" #> [10] \"Mean_Neg_Pred_Value\" \"Mean_Precision\" \"Mean_Recall\" #> [13] \"Mean_Detection_Rate\" \"Mean_Balanced_Accuracy\" #> [1] \"RMSE\" \"Rsquared\" \"MAE\" results_pr <- run_ml(otu_mini_bin, 'glmnet', cv_times = 5, perf_metric_name = 'prAUC', seed = 2019) #> Using 'dx' as the outcome column. #> Training the model... #> Warning in (function (w) : `caret::train()` issued the following warning: #> #> simpleWarning in nominalTrainWorkflow(x = x, y = y, wts = weights, info = trainInfo, : There were missing values in resampled performance measures. #> #> This warning usually means that the model didn't converge in some cross-validation folds because it is predicting something close to a constant. As a result, certain performance metrics can't be calculated. This suggests that some of the hyperparameters chosen are doing very poorly. #> Training complete. results_pr$performance #> # A tibble: 1 × 17 #> cv_metric_prAUC logLoss AUC prAUC Accuracy Kappa F1 Sensitivity #> #> 1 0.577 0.691 0.663 0.605 0.538 0.0539 0.690 1 #> # … with 9 more variables: Specificity , Pos_Pred_Value , #> # Neg_Pred_Value , Precision , Recall , Detection_Rate , #> # Balanced_Accuracy , method , seed "},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"using-groups","dir":"Articles","previous_headings":"Customizing parameters","what":"Using groups","title":"Introduction to mikropml","text":"optional groups vector groups keep together splitting data train test sets cross-validation. Sometimes ’s important split data based grouping instead just randomly. allows control similarities within groups don’t want skew predictions (.e. batch effects). example, biological data may samples collected multiple hospitals, might like keep observations hospital partition. ’s example split data train/test sets based groups: one difference run_ml() report much data training set run code chunk. can little finicky depending many samples groups . won’t exactly specify training_frac, since include one group either training set test set.","code":"# make random groups set.seed(2019) grps <- sample(LETTERS[1:8], nrow(otu_mini_bin), replace=TRUE) results_grp <- run_ml(otu_mini_bin, 'glmnet', cv_times = 2, training_frac = 0.8, groups = grps, seed = 2019) #> Using 'dx' as the outcome column. #> Fraction of data in the training set: 0.795 #> Groups in the training set: A B D F G H #> Groups in the testing set: C E #> Groups will be kept together in CV partitions #> Training the model... #> Training complete."},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"controlling-how-groups-are-assigned-to-partitions","dir":"Articles","previous_headings":"Customizing parameters > Using groups","what":"Controlling how groups are assigned to partitions","title":"Introduction to mikropml","text":"use groups parameter , default run_ml() assume want observations group placed partition train/test split. makes sense want use groups control batch effects. However, cases might prefer control exactly groups end partition, might even okay observations group assigned different partitions. example, say want groups B used training, C D testing, don’t preference happens groups. can give group_partitions parameter named list specify groups go training set go testing set. case, observations & B used training, C & D used testing, remaining groups randomly assigned one satisfy training_frac closely possible. another scenario, maybe want groups F used training, also want allow observations selected training F used testing: need even control , take look setting custom training indices. might also prefer provide train control scheme cross_val parameter run_ml().","code":"results_grp_part <- run_ml(otu_mini_bin, 'glmnet', cv_times = 2, training_frac = 0.8, groups = grps, group_partitions = list(train = c('A', 'B'), test = c('C', 'D') ), seed = 2019) #> Using 'dx' as the outcome column. #> Fraction of data in the training set: 0.785 #> Groups in the training set: A B E F G H #> Groups in the testing set: C D #> Groups will not be kept together in CV partitions because the number of groups in the training set is not larger than `kfold` #> Training the model... #> Training complete. results_grp_trainA <- run_ml(otu_mini_bin, 'glmnet', cv_times = 2, kfold = 2, training_frac = 0.5, groups = grps, group_partitions = list(train = c(\"A\", \"B\", \"C\", \"D\", \"E\", \"F\"), test = c(\"A\", \"B\", \"C\", \"D\", \"E\", \"F\", \"G\", \"H\") ), seed = 2019) #> Using 'dx' as the outcome column. #> Fraction of data in the training set: 0.5 #> Groups in the training set: A B C D E F #> Groups in the testing set: A B C D E F G H #> Groups will be kept together in CV partitions #> Training the model... #> Training complete."},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"finding-feature-importance","dir":"Articles","previous_headings":"","what":"Finding feature importance","title":"Introduction to mikropml","text":"find features contributing predictive power, can use find_feature_importance = TRUE. use permutation importance determine feature importance described (Topçuoğlu et al. 2020). Briefly, permutes features individually (correlated ones together) evaluates much performance metric decreases. performance decreases feature randomly shuffled, important feature . default FALSE takes run useful want know features important predicting outcome. Let’s look feature importance results: Now, can check feature importances: several columns: perf_metric: performance value permuted feature. perf_metric_diff: difference performance actual permuted data (.e. test performance minus permuted performance). Features larger perf_metric_diff important. pvalue: probability obtaining actual performance value null hypothesis. names: feature permuted. method: ML method used. perf_metric_name: performance metric used. seed: seed (set). can see , differences negligible (close zero), makes sense since model isn’t great. ’re interested feature importance, ’s especially useful run multiple different train/test splits, shown example snakemake workflow. can also choose permute correlated features together using corr_thresh (default: 1). features correlation threshold permuted together; .e. perfectly correlated features permuted together using default value. can see features permuted together names column. 3 features permuted together (doesn’t really make sense, ’s just example). previously executed run_ml() without feature importance now wish find feature importance fact, see example code get_feature_importance() documentation. get_feature_importance() can show live progress bar, see vignette(\"parallel\") examples.","code":"results_imp <- run_ml(otu_mini_bin, \"rf\", outcome_colname = \"dx\", find_feature_importance = TRUE, seed = 2019 ) results_imp$feature_importance #> perf_metric perf_metric_diff pvalue names method perf_metric_name #> 1 0.5542375 0.0082625 0.40594059 Otu00001 rf AUC #> 2 0.5731750 -0.0106750 0.62376238 Otu00002 rf AUC #> 3 0.5548750 0.0076250 0.43564356 Otu00003 rf AUC #> 4 0.6414750 -0.0789750 0.99009901 Otu00004 rf AUC #> 5 0.5049625 0.0575375 0.05940594 Otu00005 rf AUC #> 6 0.5444500 0.0180500 0.19801980 Otu00006 rf AUC #> 7 0.5417125 0.0207875 0.23762376 Otu00007 rf AUC #> 8 0.5257750 0.0367250 0.08910891 Otu00008 rf AUC #> 9 0.5395750 0.0229250 0.05940594 Otu00009 rf AUC #> 10 0.4977625 0.0647375 0.05940594 Otu00010 rf AUC #> seed #> 1 2019 #> 2 2019 #> 3 2019 #> 4 2019 #> 5 2019 #> 6 2019 #> 7 2019 #> 8 2019 #> 9 2019 #> 10 2019 results_imp_corr <- run_ml(otu_mini_bin, 'glmnet', cv_times = 5, find_feature_importance = TRUE, corr_thresh = 0.2, seed = 2019) #> Using 'dx' as the outcome column. #> Training the model... #> Warning in (function (w) : `caret::train()` issued the following warning: #> #> simpleWarning in nominalTrainWorkflow(x = x, y = y, wts = weights, info = trainInfo, : There were missing values in resampled performance measures. #> #> This warning usually means that the model didn't converge in some cross-validation folds because it is predicting something close to a constant. As a result, certain performance metrics can't be calculated. This suggests that some of the hyperparameters chosen are doing very poorly. #> Training complete. #> Finding feature importance... #> Feature importance complete. results_imp_corr$feature_importance #> perf_metric perf_metric_diff pvalue #> 1 0.5502105 0.09715789 0.08910891 #> 2 0.6369474 0.01042105 0.44554455 #> 3 0.5951316 0.05223684 0.11881188 #> names #> 1 Otu00001|Otu00002|Otu00003|Otu00005|Otu00006|Otu00007|Otu00009|Otu00010 #> 2 Otu00004 #> 3 Otu00008 #> method perf_metric_name seed #> 1 glmnet AUC 2019 #> 2 glmnet AUC 2019 #> 3 glmnet AUC 2019"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"tuning-hyperparameters-using-the-hyperparameter-argument","dir":"Articles","previous_headings":"","what":"Tuning hyperparameters (using the hyperparameter argument)","title":"Introduction to mikropml","text":"important, whole vignette . bottom line provide default hyperparameters can start , ’s important tune hyperparameters. information default hyperparameters , tune hyperparameters, see vignette(\"tuning\").","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"other-models","dir":"Articles","previous_headings":"","what":"Other models","title":"Introduction to mikropml","text":"examples train evaluate models. output similar, won’t go details.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"random-forest","dir":"Articles","previous_headings":"Other models","what":"Random forest","title":"Introduction to mikropml","text":"can also change number trees use random forest (ntree; default: 1000). can’t tuned using rf package implementation random forest. Please refer caret documentation interested packages random forest implementations.","code":"results_rf <- run_ml(otu_mini_bin, 'rf', cv_times = 5, seed = 2019) results_rf_nt <- run_ml(otu_mini_bin, 'rf', cv_times = 5, ntree = 10, seed = 2019)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"decision-tree","dir":"Articles","previous_headings":"Other models","what":"Decision tree","title":"Introduction to mikropml","text":"","code":"results_dt <- run_ml(otu_mini_bin, 'rpart2', cv_times = 5, seed = 2019)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"svm","dir":"Articles","previous_headings":"Other models","what":"SVM","title":"Introduction to mikropml","text":"get message “maximum number iterations reached”, see issue caret.","code":"results_svm <- run_ml(otu_mini_bin, 'svmRadial', cv_times = 5, seed = 2019)"},{"path":[]},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"multiclass-data","dir":"Articles","previous_headings":"Other data","what":"Multiclass data","title":"Introduction to mikropml","text":"provide otu_mini_multi multiclass outcome (three outcomes): ’s example running multiclass data: performance metrics slightly different, format everything else :","code":"otu_mini_multi %>% dplyr::pull('dx') %>% unique() #> [1] \"adenoma\" \"carcinoma\" \"normal\" results_multi <- run_ml(otu_mini_multi, outcome_colname = \"dx\", seed = 2019 ) results_multi$performance #> # A tibble: 1 × 17 #> cv_metric_logLoss logLoss AUC prAUC Accuracy Kappa Mean_F1 Mean_Sensitivity #> #> 1 1.07 1.11 0.506 0.353 0.382 0.0449 NA 0.360 #> # … with 9 more variables: Mean_Specificity , Mean_Pos_Pred_Value , #> # Mean_Neg_Pred_Value , Mean_Precision , Mean_Recall , #> # Mean_Detection_Rate , Mean_Balanced_Accuracy , method , #> # seed "},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"continuous-data","dir":"Articles","previous_headings":"Other data","what":"Continuous data","title":"Introduction to mikropml","text":"’s example running continuous data, outcome column numerical: , performance metrics slightly different, format rest :","code":"results_cont <- run_ml(otu_mini_bin[, 2:11], 'glmnet', outcome_colname = 'Otu00001', seed = 2019) results_cont$performance #> # A tibble: 1 × 6 #> cv_metric_RMSE RMSE Rsquared MAE method seed #> #> 1 622. 731. 0.0893 472. glmnet 2019"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"references","dir":"Articles","previous_headings":"","what":"References","title":"Introduction to mikropml","text":"Tang, Shengpu, Parmida Davarmanesh, Yanmeng Song, Danai Koutra, Michael W. Sjoding, Jenna Wiens. 2020. “Democratizing EHR Analyses FIDDLE: Flexible Data-Driven Preprocessing Pipeline Structured Clinical Data.” J Med Inform Assoc, October. https://doi.org/10.1093/jamia/ocaa139. Topçuoğlu, Begüm D., Nicholas . Lesniak, Mack T. Ruffin, Jenna Wiens, Patrick D. Schloss. 2020. “Framework Effective Application Machine Learning Microbiome-Based Classification Problems.” mBio 11 (3). https://doi.org/10.1128/mBio.00434-20.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"summary","dir":"Articles","previous_headings":"","what":"Summary","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"Machine learning (ML) classification prediction based set features used make decisions healthcare, economics, criminal justice . However, implementing ML pipeline including preprocessing, model selection, evaluation can time-consuming, confusing, difficult. , present mikropml (pronounced “meek-ROPE em el”), easy--use R package implements ML pipelines using regression, support vector machines, decision trees, random forest, gradient-boosted trees. package available GitHub, CRAN, conda.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"statement-of-need","dir":"Articles","previous_headings":"","what":"Statement of need","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"applications machine learning (ML) require reproducible steps data pre-processing, cross-validation, testing, model evaluation, often interpretation model makes particular predictions. Performing steps important, failure implement can result incorrect misleading results (Teschendorff 2019; Wiens et al. 2019). Supervised ML widely used recognize patterns large datasets make predictions outcomes interest. Several packages including caret (Kuhn 2008) tidymodels (Kuhn, Wickham, RStudio 2020) R, scikitlearn (Pedregosa et al. 2011) Python, H2O autoML platform (H2O.ai 2020) allow scientists train ML models variety algorithms. packages provide tools necessary ML step, implement complete ML pipeline according good practices literature. makes difficult practitioners new ML easily begin perform ML analyses. enable broader range researchers apply ML problem domains, created mikropml, easy--use R package (R Core Team 2020) implements ML pipeline created Topçuoğlu et al. (Topçuoğlu et al. 2020) single function returns trained model, model performance metrics feature importance. mikropml leverages caret package support several ML algorithms: linear regression, logistic regression, support vector machines radial basis kernel, decision trees, random forest, gradient boosted trees. incorporates good practices ML training, testing, model evaluation (Topçuoğlu et al. 2020; Teschendorff 2019). Furthermore, provides data preprocessing steps based FIDDLE (FlexIble Data-Driven pipeLinE) framework outlined Tang et al. (Tang et al. 2020) post-training permutation importance steps estimate importance feature models trained (Breiman 2001; Fisher, Rudin, Dominici 2018). mikropml can used starting point application ML datasets many different fields. already applied microbiome data categorize patients colorectal cancer (Topçuoğlu et al. 2020), identify differences genomic clinical features associated bacterial infections (Lapp et al. 2020), predict gender-based biases academic publishing (Hagan et al. 2020).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"mikropml-package","dir":"Articles","previous_headings":"","what":"mikropml package","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"mikropml package includes functionality preprocess data, train ML models, evaluate model performance, quantify feature importance (Figure 1). also provide vignettes example Snakemake workflow (Köster Rahmann 2012) showcase run ideal ML pipeline multiple different train/test data splits. results can visualized using helper functions use ggplot2 (Wickham 2016). mikropml allows users get started quickly facilitates reproducibility, replacement understanding ML workflow still necessary interpreting results (Pollard et al. 2019). facilitate understanding enable one tailor code application, heavily commented code provided supporting documentation can read online.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"preprocessing-data","dir":"Articles","previous_headings":"mikropml package","what":"Preprocessing data","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"provide function preprocess_data() preprocess features using several different functions caret package. preprocess_data() takes continuous categorical data, re-factors categorical data binary features, provides options normalize continuous data, remove features near-zero variance, keep one instance perfectly correlated features. set default options based implemented FIDDLE (Tang et al. 2020). details use preprocess_data() can found accompanying vignette.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"running-ml","dir":"Articles","previous_headings":"mikropml package","what":"Running ML","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"main function mikropml, run_ml(), minimally takes model choice data frame outcome column feature columns. model choice, mikropml currently supports logistic linear regression (glmnet: Friedman, Hastie, Tibshirani 2010), support vector machines radial basis kernel (kernlab: Karatzoglou et al. 2004), decision trees (rpart: Therneau et al. 2019), random forest (randomForest: Liaw Wiener 2002), gradient-boosted trees (xgboost: Chen et al. 2020). run_ml() randomly splits data train test sets maintaining distribution outcomes found full dataset. also provides option split data train test sets based categorical variables (e.g. batch, geographic location, etc.). mikropml uses caret package (Kuhn 2008) train evaluate models, optionally quantifies feature importance. output includes best model built based tuning hyperparameters internal repeated cross-validation step, model evaluation metrics, optional feature importances. Feature importances calculated using permutation test, breaks relationship feature true outcome test data, measures change model performance. provides intuitive metric individual features influence model performance comparable across model types, particularly useful model interpretation (Topçuoğlu et al. 2020). introductory vignette contains comprehensive tutorial use run_ml(). mikropml pipeline","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"ideal-workflow-for-running-mikropml-with-many-different-traintest-splits","dir":"Articles","previous_headings":"mikropml package","what":"Ideal workflow for running mikropml with many different train/test splits","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"investigate variation model performance depending train test set used (Topçuoğlu et al. 2020; Lapp et al. 2020), provide examples run_ml() many times different train/test splits get summary information model performance local computer high-performance computing cluster using Snakemake workflow.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"tuning-visualization","dir":"Articles","previous_headings":"mikropml package","what":"Tuning & visualization","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"One particularly important aspect ML hyperparameter tuning. provide reasonable range default hyperparameters model type. However practitioners explore whether range appropriate data, customize hyperparameter range. Therefore, provide function plot_hp_performance() plot cross-validation performance metric single model models built using different train/test splits. helps evaluate hyperparameter range searched exhaustively allows user pick ideal set. also provide summary plots test performance metrics many train/test splits different models using plot_model_performance(). Examples described accompanying vignette hyperparameter tuning.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"dependencies","dir":"Articles","previous_headings":"mikropml package","what":"Dependencies","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"mikropml written R (R Core Team 2020) depends several packages: dplyr (Wickham et al. 2020), rlang (Henry, Wickham, RStudio 2020) caret (Kuhn 2008). ML algorithms supported mikropml require: glmnet (Friedman, Hastie, Tibshirani 2010), e1071 (Meyer et al. 2020), MLmetrics (Yan 2016) logistic regression, rpart2 (Therneau et al. 2019) decision trees, randomForest (Liaw Wiener 2002) random forest, xgboost (Chen et al. 2020) xgboost, kernlab (Karatzoglou et al. 2004) support vector machines. also allow parallelization cross-validation steps using foreach, doFuture, future.apply, future packages (Bengtsson Team 2020). Finally, use ggplot2 plotting (Wickham 2016).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"acknowledgments","dir":"Articles","previous_headings":"","what":"Acknowledgments","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"thank members Schloss Lab participated code clubs related initial development pipeline, made documentation improvements, provided general feedback. also thank Nick Lesniak designing mikropml logo. thank US Research Software Sustainability Institute (NSF #1743188) providing training KLS Winter School Research Software Engineering.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"funding","dir":"Articles","previous_headings":"","what":"Funding","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"Salary support PDS came NIH grant 1R01CA215574. KLS received support NIH Training Program Bioinformatics (T32 GM070449). ZL received support National Science Foundation Graduate Research Fellowship Program Grant . DGE 1256260. opinions, findings, conclusions recommendations expressed material authors necessarily reflect views National Science Foundation.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"author-contributions","dir":"Articles","previous_headings":"","what":"Author contributions","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"BDT, ZL, KLS contributed equally. Author order among co-first authors determined time since joining project. BDT, ZL, KLS conceptualized study wrote code. KLS structured code R package form. BDT, ZL, JW, PDS developed methodology. PDS, ES, JW supervised project. BDT, ZL, KLS wrote original draft. authors reviewed edited manuscript.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"conflicts-of-interest","dir":"Articles","previous_headings":"","what":"Conflicts of interest","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"None.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"references","dir":"Articles","previous_headings":"","what":"References","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"Bengtsson, Henrik, R Core Team. 2020. “Future.Apply: Apply Function Elements Parallel Using Futures,” July. Breiman, Leo. 2001. “Random Forests.” Machine Learning 45 (1): 5–32. https://doi.org/10.1023/:1010933404324. Chen, Tianqi, Tong , Michael Benesty, Vadim Khotilovich, Yuan Tang, Hyunsu Cho, Kailong Chen, et al. 2020. “Xgboost: Extreme Gradient Boosting,” June. Fisher, Aaron, Cynthia Rudin, Francesca Dominici. 2018. “Models Wrong, Many Useful: Learning Variable’s Importance Studying Entire Class Prediction Models Simultaneously.” Friedman, Jerome H., Trevor Hastie, Rob Tibshirani. 2010. “Regularization Paths Generalized Linear Models via Coordinate Descent.” Journal Statistical Software 33 (1): 1–22. https://doi.org/10.18637/jss.v033.i01. H2O.ai. 2020. H2O: Scalable Machine Learning Platform. Manual. Hagan, Ada K., Begüm D. Topçuoğlu, Mia E. Gregory, Hazel . Barton, Patrick D. Schloss. 2020. “Women Underrepresented Receive Differential Outcomes ASM Journals: Six-Year Retrospective Analysis.” mBio 11 (6). https://doi.org/10.1128/mBio.01680-20. Henry, Lionel, Hadley Wickham, RStudio. 2020. “Rlang: Functions Base Types Core R ’Tidyverse’ Features,” July. Karatzoglou, Alexandros, Alexandros Smola, Kurt Hornik, Achim Zeileis. 2004. “Kernlab - S4 Package Kernel Methods R.” Journal Statistical Software 11 (1): 1–20. https://doi.org/10.18637/jss.v011.i09. Köster, Johannes, Sven Rahmann. 2012. “Snakemakea Scalable Bioinformatics Workflow Engine.” Bioinformatics 28 (19): 2520–2. https://doi.org/10.1093/bioinformatics/bts480. Kuhn, Max. 2008. “Building Predictive Models R Using Caret Package.” Journal Statistical Software 28 (1): 1–26. https://doi.org/10.18637/jss.v028.i05. Kuhn, Max, Hadley Wickham, RStudio. 2020. “Tidymodels: Easily Install Load ’Tidymodels’ Packages,” July. Lapp, Zena, Jennifer Han, Jenna Wiens, Ellie JC Goldstein, Ebbing Lautenbach, Evan Snitkin. 2020. “Machine Learning Models Identify Patient Microbial Genetic Factors Associated Carbapenem-Resistant Klebsiella Pneumoniae Infection.” medRxiv, July, 2020.07.06.20147306. https://doi.org/10.1101/2020.07.06.20147306. Liaw, Andy, Matthew Wiener. 2002. “Classification Regression randomForest” 2: 5. Meyer, David, Evgenia Dimitriadou, Kurt Hornik, Andreas Weingessel, Friedrich Leisch, Chih-Chung Chang (libsvm C++-code), Chih-Chen Lin (libsvm C++-code). 2020. “E1071: Misc Functions Department Statistics, Probability Theory Group (Formerly: E1071), TU Wien.” Pedregosa, Fabian, Gaël Varoquaux, Alexandre Gramfort, Vincent Michel, Bertrand Thirion, Olivier Grisel, Mathieu Blondel, et al. 2011. “Scikit-Learn: Machine Learning Python.” Journal Machine Learning Research 12 (85): 2825–30. Pollard, Tom J., Irene Chen, Jenna Wiens, Steven Horng, Danny Wong, Marzyeh Ghassemi, Heather Mattie, Emily Lindemer, Trishan Panch. 2019. “Turning Crank Machine Learning: Ease, Expense?” Lancet Digital Health 1 (5): e198–e199. https://doi.org/10.1016/S2589-7500(19)30112-8. R Core Team. 2020. “R: Language Environment Statistical Computing.” Tang, Shengpu, Parmida Davarmanesh, Yanmeng Song, Danai Koutra, Michael W. Sjoding, Jenna Wiens. 2020. “Democratizing EHR Analyses FIDDLE: Flexible Data-Driven Preprocessing Pipeline Structured Clinical Data.” J Med Inform Assoc, October. https://doi.org/10.1093/jamia/ocaa139. Teschendorff, Andrew E. 2019. “Avoiding Common Pitfalls Machine Learning Omic Data Science.” Nature Materials 18 (5): 422–27. https://doi.org/10.1038/s41563-018-0241-z. Therneau, Terry, Beth Atkinson, Brian Ripley (producer initial R. port, maintainer 1999-2017). 2019. “Rpart: Recursive Partitioning Regression Trees,” April. Topçuoğlu, Begüm D., Nicholas . Lesniak, Mack T. Ruffin, Jenna Wiens, Patrick D. Schloss. 2020. “Framework Effective Application Machine Learning Microbiome-Based Classification Problems.” mBio 11 (3). https://doi.org/10.1128/mBio.00434-20. Wickham, Hadley. 2016. Ggplot2: Elegant Graphics Data Analysis. Use R! Cham: Springer International Publishing. https://doi.org/10.1007/978-3-319-24277-4. Wickham, Hadley, Romain François, Lionel Henry, Kirill Müller, RStudio. 2020. “Dplyr: Grammar Data Manipulation,” August. Wiens, Jenna, Suchi Saria, Mark Sendak, Marzyeh Ghassemi, Vincent X. Liu, Finale Doshi-Velez, Kenneth Jung, et al. 2019. “Harm: Roadmap Responsible Machine Learning Health Care.” Nat. Med. 25 (9): 1337–40. https://doi.org/10.1038/s41591-019-0548-6. Yan, Yachen. 2016. “MLmetrics: Machine Learning Evaluation Metrics.”","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"speed-up-single-runs","dir":"Articles","previous_headings":"","what":"Speed up single runs","title":"Parallel processing","text":"default, preprocess_data(), run_ml(), compare_models() use one process series. ’d like parallelize various steps pipeline make run faster, install foreach, future, future.apply, doFuture. , register future plan prior calling functions: , used multicore plan split work across 2 cores. See future documentation picking best plan use case. Notably, multicore work inside RStudio Windows; need use multisession instead cases. registering future plan, can call preprocess_data() run_ml() usual, run certain tasks parallel.","code":"doFuture::registerDoFuture() future::plan(future::multicore, workers = 2) otu_data_preproc <- preprocess_data(otu_mini_bin, 'dx')$dat_transformed #> Using 'dx' as the outcome column. result1 <- run_ml(otu_data_preproc, 'glmnet') #> Using 'dx' as the outcome column. #> Training the model... #> Loading required package: ggplot2 #> Loading required package: lattice #> #> Attaching package: 'caret' #> The following object is masked from 'package:mikropml': #> #> compare_models #> Training complete."},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"call-run_ml-multiple-times-in-parallel-in-r","dir":"Articles","previous_headings":"","what":"Call run_ml() multiple times in parallel in R","title":"Parallel processing","text":"can use functions future.apply package call run_ml() multiple times parallel different parameters. first need run future::plan() haven’t already. , call run_ml() multiple seeds using future_lapply(): call run_ml() different seed uses different random split data training testing sets. Since using seeds, must set future.seed TRUE (see future.apply documentation blog post details parallel-safe random seeds). example uses seeds speed simplicity, real data recommend using many seeds get better estimate model performance. examples, used functions future.apply package run_ml() parallel, can accomplish thing parallel versions purrr::map() functions using furrr package (e.g. furrr::future_map_dfr()). Extract performance results combine one dataframe seeds:","code":"# NOTE: use more seeds for real-world data results_multi <- future.apply::future_lapply(seq(100, 102), function(seed) { run_ml(otu_data_preproc, 'glmnet', seed = seed) }, future.seed = TRUE) #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. perf_df <- future.apply::future_lapply(results_multi, function(result) { result[['performance']] %>% select(cv_metric_AUC, AUC, method) }, future.seed = TRUE) %>% dplyr::bind_rows() perf_df #> # A tibble: 3 × 3 #> cv_metric_AUC AUC method #> #> 1 0.630 0.634 glmnet #> 2 0.591 0.608 glmnet #> 3 0.671 0.471 glmnet"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"multiple-ml-methods","dir":"Articles","previous_headings":"Call run_ml() multiple times in parallel in R","what":"Multiple ML methods","title":"Parallel processing","text":"may also wish compare performance different ML methods. mapply() can iterate multiple lists vectors, future_mapply() works way: Extract combine performance results seeds methods: Visualize performance results (ggplot2 required): plot_model_performance() returns ggplot2 object. can add layers customize plot: can also create plots however like using performance results.","code":"# NOTE: use more seeds for real-world data param_grid <- expand.grid(seeds = seq(100, 102), methods = c('glmnet', 'rf')) results_mtx <- future.apply::future_mapply( function(seed, method) { run_ml(otu_data_preproc, method, seed = seed) }, param_grid$seeds, param_grid$methods %>% as.character(), future.seed = TRUE ) #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. perf_df2 <- lapply(results_mtx['performance',], function(x) { x %>% select(cv_metric_AUC, AUC, method) }) %>% dplyr::bind_rows() perf_df2 #> # A tibble: 6 × 3 #> cv_metric_AUC AUC method #> #> 1 0.630 0.634 glmnet #> 2 0.591 0.608 glmnet #> 3 0.671 0.471 glmnet #> 4 0.665 0.708 rf #> 5 0.651 0.697 rf #> 6 0.701 0.592 rf perf_boxplot <- plot_model_performance(perf_df2) perf_boxplot perf_boxplot + theme_classic() + scale_color_brewer(palette = \"Dark2\") + coord_flip()"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"live-progress-updates","dir":"Articles","previous_headings":"","what":"Live progress updates","title":"Parallel processing","text":"preprocess_data() get_feature_importance() support reporting live progress updates using progressr package. format , recommend using progress bar like : Note future backends support “near-live” progress updates, meaning progress may reported immediately parallel processing futures. Read progressr vignette. progressr customize format progress updates, see progressr docs.","code":"# optionally, specify the progress bar format with the `progress` package. progressr::handlers(progressr::handler_progress( format = \":message :bar :percent | elapsed: :elapsed | eta: :eta\", clear = FALSE, show_after = 0)) # tell progressr to always report progress in any functions that use it. # set this to FALSE to turn it back off again. progressr::handlers(global = TRUE) # run your code and watch the live progress updates. dat <- preprocess_data(otu_mini_bin, 'dx')$dat_transformed #> Using 'dx' as the outcome column. #> preprocessing ========================>------- 78% | elapsed: 1s | eta: 0s results <- run_ml(dat, \"glmnet\", kfold = 2, cv_times = 2, find_feature_importance = TRUE) #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Feature importance =========================== 100% | elapsed: 37s | eta: 0s"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"parallelizing-with-snakemake","dir":"Articles","previous_headings":"","what":"Parallelizing with Snakemake","title":"Parallel processing","text":"parallelizing multiple calls run_ml() R examples , results objects held memory. isn’t big deal small dataset run seeds. However, large datasets run parallel , say, 100 seeds (recommended), may run problems trying store objects memory . One solution write results files run_ml() call, concatenate end. show one way accomplish Snakemake example Snakemake workflow .","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"its-running-so-slow","dir":"Articles","previous_headings":"","what":"It’s running so slow!","title":"Preprocessing data","text":"Since assume lot won’t read entire vignette, ’m going say beginning. preprocess_data() function running super slow, consider parallelizing goes faster! preprocess_data() also can report live progress updates. See vignette(\"parallel\") details.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"examples","dir":"Articles","previous_headings":"","what":"Examples","title":"Preprocessing data","text":"’re going start simple get complicated, want whole shebang , just scroll bottom. First, load mikropml:","code":"library(mikropml)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"binary-data","dir":"Articles","previous_headings":"Examples","what":"Binary data","title":"Preprocessing data","text":"Let’s start binary variables: addition dataframe , provide name outcome column preprocess_data(). ’s preprocessed data looks like: output list: dat_transformed transformed data, grp_feats list grouped features, removed_feats list features removed. , grp_feats NULL perfectly correlated features (e.g. c(0,1,0) c(0,1,0), c(0,1,0) c(1,0,1) - see details). first column (var1) dat_transformed character changed var1_yes zeros () ones (yes). values second column (var2) stay ’s already binary, name changes var2_1. third column (var3) factor also changed binary b 1 0, denoted new column name var3_b.","code":"# raw binary dataset bin_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"no\", \"yes\", \"no\"), var2 = c(0, 1, 1), var3 = factor(c(\"a\",\"a\",\"b\")) ) bin_df #> outcome var1 var2 var3 #> 1 normal no 0 a #> 2 normal yes 1 a #> 3 cancer no 1 b # preprocess raw binary data preprocess_data(dataset = bin_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 4 #> outcome var1_yes var2_1 var3_b #> #> 1 normal 0 0 0 #> 2 normal 1 1 0 #> 3 cancer 0 1 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"categorical-data","dir":"Articles","previous_headings":"Examples","what":"Categorical data","title":"Preprocessing data","text":"non-binary categorical data: can see, variable split 3 different columns - one type (, b, c). , grp_feats NULL.","code":"# raw categorical dataset cat_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c('a','b','c') ) cat_df #> outcome var1 #> 1 normal a #> 2 normal b #> 3 cancer c # preprocess raw categorical data preprocess_data(dataset = cat_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 4 #> outcome var1_a var1_b var1_c #> #> 1 normal 1 0 0 #> 2 normal 0 1 0 #> 3 cancer 0 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"continuous-data","dir":"Articles","previous_headings":"Examples","what":"Continuous data","title":"Preprocessing data","text":"Now, looking continuous variables: Wow! numbers change? default normalize data using \"center\" \"scale\". often best practice, may want normalize data, may want normalize data different way. don’t want normalize data, can use method=NULL: can also normalize data different ways. can choose method supported method argument caret::preProcess() (see caret::preProcess() docs details). Note methods applied continuous variables. Another feature preprocess_data() provide continuous variables characters, converted numeric: don’t want happen, want character data remain character data even can converted numeric, can use to_numeric=FALSE kept categorical: can see output, case features treated groups rather numbers (e.g. normalized).","code":"# raw continuous dataset cont_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(1,2,3) ) cont_df #> outcome var1 #> 1 normal 1 #> 2 normal 2 #> 3 cancer 3 # preprocess raw continuous data preprocess_data(dataset = cont_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 2 #> outcome var1 #> #> 1 normal -1 #> 2 normal 0 #> 3 cancer 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0) # preprocess raw continuous data, no normalization preprocess_data(dataset = cont_df, outcome_colname = \"outcome\", method = NULL) # raw continuous dataset as characters cont_char_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"1\",\"2\",\"3\") ) cont_char_df #> outcome var1 #> 1 normal 1 #> 2 normal 2 #> 3 cancer 3 # preprocess raw continuous character data as numeric preprocess_data(dataset = cont_char_df, outcome_colname = \"outcome\") # preprocess raw continuous character data as characters preprocess_data(dataset = cont_char_df, outcome_colname = \"outcome\", to_numeric = FALSE) #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 4 #> outcome var1_1 var1_2 var1_3 #> #> 1 normal 1 0 0 #> 2 normal 0 1 0 #> 3 cancer 0 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"collapse-perfectly-correlated-features","dir":"Articles","previous_headings":"Examples","what":"Collapse perfectly correlated features","title":"Preprocessing data","text":"default, preprocess_data() collapses features perfectly positively negatively correlated. multiple copies features add information machine learning, makes run_ml faster. can see, end one variable, 3 grouped together. Also, second element list longer NULL. Instead, tells grp1 contains var1, var2, var3. want group positively correlated features, negatively correlated features (e.g. interpretability, another downstream application), can using group_neg_corr=FALSE: , var3 kept ’s ’s negatively correlated var1 var2. can also choose keep features separate, even perfectly correlated, using collapse_corr_feats=FALSE: case, grp_feats always NULL.","code":"# raw correlated dataset corr_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"no\", \"yes\", \"no\"), var2 = c(0, 1, 0), var3 = c(1,0,1) ) corr_df #> outcome var1 var2 var3 #> 1 normal no 0 1 #> 2 normal yes 1 0 #> 3 cancer no 0 1 # preprocess raw correlated dataset preprocess_data(dataset = corr_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 2 #> outcome grp1 #> #> 1 normal 0 #> 2 normal 1 #> 3 cancer 0 #> #> $grp_feats #> $grp_feats$grp1 #> [1] \"var1_yes\" \"var3_1\" #> #> #> $removed_feats #> [1] \"var2\" # preprocess raw correlated dataset; don't group negatively correlated features preprocess_data(dataset = corr_df, outcome_colname = \"outcome\", group_neg_corr = FALSE) #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_yes var3_1 #> #> 1 normal 0 1 #> 2 normal 1 0 #> 3 cancer 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var2\" # preprocess raw correlated dataset; don't group negatively correlated features preprocess_data(dataset = corr_df, outcome_colname = \"outcome\", collapse_corr_feats = FALSE) #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_yes var3_1 #> #> 1 normal 0 1 #> 2 normal 1 0 #> 3 cancer 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var2\""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"data-with-near-zero-variance","dir":"Articles","previous_headings":"Examples","what":"Data with near-zero variance","title":"Preprocessing data","text":"variables zero, “”? ones won’t contribute information, remove : , var3, var4, var5 variability, variables removed preprocessing: can read caret::preProcess() documentation information. default, remove features “near-zero variance” (remove_var='nzv'). uses default arguments caret::nearZeroVar(). However, particularly smaller datasets, might want remove features near-zero variance. want remove features zero variance, can use remove_var='zv': want include features, can use argument remove_zv=NULL. work, collapse correlated features (otherwise errors underlying caret function use). want nuanced remove near-zero variance features (e.g. change default 10% cutoff percentage distinct values total number samples), can use caret::preProcess() function running preprocess_data remove_var=NULL (see caret::nearZeroVar() function information).","code":"# raw dataset with non-variable features nonvar_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"no\", \"yes\", \"no\"), var2 = c(0, 1, 1), var3 = c(\"no\",\"no\",\"no\"), var4 = c(0,0,0), var5 = c(12,12,12) ) nonvar_df #> outcome var1 var2 var3 var4 var5 #> 1 normal no 0 no 0 12 #> 2 normal yes 1 no 0 12 #> 3 cancer no 1 no 0 12 # remove features with near-zero variance preprocess_data(dataset = nonvar_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_yes var2_1 #> #> 1 normal 0 0 #> 2 normal 1 1 #> 3 cancer 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var4\" \"var3\" \"var5\" # remove features with zero variance preprocess_data(dataset = nonvar_df, outcome_colname = \"outcome\", remove_var = 'zv') #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_yes var2_1 #> #> 1 normal 0 0 #> 2 normal 1 1 #> 3 cancer 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var4\" \"var3\" \"var5\" # don't remove features with near-zero or zero variance preprocess_data(dataset = nonvar_df, outcome_colname = \"outcome\", remove_var = NULL, collapse_corr_feats = FALSE) #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 5 #> outcome var1_yes var2_1 var3 var5 #> #> 1 normal 0 0 0 12 #> 2 normal 1 1 0 12 #> 3 cancer 0 1 0 12 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var4\""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"missing-data","dir":"Articles","previous_headings":"Examples","what":"Missing data","title":"Preprocessing data","text":"preprocess_data() also deals missing data. : Removes missing outcome variables. Maintains zero variability feature already variability (.e. feature removed removing features near-zero variance). Replaces missing binary categorical variables zero (splitting multiple columns). Replaces missing continuous data median value feature. ’d like deal missing data different way, please prior inputting data preprocess_data().","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"remove-missing-outcome-variables","dir":"Articles","previous_headings":"Examples > Missing data","what":"Remove missing outcome variables","title":"Preprocessing data","text":"","code":"# raw dataset with missing outcome value miss_oc_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\",NA), var1 = c(\"no\", \"yes\", \"no\",\"no\"), var2 = c(0, 1, 1,1) ) miss_oc_df #> outcome var1 var2 #> 1 normal no 0 #> 2 normal yes 1 #> 3 cancer no 1 #> 4 no 1 # preprocess raw dataset with missing outcome value preprocess_data(dataset = miss_oc_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> Removed 1/4 (25%) of samples because of missing outcome value (NA). #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_yes var2_1 #> #> 1 normal 0 0 #> 2 normal 1 1 #> 3 cancer 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"maintain-zero-variability-in-a-feature-if-it-already-has-no-variability","dir":"Articles","previous_headings":"Examples > Missing data","what":"Maintain zero variability in a feature if it already has no variability","title":"Preprocessing data","text":", non-variable feature missing data removed removed features near-zero variance. maintained feature, ’d ones:","code":"# raw dataset with missing value in non-variable feature miss_nonvar_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"no\", \"yes\", \"no\"), var2 = c(NA, 1, 1) ) miss_nonvar_df #> outcome var1 var2 #> 1 normal no NA #> 2 normal yes 1 #> 3 cancer no 1 # preprocess raw dataset with missing value in non-variable feature preprocess_data(dataset = miss_nonvar_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> There are 1 missing value(s) in features with no variation. Missing values were replaced with the non-varying value. #> $dat_transformed #> # A tibble: 3 × 2 #> outcome var1_yes #> #> 1 normal 0 #> 2 normal 1 #> 3 cancer 0 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var2\" # preprocess raw dataset with missing value in non-variable feature preprocess_data(dataset = miss_nonvar_df, outcome_colname = \"outcome\", remove_var = NULL, collapse_corr_feats = FALSE) #> Using 'outcome' as the outcome column. #> There are 1 missing value(s) in features with no variation. Missing values were replaced with the non-varying value. #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_yes var2 #> #> 1 normal 0 1 #> 2 normal 1 1 #> 3 cancer 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"replace-missing-binary-and-categorical-variables-with-zero","dir":"Articles","previous_headings":"Examples > Missing data","what":"Replace missing binary and categorical variables with zero","title":"Preprocessing data","text":"binary variable split two, missing value considered zero .","code":"# raw dataset with missing value in categorical feature miss_cat_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"no\", \"yes\", NA), var2 = c(NA, 1, 0) ) miss_cat_df #> outcome var1 var2 #> 1 normal no NA #> 2 normal yes 1 #> 3 cancer 0 # preprocess raw dataset with missing value in non-variable feature preprocess_data(dataset = miss_cat_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> 2 categorical missing value(s) (NA) were replaced with 0. Note that the matrix is not full rank so missing values may be duplicated in separate columns. #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_no var1_yes #> #> 1 normal 1 0 #> 2 normal 0 1 #> 3 cancer 0 0 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var2\""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"replace-missing-continuous-data-with-the-median-value-of-that-feature","dir":"Articles","previous_headings":"Examples > Missing data","what":"Replace missing continuous data with the median value of that feature","title":"Preprocessing data","text":"’re normalizing continuous features ’s easier see ’s going (.e. median value used):","code":"# raw dataset with missing value in continuous feature miss_cont_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\",\"normal\"), var1 = c(1,2,2,NA), var2 = c(1,2,3,NA) ) miss_cont_df #> outcome var1 var2 #> 1 normal 1 1 #> 2 normal 2 2 #> 3 cancer 2 3 #> 4 normal NA NA # preprocess raw dataset with missing value in continuous feature preprocess_data(dataset = miss_cont_df, outcome_colname = \"outcome\", method = NULL) #> Using 'outcome' as the outcome column. #> 2 missing continuous value(s) were imputed using the median value of the feature. #> $dat_transformed #> # A tibble: 4 × 3 #> outcome var1 var2 #> #> 1 normal 1 1 #> 2 normal 2 2 #> 3 cancer 2 3 #> 4 normal 2 2 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"putting-it-all-together","dir":"Articles","previous_headings":"Examples","what":"Putting it all together","title":"Preprocessing data","text":"’s complicated example raw data puts everything discussed together: Let’s throw preprocessing function default values: can see, got several messages: One samples (row 4) removed outcome value missing. One variables feature variation missing value replaced non-varying value (var11). Four categorical missing values replaced zero (var9). 4 missing rather just 1 (like raw data) split categorical variable 4 different columns first. One missing continuous value imputed using median value feature (var8). Additionally, can see continuous variables normalized, categorical variables changed binary, several features grouped together. variables group can found grp_feats.","code":"test_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\", NA), var1 = 1:4, var2 = c(\"a\", \"b\", \"c\", \"d\"), var3 = c(\"no\", \"yes\", \"no\", \"no\"), var4 = c(0, 1, 0, 0), var5 = c(0, 0, 0, 0), var6 = c(\"no\", \"no\", \"no\", \"no\"), var7 = c(1, 1, 0, 0), var8 = c(5, 6, NA, 7), var9 = c(NA, \"x\", \"y\", \"z\"), var10 = c(1, 0, NA, NA), var11 = c(1, 1, NA, NA), var12 = c(\"1\", \"2\", \"3\", \"4\") ) test_df #> outcome var1 var2 var3 var4 var5 var6 var7 var8 var9 var10 var11 var12 #> 1 normal 1 a no 0 0 no 1 5 1 1 1 #> 2 normal 2 b yes 1 0 no 1 6 x 0 1 2 #> 3 cancer 3 c no 0 0 no 0 NA y NA NA 3 #> 4 4 d no 0 0 no 0 7 z NA NA 4 preprocess_data(dataset = test_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> Removed 1/4 (25%) of samples because of missing outcome value (NA). #> There are 1 missing value(s) in features with no variation. Missing values were replaced with the non-varying value. #> 2 categorical missing value(s) (NA) were replaced with 0. Note that the matrix is not full rank so missing values may be duplicated in separate columns. #> 1 missing continuous value(s) were imputed using the median value of the feature. #> $dat_transformed #> # A tibble: 3 × 6 #> outcome grp1 var2_a grp2 grp3 var8 #> #> 1 normal -1 1 0 0 -0.707 #> 2 normal 0 0 1 0 0.707 #> 3 cancer 1 0 0 1 0 #> #> $grp_feats #> $grp_feats$grp1 #> [1] \"var1\" \"var12\" #> #> $grp_feats$var2_a #> [1] \"var2_a\" #> #> $grp_feats$grp2 #> [1] \"var2_b\" \"var3_yes\" \"var9_x\" #> #> $grp_feats$grp3 #> [1] \"var2_c\" \"var7_1\" \"var9_y\" #> #> $grp_feats$var8 #> [1] \"var8\" #> #> #> $removed_feats #> [1] \"var4\" \"var5\" \"var10\" \"var6\" \"var11\""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"next-step-train-and-evaluate-your-model","dir":"Articles","previous_headings":"Examples","what":"Next step: train and evaluate your model!","title":"Preprocessing data","text":"preprocess data (either using preprocess_data() preprocessing data ), ’re ready train evaluate machine learning models! Please see run_ml() information training models. Tang, Shengpu, Parmida Davarmanesh, Yanmeng Song, Danai Koutra, Michael W. Sjoding, Jenna Wiens. 2020. “Democratizing EHR Analyses FIDDLE: Flexible Data-Driven Preprocessing Pipeline Structured Clinical Data.” J Med Inform Assoc, October. https://doi.org/10.1093/jamia/ocaa139.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"the-simplest-way-to-run_ml","dir":"Articles","previous_headings":"","what":"The simplest way to run_ml()","title":"Hyperparameter tuning","text":"mentioned , minimal input dataset (dataset) machine learning model want use (method). run_ml(), default 100 times repeated, 5-fold cross-validation, evaluate hyperparameters 500 total iterations. Say want run L2 regularized logistic regression. : ’ll probably get warning run dataset small. want learn , check introductory vignette training evaluating ML model: vignette(\"introduction\"). default, run_ml() selects hyperparameters depending dataset method used. can see, alpha hyperparameter set 0, specifies L2 regularization. glmnet gives us option run L1 L2 regularization. change alpha 1, run L1-regularized logistic regression. can also tune alpha specifying variety values 0 1. use value 0 1, running elastic net. default hyperparameter lambda adjusts L2 regularization penalty range values 10^-4 10. look 100 repeated cross-validation performance metrics AUC, Accuracy, prAUC tested lambda value, see appropriate dataset better others.","code":"results <- run_ml(dat, 'glmnet', outcome_colname = 'dx', cv_times = 100, seed = 2019) #> Using 'dx' as the outcome column. #> Training the model... #> Loading required package: ggplot2 #> Loading required package: lattice #> #> Attaching package: 'caret' #> The following object is masked from 'package:mikropml': #> #> compare_models #> Training complete. results$trained_model #> glmnet #> #> 161 samples #> 10 predictor #> 2 classes: 'cancer', 'normal' #> #> No pre-processing #> Resampling: Cross-Validated (5 fold, repeated 100 times) #> Summary of sample sizes: 128, 129, 129, 129, 129, 130, ... #> Resampling results across tuning parameters: #> #> lambda logLoss AUC prAUC Accuracy Kappa F1 #> 1e-04 0.7113272 0.6123301 0.5725828 0.5853927 0.17080523 0.5730989 #> 1e-03 0.7113272 0.6123301 0.5725828 0.5853927 0.17080523 0.5730989 #> 1e-02 0.7112738 0.6123883 0.5726478 0.5854514 0.17092470 0.5731635 #> 1e-01 0.6819806 0.6210744 0.5793961 0.5918756 0.18369829 0.5779616 #> 1e+00 0.6803749 0.6278273 0.5827655 0.5896356 0.17756961 0.5408139 #> 1e+01 0.6909820 0.6271894 0.5814202 0.5218000 0.02920942 0.1875293 #> Sensitivity Specificity Pos_Pred_Value Neg_Pred_Value Precision #> 0.5789667 0.5920074 0.5796685 0.5977166 0.5796685 #> 0.5789667 0.5920074 0.5796685 0.5977166 0.5796685 #> 0.5789667 0.5921250 0.5797769 0.5977182 0.5797769 #> 0.5805917 0.6032353 0.5880165 0.6026963 0.5880165 #> 0.5057833 0.6715588 0.6005149 0.5887829 0.6005149 #> 0.0607250 0.9678676 0.7265246 0.5171323 0.7265246 #> Recall Detection_Rate Balanced_Accuracy #> 0.5789667 0.2839655 0.5854870 #> 0.5789667 0.2839655 0.5854870 #> 0.5789667 0.2839636 0.5855458 #> 0.5805917 0.2847195 0.5919135 #> 0.5057833 0.2478291 0.5886711 #> 0.0607250 0.0292613 0.5142963 #> #> Tuning parameter 'alpha' was held constant at a value of 0 #> AUC was used to select the optimal model using the largest value. #> The final values used for the model were alpha = 0 and lambda = 1. results$trained_model$results #> alpha lambda logLoss AUC prAUC Accuracy Kappa F1 #> 1 0 1e-04 0.7113272 0.6123301 0.5725828 0.5853927 0.17080523 0.5730989 #> 2 0 1e-03 0.7113272 0.6123301 0.5725828 0.5853927 0.17080523 0.5730989 #> 3 0 1e-02 0.7112738 0.6123883 0.5726478 0.5854514 0.17092470 0.5731635 #> 4 0 1e-01 0.6819806 0.6210744 0.5793961 0.5918756 0.18369829 0.5779616 #> 5 0 1e+00 0.6803749 0.6278273 0.5827655 0.5896356 0.17756961 0.5408139 #> 6 0 1e+01 0.6909820 0.6271894 0.5814202 0.5218000 0.02920942 0.1875293 #> Sensitivity Specificity Pos_Pred_Value Neg_Pred_Value Precision Recall #> 1 0.5789667 0.5920074 0.5796685 0.5977166 0.5796685 0.5789667 #> 2 0.5789667 0.5920074 0.5796685 0.5977166 0.5796685 0.5789667 #> 3 0.5789667 0.5921250 0.5797769 0.5977182 0.5797769 0.5789667 #> 4 0.5805917 0.6032353 0.5880165 0.6026963 0.5880165 0.5805917 #> 5 0.5057833 0.6715588 0.6005149 0.5887829 0.6005149 0.5057833 #> 6 0.0607250 0.9678676 0.7265246 0.5171323 0.7265246 0.0607250 #> Detection_Rate Balanced_Accuracy logLossSD AUCSD prAUCSD AccuracySD #> 1 0.2839655 0.5854870 0.085315967 0.09115229 0.07296554 0.07628572 #> 2 0.2839655 0.5854870 0.085315967 0.09115229 0.07296554 0.07628572 #> 3 0.2839636 0.5855458 0.085276565 0.09122242 0.07301412 0.07637123 #> 4 0.2847195 0.5919135 0.048120032 0.09025695 0.07329214 0.07747312 #> 5 0.2478291 0.5886711 0.012189172 0.09111917 0.07505095 0.07771171 #> 6 0.0292613 0.5142963 0.001610008 0.09266875 0.07640896 0.03421597 #> KappaSD F1SD SensitivitySD SpecificitySD Pos_Pred_ValueSD #> 1 0.15265728 0.09353786 0.13091452 0.11988406 0.08316345 #> 2 0.15265728 0.09353786 0.13091452 0.11988406 0.08316345 #> 3 0.15281903 0.09350099 0.13073501 0.12002481 0.08329024 #> 4 0.15485134 0.09308733 0.12870031 0.12037225 0.08554483 #> 5 0.15563046 0.10525917 0.13381009 0.11639614 0.09957685 #> 6 0.06527242 0.09664720 0.08010494 0.06371495 0.31899811 #> Neg_Pred_ValueSD PrecisionSD RecallSD Detection_RateSD Balanced_AccuracySD #> 1 0.08384956 0.08316345 0.13091452 0.06394409 0.07640308 #> 2 0.08384956 0.08316345 0.13091452 0.06394409 0.07640308 #> 3 0.08385838 0.08329024 0.13073501 0.06384692 0.07648207 #> 4 0.08427362 0.08554483 0.12870031 0.06272897 0.07748791 #> 5 0.07597766 0.09957685 0.13381009 0.06453637 0.07773039 #> 6 0.02292294 0.31899811 0.08010494 0.03803159 0.03184136"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"customizing-hyperparameters","dir":"Articles","previous_headings":"","what":"Customizing hyperparameters","title":"Hyperparameter tuning","text":"example, want change lambda values provide better range test cross-validation step. don’t want use defaults provide named list new values. example: Now let’s run L2 logistic regression new lambda values: time, cover larger different range lambda settings cross-validation. know lambda value best one? answer , need run ML pipeline multiple data splits look mean cross-validation performance lambda across modeling experiments. describe run pipeline multiple data splits vignette(\"parallel\"). train model new lambda range defined . run 3 times different seed, result different splits data training testing sets. can use plot_hp_performance see lambda gives us largest mean AUC value across modeling experiments. can see, get mean maxima 0.03 best lambda value dataset run 3 data splits. fact seeing maxima middle range edges, shows providing large enough range exhaust lambda search build model. recommend user use plot make sure best hyperparameter edges provided list. better understanding global maxima, better run data splits using seeds. picked 3 seeds keep runtime vignette, real-world data recommend using many seeds.","code":"new_hp <- list(alpha = 1, lambda = c(0.00001, 0.0001, 0.001, 0.01, 0.015, 0.02, 0.025, 0.03, 0.04, 0.05, 0.06, 0.1)) new_hp #> $alpha #> [1] 1 #> #> $lambda #> [1] 0.00001 0.00010 0.00100 0.01000 0.01500 0.02000 0.02500 0.03000 0.04000 #> [10] 0.05000 0.06000 0.10000 results <- run_ml(dat, 'glmnet', outcome_colname = 'dx', cv_times = 100, hyperparameters = new_hp, seed = 2019 ) #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. results$trained_model #> glmnet #> #> 161 samples #> 10 predictor #> 2 classes: 'cancer', 'normal' #> #> No pre-processing #> Resampling: Cross-Validated (5 fold, repeated 100 times) #> Summary of sample sizes: 128, 129, 129, 129, 129, 130, ... #> Resampling results across tuning parameters: #> #> lambda logLoss AUC prAUC Accuracy Kappa F1 #> 0.00001 0.7215038 0.6112253 0.5720005 0.5842184 0.1684871 0.5726974 #> 0.00010 0.7215038 0.6112253 0.5720005 0.5842184 0.1684871 0.5726974 #> 0.00100 0.7209099 0.6112771 0.5719601 0.5845329 0.1691285 0.5730414 #> 0.01000 0.6984432 0.6156112 0.5758977 0.5830960 0.1665062 0.5759265 #> 0.01500 0.6913332 0.6169396 0.5770496 0.5839720 0.1683912 0.5786347 #> 0.02000 0.6870103 0.6177313 0.5779563 0.5833645 0.1673234 0.5796891 #> 0.02500 0.6846387 0.6169757 0.5769305 0.5831907 0.1669901 0.5792840 #> 0.03000 0.6834369 0.6154763 0.5754118 0.5821394 0.1649081 0.5786336 #> 0.04000 0.6833322 0.6124776 0.5724802 0.5786224 0.1578750 0.5735757 #> 0.05000 0.6850454 0.6069059 0.5668928 0.5732197 0.1468699 0.5624480 #> 0.06000 0.6880861 0.5974311 0.5596714 0.5620224 0.1240112 0.5375824 #> 0.10000 0.6944846 0.5123565 0.3034983 0.5120114 0.0110144 0.3852423 #> Sensitivity Specificity Pos_Pred_Value Neg_Pred_Value Precision #> 0.5798500 0.5888162 0.5780748 0.5971698 0.5780748 #> 0.5798500 0.5888162 0.5780748 0.5971698 0.5780748 #> 0.5801167 0.5891912 0.5784544 0.5974307 0.5784544 #> 0.5883667 0.5783456 0.5755460 0.5977390 0.5755460 #> 0.5929750 0.5756471 0.5763123 0.5987220 0.5763123 #> 0.5967167 0.5708824 0.5748385 0.5990649 0.5748385 #> 0.5970250 0.5702721 0.5743474 0.5997928 0.5743474 #> 0.5964500 0.5687721 0.5734044 0.5982451 0.5734044 #> 0.5904500 0.5677353 0.5699817 0.5943308 0.5699817 #> 0.5734833 0.5736176 0.5668523 0.5864448 0.5668523 #> 0.5360333 0.5881250 0.5595918 0.5722851 0.5595918 #> 0.1145917 0.8963456 0.5255752 0.5132665 0.5255752 #> Recall Detection_Rate Balanced_Accuracy #> 0.5798500 0.28441068 0.5843331 #> 0.5798500 0.28441068 0.5843331 #> 0.5801167 0.28453770 0.5846539 #> 0.5883667 0.28860521 0.5833561 #> 0.5929750 0.29084305 0.5843110 #> 0.5967167 0.29264681 0.5837995 #> 0.5970250 0.29278708 0.5836485 #> 0.5964500 0.29248583 0.5826110 #> 0.5904500 0.28951992 0.5790926 #> 0.5734833 0.28119862 0.5735505 #> 0.5360333 0.26270204 0.5620792 #> 0.1145917 0.05585777 0.5054686 #> #> Tuning parameter 'alpha' was held constant at a value of 1 #> AUC was used to select the optimal model using the largest value. #> The final values used for the model were alpha = 1 and lambda = 0.02. results <- lapply(seq(100, 102), function(seed) { run_ml(dat, \"glmnet\", seed = seed, hyperparameters = new_hp) }) #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. models <- lapply(results, function(x) x$trained_model) hp_metrics <- combine_hp_performance(models) plot_hp_performance(hp_metrics$dat, lambda, AUC)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"hyperparameter-options","dir":"Articles","previous_headings":"","what":"Hyperparameter options","title":"Hyperparameter tuning","text":"can see default hyperparameters used dataset get_hyperparams_list(). examples built-datasets provide: hyperparameters tuned modeling methods. output similar, won’t go details.","code":"get_hyperparams_list(otu_mini_bin, 'glmnet') #> $lambda #> [1] 1e-04 1e-03 1e-02 1e-01 1e+00 1e+01 #> #> $alpha #> [1] 0 get_hyperparams_list(otu_mini_bin, 'rf') #> $mtry #> [1] 2 3 6 get_hyperparams_list(otu_small, 'rf') #> $mtry #> [1] 4 8 16"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"regression","dir":"Articles","previous_headings":"Hyperparameter options","what":"Regression","title":"Hyperparameter tuning","text":"mentioned , glmnet uses alpha parameter lambda hyperparameter. alpha 0 L2 regularization (ridge). alpha 1 L1 regularization (lasso). alpha elastic net. can also tune alpha like hyperparameter. Please refer original glmnet documentation information: https://web.stanford.edu/~hastie/glmnet/glmnet_alpha.html default hyperparameters chosen run_ml() fixed glmnet.","code":"#> $lambda #> [1] 1e-04 1e-03 1e-02 1e-01 1e+00 1e+01 #> #> $alpha #> [1] 0"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"random-forest","dir":"Articles","previous_headings":"Hyperparameter options","what":"Random forest","title":"Hyperparameter tuning","text":"run rf using randomForest package implementation. tuning mtry hyperparameter. number features randomly collected sampled tree node. number needs less number features dataset. Please refer original documentation information: https://cran.r-project.org/web/packages/randomForest/randomForest.pdf default, take square root number features dataset provide range [sqrt_features / 2, sqrt_features, sqrt_features * 2]. example number features 1000: Similar glmnet method, can provide mtry range.","code":"#> $mtry #> [1] 16 32 64"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"decision-tree","dir":"Articles","previous_headings":"Hyperparameter options","what":"Decision tree","title":"Hyperparameter tuning","text":"run rpart2, running rpart package implementation decision tree. tuning maxdepth hyperparameter. maximum depth node final tree. Please refer original documentation information maxdepth: https://cran.r-project.org/web/packages/rpart/rpart.pdf default, provide range less number features dataset. example 1000 features: 10 features:","code":"#> $maxdepth #> [1] 1 2 4 8 16 30 #> $maxdepth #> [1] 1 2 4 8"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"svm-with-radial-basis-kernel","dir":"Articles","previous_headings":"Hyperparameter options","what":"SVM with radial basis kernel","title":"Hyperparameter tuning","text":"run svmRadial method, tuning C sigma hyperparameters. sigma defines far influence single training example reaches C behaves regularization parameter. Please refer great sklearn resource information hyperparameters: https://scikit-learn.org/stable/auto_examples/svm/plot_rbf_parameters.html default, provide 2 separate range values two hyperparameters.","code":"#> $C #> [1] 1e-03 1e-02 1e-01 1e+00 1e+01 1e+02 #> #> $sigma #> [1] 1e-06 1e-05 1e-04 1e-03 1e-02 1e-01"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"xgboost","dir":"Articles","previous_headings":"Hyperparameter options","what":"XGBoost","title":"Hyperparameter tuning","text":"run xgbTree method, tuning nrounds, gamma, eta max_depth, colsample_bytree, min_child_weight subsample hyperparameters. can read hyperparameters : https://xgboost.readthedocs.io/en/latest/parameter.html default, set nrounds, gamma, colsample_bytree min_child_weight fixed values provide range values eta, max_depth subsample. can changed optimized user supplying custom named list hyperparameters run_ml().","code":"#> $nrounds #> [1] 100 #> #> $gamma #> [1] 0 #> #> $eta #> [1] 0.001 0.010 0.100 1.000 #> #> $max_depth #> [1] 1 2 4 8 16 30 #> #> $colsample_bytree #> [1] 0.8 #> #> $min_child_weight #> [1] 1 #> #> $subsample #> [1] 0.4 0.5 0.6 0.7"},{"path":"http://www.schlosslab.org/mikropml/dev/authors.html","id":null,"dir":"","previous_headings":"","what":"Authors","title":"Authors and Citation","text":"Begüm Topçuoğlu. Author. Zena Lapp. Author. Kelly Sovacool. Author, maintainer. Evan Snitkin. Author. Jenna Wiens. Author. Patrick Schloss. Author. Nick Lesniak. Contributor. Courtney Armour. Contributor. Sarah Lucas. Contributor.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/authors.html","id":"citation","dir":"","previous_headings":"","what":"Citation","title":"Authors and Citation","text":"Topçuoğlu et al., (2021). mikropml: User-Friendly R Package Supervised Machine Learning Pipelines. Journal Open Source Software, 6(61), 3073, https://doi.org/10.21105/joss.03073","code":"@Article{, title = {{mikropml}: User-Friendly R Package for Supervised Machine Learning Pipelines}, author = {Begüm D. Topçuoğlu and Zena Lapp and Kelly L. Sovacool and Evan Snitkin and Jenna Wiens and Patrick D. Schloss}, journal = {Journal of Open Source Software}, year = {2021}, month = {May}, volume = {6}, number = {61}, pages = {3073}, doi = {10.21105/joss.03073}, url = {https://joss.theoj.org/papers/10.21105/joss.03073}, }"},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"mikropml-","dir":"","previous_headings":"","what":"User-Friendly R Package for Supervised Machine Learning Pipelines","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"meek-ROPE em el User-Friendly R Package Supervised Machine Learning Pipelines interface build machine learning models classification regression problems. mikropml implements ML pipeline described Topçuoğlu et al. (2020) reasonable default options data preprocessing, hyperparameter tuning, cross-validation, testing, model evaluation, interpretation steps. See website information, documentation, examples.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"installation","dir":"","previous_headings":"","what":"Installation","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"can install latest release CRAN: development version GitHub: install terminal using conda:","code":"install.packages('mikropml') # install.packages(\"devtools\") devtools::install_github(\"SchlossLab/mikropml\") conda install -c conda-forge r-mikropml"},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"dependencies","dir":"","previous_headings":"Installation","what":"Dependencies","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"Imports: caret, dplyr, e1071, glmnet, kernlab, MLmetrics, randomForest, rlang, rpart, stats, utils, xgboost Suggests: doFuture, foreach, future, future.apply, ggplot2, knitr, progress, progressr, purrr, rmarkdown, testthat, tidyr","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"usage","dir":"","previous_headings":"","what":"Usage","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"Check introductory vignette quick start tutorial. -depth discussion, read vignettes /take look reference documentation. can watch Riffomonas Project series video tutorials covering mikropml skills related machine learning. also provide example Snakemake workflow running mikropml HPC.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"help--contributing","dir":"","previous_headings":"","what":"Help & Contributing","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"come across bug, open issue include minimal reproducible example. questions, create new post Discussions. ’d like contribute, see guidelines .","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"code-of-conduct","dir":"","previous_headings":"","what":"Code of Conduct","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"Please note mikropml project released Contributor Code Conduct. contributing project, agree abide terms.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"license","dir":"","previous_headings":"","what":"License","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"mikropml package licensed MIT license. Text images included repository, including mikropml logo, licensed CC 4.0 license.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"citation","dir":"","previous_headings":"","what":"Citation","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"cite mikropml publications, use: Topçuoğlu BD, Lapp Z, Sovacool KL, Snitkin E, Wiens J, Schloss PD (2021). “mikropml: User-Friendly R Package Supervised Machine Learning Pipelines.” Journal Open Source Software, 6(61), 3073. doi:10.21105/joss.03073, https://joss.theoj.org/papers/10.21105/joss.03073. BibTeX entry LaTeX users :","code":"@Article{, title = {{mikropml}: User-Friendly R Package for Supervised Machine Learning Pipelines}, author = {Begüm D. Topçuoğlu and Zena Lapp and Kelly L. Sovacool and Evan Snitkin and Jenna Wiens and Patrick D. Schloss}, journal = {Journal of Open Source Software}, year = {2021}, month = {May}, volume = {6}, number = {61}, pages = {3073}, doi = {10.21105/joss.03073}, url = {https://joss.theoj.org/papers/10.21105/joss.03073}, }"},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"why-the-name","dir":"","previous_headings":"","what":"Why the name?","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"word “mikrop” (pronounced “meek-ROPE”) Turkish “microbe”. package originally implemented machine learning pipeline microbiome-based classification problems (see Topçuoğlu et al. 2020). realized methods applicable many fields , stuck name like !","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/pull_request_template.html","id":"issues","dir":"","previous_headings":"","what":"Issues","title":"NA","text":"Resolves # .","code":""},{"path":[]},{"path":"http://www.schlosslab.org/mikropml/dev/pull_request_template.html","id":"checklist","dir":"","previous_headings":"","what":"Checklist","title":"NA","text":"(Strikethrough points applicable.) Write unit tests new functionality bug fixes. roxygen comments vignettes Update NEWS.md includes user-facing changes. check workflow succeeds recent commit. always required PR can merged.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/calc_perf_metrics.html","id":null,"dir":"Reference","previous_headings":"","what":"Get performance metrics for test data — calc_perf_metrics","title":"Get performance metrics for test data — calc_perf_metrics","text":"Get performance metrics test data","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/calc_perf_metrics.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get performance metrics for test data — calc_perf_metrics","text":"","code":"calc_perf_metrics( test_data, trained_model, outcome_colname, perf_metric_function, class_probs )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/calc_perf_metrics.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get performance metrics for test data — calc_perf_metrics","text":"test_data Held test data: dataframe outcome features. trained_model Trained model caret::train(). outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). perf_metric_function Function calculate performance metric used cross-validation test performance. functions provided caret (see caret::defaultSummary()). Defaults: binary classification = twoClassSummary, multi-class classification = multiClassSummary, regression = defaultSummary. class_probs Whether use class probabilities (TRUE categorical outcomes, FALSE numeric outcomes).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/calc_perf_metrics.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get performance metrics for test data — calc_perf_metrics","text":"Dataframe performance metrics.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/calc_perf_metrics.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get performance metrics for test data — calc_perf_metrics","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/calc_perf_metrics.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get performance metrics for test data — calc_perf_metrics","text":"","code":"if (FALSE) { results <- run_ml(otu_small, \"glmnet\", kfold = 2, cv_times = 2) calc_perf_metrics(results$test_data, results$trained_model, \"dx\", multiClassSummary, class_probs = TRUE ) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/combine_hp_performance.html","id":null,"dir":"Reference","previous_headings":"","what":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","title":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","text":"Combine hyperparameter performance metrics multiple train/test splits generated , instance, looping R using snakemake workflow high-performance computer.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/combine_hp_performance.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","text":"","code":"combine_hp_performance(trained_model_lst)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/combine_hp_performance.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","text":"trained_model_lst List trained models.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/combine_hp_performance.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","text":"Named list: dat: Dataframe performance metric group hyperparameters params: Hyperparameters tuned. Metric: Performance metric used.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/combine_hp_performance.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/combine_hp_performance.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","text":"","code":"if (FALSE) { results <- lapply(seq(100, 102), function(seed) { run_ml(otu_small, \"glmnet\", seed = seed, cv_times = 2, kfold = 2) }) models <- lapply(results, function(x) x$trained_model) combine_hp_performance(models) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/compare_models.html","id":null,"dir":"Reference","previous_headings":"","what":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","title":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","text":"wrapper permute_p_value().","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/compare_models.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","text":"","code":"compare_models(merged_data, metric, group_name, nperm = 10000)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/compare_models.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","text":"merged_data concatenated performance data run_ml metric metric compare, must numeric group_name column group variables compare nperm number permutations, default=10000","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/compare_models.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","text":"table p-values pairs group variable","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/compare_models.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","text":"Courtney R Armour, armourc@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/compare_models.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","text":"","code":"df <- dplyr::tibble( model = c(\"rf\", \"rf\", \"glmnet\", \"glmnet\", \"svmRadial\", \"svmRadial\"), AUC = c(.2, 0.3, 0.8, 0.9, 0.85, 0.95) ) set.seed(123) compare_models(df, \"AUC\", \"model\", nperm = 10) #> group1 group2 p_value #> 1 glmnet svmRadial 0.7272727 #> 2 rf glmnet 0.2727273 #> 3 rf svmRadial 0.5454545"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/define_cv.html","id":null,"dir":"Reference","previous_headings":"","what":"Define cross-validation scheme and training parameters — define_cv","title":"Define cross-validation scheme and training parameters — define_cv","text":"Define cross-validation scheme training parameters","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/define_cv.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Define cross-validation scheme and training parameters — define_cv","text":"","code":"define_cv( train_data, outcome_colname, hyperparams_list, perf_metric_function, class_probs, kfold = 5, cv_times = 100, groups = NULL, group_partitions = NULL )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/define_cv.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Define cross-validation scheme and training parameters — define_cv","text":"train_data Dataframe training model. outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). hyperparams_list Named list lists hyperparameters. perf_metric_function Function calculate performance metric used cross-validation test performance. functions provided caret (see caret::defaultSummary()). Defaults: binary classification = twoClassSummary, multi-class classification = multiClassSummary, regression = defaultSummary. class_probs Whether use class probabilities (TRUE categorical outcomes, FALSE numeric outcomes). kfold Fold number k-fold cross-validation (default: 5). cv_times Number cross-validation partitions create (default: 100). groups Vector groups keep together splitting data train test sets. number groups training set larger kfold, groups also kept together cross-validation. Length matches number rows dataset (default: NULL). group_partitions Specify assign groups training testing partitions (default: NULL). groups specifies samples belong group \"\" belong group \"B\", setting group_partitions = list(train = c(\"\", \"B\"), test = c(\"B\")) result samples group \"\" placed training set, samples \"B\" also training set, remaining samples \"B\" testing set. partition sizes close training_frac possible. number groups training set larger kfold, groups also kept together cross-validation.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/define_cv.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Define cross-validation scheme and training parameters — define_cv","text":"Caret object trainControl controls cross-validation","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/define_cv.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Define cross-validation scheme and training parameters — define_cv","text":"Begüm Topçuoğlu, topcuoglu.begum@gmail.com Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/define_cv.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Define cross-validation scheme and training parameters — define_cv","text":"","code":"training_inds <- get_partition_indices(otu_small %>% dplyr::pull(\"dx\"), training_frac = 0.8, groups = NULL ) train_data <- otu_small[training_inds, ] test_data <- otu_small[-training_inds, ] cv <- define_cv(train_data, outcome_colname = \"dx\", hyperparams_list = get_hyperparams_list(otu_small, \"glmnet\"), perf_metric_function = caret::multiClassSummary, class_probs = TRUE, kfold = 5 )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_caret_processed_df.html","id":null,"dir":"Reference","previous_headings":"","what":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","title":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","text":"Get preprocessed dataframe continuous variables","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_caret_processed_df.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","text":"","code":"get_caret_processed_df(features, method)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_caret_processed_df.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","text":"features Dataframe features machine learning method Methods preprocess data, described caret::preProcess() (default: c(\"center\",\"scale\"), use NULL normalization).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_caret_processed_df.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","text":"Named list: processed: Dataframe processed features. removed: Names features removed preprocessing.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_caret_processed_df.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_caret_processed_df.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","text":"","code":"get_caret_processed_df(mikropml::otu_small[, 2:ncol(otu_small)], c(\"center\", \"scale\")) #> $processed #> Otu00001 Otu00002 Otu00003 Otu00004 Otu00005 #> 1 -0.4198476322 -0.218855527 -0.174296240 -0.59073845 -0.048774220 #> 2 -0.1045750483 1.754032339 -0.718419364 0.03805034 1.537072974 #> 3 -0.7076423302 0.696324396 1.428146361 0.60439092 -0.264559044 #> 4 -0.4936040623 -0.665193276 2.015799335 -0.59289184 -0.675577755 #> 5 1.1116829471 -0.395140184 -0.753787367 -0.58643168 -0.754356341 #> 6 -0.6845030580 0.613808173 -0.174296240 -0.58427829 0.375945115 #> 7 -0.7698291243 -0.496410093 -0.318488868 0.15863997 -0.658451975 #> 8 -0.4241862457 -0.477656406 -0.397386721 -0.55628427 -0.391289813 #> 9 -0.5557908564 1.144537514 1.615868839 -0.35171258 -0.274834512 #> 10 1.4573258257 -0.451401245 -0.693933823 -0.05669866 -0.706404158 #> 11 0.2931311927 -0.721454336 -0.753787367 3.03341063 -0.449517464 #> 12 1.1044519245 0.002437979 -0.473563958 -0.41846755 0.413621830 #> 13 -0.5933921737 -0.297621012 -0.340253793 -0.59289184 -0.655026820 #> 14 -0.8016456236 0.077452727 -0.419151646 -0.59073845 -0.045349064 #> 15 -0.7915221920 0.291244758 -0.269517787 -0.59289184 -0.220032017 #> 16 1.4862499159 -0.683946963 -0.745625520 -0.54551734 -0.744080874 #> 17 -0.3750152923 -0.051947713 0.103206554 1.37745659 0.458148857 #> 18 0.2135899445 0.325001395 0.478651509 -0.34309903 0.560903535 #> 19 -0.5181895390 -0.100707299 -0.073633462 -0.40770062 -0.237157796 #> 20 0.8745054069 -0.676445488 -0.560623658 -0.58212491 -0.154954054 #> 21 2.0184531767 -0.682071594 -0.740184289 -0.58643168 -0.720104782 #> 22 0.5867107089 -0.646439589 -0.560623658 0.09188499 -0.593374013 #> 23 -0.4603413585 -0.397015552 0.386150578 -0.42062094 -0.463218088 #> 24 -0.7553670792 1.401463025 0.829610924 -0.58858507 -0.295385447 #> 25 1.9316809059 0.334378238 -0.228708552 -0.42923448 -0.535146362 #> 26 1.2201482855 -0.108208774 -0.302165174 -0.58858507 0.358819335 #> 27 -0.9158957801 -0.674570119 -0.732022442 -0.53475041 -0.689278379 #> 28 -0.7597056927 -0.595804634 -0.375621796 -0.57566475 -0.730380250 #> 29 -0.5109585165 -0.558297260 -0.432754724 3.84093048 -0.672152599 #> 30 -0.8811868718 -0.385763340 -0.595991661 -0.58212491 -0.192630769 #> 31 0.3437483507 0.902614952 1.376454664 -0.59289184 1.396641581 #> 32 -0.5109585165 0.535042688 -0.484446421 -0.59289184 0.550628067 #> 33 1.2302717171 -0.582677053 0.007985007 -0.40554723 -0.672152599 #> 34 -0.0770971626 0.244360541 -0.313047636 -0.28711099 2.273481498 #> 35 -0.2275024319 2.211622300 1.515206061 -0.57781814 1.269910812 #> 36 0.0284757669 -0.663317907 -0.634080280 -0.57781814 -0.730380250 #> 37 -0.3157209072 -0.290119537 -0.231429168 -0.58643168 -0.233732640 #> 38 -0.1653156379 1.476477772 1.836238704 1.65309003 4.393653017 #> 39 -0.6859492625 -0.134463935 -0.258635324 0.68191283 0.399921206 #> 40 -0.3967083600 -0.126962461 -0.269517787 -0.57135798 0.304016840 #> 41 0.0009978811 -0.485157881 -0.291282711 -0.58858507 -0.463218088 #> 42 -0.4111704051 -0.029443288 -0.400107336 -0.19236199 0.050555301 #> 43 1.4399713715 -0.693323806 -0.734743058 3.42532693 -0.757781497 #> 44 -0.4805882217 -0.593929265 -0.397386721 1.44851834 -0.648176508 #> 45 0.3827958725 -0.421395345 -0.609594739 2.34648041 -0.333062162 #> 46 1.6438862078 -0.592053897 -0.579667967 -0.58427829 -0.726955094 #> 47 -0.5471136293 -0.280742693 0.269164106 1.53680717 -0.202906237 #> 48 1.6164083221 -0.653941063 -0.615035970 -0.58643168 -0.696128691 #> 49 -0.8609400086 -0.078202875 -0.397386721 0.99630722 -0.086450936 #> 50 0.6026189585 0.146841369 -0.345695024 -0.58212491 1.002748649 #> 51 -0.8363545319 -0.635187377 -0.421872261 -0.56705120 -0.593374013 #> 52 -0.1783314785 0.328752132 -0.666727667 -0.58643168 -0.531721206 #> 53 1.6728102981 -0.548920417 -0.639521511 -0.58858507 -0.562547610 #> 54 1.2620882164 -0.458902719 -0.751066751 1.21595262 -0.579673389 #> 55 -0.7828449649 -0.065075294 -0.130766390 1.23963987 0.375945115 #> 56 2.0705165392 -0.605181477 -0.606874124 -0.58858507 -0.733805406 #> 57 -0.6469017407 -0.327626911 -0.666727667 -0.57566475 -0.600224325 #> 58 -0.3663380652 8.732279248 -0.740184289 -0.54551734 -0.730380250 #> 59 -0.4415406999 1.363955651 -0.748346136 -0.10191977 1.746007486 #> 60 -0.4111704051 -0.479531775 -0.242311630 -0.59289184 -0.422116216 #> 61 -0.2694423628 2.967395884 -0.740184289 0.50964193 0.721885863 #> 62 0.6112961856 0.047446828 -0.579667967 -0.52613687 0.564328691 #> 63 -0.2347334545 -0.425146083 -0.479005189 -0.05454527 -0.665302287 #> 64 -0.8334621229 -0.344505229 -0.356577486 -0.45507512 -0.350187942 #> 65 -0.8884178944 -0.550795785 -0.400107336 0.57424351 -0.476918711 #> 66 -0.6049618098 -0.721454336 1.305718658 1.43129125 -0.487194179 #> 67 -0.8970951214 -0.642688851 -0.623197817 -0.58858507 -0.682428067 #> 68 -0.7293353979 0.801345043 -0.130766390 0.47303436 -0.257708732 #> 69 -0.7221043754 -0.721454336 2.736762475 1.05660204 -0.052199376 #> 70 -0.1002364348 -0.717703599 0.679977065 -0.57135798 0.512951352 #> 71 -0.2708885673 -0.584552422 0.783360459 -0.59289184 0.389645738 #> 72 -0.7221043754 0.150592106 1.036377712 1.00276738 0.030004366 #> 73 -0.4979426759 0.486283102 0.116809632 0.01436309 0.314292308 #> 74 -0.5557908564 -0.412018502 -0.527976271 -0.32587194 -0.315936383 #> 75 -0.7061961257 0.306247708 -0.323930099 -0.17944168 0.526651975 #> 76 0.4203971899 -0.558297260 0.269164106 0.31583716 0.074531393 #> 77 -0.6425631272 0.304372339 0.699021375 -0.52829025 0.427322453 #> 78 0.6488975029 -0.053823082 2.739483091 -0.59073845 -0.291960291 #> 79 -0.7380126250 -0.190724996 -0.424592877 -0.16867475 -0.199481081 #> 80 2.5159475292 -0.717703599 -0.710257517 -0.59289184 -0.757781497 #> 81 -0.8710634402 -0.331377648 -0.470843343 -0.55628427 -0.555697298 #> 82 -0.4039393825 1.645260955 -0.138928237 -0.59289184 0.410196674 #> 83 1.0032176086 -0.425146083 -0.405548568 -0.59073845 0.095082329 #> 84 -0.8305697138 -0.548920417 -0.748346136 -0.59289184 -0.319361539 #> 85 -0.8088766461 -0.368885022 2.105579651 -0.57135798 -0.579673389 #> 86 -0.6859492625 -0.550795785 0.410636119 0.67545267 -0.490619335 #> 87 -0.0062331415 0.167470424 0.367106269 -0.59289184 0.869167568 #> 88 -0.6497941497 1.360204913 -0.751066751 -0.59289184 -0.291960291 #> 89 -0.4458793134 0.788217462 0.731668762 -0.30864485 1.211683161 #> 90 -0.1421763657 -0.717703599 0.767036765 -0.59289184 2.119349482 #> 91 -0.7915221920 1.123908458 0.652770909 2.20651030 -0.045349064 #> 92 -0.4502179269 0.116835470 -0.054589152 -0.26342374 3.900430564 #> 93 1.3633225323 -0.593929265 -0.753787367 0.12203240 -0.206331393 #> 94 -0.6743796264 -0.442024401 -0.538858733 2.10314776 -0.679002911 #> 95 -0.2072555688 0.193725586 -0.364739333 -0.49383607 0.554053223 #> 96 -0.3460912020 2.147859765 2.856469562 1.86412189 1.304162371 #> 97 0.2121437400 -0.700825281 -0.742904905 0.42135309 -0.747506030 #> 98 -0.5948383782 0.169345793 0.024308701 -0.36463290 -0.401565281 #> 99 1.5281898468 -0.704576018 -0.699375055 -0.58858507 -0.692703535 #> 100 -0.5688066970 -0.680196225 -0.729301827 -0.58212491 -0.740655718 #> 101 -0.2361796590 -0.179472784 -0.498049499 -0.58427829 -0.631050728 #> 102 2.6518907534 -0.683946963 -0.721139980 -0.53905718 -0.740655718 #> 103 0.8325654761 -0.590178528 -0.473563958 2.55966565 -0.672152599 #> 104 0.3061470334 -0.162594466 0.524901975 0.39981923 1.636402496 #> 105 -0.5644680835 0.915742533 -0.449078418 -0.57997152 0.841766321 #> 106 -0.2390720680 0.345630450 -0.348415640 -0.56274443 0.132759044 #> 107 -0.7394588295 -0.278867325 -0.748346136 0.10049854 0.290316216 #> 108 -0.3648918607 1.302068484 0.802404768 -0.59289184 0.506101040 #> 109 1.5079429836 -0.682071594 -0.650403974 -0.50029623 -0.713254470 #> 110 -0.6035156053 -0.451401245 -0.582388583 -0.59289184 -0.583098545 #> 111 -0.2810119989 1.495231459 0.660932756 -0.59289184 -0.130977963 #> 112 -0.4502179269 -0.719578968 1.033657096 -0.58427829 -0.754356341 #> 113 -0.4458793134 0.508787527 2.353155672 -0.59289184 4.314874431 #> 114 -0.7813987604 0.090580308 -0.304885790 -0.17944168 -0.329637006 #> 115 -0.7365664205 3.171811071 0.228354872 -0.59073845 0.649957589 #> 116 -0.0264800046 -0.078202875 -0.413710414 0.21462801 0.321142620 #> 117 -0.6324396956 -0.205727946 -0.753787367 0.15863997 -0.702979003 #> 118 -0.5557908564 -0.213229420 0.821449078 0.09188499 2.633122871 #> 119 0.2309443987 1.073273503 2.619776003 -0.42062094 0.817790229 #> 120 0.3900268951 -0.674570119 -0.718419364 -0.58858507 -0.737230562 #> 121 -0.9057723485 -0.344505229 -0.661286436 -0.59073845 -0.668727443 #> 122 -0.0264800046 0.013690191 -0.751066751 -0.59289184 -0.761206653 #> 123 -0.7076423302 -0.637062745 -0.468122727 -0.50890978 -0.675577755 #> 124 0.0545074481 -0.650190326 0.443283506 -0.57566475 0.348543867 #> 125 0.3249476920 0.144966000 -0.585109199 -0.58427829 -0.196055925 #> 126 -0.0496192768 2.852998394 0.233796103 -0.58858507 0.944520999 #> 127 -0.3388601794 -0.057573819 0.954759243 -0.42923448 -0.004247193 #> 128 -0.0366034362 -0.286368800 -0.511652577 1.86196850 -0.757781497 #> 129 -0.8522627815 -0.355757441 -0.386504258 -0.32371856 0.194411850 #> 130 -0.2766733854 0.094331045 1.893371632 2.95158195 2.937961748 #> 131 0.5433245735 -0.537668204 -0.712978133 -0.58427829 -0.747506030 #> 132 -0.3764614968 -0.121336354 0.062397319 -0.56274443 -0.360463410 #> 133 0.1340486963 -0.316374699 0.312693956 -0.45722851 -0.052199376 #> 134 -0.5196357436 0.308123076 -0.280400249 -0.56705120 0.598580250 #> 135 0.1499569459 -0.706451387 -0.712978133 -0.57781814 -0.744080874 #> 136 -0.3808001103 1.189546362 0.475930894 -0.59289184 0.214962786 #> 137 -0.6859492625 0.872609053 5.601570724 -0.56705120 -0.672152599 #> 138 0.1210328557 -0.301371749 -0.443637186 -0.59289184 -0.562547610 #> 139 -0.8450317590 -0.192600365 -0.636800895 1.93303025 -0.709829314 #> 140 0.1803272407 0.475030890 0.435121659 1.12120363 1.276761123 #> 141 -0.8464779635 0.321250657 -0.220546705 -0.58643168 -0.538571518 #> 142 -0.8826330763 -0.472030300 0.764316150 3.24444248 0.026579210 #> 143 -0.8537089861 -0.522665255 -0.549741196 1.32792871 -0.357038254 #> 144 -0.7582594882 -0.344505229 -0.664007052 -0.45722851 -0.726955094 #> 145 -0.5832687421 -0.171971309 1.553294680 -0.58212491 -0.374164034 #> 146 -0.6469017407 -0.470154931 -0.644962742 -0.59073845 -0.730380250 #> 147 -0.3460912020 -0.023817182 2.127344576 1.81459400 0.307441996 #> 148 -0.5644680835 -0.408267765 2.886396334 0.15863997 -0.346762786 #> 149 1.1478380599 -0.593929265 -0.601432892 -0.58427829 -0.730380250 #> 150 0.1427259234 3.299336143 2.657864622 -0.56705120 3.835352601 #> 151 -0.5659142880 3.123051485 1.289394964 -0.57781814 0.899993971 #> 152 0.9699549048 0.081203464 -0.285841480 -0.58643168 0.423897298 #> 153 -0.8378007364 0.203102430 0.070559166 -0.58427829 -0.442667152 #> 154 -0.6830568535 -0.653941063 -0.560623658 -0.56274443 -0.668727443 #> 155 -0.3735690877 -0.466404194 -0.590550430 -0.49383607 -0.689278379 #> 156 -0.3460912020 -0.235733845 -0.294003327 2.08161390 -0.682428067 #> 157 4.1935447642 -0.668944013 -0.683051361 2.59842660 -0.716679626 #> 158 0.1687576046 -0.477656406 -0.000176840 -0.57135798 -0.713254470 #> 159 0.9280149739 -0.592053897 2.867352025 -0.58212491 0.584879626 #> 160 1.1001133110 -0.674570119 -0.715698748 -0.48737591 -0.494044491 #> 161 -0.6526865587 -0.693323806 -0.718419364 -0.57566475 -0.750931186 #> 162 -0.0192489821 0.495659946 -0.751066751 -0.59289184 5.606158216 #> 163 -0.2491954996 -0.653941063 -0.536138117 -0.16006120 -0.668727443 #> 164 0.4478750756 -0.250736794 -0.179737471 -0.44215480 -0.500894803 #> 165 0.4088275538 -0.663317907 -0.595991661 -0.56705120 0.310867152 #> 166 -0.9130033711 0.317499920 0.761595534 -0.59289184 -0.449517464 #> 167 -0.8999875305 0.506912158 0.595637981 0.91447854 -0.720104782 #> 168 2.9367930424 -0.721454336 -0.748346136 -0.58643168 -0.754356341 #> 169 3.0119956771 -0.689573069 -0.680330745 -0.59073845 -0.726955094 #> 170 3.9332279519 -0.706451387 -0.740184289 0.67760606 -0.432391684 #> 171 -0.5962845827 0.291244758 -0.196061165 -0.57351136 0.701334928 #> 172 -0.7683829198 -0.554546523 -0.658565820 -0.12776041 -0.552272142 #> 173 -0.1754390695 -0.712077493 -0.734743058 2.77931105 -0.730380250 #> 174 -0.3186133163 -0.166345203 -0.397386721 1.53034702 -0.028223285 #> 175 -0.5890535602 0.373760981 -0.043706690 -0.30649147 -0.518020582 #> 176 -0.3446449975 -0.160719097 2.959852956 -0.57781814 -0.161804366 #> 177 -0.5283129706 -0.288244168 -0.606874124 -0.57566475 0.067681081 #> 178 1.8608168848 -0.036944763 -0.160693162 -0.39262692 -0.329637006 #> 179 0.0501688346 -0.698949912 -0.726581211 0.53978933 -0.750931186 #> 180 -0.8363545319 0.364384137 -0.492608267 -0.33233210 -0.446092308 #> 181 -0.8378007364 0.131838419 -0.296723943 -0.56489782 -0.634475884 #> 182 -0.8233386913 -0.593929265 -0.095398387 -0.58858507 1.266485656 #> 183 -0.7177657618 -0.571424841 -0.737463673 3.51146238 -0.644751352 #> 184 -0.7625981017 0.683196815 -0.405548568 -0.39478030 -0.175504990 #> 185 -0.3301829524 -0.672694750 -0.742904905 -0.59289184 -0.569397921 #> 186 -0.9202343936 -0.383887972 -0.117163312 1.05660204 -0.048774220 #> 187 0.0762005158 -0.006938864 -0.593271045 2.91066761 0.036854678 #> 188 0.5028308471 -0.708326756 -0.742904905 -0.23542972 -0.062474844 #> 189 -0.8652786222 -0.389514078 0.032470547 -0.47230221 -0.055624532 #> 190 -0.7842911694 -0.059449188 -0.356577486 0.82403632 -0.668727443 #> 191 0.7212077286 -0.685822331 -0.753787367 -0.55197750 -0.631050728 #> 192 0.9844169499 -0.713952862 -0.751066751 -0.58427829 -0.754356341 #> 193 0.3061470334 0.120586207 -0.261355940 -0.58427829 0.817790229 #> 194 -0.2810119989 -0.577050947 -0.443637186 0.18017383 -0.661877131 #> 195 -0.5413288112 0.195600955 -0.356577486 -0.58212491 0.677358836 #> 196 1.6858261387 -0.702700649 -0.734743058 -0.59073845 -0.723529938 #> 197 1.1478380599 -0.078202875 3.286326831 -0.57135798 0.852041788 #> 198 5.1046536074 -0.691448437 -0.753787367 -0.48737591 -0.716679626 #> 199 0.9309073830 -0.350131335 -0.590550430 -0.58212491 1.232234096 #> 200 -0.6252086730 0.400016142 -0.424592877 -0.58427829 -0.048774220 #> Otu00006 Otu00007 Otu00008 Otu00009 Otu00010 Otu00011 #> 1 -0.16741683 -0.568745802 -0.0623643594 0.05469266 -0.637876227 0.72598299 #> 2 -0.57321773 -0.642744310 -0.1320452079 -0.40907548 -0.383314722 0.01116897 #> 3 -0.03641025 -0.612472193 -0.2070861217 -0.73691158 2.586569508 -0.77512645 #> 4 -0.58599886 -0.551927960 -0.4697293198 0.13465268 0.025980248 -0.02010414 #> 5 2.72750923 0.191420685 -0.6760918326 1.26208901 1.703091342 1.58822740 #> 6 0.80394909 -0.336659574 -0.0060836741 -0.26514744 -0.348374907 0.55621466 #> 7 2.20028760 -0.716742817 0.0635971744 -0.84485761 -0.682798846 -0.48920084 #> 8 -0.06197251 0.376416954 -0.0221638699 0.61841082 2.381922023 -0.75278851 #> 9 -0.46457812 -0.804195599 0.2940799810 -0.40907548 0.280541753 0.18540489 #> 10 0.68891891 -0.370295259 1.5885357433 -0.72091957 0.495172042 -0.82873750 #> 11 -0.24090833 0.066968649 -0.1963659911 0.09067467 0.500163444 0.38644633 #> 12 -0.35593851 -0.777287051 0.3423205684 0.48247878 0.634931300 -0.48920084 #> 13 -0.49972623 -0.141572599 -0.2178062522 -0.14520740 -0.248546866 1.11019552 #> 14 -0.77452053 -0.622562899 -0.4214887324 -0.64495755 -0.607927815 -0.04690967 #> 15 -0.68824790 -0.582200076 0.8997673563 -0.79688159 -0.892437732 -0.62769606 #> 16 0.91897926 1.180309832 -0.4241687650 -0.06524738 -0.058873587 -0.06478002 #> 17 0.18725954 0.046787238 2.0950619112 -0.13321340 -0.423245938 -0.65003400 #> 18 1.43341977 -0.316478163 0.7791658878 0.44249877 -0.353366309 -0.02903932 #> 19 -0.73298186 -0.693197838 -0.2124461869 0.28657672 0.045945856 -0.55174707 #> 20 -0.11948759 -0.481293021 -0.2365664806 0.54644680 2.327016600 1.20848245 #> 21 5.17190045 0.712773807 -0.6787718653 0.43450277 1.468495445 -0.87788097 #> 22 -0.60517055 1.372033238 0.5004424938 1.02620694 0.040954454 0.29262699 #> 23 -0.26008003 -0.568745802 -0.6787718653 -0.80087960 -0.677807444 0.29709458 #> 24 -0.25368946 0.524413969 -0.6787718653 0.62240882 0.619957094 -0.09605313 #> 25 0.79116796 -0.797468462 0.2779997852 0.79432287 -0.363349113 0.69024229 #> 26 -0.25368946 0.904497213 -0.5126098420 0.24659671 3.609806932 -0.80193198 #> 27 -0.88635543 0.278873467 -0.6760918326 -0.86884562 -0.153710227 -0.87788097 #> 28 -0.88955071 0.077059355 0.1895587082 0.22260871 -0.842523712 -0.87788097 #> 29 2.07567158 -0.518292274 -0.6760918326 0.31056473 0.445258021 -0.81086715 #> 30 -0.80647336 -0.784014188 -0.5903307884 -0.83286360 -0.932368949 -0.80639956 #> 31 1.79768199 -0.797468462 -0.6787718653 -0.82886560 -0.378323320 2.83915193 #> 32 -0.49333566 0.100604335 -0.6787718653 -0.25715144 -0.712747258 -0.54727948 #> 33 -0.79049695 0.917951487 0.3878811232 1.19812099 -0.647859031 0.67683952 #> 34 -0.37511020 1.028949248 -0.6787718653 0.02670665 -0.558013794 -0.87788097 #> 35 -0.15463570 -0.239116087 -0.5822906904 -0.64895555 0.585017279 0.01116897 #> 36 0.92536983 0.743045923 0.9480079437 2.52545738 0.470215032 -0.46239532 #> 37 2.33129419 0.325963426 -0.5849707231 -0.84485761 -0.897429135 1.27549626 #> 38 1.15862546 -0.787377757 -0.6787718653 -0.11722139 0.679853918 1.45419977 #> 39 -0.53806962 -0.373658828 0.0582371091 -0.35710146 -0.737704268 -0.31496493 #> 40 -0.56363188 -0.535110117 -0.5045697441 -0.02926537 0.555068867 -0.24348353 #> 41 -0.11309703 1.207218380 -0.0864846531 0.96623692 1.363676002 0.34177045 #> 42 2.76585263 -0.387113102 0.7014449414 -0.70492757 -0.892437732 0.98063548 #> 43 -0.62753753 -0.797468462 -0.6707317674 6.20761646 1.054209073 0.15859936 #> 44 -0.36552436 0.547958949 -0.6653717021 0.57043481 0.510146248 0.65896917 #> 45 0.01151899 -0.794104894 -0.6466114737 1.17413298 1.608254703 -0.85554303 #> 46 0.17128313 -0.555291528 0.0207166523 1.17813099 -0.233572660 2.34771729 #> 47 -0.75215356 0.036696533 0.9185275847 0.13865068 -0.298460887 0.34623804 #> 48 1.35034242 3.773621166 0.6022837339 -0.57299353 0.150765299 3.36185968 #> 49 -0.81286393 -0.784014188 -0.2526466764 -0.83686161 2.242162765 -0.80193198 #> 50 -0.60836584 0.574867497 -0.4214887324 -0.80087960 -0.148718825 -0.70364505 #> 51 0.16489256 2.014474827 -0.6787718653 0.28257872 2.297068188 -0.56514983 #> 52 0.63140383 0.161148568 0.2860398831 -0.21717142 0.400335403 -0.23454835 #> 53 0.38856234 -0.800832031 -0.6680517347 3.36503763 0.055928660 0.69917746 #> 54 -0.76173940 3.867801085 -0.6787718653 -0.23716143 -0.617910619 -0.87341338 #> 55 -0.54765547 -0.689834269 1.0686094123 -0.25315343 -0.792609691 -0.73045058 #> 56 2.00537536 1.789115735 -0.6787718653 -0.62496754 1.618237507 -0.87341338 #> 57 -0.78730166 -0.296296752 -0.1856458606 0.29857073 0.794656166 -0.46686291 #> 58 0.91258870 -0.800832031 -0.6734118000 -0.88483762 -0.867480722 -0.84660785 #> 59 -0.10990175 -0.565382234 -0.6760918326 0.78232887 0.150765299 -0.77065886 #> 60 -0.80966864 -0.403930944 0.2833598504 -0.08123938 -0.043899381 -0.18540489 #> 61 -0.27925173 -0.659562152 -0.5045697441 -0.58098953 -0.692781650 0.40431668 #> 62 0.74004343 -0.091119071 -0.6573316042 0.29857073 -0.423245938 0.12285866 #> 63 -0.33037625 -0.333296005 -0.4884895482 0.41451276 -0.742695670 -0.39538150 #> 64 -0.46777340 -0.054119818 1.5965758412 -0.61297354 -0.538048186 -0.44005738 #> 65 -0.80008280 -0.777287051 -0.5769306252 -0.73691158 -0.353366309 -0.82873750 #> 66 -0.72659129 0.450415461 1.5670954822 -0.02126937 -0.508099773 0.69470987 #> 67 -0.66588092 -0.696561406 1.1248900976 -0.75290358 -0.498116969 -0.62322847 #> 68 0.06583880 0.362962680 -0.6787718653 0.10666667 -0.218598454 0.82426992 #> 69 0.25755576 -0.575472939 0.0448369460 -0.42506748 -0.687790248 -0.66790435 #> 70 1.16821131 0.188057116 -0.1320452079 -0.08923539 -0.288478082 -0.12732625 #> 71 0.02430012 0.140967157 -0.6707317674 0.50646679 0.709802331 2.57556426 #> 72 0.12335389 -0.461111609 0.6451642560 -0.36109947 -0.273503876 -0.38197874 #> 73 -0.16741683 -0.175208285 -0.1722456974 -0.62896554 -0.558013794 0.16753454 #> 74 0.12335389 -0.040665543 -0.2392465133 0.19862070 0.020988846 -0.68130711 #> 75 -0.48055453 -0.683107132 -0.3116073944 -0.53701152 -0.188650041 -0.02457173 #> 76 -0.48694510 -0.804195599 -0.0302039678 -0.04525737 -0.518082577 0.55621466 #> 77 -0.84162147 -0.558655097 -0.6117710494 -0.80087960 0.205670722 -0.42218703 #> 78 0.86785474 0.053514375 0.1654384145 -0.88083962 2.322025198 -0.86001062 #> 79 -0.62114697 -0.498110863 -0.3089273618 -0.54500752 -0.712747258 -0.15413177 #> 80 -0.90233184 -0.797468462 -0.6760918326 -0.88483762 0.649905506 2.71405948 #> 81 -0.50611679 -0.716742817 -0.6707317674 -0.75290358 -0.852506516 1.21741763 #> 82 -0.08433949 -0.366931691 -0.6787718653 0.05869066 -0.328409299 -0.87788097 #> 83 0.65377080 -0.155026873 -0.6600116368 0.49847079 1.488461053 0.78406163 #> 84 -0.81925449 0.231783507 -0.6787718653 0.63040483 -0.308443691 -0.84660785 #> 85 -0.71381016 -0.753742071 -0.1427653384 -0.71692157 -0.882454928 -0.86894579 #> 86 -0.88316014 0.322599857 -0.6734118000 1.87378319 -0.533056784 1.00744101 #> 87 0.13293973 -0.477929452 -0.6707317674 -0.03326337 1.223916744 2.28070348 #> 88 -0.35274323 -0.400567376 -0.4482890587 -0.66494756 -0.418254536 -0.22114559 #> 89 -0.12587816 -0.141572599 0.5138426570 -0.60097954 -0.458185753 0.30602975 #> 90 1.82643953 -0.575472939 -0.3866483081 -0.34910546 -0.088822000 1.92776406 #> 91 -0.90233184 -0.804195599 -0.5983708863 -0.71692157 -0.707755856 -0.41325185 #> 92 -0.36871964 -0.494747295 -0.0516442289 -0.32111945 -0.702764454 -0.80193198 #> 93 -0.14824514 -0.800832031 0.0930775334 4.84030006 3.445090663 -0.35964080 #> 94 -0.86718373 -0.091119071 1.1329301955 0.40651676 0.305498763 -0.60089053 #> 95 -0.21215079 -0.380385965 0.2377992956 -0.53701152 -0.707755856 0.12732625 #> 96 -0.55724132 -0.464475178 -0.6787718653 -0.87284362 -0.538048186 -0.87788097 #> 97 3.31863652 3.736621913 -0.6734118000 0.94624692 2.671423343 -0.50707119 #> 98 -0.75215356 -0.535110117 -0.6787718653 -0.50102951 -0.013950969 1.64630604 #> 99 0.78477739 -0.804195599 0.2699596873 0.65039483 0.510146248 -0.67237194 #> 100 1.54844994 -0.800832031 -0.6787718653 -0.88483762 -0.667824639 -0.73938575 #> 101 2.03732818 -0.030574838 0.2511994588 -0.60897554 -0.098804804 1.15040381 #> 102 -0.88316014 2.815004136 -0.3330476555 1.66588713 -0.937360351 -0.31049734 #> 103 -0.41664888 2.848639821 -0.6787718653 1.03820094 -0.443211546 -0.15859936 #> 104 -0.39747718 0.262055624 0.0501970112 2.35754133 -0.268512474 -0.13179383 #> 105 -0.44860171 0.191420685 0.2404793283 0.11466267 -0.533056784 2.22262484 #> 106 -0.77771582 -0.730197092 0.8381266057 -0.82486760 -0.942351753 0.84214027 #> 107 -0.39428190 -0.020484132 1.2026110440 -0.52501751 -0.712747258 0.19434006 #> 108 0.83590191 -0.538473686 -0.3384077207 -0.50502751 -0.363349113 -0.66343676 #> 109 -0.28883757 0.066968649 1.6716167550 -0.57699153 -0.657841835 -0.76172369 #> 110 -0.73298186 -0.340023142 0.0475169786 -0.20517742 -0.707755856 -0.41325185 #> 111 -0.47096868 -0.518292274 -0.2794470028 -0.42906549 -0.043899381 -0.25688629 #> 112 -0.89913656 0.194784253 -0.6760918326 -0.46104950 -0.957325959 -0.87788097 #> 113 0.46524913 0.369689817 -0.6787718653 -0.41707148 0.530111856 -0.33283528 #> 114 -0.87357430 -0.582200076 -0.0007236088 -0.64095955 -0.702764454 -0.39091392 #> 115 -0.36232907 -0.363568122 -0.2499666438 -0.07324338 0.270558949 -0.10498831 #> 116 0.08501049 0.921315055 -0.3276875902 -0.61297354 0.125808289 2.32091177 #> 117 1.88395462 0.009787984 2.6230283401 -0.88083962 1.203951135 -0.81980233 #> 118 -0.89594127 1.405668923 1.9905406385 -0.46104950 -0.867480722 0.22561317 #> 119 -0.58599886 0.151057863 -0.6734118000 -0.60897554 1.628220311 0.02010414 #> 120 2.31851306 3.225359496 -0.3357276881 3.15314357 0.365395588 0.14519660 #> 121 -0.19617438 -0.713379249 0.2377992956 -0.79288359 -0.927377547 0.19434006 #> 122 0.04027654 -0.454384472 0.5084825917 0.21861070 0.020988846 -0.05584485 #> 123 -0.83203562 0.053514375 -0.6787718653 0.16663669 -0.882454928 -0.80193198 #> 124 1.19696885 0.201511390 0.0421569133 0.49447279 -0.632884825 0.15413177 #> 125 -0.02682440 -0.111300483 -0.6707317674 -0.08923539 -0.108787608 0.07371520 #> 126 -0.41984416 -0.521655843 -0.1508054363 -0.20917542 -0.113779010 0.12732625 #> 127 -0.31439983 -0.259297498 -0.6626916695 0.65039483 0.844570187 -0.73045058 #> 128 -0.50292151 2.169198979 0.0582371091 -0.09323339 1.193968331 -0.84214027 #> 129 -0.84162147 -0.171844716 -0.2338864480 -0.83686161 -0.822558104 -0.81980233 #> 130 -0.64670923 -0.370295259 -0.6787718653 -0.45705149 -0.682798846 -0.87788097 #> 131 -0.04599609 1.752116482 -0.6760918326 0.94624692 1.533383672 0.28815940 #> 132 0.82631607 -0.138209031 -0.6760918326 -0.44505749 0.160748104 -0.86894579 #> 133 0.43329630 0.235147076 3.7084415570 -0.53701152 -0.563005196 -0.46686291 #> 134 -0.27286116 -0.575472939 -0.0543242615 -0.45705149 -0.672816042 -0.47133049 #> 135 2.29934136 3.295994435 0.5835235055 1.39802105 -0.538048186 -0.72598299 #> 136 -0.49014038 -0.414021650 0.3369605031 -0.55300352 0.440266619 -0.20327524 #> 137 -0.07475364 -0.498110863 -0.1963659911 -0.53701152 -0.752678475 0.19434006 #> 138 -0.53167905 -0.420748787 -0.6787718653 -0.36909547 -0.882454928 0.56068225 #> 139 -0.87037901 -0.696561406 -0.3893283408 -0.35710146 0.919441218 -0.82873750 #> 140 -0.88955071 1.153401284 -0.1052448815 -0.87684162 1.832867796 -0.87788097 #> 141 -0.74256771 -0.646107878 -0.6787718653 0.21461270 -0.832540908 -0.68130711 #> 142 -0.81286393 -0.740287797 -0.1963659911 -0.83286360 -0.947343155 -0.77959404 #> 143 -0.52209321 -0.740287797 -0.4080885692 -0.70492757 -0.762661279 -0.87341338 #> 144 -0.78410638 -0.528382980 -0.0328840004 -0.74490758 -0.777635485 -0.53387672 #> 145 -0.81925449 -0.666289290 -0.6707317674 -0.88483762 -0.907411939 0.97170031 #> 146 -0.89594127 -0.625926467 -0.4992096788 -0.10122939 -0.243555464 -0.35070563 #> 147 1.67945653 -0.800832031 3.4377582610 -0.88483762 -0.927377547 -0.87788097 #> 148 -0.87357430 -0.350113848 -0.6760918326 -0.19318342 -0.847515114 0.23901594 #> 149 -0.14504986 3.423810040 -0.6573316042 -0.82886560 -0.937360351 -0.86894579 #> 150 -0.54765547 -0.686470701 -0.4911695809 -0.88083962 -0.957325959 -0.87788097 #> 151 0.95732265 -0.740287797 -0.6707317674 -0.52501751 -0.677807444 0.73045058 #> 152 0.12974445 0.023242259 0.2538794914 0.43050477 -0.852506516 -0.36410839 #> 153 -0.88316014 -0.760469208 -0.0570042941 -0.69693156 -0.083830598 -0.60089053 #> 154 -0.87676958 0.181329979 -0.6787718653 -0.58098953 -0.907411939 -0.74385334 #> 155 -0.22493192 0.299054878 -0.6760918326 0.60641682 1.154037115 2.12880550 #> 156 1.17460187 -0.185298990 3.1563548344 0.87028490 0.120816887 -0.01116897 #> 157 -0.85440260 -0.800832031 -0.6707317674 -0.87684162 -0.947343155 -0.87788097 #> 158 -0.26966588 -0.235752518 -0.6653717021 -0.02526737 -0.123761814 -0.39091392 #> 159 -0.77452053 -0.740287797 2.1754628902 -0.10122939 -0.822558104 -0.10945590 #> 160 -0.57002245 2.630007867 2.0468213238 2.70936544 -0.797601093 1.57035705 #> 161 -0.85440260 -0.269388203 1.2990922188 -0.24115943 0.001023237 0.19434006 #> 162 -0.75854412 0.164512137 -0.6787718653 -0.76889559 0.979338042 -0.51600637 #> 163 0.34063311 -0.269388203 1.1945709461 -0.40107948 0.020988846 0.41325185 #> 164 1.27046035 2.199471096 0.4656020696 -0.85685161 -0.303452289 -0.74832093 #> 165 -0.10031590 1.190400537 0.2243991325 0.93825092 -0.353366309 -0.50260361 #> 166 -0.89913656 -0.656198584 -0.6653717021 3.16913557 -0.957325959 -0.52047396 #> 167 -0.61156112 -0.639380741 -0.2044060890 -0.73691158 0.110834083 -0.74832093 #> 168 -0.89913656 -0.141572599 1.6823368855 0.02270865 2.371939219 -0.78406163 #> 169 0.02430012 0.282237035 1.0257288901 2.52145938 0.714793733 0.36857598 #> 170 -0.84162147 -0.804195599 -0.6760918326 -0.88483762 -0.952334557 6.31046751 #> 171 -0.58919414 -0.356840985 0.3021200789 -0.34110946 -0.338392103 0.06924761 #> 172 -0.45179699 -0.511565137 -0.0650443920 -0.63296355 -0.617910619 -0.69470987 #> 173 -0.61795169 -0.356840985 -0.4992096788 -0.30112945 -0.717738660 -0.81086715 #> 174 0.44927271 0.803590157 -0.6760918326 0.21861070 0.450249423 0.52940913 #> 175 0.46205385 -0.158390442 -0.6760918326 -0.47304350 -0.637876227 0.72598299 #> 176 0.81673022 0.019878690 -0.6734118000 -0.09723139 0.370386990 0.38644633 #> 177 -0.41345360 -0.659562152 -0.0757645226 -0.40107948 -0.862489320 0.01563656 #> 178 -0.90233184 -0.797468462 -0.6787718653 1.03420294 0.170730908 2.49514768 #> 179 -0.82884034 -0.252570361 4.7054136970 0.72635685 3.634763942 -0.78852921 #> 180 -0.47096868 -0.706652112 -0.6626916695 -0.68493756 -0.927377547 -0.44899256 #> 181 -0.02362912 -0.760469208 -0.3678880797 -0.82886560 -0.872472124 -0.81533474 #> 182 -0.74256771 -0.625926467 0.9506879764 -0.01727137 -0.727721464 0.10945590 #> 183 -0.22812720 1.583938055 -0.1910059258 0.38252875 -0.652850433 -0.01116897 #> 184 -0.72020073 -0.706652112 -0.6358913431 -0.68493756 -0.518082577 -0.54727948 #> 185 -0.25049418 0.938132898 -0.6787718653 0.13065468 -0.782626887 -0.16753454 #> 186 -0.80966864 -0.733560660 -0.2285263827 -0.86884562 -0.912403341 -0.81980233 #> 187 -0.89913656 -0.797468462 -0.6787718653 1.25409301 2.312042394 -0.86894579 #> 188 2.32490362 0.023242259 -0.6171311147 -0.87684162 -0.008959567 -0.83320509 #> 189 -0.89274599 -0.784014188 -0.3678880797 -0.88083962 -0.882454928 -0.84214027 #> 190 -0.31120455 -0.548564391 -0.3518078839 -0.39308348 -0.777635485 0.04690967 #> 191 2.05330460 0.921315055 0.9453279111 -0.26914544 -0.208615649 2.93297127 #> 192 0.66335665 0.070332218 -0.6787718653 0.32655674 -0.368340516 -0.35070563 #> 193 1.15862546 -0.131481894 -0.3062473291 -0.44905549 -0.563005196 0.07371520 #> 194 2.01815649 -0.121391188 -0.3625280144 -0.44505749 0.899475609 -0.36857598 #> 195 -0.59877999 -0.498110863 0.9346077806 -0.28113944 -0.767652681 -0.58302018 #> 196 -0.89913656 -0.457748041 6.7583186948 -0.02926537 0.035963052 -0.68577470 #> 197 -0.21215079 -0.757105640 -0.4429289935 0.58242881 -0.737704268 0.90468649 #> 198 0.46844441 3.948526730 -0.6600116368 1.81381317 0.609974290 -0.86894579 #> 199 2.88088280 -0.797468462 -0.6064109842 -0.01727137 3.969187880 0.98957066 #> 200 -0.20576023 0.147694294 -0.5126098420 -0.88083962 0.210662124 0.80193198 #> Otu00012 Otu00013 Otu00014 Otu00015 Otu00016 #> 1 -0.025998508 1.524303976 1.671766383 1.2574976512 -0.49503122 #> 2 0.413356123 -0.551251533 0.971673864 0.3058058591 -0.50331257 #> 3 -0.678984290 -0.980085316 0.007910136 -0.6658514951 -0.47570808 #> 4 -0.654799631 -0.842858506 -0.542162557 -0.4795062491 -0.50055212 #> 5 -0.674953513 0.517974032 -0.792195600 -0.9653349262 0.34414511 #> 6 -0.267845094 1.215543652 2.385497069 0.4056336694 -0.10856844 #> 7 -0.638676525 0.323569384 -0.737642936 -0.4928166238 -0.39841553 #> 8 -0.058244719 0.294980465 -0.437603285 -0.6059548089 3.78366388 #> 9 -0.038090837 -0.219620075 0.953489643 -0.9453693641 -0.32940431 #> 10 0.183601866 -0.699913911 -0.751281102 -0.8921278653 0.16471596 #> 11 0.425448452 0.392182789 -0.792195600 0.6585307890 -0.50331257 #> 12 -0.211414224 -0.911471911 2.280937796 0.0861846763 1.72712978 #> 13 -0.199321895 -0.534098182 0.485245945 -0.7457137434 -0.49503122 #> 14 -0.610461090 -0.671324993 0.958035698 0.1327709878 -0.12237068 #> 15 -0.501630127 -0.717067263 0.262489234 -0.0136431341 -0.47294763 #> 16 -0.509691680 -0.339693534 -0.769465323 -0.5260925606 -0.19690279 #> 17 -0.404891492 -0.008062075 1.294443792 -0.4795062491 -0.50331257 #> 18 -0.582245655 -0.162442237 1.358088566 1.4970843961 0.02669354 #> 19 -0.457291586 -0.791398452 0.812561928 1.1044283420 -0.37633194 #> 20 -0.594337985 0.643765275 -0.010274085 0.0928398636 1.70504620 #> 21 -0.707199725 0.020526844 -0.792195600 -0.9520245515 -0.47846853 #> 22 -0.030029284 0.312133816 -0.342136123 2.6883629331 0.29997793 #> 23 0.683418144 -0.585558236 0.262489234 0.8981175339 -0.50331257 #> 24 0.687448920 0.981114517 -0.783103489 -0.2066435675 0.39935408 #> 25 0.598771839 -0.231055642 1.771779600 -0.4329199376 -0.36252970 #> 26 -0.654799631 -0.379718020 0.635265770 -0.7390585561 0.90727659 #> 27 -0.767661371 -1.008674235 -0.792195600 -0.9520245515 -0.48674988 #> 28 0.107017114 -0.705631695 -0.701274494 -0.7257481813 -0.49779167 #> 29 0.175540314 -0.448331426 -0.692182383 -0.6725066825 -0.48122898 #> 30 -0.699138172 -0.934343046 0.080647021 -0.8521967411 -0.50331257 #> 31 -0.328306740 0.060551330 1.680858494 -0.9586797389 -0.24383041 #> 32 -0.650768855 0.357876086 -0.792195600 1.2308769018 -0.04507812 #> 33 2.170774647 1.209825868 -0.387596677 -0.2865058157 -0.50331257 #> 34 0.119109444 -1.014392019 -0.792195600 -0.0668846329 -0.30456028 #> 35 -0.142891024 -0.168160021 2.430957622 -0.1800228180 0.33586376 #> 36 -0.529845562 2.467738298 -0.778557434 -0.5726788721 2.28474037 #> 37 -0.618522643 1.204108084 0.894390924 -0.0202983214 -0.20242369 #> 38 -0.723322831 0.500820681 -0.792195600 -0.9120934274 2.25437544 #> 39 -0.308152858 0.923936680 -0.164839966 -0.0469190709 -0.01471319 #> 40 0.687448920 -0.785680668 1.117147634 0.1327709878 -0.50055212 #> 41 1.594373617 1.095470193 -0.687636328 2.3422931905 -0.05888037 #> 42 -0.437137704 -0.762809533 -0.196662353 -0.5527133100 -0.36529014 #> 43 -0.046152390 1.850217651 -0.787649545 0.3723577327 -0.13893337 #> 44 0.542340969 -0.088111048 0.062462800 -0.2931610031 -0.45914539 #> 45 -0.416983822 0.849605491 -0.096649136 -0.8721623032 0.21164358 #> 46 2.731052571 0.180624789 -0.487609894 1.3173943374 -0.50331257 #> 47 1.062311128 0.489385113 0.594351273 0.1727021119 -0.13617292 #> 48 -0.066306272 3.559834998 -0.628537609 -0.9520245515 -0.18586099 #> 49 -0.646738078 -0.871447425 0.553436775 -0.8056104296 -0.18862144 #> 50 -0.130798695 -0.705631695 1.571753166 -0.4595406870 -0.50331257 #> 51 0.006247703 -0.145288886 -0.778557434 -0.3730232514 0.66435713 #> 52 0.489940875 -0.414024723 0.489792000 3.0677086125 0.06533981 #> 53 -0.622553420 0.695225329 -0.687636328 -0.9520245515 -0.50055212 #> 54 3.315515153 -0.997238668 -0.778557434 -0.9387141768 0.70024296 #> 55 -0.570153326 -0.734220614 1.735411158 -0.4528854997 -0.35424835 #> 56 -0.098552484 2.307640352 -0.783103489 0.8714967845 1.32410431 #> 57 -0.683015066 -0.677042776 0.026094358 -0.0202983214 -0.11961023 #> 58 -0.739445936 -1.014392019 -0.783103489 -0.9586797389 -0.48122898 #> 59 0.240032736 1.221261436 0.048824634 -0.3530576893 0.45732350 #> 60 -0.545968668 0.878194409 0.471607779 0.5986341028 -0.23830952 #> 61 -0.678984290 -0.053804345 -0.792195600 -0.2332643169 -0.01747364 #> 62 -0.683015066 -0.791398452 -0.037550417 -0.4196095629 -0.32112297 #> 63 -0.009875402 -0.757091749 -0.733096881 -0.2399195042 -0.13617292 #> 64 -0.578214879 -0.596993804 -0.787649545 -0.9054382400 -0.50331257 #> 65 -0.755569042 -1.014392019 -0.483063838 -0.9387141768 -0.27143490 #> 66 2.138528435 1.753015327 -0.623991553 -0.8721623032 -0.28799759 #> 67 -0.497599350 -0.368282453 -0.483063838 -0.7656793055 0.66435713 #> 68 -0.352491399 -0.528380398 1.198976630 -0.0003327594 0.05429802 #> 69 -0.102583260 -0.396871372 -0.792195600 2.5352936239 -0.49779167 #> 70 -0.421014598 -0.385435804 1.417187285 2.0228441973 -0.48951032 #> 71 -0.489537798 1.158365814 0.357956396 -0.1800228180 -0.50331257 #> 72 -0.574184103 0.003373492 -0.792195600 2.9346048653 -0.12513113 #> 73 -0.545968668 -0.133853318 0.862568536 1.2042561523 -0.28799759 #> 74 -0.203352671 0.071986898 0.467061724 1.5370155202 0.84102583 #> 75 -0.493568574 -0.351129101 2.640076167 0.0262879901 -0.48674988 #> 76 2.344098033 -1.014392019 0.953489643 -0.6192651836 -0.27143490 #> 77 -0.654799631 -0.494073696 -0.778557434 -0.1999883801 -0.48951032 #> 78 2.194959305 -1.014392019 -0.787649545 3.3339161068 -0.50055212 #> 79 -0.489537798 -0.768527317 0.621627604 0.4854959177 0.23648762 #> 80 -0.731384383 2.416278244 -0.787649545 -0.9387141768 -0.50331257 #> 81 -0.719292054 -0.762809533 -0.437603285 0.6452204143 -0.31836252 #> 82 -0.558060997 0.346440519 -0.792195600 -0.9653349262 -0.50331257 #> 83 -0.574184103 0.986832301 -0.210300519 1.8431541387 -0.01747364 #> 84 0.514125534 -0.842858506 -0.587623111 -0.9520245515 -0.50055212 #> 85 -0.650768855 -0.814269587 -0.469425672 -0.1667124433 -0.50331257 #> 86 0.514125534 -1.014392019 -0.792195600 -0.9387141768 -0.06992216 #> 87 -0.392799163 0.552280735 1.153516077 1.2841184006 -0.43706180 #> 88 -0.441168480 -0.471202561 -0.792195600 0.6052892902 -0.41221777 #> 89 -0.231568106 0.134882519 1.435371507 -0.1334365065 -0.50331257 #> 90 0.280340501 3.136718999 0.989858085 -0.1134709444 -0.50331257 #> 91 -0.674953513 0.026244628 -0.792195600 0.8648415971 -0.47294763 #> 92 -0.320245187 0.043397979 0.639811826 2.3223276284 -0.44534315 #> 93 1.646773711 -0.133853318 -0.792195600 0.5387374166 -0.23002817 #> 94 -0.610461090 -0.842858506 0.357956396 -0.6858170572 1.29926027 #> 95 -0.340399070 -0.516944831 2.621891945 -0.4728510618 -0.47294763 #> 96 -0.767661371 -1.014392019 -0.792195600 -0.7190929940 -0.44534315 #> 97 -0.263814317 2.730756352 -0.792195600 -0.6458859330 1.69400440 #> 98 -0.287998976 -0.196748940 1.176246353 -0.2066435675 -0.16929830 #> 99 5.508257532 1.512868408 -0.769465323 -0.8721623032 -0.43154091 #> 100 -0.751538266 -1.014392019 -0.783103489 -0.0003327594 -0.50055212 #> 101 -0.582245655 0.112011384 -0.764919268 0.2459091729 3.51866083 #> 102 -0.203352671 -0.728502830 -0.755827157 -0.9520245515 -0.48122898 #> 103 2.106282224 -0.196748940 -0.701274494 0.5786685407 -0.50331257 #> 104 -0.421014598 0.134882519 -0.792195600 -0.9453693641 1.54494019 #> 105 -0.263814317 0.300698249 2.976484260 0.1660469246 -0.50331257 #> 106 -0.683015066 -0.202466723 0.903483034 -0.6725066825 0.38279139 #> 107 -0.267845094 -0.202466723 -0.251215017 1.0578420305 -0.18034010 #> 108 0.312586712 -0.276797912 -0.787649545 1.0179109063 -0.44810360 #> 109 0.115078667 -0.522662615 -0.751281102 -0.6325755583 -0.50055212 #> 110 -0.646738078 -0.133853318 -0.651267885 -0.6658514951 -0.07268261 #> 111 -0.570153326 -0.516944831 2.126371915 0.3989784821 -0.01195274 #> 112 0.288402054 -0.322540183 -0.792195600 1.1510146535 -0.40393642 #> 113 -0.412953045 -0.665607209 0.685272379 2.0960512583 -0.41773866 #> 114 -0.662861184 -0.762809533 -0.664906051 0.6252548522 -0.26867445 #> 115 -0.433106927 -0.333975750 1.989990256 1.0844627799 -0.28523714 #> 116 -0.392799163 -0.030933210 -0.646721830 0.4056336694 -0.20794458 #> 117 -0.425045375 -0.591276020 -0.792195600 -0.7656793055 0.21716448 #> 118 -0.521784009 -0.282515696 0.271581345 -0.1933331927 0.04049578 #> 119 0.151355655 -0.625582722 2.549155060 1.6434985179 -0.50055212 #> 120 -0.231568106 0.603740788 -0.792195600 -0.8588519285 0.26409210 #> 121 -0.703168948 -0.848576290 -0.133017579 -0.3197817525 -0.50055212 #> 122 0.941387835 1.284157057 0.062462800 1.2109113397 -0.27971624 #> 123 -0.594337985 -1.014392019 -0.410326953 -0.7324033687 -0.49227077 #> 124 -0.493568574 1.186954733 0.307949787 2.1958790686 2.14947840 #> 125 0.933326283 0.409336140 -0.573984945 0.8781519718 -0.46466629 #> 126 -0.421014598 0.746685383 1.939983647 0.2392539855 -0.48122898 #> 127 -0.296060529 -0.728502830 -0.092103081 -0.5460581227 -0.47294763 #> 128 -0.723322831 -0.882882992 -0.724004770 -0.9187486147 -0.45914539 #> 129 0.006247703 -0.968649749 -0.323951902 -0.7856448676 -0.36529014 #> 130 -0.404891492 -0.568404885 2.108187694 -0.8388863664 -0.50331257 #> 131 0.058647797 -0.242491210 -0.351228234 0.7982897235 0.86034897 #> 132 -0.445199257 1.524303976 -0.787649545 0.4389096062 -0.13065203 #> 133 0.264217395 0.129164735 -0.605807332 -0.7923000549 -0.20242369 #> 134 -0.199321895 -0.151006669 3.244701524 0.1527365499 -0.50331257 #> 135 0.393202241 4.720545104 -0.783103489 -0.7324033687 -0.32388342 #> 136 0.123140220 -0.002344291 -0.273945294 0.4189440442 -0.36805059 #> 137 -0.038090837 0.792427653 1.785417766 -0.9453693641 -0.50331257 #> 138 3.795177548 -0.145288886 1.271713515 0.5919789155 -0.50331257 #> 139 -0.723322831 -0.934343046 -0.623991553 -0.8322311791 1.30478117 #> 140 0.824495319 -1.008674235 1.008042307 1.8564645134 -0.49503122 #> 141 0.868833860 -0.213902291 -0.442149340 -0.7324033687 -0.50331257 #> 142 -0.735415160 -0.962931965 -0.037550417 -0.8521967411 -0.45362449 #> 143 -0.723322831 -0.922907479 0.671634213 -0.7590241181 -0.30732073 #> 144 -0.598368761 -0.562687101 -0.696728438 0.1527365499 -0.35424835 #> 145 -0.658830408 -1.002956451 -0.783103489 3.7132617861 -0.41497822 #> 146 -0.638676525 -0.837140722 -0.783103489 -0.7457137434 -0.50331257 #> 147 -0.634645749 -1.008674235 -0.787649545 -0.9653349262 -0.50055212 #> 148 -0.715261278 -0.837140722 0.507976221 -0.8189208043 -0.11408933 #> 149 0.921233953 -0.940060830 -0.423965119 -0.8921278653 -0.50331257 #> 150 -0.106614037 -1.014392019 -0.792195600 -0.9653349262 -0.50331257 #> 151 -0.416983822 -0.408306939 -0.223938685 -0.3131265652 -0.42049911 #> 152 3.017237697 0.180624789 -0.546708613 0.4122888568 -0.41773866 #> 153 -0.566122550 -0.922907479 2.344582571 0.1993228614 -0.50331257 #> 154 -0.344429846 -1.014392019 -0.664906051 -0.9586797389 1.93140297 #> 155 1.134865104 -0.614147155 -0.783103489 1.1310490914 -0.45638494 #> 156 1.219511409 -0.419742507 -0.319405847 -0.9586797389 -0.44534315 #> 157 -0.767661371 1.890242137 -0.783103489 -0.9653349262 -0.49779167 #> 158 4.012839476 2.439149379 -0.351228234 0.1727021119 -0.49779167 #> 159 0.514125534 -0.968649749 -0.787649545 -0.8255759917 0.72232655 #> 160 0.485910099 0.929654463 -0.583077055 -0.4994718112 -0.16377741 #> 161 -0.715261278 1.106905760 -0.792195600 0.6984619132 -0.50331257 #> 162 -0.731384383 0.603740788 -0.792195600 1.6368433306 0.95144377 #> 163 -0.594337985 0.780992085 -0.687636328 0.0129776153 -0.48674988 #> 164 -0.545968668 0.060551330 -0.528524391 1.2907735880 -0.49227077 #> 165 -0.477445468 2.216155812 -0.787649545 -0.6791618698 2.69604719 #> 166 -0.646738078 -1.008674235 -0.792195600 -0.9653349262 -0.49503122 #> 167 -0.529845562 -0.431178074 0.017002247 0.9912901569 -0.45914539 #> 168 0.961541718 -1.002956451 -0.792195600 -0.8987830526 -0.49503122 #> 169 0.308555936 -0.682760560 -0.746735047 -0.8189208043 0.49596977 #> 170 -0.634645749 -1.008674235 -0.419419064 -0.9387141768 -0.02299454 #> 171 -0.469383915 -0.499791479 2.426411566 0.0861846763 -0.38185283 #> 172 0.183601866 -0.871447425 -0.755827157 -0.6991274319 8.63929272 #> 173 -0.191260342 -0.854294073 -0.792195600 -0.9520245515 1.62499319 #> 174 1.155018986 -0.299669047 -0.787649545 0.0395983648 -0.38737373 #> 175 0.227940407 0.981114517 0.021548302 0.7117722879 -0.32112297 #> 176 -0.384737610 0.186342573 -0.774011379 -0.9254038021 -0.50331257 #> 177 -0.541937891 -0.791398452 0.785285596 0.2126332361 -0.50331257 #> 178 1.183234421 0.352158303 -0.701274494 0.5254270419 1.07566395 #> 179 -0.235598882 -0.213902291 -0.792195600 -0.9320589894 -0.50055212 #> 180 -0.751538266 -0.677042776 -0.787649545 0.8714967845 -0.23830952 #> 181 -0.122737142 -0.728502830 -0.628537609 0.0994950510 -0.50055212 #> 182 -0.150952577 -0.048086562 -0.714912660 -0.6791618698 -0.44534315 #> 183 -0.469383915 0.094858033 -0.533070447 0.3257714212 0.23372717 #> 184 -0.654799631 -0.877165208 -0.619445498 -0.2399195042 -0.40669687 #> 185 -0.271875870 0.060551330 -0.787649545 -0.2665402537 -0.50331257 #> 186 -0.715261278 -0.962931965 0.648903936 2.6218110595 -0.48951032 #> 187 1.803973992 0.918218896 -0.655813940 -0.9653349262 3.58767204 #> 188 -0.545968668 0.415053924 -0.792195600 -0.8721623032 2.44484638 #> 189 -0.038090837 -0.940060830 -0.660359996 -0.8455415538 -0.50331257 #> 190 -0.638676525 -0.333975750 0.007910136 0.3856681074 0.21992493 #> 191 0.078801679 2.730756352 -0.678544217 -0.7324033687 -0.48674988 #> 192 -0.416983822 0.094858033 -0.792195600 -0.9586797389 -0.48398943 #> 193 -0.400860716 1.152648031 2.117279805 -0.1667124433 0.36070780 #> 194 4.726286904 -0.191031156 -0.683090272 -0.7190929940 0.57602278 #> 195 -0.154983354 -0.516944831 2.149102192 -0.2598850663 -0.41221777 #> 196 0.631018050 0.317851600 -0.792195600 -0.9653349262 -0.50331257 #> 197 1.195326751 0.826734356 0.821654039 -0.7390585561 -0.50331257 #> 198 -0.719292054 2.136106839 -0.792195600 -0.6458859330 -0.13341247 #> 199 -0.497599350 1.381359381 0.280673455 -0.8056104296 0.18403910 #> 200 -0.283968200 1.124059112 0.703456600 1.6501537053 -0.44258270 #> Otu00017 Otu00018 Otu00019 Otu00020 Otu00021 Otu00022 #> 1 0.47611468 0.399615523 0.55293856 0.554816232 -0.35537010 1.647612103 #> 2 -0.32110972 -0.679309939 0.61541514 -0.360008658 0.15159833 -0.375705829 #> 3 0.49083266 -0.679309939 -0.13846893 -0.529188603 -0.63100342 -0.081618920 #> 4 -0.26714376 0.030253653 0.08644676 -0.266019799 0.74224116 -0.187490207 #> 5 -0.52961456 -0.674449915 -0.64244668 -0.685836701 -0.63100342 -0.367863511 #> 6 3.30687454 -0.008626544 -0.08432256 0.172594874 0.78161735 -0.356100035 #> 7 -0.50263159 -0.518929127 -0.52165862 -0.403870125 -0.63100342 -0.026722697 #> 8 -0.53452056 0.419055622 0.69871725 -0.027914691 -0.60639331 -0.207096001 #> 9 1.29296306 -0.679309939 0.29053693 -0.673304853 -0.63100342 0.141887131 #> 10 -0.52225557 -0.436308709 -0.03017619 0.918239819 -0.52271890 -0.281598018 #> 11 -0.53452056 -0.679309939 -0.35921951 1.005962753 -0.63100342 -0.383548146 #> 12 2.65928302 -0.664729865 -0.21344082 -0.641975234 0.46660784 -0.273755700 #> 13 -0.44375963 -0.650149792 -0.64244668 -0.522922680 -0.62608140 -0.371784670 #> 14 0.94709032 -0.120407110 -0.34255909 -0.479061212 3.60193686 -0.277676859 #> 15 0.68216652 -0.280787922 -0.30923825 -0.585581919 -0.11911297 -0.360021194 #> 16 -0.53452056 2.304745168 -0.35921951 1.087419764 -0.62608140 -0.301203812 #> 17 2.23246135 -0.674449915 -0.23426635 -0.535454527 0.23035070 -0.340415400 #> 18 1.37881799 0.146894244 0.02813529 -0.165765017 0.69302092 -0.163963254 #> 19 0.70914950 0.137174194 0.40299477 -0.159499093 -0.16341118 0.185019877 #> 20 -0.50508458 2.960848490 -0.39670546 -0.234690180 -0.61623735 0.628110819 #> 21 -0.53452056 -0.664729865 -0.63828157 -0.679570777 -0.62115938 -0.379626987 #> 22 -0.53206756 0.224654637 0.28637182 0.673868786 -0.47842069 -0.367863511 #> 23 -0.53452056 0.278114908 0.60291983 2.033574274 -0.63100342 -0.003195744 #> 24 -0.52716157 -0.674449915 -0.64244668 -0.485327136 -0.62115938 -0.379626987 #> 25 -0.35299870 1.157779362 0.69455215 0.254051885 0.41738760 0.185019877 #> 26 2.12943543 0.900198058 -0.44668673 -0.604379690 -0.23231951 -0.352178876 #> 27 -0.53452056 -0.669589890 -0.64244668 -0.685836701 -0.63100342 -0.379626987 #> 28 -0.53452056 -0.679309939 5.46359780 2.321806774 -0.63100342 -0.336494241 #> 29 -0.51489658 -0.674449915 -0.38004504 0.442029602 -0.63100342 -0.293361494 #> 30 1.07709922 -0.679309939 4.20990108 -0.660773005 -0.29630582 -0.367863511 #> 31 -0.53452056 -0.023206617 -0.55081436 -0.585581919 -0.62115938 1.173151890 #> 32 0.40252473 -0.314808094 -0.56330968 -0.441465669 -0.63100342 0.604583867 #> 33 -0.53452056 -0.679309939 0.01980508 -0.071776158 -0.56701712 -0.379626987 #> 34 -0.53452056 -0.679309939 -0.64244668 -0.679570777 1.28366375 0.216389147 #> 35 0.31176380 -0.188447454 -0.18428509 -0.585581919 -0.26677368 -0.383548146 #> 36 -0.51980257 4.146694494 -0.57997010 -0.554252299 -0.63100342 -0.371784670 #> 37 1.22673211 0.389895474 -0.24676167 -0.660773005 -0.02559452 -0.152199778 #> 38 -0.53452056 -0.674449915 -0.63411647 -0.259753876 -0.61131533 -0.375705829 #> 39 -0.53452056 0.176054391 -0.49250288 -0.447731593 -0.53748498 -0.352178876 #> 40 2.04358049 -0.674449915 0.93612826 -0.197094636 0.03346976 -0.261992224 #> 41 0.24553285 0.559996335 -0.24676167 2.240349763 -0.62608140 -0.379626987 #> 42 -0.46093062 -0.329388168 -0.23843146 -0.410136049 1.79063218 -0.332573082 #> 43 -0.46093062 0.219794613 -0.64244668 -0.685836701 -0.62115938 -0.375705829 #> 44 1.26843308 0.195494490 1.00693505 -0.510390832 -0.60639331 0.024252367 #> 45 0.51536265 -0.679309939 -0.57997010 -0.240956104 -0.38982427 -0.379626987 #> 46 -0.50753758 -0.402288537 -0.17178977 -0.190828713 -0.62115938 -0.332573082 #> 47 0.75820946 -0.679309939 -0.54664925 0.078606015 0.89974591 -0.348257717 #> 48 -0.53452056 -0.105827036 0.02813529 3.430875305 -0.58670521 -0.328651923 #> 49 1.34692902 -0.343968241 -0.55081436 -0.610645614 0.80622746 0.024252367 #> 50 3.17195964 2.469986005 -0.22177104 -0.547986375 1.48054469 -0.367863511 #> 51 -0.53206756 -0.679309939 -0.41336588 0.968367210 -0.62608140 -0.265913383 #> 52 0.13514793 -0.207887552 -0.11347830 -0.529188603 0.72747509 -0.363942352 #> 53 -0.36526369 -0.679309939 -0.64244668 -0.598113766 -0.40951236 -0.360021194 #> 54 -0.53452056 -0.664729865 -0.36754972 -0.353742734 -0.55225105 0.094833225 #> 55 2.23491435 -0.368268364 0.18224419 -0.522922680 0.82099353 -0.254149906 #> 56 -0.51244358 0.885617984 -0.64244668 2.722825904 -0.49810879 -0.375705829 #> 57 -0.48055460 -0.431448684 -0.32173356 -0.366274582 0.53059414 -0.312967288 #> 58 -0.51734957 -0.679309939 -0.62995136 -0.679570777 -0.63100342 -0.363942352 #> 59 -0.51980257 -0.363408340 0.80700999 0.003414929 0.45184176 1.631927468 #> 60 0.14005393 1.138339263 -0.05100172 0.028478624 -0.38490224 -0.332573082 #> 61 -0.53452056 -0.679309939 -0.03434129 -0.472795288 -0.62608140 -0.383548146 #> 62 -0.03901494 -0.679309939 -0.55914457 -0.598113766 1.13108102 -0.301203812 #> 63 -0.52225557 0.788417492 -0.36754972 -0.303615343 -0.62608140 -0.363942352 #> 64 -0.53452056 -0.159287306 -0.09681787 1.156344927 -0.24216356 -0.132593984 #> 65 -0.47810160 -0.679309939 1.00276994 -0.616911538 -0.63100342 -0.171805572 #> 66 -0.53452056 -0.674449915 1.28183200 0.636273243 0.37308939 -0.332573082 #> 67 -0.48546060 -0.562669349 -0.35505441 -0.347476810 -0.62608140 -0.246307589 #> 68 -0.53206756 -0.008626544 -0.49250288 -0.052978387 -0.63100342 -0.293361494 #> 69 -0.53452056 -0.669589890 1.39845495 -0.491593060 -0.01575048 -0.258071065 #> 70 3.36819949 1.269559928 -0.62995136 -0.623177462 1.17045721 0.008567732 #> 71 0.32402879 -0.679309939 -0.20511061 -0.479061212 -0.55717307 0.012488891 #> 72 -0.53452056 0.321855129 1.36513411 0.141265254 -0.63100342 0.290891164 #> 73 1.25862108 0.083713924 -0.64244668 -0.134435397 2.44033929 0.118360178 #> 74 0.65273054 -0.679309939 1.11939289 -0.410136049 -0.25692963 -0.297282653 #> 75 2.94383081 -0.679309939 0.50295730 -0.372540506 1.28366375 -0.367863511 #> 76 1.98716153 1.775002486 -0.03017619 -0.397604201 -0.62608140 -0.379626987 #> 77 -0.29903274 -0.679309939 -0.50499820 -0.648241158 2.05149943 0.761430218 #> 78 -0.53452056 0.195494490 -0.64244668 -0.685836701 0.71763104 0.204625671 #> 79 0.99615028 -0.275927897 -0.24676167 -0.554252299 0.07776797 -0.371784670 #> 80 -0.53206756 -0.679309939 6.88389873 -0.679570777 -0.62608140 -0.383548146 #> 81 0.06646398 0.005953530 -0.36754972 -0.629443386 -0.63100342 -0.277676859 #> 82 -0.28186175 -0.674449915 -0.64244668 0.128733407 4.36977254 -0.046328491 #> 83 0.49573866 0.200354514 -0.55914457 -0.491593060 0.13683226 -0.344336558 #> 84 -0.53452056 -0.674449915 -0.64244668 -0.178296865 -0.62608140 7.537192593 #> 85 -0.53206756 -0.664729865 -0.64244668 -0.685836701 -0.63100342 -0.316888447 #> 86 -0.53452056 2.192964602 1.78164465 -0.679570777 -0.63100342 -0.234544113 #> 87 0.40743073 -0.475188906 -0.28008251 -0.422667897 0.31894713 0.377156657 #> 88 -0.53452056 -0.193307479 -0.05100172 -0.090573930 2.66183035 0.702612836 #> 89 -0.24016078 -0.679309939 0.47380156 0.254051885 -0.46857665 1.141782620 #> 90 -0.53452056 -0.679309939 -0.47167736 0.924505743 -0.63100342 0.561451120 #> 91 -0.29412674 -0.679309939 -0.64244668 -0.497858984 -0.62608140 -0.379626987 #> 92 -0.53452056 -0.679309939 0.44048072 -0.504124908 -0.62608140 -0.371784670 #> 93 -0.53452056 -0.679309939 0.27387650 1.782937318 -0.63100342 -0.383548146 #> 94 -0.53452056 2.601206669 1.18603458 -0.259753876 -0.08958083 -0.250228748 #> 95 3.55708035 -0.664729865 1.49008727 -0.598113766 1.48546672 -0.211017160 #> 96 -0.46828961 -0.655009816 -0.64244668 -0.679570777 4.06952910 0.020331208 #> 97 -0.53452056 -0.679309939 -0.45501694 -0.667038929 -0.62608140 -0.383548146 #> 98 0.78519244 -0.455748807 -0.05516682 -0.103105778 -0.63100342 -0.281598018 #> 99 -0.53452056 -0.669589890 3.29774300 0.354306667 -0.62608140 -0.383548146 #> 100 -0.53206756 -0.679309939 -0.52582373 0.147531178 -0.60639331 -0.383548146 #> 101 -0.40451166 1.002258574 -0.63411647 -0.065510234 1.30335184 -0.371784670 #> 102 -0.52225557 -0.679309939 -0.45918204 -0.604379690 -0.63100342 -0.379626987 #> 103 -0.43885363 2.800467678 -0.10514809 0.166328950 -0.62115938 -0.383548146 #> 104 -0.53452056 0.161474318 -0.52165862 -0.178296865 -0.61131533 0.549687644 #> 105 2.59305208 -0.674449915 0.31552756 -0.529188603 0.41246558 0.345787387 #> 106 1.42787796 -0.679309939 1.39012474 -0.673304853 0.20574059 -0.301203812 #> 107 -0.53452056 -0.188447454 0.50712240 -0.272285723 0.61919057 2.274997508 #> 108 -0.25978477 0.681496950 0.22389524 0.222722265 -0.62608140 1.337840559 #> 109 -0.52470857 -0.217607602 2.99785542 2.096233513 -0.60639331 -0.352178876 #> 110 -0.50263159 -0.382848438 -0.41336588 -0.203360560 -0.61623735 -0.269834542 #> 111 -0.53206756 -0.421728635 -0.62578626 -0.416401973 -0.62608140 -0.199253683 #> 112 -0.21072481 -0.669589890 -0.64244668 0.454561450 -0.62608140 -0.383548146 #> 113 -0.53452056 -0.032926667 -0.41336588 0.053542320 2.00227919 -0.316888447 #> 114 -0.40941766 -0.412008586 -0.06349703 -0.491593060 -0.54240700 0.286970005 #> 115 -0.53206756 0.054553776 -0.08848766 -0.052978387 -0.43412248 -0.128672825 #> 116 -0.45111862 1.211239632 0.01147487 0.015946776 0.82591556 -0.336494241 #> 117 -0.53452056 -0.013486568 0.57792920 -0.685836701 -0.39966831 -0.371784670 #> 118 -0.16902384 -0.465468857 0.42798540 0.028478624 0.34847927 0.094833225 #> 119 -0.53452056 -0.679309939 0.72370788 1.739075850 -0.63100342 -0.383548146 #> 120 -0.53452056 0.244094736 -0.21344082 -0.159499093 -0.63100342 -0.383548146 #> 121 -0.52716157 -0.679309939 -0.44252162 -0.679570777 -0.23724154 -0.383548146 #> 122 -0.53452056 -0.679309939 0.23639056 -0.522922680 0.03346976 -0.383548146 #> 123 -0.53452056 4.550076536 -0.48417267 1.544832209 -0.56701712 -0.340415400 #> 124 -0.53206756 -0.421728635 -0.48833778 0.009680852 -0.15356714 -0.352178876 #> 125 -0.48055460 -0.139847208 -0.13846893 -0.215892408 -0.63100342 -0.375705829 #> 126 -0.53452056 -0.309948069 -0.03017619 0.141265254 0.65364473 -0.348257717 #> 127 -0.47319561 -0.596689521 -0.45085183 -0.516656756 1.18522328 -0.156120937 #> 128 -0.49772559 1.687522044 -0.63828157 -0.140701321 -0.63100342 -0.332573082 #> 129 0.10571196 0.919638156 -0.57580499 2.716559980 0.73239711 -0.238465271 #> 130 1.58486984 -0.023206617 0.17391397 -0.660773005 -0.63100342 -0.383548146 #> 131 -0.51489658 0.419055622 -0.64244668 0.084871939 -0.25200761 -0.301203812 #> 132 -0.52470857 -0.669589890 1.18186948 -0.604379690 -0.54732902 -0.379626987 #> 133 -0.53452056 0.030253653 0.86115636 -0.234690180 -0.52764093 -0.285519177 #> 134 3.26762657 -0.650149792 0.57376409 -0.485327136 1.72172385 -0.328651923 #> 135 -0.53452056 0.880757959 1.11106268 2.478454871 -0.59654926 -0.324730765 #> 136 0.11552395 -0.679309939 -0.13430382 -0.547986375 0.70778699 0.118360178 #> 137 -0.53452056 -0.679309939 -0.64244668 -0.667038929 -0.61623735 -0.379626987 #> 138 -0.53206756 -0.460608832 0.26138119 -0.685836701 4.39438266 0.032094685 #> 139 0.17439590 0.380175425 -0.54248415 -0.109371702 -0.62115938 -0.324730765 #> 140 -0.52716157 -0.674449915 -0.63411647 -0.259753876 0.83083758 -0.265913383 #> 141 -0.53452056 0.428775671 0.59042451 -0.009116919 0.05807988 0.141887131 #> 142 -0.37262268 -0.523789152 -0.56330968 -0.673304853 0.61919057 2.714167291 #> 143 -0.53452056 -0.538369226 -0.35921951 -0.109371702 -0.61623735 -0.277676859 #> 144 -0.49527259 0.973098427 -0.53831904 0.786655417 -0.63100342 -0.277676859 #> 145 -0.08807490 -0.528649176 -0.63411647 -0.566784147 3.53302853 -0.352178876 #> 146 -0.51244358 -0.222467626 -0.60079562 -0.435199745 -0.62115938 -0.363942352 #> 147 -0.53452056 -0.679309939 -0.64244668 -0.466529364 -0.62608140 3.682693510 #> 148 0.14741292 -0.081526913 -0.50499820 -0.366274582 -0.62608140 2.231864761 #> 149 -0.53452056 -0.655009816 0.59042451 5.498630194 -0.49810879 -0.383548146 #> 150 -0.53452056 -0.679309939 -0.64244668 -0.554252299 -0.20770940 0.443816357 #> 151 -0.43394764 -0.679309939 -0.39254036 -0.360008658 -0.60147128 -0.261992224 #> 152 -0.48546060 -0.314808094 -0.62162115 0.091137863 1.57898517 -0.352178876 #> 153 -0.53452056 -0.596689521 -0.58413520 -0.591847843 0.34847927 0.130123654 #> 154 -0.52961456 -0.679309939 -0.63828157 4.320636500 0.09745607 -0.191411366 #> 155 -0.53452056 0.214934588 0.20306971 1.024760525 -0.57193914 -0.379626987 #> 156 -0.52470857 0.030253653 -0.63828157 -0.353742734 -0.63100342 -0.328651923 #> 157 -0.53206756 -0.679309939 -0.64244668 -0.685836701 -0.63100342 -0.383548146 #> 158 -0.53452056 -0.091246962 4.23489171 -0.673304853 -0.62608140 -0.211017160 #> 159 -0.53452056 2.523446276 -0.63828157 -0.328679038 0.54043819 1.333919400 #> 160 -0.53452056 1.002258574 0.05312592 1.569895905 -0.63100342 -0.371784670 #> 161 -0.52225557 0.428775671 -0.57997010 0.066074168 -0.63100342 -0.344336558 #> 162 -0.53452056 1.998563618 -0.64244668 0.066074168 -0.63100342 7.666590833 #> 163 -0.53206756 -0.266207848 -0.25925698 2.459657100 -0.63100342 -0.383548146 #> 164 -0.51244358 -0.674449915 -0.62578626 -0.228424256 -0.61623735 -0.371784670 #> 165 -0.51489658 0.351015277 0.32385777 -0.103105778 -0.63100342 -0.375705829 #> 166 -0.53452056 -0.674449915 -0.64244668 -0.648241158 0.11222214 -0.383548146 #> 167 -0.49036659 -0.514069103 -0.63828157 0.279115580 1.49038874 -0.258071065 #> 168 -0.53452056 -0.412008586 0.18224419 -0.159499093 -0.62608140 -0.360021194 #> 169 -0.53206756 -0.679309939 -0.63828157 -0.504124908 -0.63100342 -0.383548146 #> 170 -0.04882693 -0.679309939 -0.63828157 -0.685836701 -0.63100342 -0.261992224 #> 171 3.46877241 -0.407148561 1.34847369 -0.009116919 1.17045721 -0.132593984 #> 172 -0.50753758 1.109179116 -0.31340335 -0.616911538 -0.52764093 -0.167884413 #> 173 -0.53452056 -0.562669349 -0.60912584 2.171424600 -0.62115938 -0.309046129 #> 174 -0.45602462 0.423915646 -0.36754972 0.698932482 -0.63100342 -0.175726731 #> 175 0.17439590 0.039973702 -0.54248415 -0.554252299 0.23527273 -0.258071065 #> 176 0.70914950 -0.679309939 -0.64244668 -0.121903550 2.44526132 -0.375705829 #> 177 0.95444931 -0.271067872 -0.38004504 -0.585581919 -0.06989273 -0.344336558 #> 178 -0.11996387 1.279279977 -0.64244668 -0.685836701 3.24755116 -0.136515143 #> 179 -0.53452056 -0.679309939 -0.19261530 0.435763678 -0.61131533 -0.360021194 #> 180 -0.48546060 -0.518929127 -0.26342209 -0.479061212 -0.63100342 -0.320809606 #> 181 -0.49772559 -0.635569718 -0.56747478 -0.673304853 -0.60639331 2.278918667 #> 182 -0.53206756 1.964543446 -0.63411647 0.391902211 -0.06004869 -0.375705829 #> 183 -0.52716157 -0.169007356 -0.42169609 3.180238349 -0.62608140 -0.383548146 #> 184 -0.32601572 -0.314808094 -0.50499820 -0.610645614 -0.13387904 -0.062013126 #> 185 -0.51489658 3.373950582 -0.27591741 -0.510390832 -0.61131533 -0.383548146 #> 186 -0.51980257 -0.679309939 -0.63411647 -0.641975234 -0.29630582 0.651637772 #> 187 0.38535374 0.783557467 -0.64244668 -0.504124908 1.10154888 -0.371784670 #> 188 -0.53452056 1.993703594 0.05729102 0.084871939 -0.63100342 -0.383548146 #> 189 -0.49281959 -0.353688291 -0.55081436 4.583805304 -0.60639331 3.910120720 #> 190 -0.37262268 -0.339108217 -0.08015745 -0.347476810 -0.62608140 -0.062013126 #> 191 -0.53452056 1.532001256 1.58588470 -0.428933821 -0.57193914 -0.081618920 #> 192 -0.53452056 -0.669589890 -0.27175230 -0.266019799 -0.63100342 -0.379626987 #> 193 3.84898713 -0.518929127 -0.16345956 -0.510390832 0.37308939 -0.348257717 #> 194 -0.52716157 0.715517123 0.39466456 -0.497858984 -0.21755344 -0.379626987 #> 195 3.26026757 0.268394859 -0.03017619 0.153797102 0.67825485 -0.211017160 #> 196 -0.48546060 4.652137053 0.77785425 -0.416401973 -0.63100342 -0.383548146 #> 197 -0.51244358 0.351015277 -0.14679914 -0.685836701 0.41738760 -0.367863511 #> 198 -0.53452056 -0.679309939 -0.63828157 -0.623177462 -0.63100342 -0.383548146 #> 199 1.06483423 -0.674449915 -0.53831904 -0.667038929 -0.18309928 -0.375705829 #> 200 -0.53452056 -0.552949299 0.14059313 -0.002850995 0.27957094 0.196783353 #> Otu00023 Otu00024 Otu00025 Otu00026 Otu00027 Otu00028 #> 1 -0.0069254588 -0.177204415 -0.24303824 -0.22202016 -0.24641906 -0.292554022 #> 2 -0.6642571429 -0.678440995 -0.43616774 -0.29146475 -0.38539990 -0.307394436 #> 3 -0.3747181868 0.177117995 0.04157367 -0.47086329 -0.41259180 -0.168883908 #> 4 -0.3199405465 0.954898895 -0.28369708 0.43770350 -0.36425064 -0.314814643 #> 5 -0.9068438359 -0.695725015 -0.39550890 -0.61553953 -0.06816104 -0.314814643 #> 6 -0.3434166781 0.851194775 0.03649131 -0.45350214 -0.38842122 -0.319761448 #> 7 0.4078195324 -0.669798985 -0.42600303 0.87751927 -0.23131245 -0.295027425 #> 8 -0.0851792307 -0.592020895 -0.35485005 -0.57503018 0.01945732 -0.322234850 #> 9 -0.8990184587 -0.393254665 -0.45141481 -0.62132658 -0.31288816 -0.319761448 #> 10 -0.4060196956 -0.341402605 1.42397434 -0.62132658 -0.40957048 0.214493446 #> 11 0.1965343482 3.962318375 -0.07023815 0.46085170 -0.20412055 -0.322234850 #> 12 1.2451348919 0.324032165 -0.14647348 -0.58660428 0.02852128 -0.319761448 #> 13 0.0713283131 0.488230355 -0.30402650 -0.37248345 -0.39748519 -0.314814643 #> 14 -0.5625272394 -0.280908535 -0.26845001 1.35205733 -0.37935725 -0.322234850 #> 15 -0.6955586517 0.107981915 -0.37009712 -0.26252951 -0.31288816 -0.312341241 #> 16 1.6911813918 -0.713009035 -0.43616774 -0.01368637 -0.32497345 -0.307394436 #> 17 -0.1399568711 0.099339905 0.21437375 -0.25095541 -0.38237857 -0.314814643 #> 18 -0.4138450728 -0.030290245 0.21437375 -0.22780721 -0.39144254 -0.183724322 #> 19 -0.7581616692 -0.021648235 -0.37517948 0.53608334 -0.12556616 -0.307394436 #> 20 0.8538660323 -0.592020895 -0.45141481 -0.54030789 -0.30986683 -0.312341241 #> 21 -0.8911930815 -0.704367025 5.62708227 -0.62132658 -0.41259180 -0.297500827 #> 22 0.7756122604 -0.704367025 0.61587983 -0.32618705 -0.31288816 -0.205984942 #> 23 0.3686926464 -0.721651045 -0.45649716 0.48978694 0.23699254 -0.299974229 #> 24 -0.1243061167 0.203044025 -0.40059125 -0.62132658 0.44848511 -0.314814643 #> 25 1.1434049884 -0.013006225 -0.29386179 -0.62132658 -0.41863444 -0.235665770 #> 26 -0.8285900640 0.168475985 -0.03974402 -0.58660428 0.33367486 -0.089735035 #> 27 -0.8677169499 -0.721651045 -0.14139113 -0.62132658 -0.41561312 1.485822222 #> 28 0.2200104798 -0.678440995 -0.44125010 2.96085712 -0.42467709 4.458851770 #> 29 -0.4216704500 -0.522884815 -0.43616774 -0.10049212 -0.32195212 -0.319761448 #> 30 -0.7816378008 -0.142636375 -0.37517948 -0.58660428 -0.40654915 -0.314814643 #> 31 -0.4920988447 1.680827735 -0.42600303 -0.60396543 -0.40352783 -0.317288045 #> 32 -0.6642571429 1.853667935 -0.31419121 -0.41299279 -0.40957048 -0.210931747 #> 33 1.3546901726 -0.721651045 -0.34976770 -0.59239133 0.49682627 -0.228245563 #> 34 -0.8990184587 -0.410538685 3.72119899 -0.49979854 -0.05909707 -0.260399793 #> 35 -0.2729882833 4.938865505 -0.18204997 -0.52873379 -0.33101609 -0.309867838 #> 36 2.7789088215 -0.661156975 1.47988025 -0.61553953 -0.15275807 -0.314814643 #> 37 -0.5234003535 2.026508135 0.45324446 -0.58081723 0.09801170 -0.314814643 #> 38 -0.9068438359 -0.721651045 0.34143264 -0.59817838 -0.36122932 -0.307394436 #> 39 -0.0069254588 -0.661156975 -0.26845001 -0.43614099 0.49984759 -0.287607218 #> 40 -0.6407810114 0.038845835 -0.25320295 -0.21623311 -0.37935725 -0.314814643 #> 41 1.1825318744 -0.609304915 -0.42092068 -0.61553953 0.26418444 -0.317288045 #> 42 -0.4529719588 0.073413875 -0.42092068 -0.37248345 -0.37935725 5.443265880 #> 43 3.1388761724 -0.721651045 -0.37517948 -0.62132658 -0.34914403 -0.297500827 #> 44 0.4391210411 0.090697895 -0.34976770 -0.59817838 -0.31288816 -0.295027425 #> 45 0.5252001902 -0.410538685 1.46971554 -0.61553953 -0.09535294 -0.317288045 #> 46 1.3077379094 -0.436464715 -0.24303824 0.16571217 -0.37633593 -0.210931747 #> 47 0.5173748130 0.393168245 0.04665602 -0.60396543 0.54818875 -0.317288045 #> 48 1.4877215849 -0.661156975 -0.33960299 -0.62132658 -0.41561312 -0.314814643 #> 49 -0.8442408184 0.151191965 -0.24812059 -0.60396543 -0.41863444 -0.290080620 #> 50 -0.6720825201 0.747490655 -0.18204997 -0.58660428 -0.38842122 -0.267820000 #> 51 -0.3590674325 -0.574736875 -0.44125010 1.11478830 -0.42467709 1.305263855 #> 52 -0.6407810114 0.427736285 -0.21762646 -0.60975248 -0.35518667 -0.302447632 #> 53 1.7459590322 -0.704367025 6.00825892 -0.60975248 0.58746594 -0.223298758 #> 54 1.4877215849 -0.522884815 1.16985657 -0.41877984 -0.36425064 -0.262873195 #> 55 -0.7425109149 0.254896085 -0.17188526 0.50714809 -0.10441691 -0.314814643 #> 56 0.8225645235 -0.713009035 0.03649131 -0.61553953 -0.36727196 -0.314814643 #> 57 -0.3590674325 -0.557452855 -0.45141481 1.07427895 0.25209915 -0.109522253 #> 58 -0.8911930815 -0.669798985 1.25117426 -0.62132658 -0.42467709 0.738854731 #> 59 -0.1008299851 0.445020305 -0.45141481 -0.38984460 0.56027404 -0.312341241 #> 60 0.0165506728 -0.254982505 0.61587983 0.62867613 0.19167270 -0.277713609 #> 61 -0.4294958272 -0.488316775 -0.45649716 -0.28567770 -0.37331461 -0.317288045 #> 62 -0.2338613974 -0.427822705 0.39733855 -0.40720575 -0.17390732 2.002763299 #> 63 1.9259427076 -0.592020895 -0.44633245 0.99904731 -0.42165577 -0.230718965 #> 64 -0.3981943184 -0.713009035 0.88524467 0.14256397 0.11613964 -0.317288045 #> 65 -0.6564317657 -0.531526825 -0.47174423 -0.55188199 8.52145880 0.006727654 #> 66 -0.6955586517 -0.177204415 -0.47174423 -0.62132658 -0.23433377 -0.322234850 #> 67 -0.5625272394 -0.687083005 -0.47174423 2.85669023 0.33367486 -0.322234850 #> 68 -0.3121151693 0.393168245 -0.45649716 0.17728626 -0.39748519 -0.319761448 #> 69 1.1590557428 -0.721651045 0.02124425 1.73400261 0.03758525 -0.309867838 #> 70 0.1808835938 1.940088035 -0.43616774 -0.54030789 -0.38539990 -0.319761448 #> 71 1.0181989533 -0.358686625 1.11395066 -0.61553953 -0.31893080 -0.304921034 #> 72 -0.3355913009 -0.721651045 -0.30910886 1.01640846 -0.16182203 -0.275240206 #> 73 -0.5860033710 -0.038932255 -0.42092068 -0.23359426 -0.26756832 -0.314814643 #> 74 -0.5781779938 -0.177204415 -0.36501477 0.14256397 0.83521439 0.006727654 #> 75 -0.4686227131 0.894404825 0.01107953 -0.30882590 -0.35216535 -0.304921034 #> 76 -0.6486063886 0.531440405 -0.44125010 -0.52294674 -0.36727196 -0.307394436 #> 77 -0.4842734675 0.721564625 -0.47174423 2.76409744 -0.37029328 -0.309867838 #> 78 -0.9068438359 1.015392965 0.94115058 -0.23938131 -0.39446386 -0.292554022 #> 79 -0.4451465816 -0.237698485 -0.26336766 -0.08313097 -0.28569625 -0.314814643 #> 80 0.0791536903 -0.721651045 0.36176206 -0.61553953 -0.42467709 -0.248032781 #> 81 -0.7190347833 -0.687083005 -0.29894415 0.60552794 -0.30986683 -0.322234850 #> 82 0.0087252956 1.145023115 -0.39042654 -0.23938131 -0.11045955 -0.270293402 #> 83 1.9885457251 -0.315476575 -0.33452063 -0.60396543 -0.40654915 -0.257926390 #> 84 0.2747881201 -0.721651045 -0.32943828 2.66571759 2.25221464 -0.314814643 #> 85 -0.8833677043 -0.229056475 -0.46157952 1.49673357 0.05269186 0.911992891 #> 86 -0.9068438359 -0.626588935 -0.45141481 1.59511342 1.12224003 -0.322234850 #> 87 -0.2495121518 5.517880175 -0.38534419 -0.61553953 -0.40352783 -0.309867838 #> 88 -0.2886390377 0.721564625 -0.08040286 -0.22780721 -0.21922716 -0.275240206 #> 89 -0.5234003535 0.133907945 -0.30910886 -0.19308491 -0.41561312 -0.173830713 #> 90 0.0008999184 0.082055885 -0.41075596 0.40876825 -0.42165577 -0.302447632 #> 91 -0.7659870464 -0.393254665 -0.44633245 0.45506465 -0.33705874 -0.302447632 #> 92 -0.7738124236 0.954898895 0.85983289 -0.30882590 -0.41561312 1.837045346 #> 93 0.1417567078 -0.721651045 6.81127108 -0.62132658 -0.14369410 -0.302447632 #> 94 -0.6016541254 -0.341402605 -0.46157952 1.02798256 -0.10743823 -0.149096690 #> 95 0.7286599972 0.254896085 -0.07532051 -0.53452084 -0.30080287 -0.319761448 #> 96 -0.9068438359 0.194402015 -0.46157952 -0.34354820 -0.42467709 -0.322234850 #> 97 1.9181173304 -0.704367025 -0.27353237 -0.62132658 0.98325919 -0.248032781 #> 98 -0.4529719588 0.142549955 0.31093850 0.24094381 -0.35820799 -0.277713609 #> 99 0.7286599972 -0.713009035 -0.07023815 -0.59239133 0.11311831 -0.280187011 #> 100 -0.5234003535 -0.704367025 -0.46666187 -0.60396543 0.06175583 3.006964628 #> 101 0.0243760500 0.514156385 -0.28369708 -0.61553953 3.79913175 -0.322234850 #> 102 5.4160609352 -0.609304915 -0.43108539 -0.61553953 5.83248179 -0.275240206 #> 103 1.1512303656 -0.609304915 -0.44125010 -0.54609494 0.83823571 -0.205984942 #> 104 -0.9068438359 -0.574736875 -0.28369708 0.40298120 -0.42467709 -0.319761448 #> 105 0.1495820850 0.254896085 -0.11597935 -0.59817838 -0.22526980 -0.282660413 #> 106 -0.7972885552 -0.056216275 -0.21254410 -0.59239133 0.43942114 -0.312341241 #> 107 -0.2260360202 -0.229056475 -0.34468534 0.61710203 -0.30080287 0.169972205 #> 108 -0.5468764851 1.335147335 -0.45141481 1.46779833 -0.12254484 -0.309867838 #> 109 1.1121034796 -0.678440995 -0.39550890 -0.59817838 -0.32195212 -0.312341241 #> 110 0.7599615060 -0.479674765 -0.45141481 0.94696386 -0.05305442 -0.309867838 #> 111 -0.6407810114 -0.289550545 1.47479789 0.06154527 -0.40957048 0.058669102 #> 112 -0.5468764851 -0.721651045 -0.25320295 -0.40141870 -0.07722500 -0.314814643 #> 113 -0.8990184587 -0.721651045 -0.24303824 -0.61553953 -0.42165577 -0.314814643 #> 114 -0.6486063886 -0.082142305 -0.30910886 -0.20465901 -0.22829113 -0.319761448 #> 115 -0.4842734675 0.073413875 -0.41583832 -0.62132658 0.20980063 -0.277713609 #> 116 0.1261059534 0.583292465 -0.43108539 -0.60396543 -0.40352783 -0.025426576 #> 117 0.0243760500 -0.514242805 -0.45141481 -0.62132658 -0.39748519 0.763588754 #> 118 -0.0304015904 -0.721651045 -0.27861472 -0.15257556 0.01945732 -0.319761448 #> 119 -0.7033840289 2.389472555 -0.45141481 -0.62132658 -0.38237857 -0.317288045 #> 120 1.8320381813 -0.652514965 -0.20237939 -0.61553953 0.10103302 -0.309867838 #> 121 -0.5547018623 -0.548810845 -0.47174423 -0.44771509 0.03154261 -0.272766804 #> 122 -0.1869091342 -0.254982505 3.03508101 -0.53452084 -0.31893080 -0.250506184 #> 123 -0.2260360202 -0.462390745 -0.46157952 2.06965148 -0.42467709 6.323797094 #> 124 0.1652328394 1.170949145 -0.44125010 -0.60975248 -0.42467709 3.514012096 #> 125 -0.9068438359 -0.531526825 -0.33960299 4.84743529 -0.38842122 -0.299974229 #> 126 -0.6329556342 3.564785915 -0.24812059 -0.52294674 -0.39748519 -0.245559379 #> 127 -0.9068438359 -0.367328635 -0.40059125 0.37983300 -0.36727196 -0.314814643 #> 128 1.6677052603 0.185760005 3.05032807 0.39140710 0.28533370 -0.314814643 #> 129 -0.0851792307 -0.522884815 -0.16680290 5.25252877 0.85032100 -0.280187011 #> 130 -0.6251302570 -0.695725015 0.10764429 -0.60975248 -0.27663229 -0.322234850 #> 131 -0.9068438359 -0.419180695 -0.42600303 -0.51715969 -0.02586252 -0.317288045 #> 132 1.4407693217 -0.592020895 -0.44125010 -0.55188199 1.61169427 -0.285133816 #> 133 0.4547717955 -0.488316775 0.03649131 -0.17572376 -0.21318451 -0.248032781 #> 134 -0.2808136605 0.427736285 0.24486788 -0.45928919 -0.29476022 -0.314814643 #> 135 -0.0695284764 -0.678440995 -0.33452063 -0.59239133 0.91679010 -0.317288045 #> 136 0.3217403832 -0.280908535 -0.39550890 -0.54030789 0.65997768 0.031461677 #> 137 0.4547717955 0.868478795 -0.44125010 0.07890642 -0.36727196 -0.136729678 #> 138 -0.5312257307 0.453662315 -0.47174423 -0.44192804 -0.40957048 1.082657649 #> 139 0.0400268043 -0.133994365 -0.41583832 1.91918820 0.06477715 -0.322234850 #> 140 -0.9068438359 2.795647025 -0.44125010 -0.55188199 -0.41561312 -0.317288045 #> 141 -0.4920988447 -0.583378885 -0.47174423 2.26062412 0.17656609 -0.116942460 #> 142 -0.7894631780 -0.237698485 -0.21762646 -0.42456689 -0.42467709 -0.099628644 #> 143 -0.5155749763 0.038845835 -0.24812059 0.23515676 -0.42467709 -0.015532966 #> 144 0.1417567078 0.142549955 0.09239722 1.66455801 -0.27663229 0.320849745 #> 145 -0.8833677043 -0.315476575 -0.15155584 -0.61553953 -0.40050651 5.809329418 #> 146 -0.3668928096 -0.609304915 -0.44633245 0.68075958 -0.42467709 -0.292554022 #> 147 -0.8990184587 -0.713009035 -0.44125010 -0.60975248 -0.31893080 -0.314814643 #> 148 -0.1869091342 -0.073500295 -0.41075596 1.02798256 0.45452776 -0.223298758 #> 149 -0.1008299851 -0.626588935 -0.39042654 -0.11785327 -0.39748519 -0.299974229 #> 150 0.0322014271 2.372188535 -0.39042654 0.42612940 -0.40352783 -0.322234850 #> 151 -0.2495121518 1.231443215 -0.46157952 -0.60396543 -0.42467709 -0.304921034 #> 152 0.3921687780 1.352431355 -0.20746175 -0.46507624 -0.41259180 -0.280187011 #> 153 -0.8442408184 0.548724425 -0.43108539 0.60552794 -0.34008006 -0.307394436 #> 154 1.2060080059 -0.617946925 -0.36501477 -0.62132658 0.43639982 -0.245559379 #> 155 0.9086436726 -0.531526825 -0.22779117 -0.56924313 0.30648295 0.706700501 #> 156 -0.4686227131 -0.522884815 -0.42092068 -0.61553953 -0.42165577 -0.314814643 #> 157 -0.8911930815 -0.687083005 0.98180942 -0.62132658 -0.33705874 -0.210931747 #> 158 0.9947228218 -0.220414465 0.74293871 0.07311937 -0.41561312 -0.295027425 #> 159 -0.6564317657 -0.125352355 -0.40567361 2.60784710 -0.41561312 -0.277713609 #> 160 -0.6877332745 -0.713009035 -0.34468534 -0.59239133 0.64184975 -0.139203081 #> 161 0.4078195324 -0.669798985 -0.47174423 3.04187582 -0.41561312 -0.314814643 #> 162 -0.8990184587 -0.721651045 -0.14647348 -0.62132658 -0.37633593 -0.285133816 #> 163 1.1121034796 -0.721651045 -0.35993241 0.74441713 -0.29173890 -0.290080620 #> 164 0.9712466902 -0.168562405 -0.32435592 -0.59817838 0.79895852 -0.272766804 #> 165 0.2356612341 -0.566094865 -0.33960299 -0.49979854 5.67839434 -0.297500827 #> 166 -0.3434166781 1.369715375 -0.46157952 -0.60975248 -0.41561312 4.716085608 #> 167 -0.5468764851 0.419094275 -0.46666187 3.73053472 -0.40654915 -0.307394436 #> 168 -0.5155749763 -0.721651045 -0.40567361 -0.59817838 -0.34008006 -0.287607218 #> 169 3.5849226723 -0.704367025 0.95639764 -0.53452084 0.37597337 -0.304921034 #> 170 -0.9068438359 -0.687083005 -0.39042654 -0.62132658 -0.41863444 -0.312341241 #> 171 -0.5390511079 0.617860505 -0.07532051 -0.37827050 -0.37633593 -0.314814643 #> 172 -0.4529719588 -0.626588935 -0.46157952 -0.26252951 2.99243865 -0.077368024 #> 173 -0.8207646868 -0.687083005 -0.40567361 -0.62132658 0.99836580 0.019094666 #> 174 0.4312956639 1.741321805 -0.39042654 -0.51137264 -0.15275807 -0.290080620 #> 175 -0.0695284764 0.107981915 -0.45649716 -0.50558559 -0.29778154 -0.295027425 #> 176 0.4547717955 4.307998775 1.64759798 -0.58660428 -0.37029328 -0.304921034 #> 177 -0.1321314939 -0.220414465 -0.24812059 0.70969483 -0.38842122 -0.319761448 #> 178 -0.9068438359 -0.410538685 -0.45649716 -0.62132658 -0.42165577 -0.299974229 #> 179 0.2982642517 -0.574736875 -0.16680290 -0.06576982 0.68414826 -0.319761448 #> 180 -0.5077495991 0.280822115 -0.44633245 -0.33776115 -0.37029328 0.244174274 #> 181 -0.6877332745 -0.522884815 0.01616189 0.77335237 -0.08931029 -0.302447632 #> 182 -0.5938287482 0.436378295 -0.46157952 1.04534371 -0.20109922 -0.196091333 #> 183 -0.4451465816 -0.367328635 -0.22779117 -0.19308491 -0.30684551 0.273855101 #> 184 -0.7738124236 0.151191965 0.03649131 -0.51137264 -0.36727196 1.483348819 #> 185 3.0997492864 -0.617946925 -0.42092068 -0.56924313 0.18260873 -0.314814643 #> 186 -0.8677169499 0.393168245 -0.47174423 0.21200856 -0.39144254 -0.069947817 #> 187 -0.9068438359 -0.609304915 -0.46157952 -0.61553953 -0.42165577 -0.309867838 #> 188 2.7710834443 -0.721651045 -0.34468534 -0.60396543 -0.08628897 0.773482363 #> 189 -0.8755423271 -0.047574265 -0.43108539 -0.43614099 -0.41863444 0.187286021 #> 190 -0.3355913009 -0.246340495 -0.40567361 1.58353932 -0.11650220 -0.302447632 #> 191 -0.6094795026 -0.479674765 -0.42092068 -0.45350214 -0.41259180 -0.245559379 #> 192 0.1104551991 -0.721651045 0.80900933 -0.59239133 -0.40957048 -0.307394436 #> 193 -0.5077495991 0.609218495 0.12289135 -0.56924313 -0.14671542 -0.297500827 #> 194 3.4518912600 -0.687083005 -0.40567361 1.55460407 0.06175583 -0.260399793 #> 195 -0.4842734675 0.315390155 2.58783373 -0.52873379 0.17958741 -0.282660413 #> 196 2.4658937338 -0.721651045 1.35282136 -0.16414966 -0.42467709 -0.322234850 #> 197 -0.0382269676 -0.669798985 -0.39550890 -0.58660428 -0.40352783 -0.161463701 #> 198 -0.9068438359 -0.721651045 0.15338549 -0.62132658 -0.41561312 -0.297500827 #> 199 -0.8598915727 0.107981915 0.40750326 -0.60396543 -0.27058964 -0.299974229 #> 200 -0.0304015904 0.004277795 -0.14647348 -0.55766903 -0.23131245 -0.317288045 #> Otu00029 Otu00030 Otu00031 Otu00032 Otu00033 #> 1 0.695821495 0.39193166 0.2730666130 1.850227727 -0.352365855 #> 2 -0.252260766 0.44720466 -0.1402887916 -0.493938512 0.152851091 #> 3 0.066720182 -0.59377025 -0.4629076438 -0.357825634 -0.288065517 #> 4 -0.473775313 -0.71352842 1.5937875395 -0.501500339 -0.435037719 #> 5 -0.571241714 0.33665866 -0.5637260352 -0.577118604 0.952012441 #> 6 -0.216818439 -0.52928508 -0.2411071829 0.337862411 0.079364989 #> 7 3.079318020 0.19847615 -0.3520074134 -0.395634767 -0.618752972 #> 8 0.031277854 -0.17001055 -0.3822529308 -0.357825634 -0.444223482 #> 9 -0.730732188 -0.11473754 0.3335576478 -0.070476224 -0.168650602 #> 10 0.137604837 -0.76880143 -0.4830713221 -0.516623992 0.740739900 #> 11 -0.305424257 0.16162748 -0.5939715526 -0.577118604 -0.600381447 #> 12 -0.730732188 -0.54770941 -0.5233986787 0.148816747 0.465167021 #> 13 -0.269981930 -0.62140675 -0.2209435046 0.103445788 -0.453409245 #> 14 -0.526938804 0.54853851 0.1420027042 0.572279035 -0.646310260 #> 15 -0.535799386 -0.33582956 -0.2411071829 0.436166157 -0.655496023 #> 16 -0.340866585 -0.38189040 -0.4729894830 -0.569556778 1.071427356 #> 17 -0.181376111 1.20260239 -0.4427439656 1.071359589 -0.582009922 #> 18 0.279374147 0.65908451 0.0109387955 -0.100723530 0.106922277 #> 19 0.270513565 0.72356969 -0.0797977567 0.466413463 -0.232950941 #> 20 1.431249791 0.85254003 0.4646215565 -0.546871298 0.446795495 #> 21 -0.730732188 -0.76880143 -0.5939715526 -0.569556778 1.787916843 #> 22 2.937548710 -0.28055656 -0.5536441961 -0.456129379 -0.159464840 #> 23 -0.004164473 0.04186930 -0.3217618960 0.141254920 -0.673867548 #> 24 0.146465418 1.07363205 -0.5838897135 0.504222596 0.116108040 #> 25 -0.730732188 0.79726702 -0.1806161481 -0.577118604 -0.021678400 #> 26 -0.730732188 -0.70431626 -0.5637260352 -0.138532663 4.424230724 #> 27 -0.686429278 -0.76880143 -0.5838897135 -0.531747645 1.705244979 #> 28 0.562912767 -0.76880143 -0.5939715526 -0.577118604 -0.490152295 #> 29 0.279374147 -0.52928508 -0.1402887916 -0.357825634 1.098984644 #> 30 -0.721871606 7.25499635 -0.5637260352 0.020265695 -0.692239074 #> 31 -0.128212620 1.34078490 1.6643604135 -0.569556778 -0.012492637 #> 32 1.378086300 -0.06867671 -0.5838897135 2.530792119 -0.627938735 #> 33 0.075580763 -0.43716340 -0.5939715526 -0.577118604 0.428423970 #> 34 -0.243400184 -0.76880143 -0.5838897135 -0.577118604 -0.223765178 #> 35 0.199628910 0.76041836 0.3033121304 -0.441005726 -0.407480431 #> 36 2.388192634 3.49643206 -0.5939715526 -0.509062165 -0.407480431 #> 37 -0.695289860 -0.67667975 -0.4830713221 0.821819312 -0.701424836 #> 38 -0.721871606 -0.03182804 -0.5939715526 -0.577118604 -0.012492637 #> 39 -0.234539602 2.08697046 0.5251125913 -0.350263807 -0.591195684 #> 40 -0.323145421 0.04186930 -0.1402887916 0.065636655 -0.609567210 #> 41 1.316062227 -0.34504173 -0.5233986787 -0.448567553 0.290637530 #> 42 -0.367448331 -0.06867671 -0.2713527003 -0.123409010 -0.692239074 #> 43 -0.721871606 -0.76880143 -0.5738078743 -0.577118604 -0.609567210 #> 44 0.748984986 0.39193166 1.3316597220 -0.478814859 -0.379923143 #> 45 1.989466449 -0.75037709 -0.4931531613 -0.289769194 2.936137175 #> 46 -0.057327965 -0.76880143 -0.4729894830 -0.569556778 2.467663279 #> 47 -0.730732188 -0.73195276 -0.3217618960 -0.297331021 -0.141093314 #> 48 3.495765369 -0.20685922 -0.5435623569 -0.524185818 -0.058421450 #> 49 -0.385169494 -0.72274059 -0.2108616655 -0.229274582 0.492724309 #> 50 -0.624405205 -0.63983108 -0.4124984482 0.489098943 0.042621939 #> 51 -0.588962878 2.18830430 -0.4830713221 -0.561994951 3.110666665 #> 52 -0.137073202 0.12477881 0.6662583392 1.056235936 -0.232950941 #> 53 -0.730732188 -0.76880143 -0.5939715526 -0.561994951 -0.692239074 #> 54 -0.305424257 -0.75037709 -0.5738078743 -0.577118604 -0.398294669 #> 55 -0.535799386 -0.63983108 -0.4225802873 0.050513002 -0.591195684 #> 56 -0.730732188 0.92623737 -0.5536441961 -0.478814859 0.446795495 #> 57 -0.367448331 2.16066779 -0.2511890220 5.563084576 -0.600381447 #> 58 -0.721871606 -0.75037709 -0.5838897135 -0.546871298 0.042621939 #> 59 -0.721871606 -0.23449572 2.7128716834 -0.577118604 1.622573115 #> 60 0.376840547 0.43799250 -0.4024166090 -0.115847183 -0.122721789 #> 61 0.111023091 0.09714230 4.3360477841 -0.055352571 -0.582009922 #> 62 -0.562381132 0.13399097 -0.2209435046 -0.577118604 -0.021678400 #> 63 1.750230739 0.22611265 -0.5133168395 -0.463691206 -0.554452634 #> 64 -0.314284839 0.36429516 2.6422988095 0.254682319 0.079364989 #> 65 -0.721871606 -0.75958926 -0.3923347699 -0.577118604 -0.085978738 #> 66 0.252792401 -0.54770941 -0.5939715526 -0.569556778 -0.333994330 #> 67 -0.358587749 -0.54770941 -0.4024166090 -0.554433125 -0.471780770 #> 68 -0.677568696 0.15241531 0.6965038566 0.012703869 -0.315622805 #> 69 0.642658004 -0.19764705 -0.0596340785 0.156378574 -0.517709583 #> 70 0.155326000 0.24453698 2.8741811096 -0.577118604 -0.499338058 #> 71 0.935057206 -0.48322424 -0.5939715526 0.942808538 -0.389108906 #> 72 -0.491496477 0.21690048 0.1117571868 -0.577118604 -0.343180093 #> 73 -0.730732188 -0.02261587 -0.4729894830 0.186625880 -0.673867548 #> 74 0.048999018 -0.46479990 -0.4225802873 -0.191465449 -0.425851957 #> 75 -0.145933784 1.34078490 -0.3217618960 0.436166157 -0.232950941 #> 76 -0.730732188 1.31314840 4.7393213494 0.141254920 -0.453409245 #> 77 -0.730732188 -0.05025237 4.3864569797 1.404079959 0.079364989 #> 78 -0.730732188 -0.76880143 -0.1302069524 -0.289769194 2.081861248 #> 79 -0.243400184 0.63144801 -0.3520074134 -0.168779969 -0.673867548 #> 80 6.614690190 0.31823432 -0.5939715526 -0.577118604 -0.389108906 #> 81 -0.394030076 -0.05025237 -0.5334805178 -0.342701980 -0.664681786 #> 82 1.759091320 -0.76880143 -0.5939715526 -0.577118604 0.162036853 #> 83 2.007187613 -0.28055656 -0.5334805178 -0.350263807 0.520281597 #> 84 -0.730732188 0.35508299 -0.5939715526 -0.478814859 -0.205393653 #> 85 -0.633265787 -0.08710104 -0.1201251133 -0.577118604 -0.710610599 #> 86 -0.101630874 0.08793014 -0.3419255742 -0.577118604 -0.269693992 #> 87 1.218595826 0.21690048 0.2125755781 1.094045069 -0.131907552 #> 88 -0.721871606 -0.40031473 -0.1906979872 -0.577118604 0.125293803 #> 89 -0.207957857 -0.45558774 -0.5939715526 -0.509062165 -0.425851957 #> 90 -0.730732188 -0.30819306 0.8376496045 -0.577118604 0.667253799 #> 91 -0.730732188 -0.76880143 1.7450151266 -0.093161703 -0.067607213 #> 92 -0.544659968 -0.17001055 -0.1503706307 -0.078038050 -0.582009922 #> 93 0.881893714 -0.76880143 -0.3520074134 -0.577118604 -0.398294669 #> 94 -0.137073202 -0.73195276 -0.1402887916 -0.577118604 -0.554452634 #> 95 -0.624405205 -0.29898089 -0.2612708612 0.383233371 -0.333994330 #> 96 -0.730732188 -0.76880143 -0.5939715526 2.349308281 -0.591195684 #> 97 0.243931819 -0.59377025 -0.5939715526 -0.577118604 2.807536497 #> 98 -0.482635895 0.42878033 1.4223962743 2.530792119 -0.159464840 #> 99 -0.730732188 -0.69510409 -0.5939715526 -0.561994951 -0.600381447 #> 100 -0.730732188 0.40114383 0.1420027042 -0.569556778 -0.600381447 #> 101 -0.704150442 0.91702520 -0.5637260352 -0.561994951 -0.389108906 #> 102 -0.491496477 2.38175981 -0.5939715526 -0.577118604 -0.683053311 #> 103 -0.243400184 -0.30819306 -0.4326621264 -0.569556778 -0.370737381 #> 104 1.316062227 -0.76880143 -0.5939715526 -0.009981611 -0.343180093 #> 105 0.040138436 0.56696284 -0.1201251133 0.156378574 -0.232950941 #> 106 -0.668708114 -0.23449572 -0.4528258047 0.020265695 -0.710610599 #> 107 0.261652983 1.19339022 0.4444578782 -0.138532663 -0.600381447 #> 108 -0.730732188 0.74199402 -0.5838897135 0.564717209 -0.582009922 #> 109 -0.704150442 -0.55692158 -0.4931531613 -0.561994951 -0.040049925 #> 110 -0.261121348 1.46975524 0.3133939695 -0.183903622 -0.288065517 #> 111 -0.367448331 -0.22528355 3.8823650230 -0.055352571 -0.572824159 #> 112 -0.721871606 -0.75958926 -0.5939715526 -0.531747645 -0.710610599 #> 113 -0.128212620 0.83411569 3.5496643316 0.678144607 -0.315622805 #> 114 -0.650986951 -0.10552538 -0.4830713221 -0.546871298 -0.664681786 #> 115 -0.500357059 0.99072254 3.0052450183 0.715953740 0.033436176 #> 116 -0.243400184 -0.56613375 -0.3419255742 -0.259521888 -0.361551618 #> 117 0.917336042 -0.76880143 -0.4427439656 -0.365387460 2.100232773 #> 118 0.616076258 0.43799250 0.7569948914 3.377716696 -0.563638396 #> 119 -0.225679020 -0.76880143 1.0090408698 2.939130754 0.703996850 #> 120 2.512240780 0.53932634 -0.5838897135 -0.546871298 -0.131907552 #> 121 -0.394030076 0.44720466 -0.4830713221 -0.531747645 -0.683053311 #> 122 0.111023091 -0.41873907 1.2409231698 0.950370364 -0.333994330 #> 123 -0.721871606 -0.75037709 -0.2915163786 -0.448567553 -0.683053311 #> 124 0.261652983 0.06029364 -0.3520074134 -0.161218143 -0.609567210 #> 125 -0.721871606 0.94466170 -0.3822529308 0.247120493 -0.012492637 #> 126 0.137604837 -0.75958926 -0.4225802873 -0.569556778 -0.058421450 #> 127 -0.713011024 -0.56613375 0.1117571868 -0.554433125 -0.232950941 #> 128 0.075580763 -0.51086074 -0.5233986787 -0.168779969 3.955756829 #> 129 -0.500357059 -0.56613375 -0.4427439656 -0.463691206 -0.471780770 #> 130 -0.642126369 -0.05946454 -0.5939715526 -0.456129379 -0.333994330 #> 131 2.972991038 -0.66746759 -0.5233986787 0.050513002 1.493972438 #> 132 -0.730732188 0.35508299 -0.4024166090 -0.040228917 0.823411764 #> 133 2.078072268 -0.70431626 0.0109387955 -0.463691206 -0.040049925 #> 134 -0.473775313 -0.54770941 -0.1402887916 0.315176932 -0.517709583 #> 135 2.645149508 -0.53849724 -0.5838897135 -0.561994951 1.319442948 #> 136 0.350258802 -0.45558774 1.1804321350 1.313338040 -0.049235688 #> 137 -0.269981930 -0.20685922 3.0254086966 1.857789554 -0.591195684 #> 138 0.093301927 -0.54770941 -0.4528258047 2.583724905 -0.683053311 #> 139 0.607215676 -0.66746759 -0.2209435046 7.158629984 -0.517709583 #> 140 -0.730732188 0.83411569 2.2087797267 -0.577118604 3.312753443 #> 141 -0.110491456 1.50660391 0.2125755781 0.368109718 -0.600381447 #> 142 -0.305424257 -0.75037709 -0.1705343090 -0.569556778 -0.710610599 #> 143 -0.278842512 -0.06867671 -0.3217618960 0.179064053 -0.683053311 #> 144 -0.571241714 0.50247767 -0.0293885611 2.349308281 -0.582009922 #> 145 1.271759317 -0.29898089 -0.4427439656 -0.365387460 -0.710610599 #> 146 -0.110491456 0.47484117 0.0008569563 0.549593556 0.051807701 #> 147 -0.730732188 -0.76880143 -0.5838897135 -0.577118604 -0.673867548 #> 148 -0.367448331 0.19847615 1.9164063918 0.632773648 -0.710610599 #> 149 -0.642126369 -0.74116493 -0.4326621264 -0.569556778 -0.701424836 #> 150 -0.730732188 4.27025412 -0.5939715526 -0.577118604 -0.701424836 #> 151 -0.402890658 -0.38189040 -0.4629076438 -0.577118604 0.805040239 #> 152 0.740124404 -0.36346606 -0.2511890220 0.050513002 -0.609567210 #> 153 -0.580102296 -0.65825542 0.0109387955 1.162101508 1.025498543 #> 154 -0.704150442 -0.74116493 -0.2209435046 2.825703355 -0.655496023 #> 155 0.004696108 0.90781303 -0.5133168395 -0.448567553 0.005878888 #> 156 0.846451387 -0.07788888 -0.2612708612 -0.561994951 -0.664681786 #> 157 -0.713011024 -0.76880143 -0.5838897135 -0.561994951 -0.710610599 #> 158 -0.367448331 -0.76880143 -0.0797977567 0.156378574 -0.637124498 #> 159 -0.163654947 -0.40031473 2.0676339788 -0.569556778 -0.646310260 #> 160 0.004696108 -0.48322424 -0.5738078743 -0.539309471 -0.370737381 #> 161 1.094547680 -0.48322424 -0.3923347699 -0.433443899 -0.591195684 #> 162 -0.730732188 0.41956816 -0.5939715526 -0.577118604 1.319442948 #> 163 0.181907746 -0.61219458 -0.5637260352 -0.569556778 -0.444223482 #> 164 -0.721871606 -0.25292005 -0.4830713221 -0.501500339 0.465167021 #> 165 -0.030746219 0.01423280 -0.5838897135 -0.554433125 -0.223765178 #> 166 -0.713011024 -0.76880143 0.6662583392 -0.577118604 -0.710610599 #> 167 -0.713011024 4.09522294 1.1602684568 -0.577118604 2.302319551 #> 168 2.388192634 -0.70431626 -0.5939715526 -0.577118604 1.007127017 #> 169 0.270513565 -0.76880143 -0.5738078743 -0.539309471 0.593767698 #> 170 -0.730732188 -0.76880143 0.1016753477 -0.569556778 -0.710610599 #> 171 -0.571241714 -0.61219458 -0.1100432742 0.534469902 -0.600381447 #> 172 -0.287703094 -0.48322424 -0.4225802873 -0.524185818 -0.407480431 #> 173 1.422389209 -0.61219458 -0.5738078743 -0.577118604 2.752421921 #> 174 0.456585784 0.14320314 -0.1705343090 -0.546871298 1.806288368 #> 175 -0.296563675 -0.39110257 -0.0697159176 -0.493938512 -0.627938735 #> 176 0.562912767 1.38684574 -0.5939715526 0.587402689 -0.012492637 #> 177 0.952778369 -0.48322424 -0.1604524698 -0.244398235 -0.683053311 #> 178 -0.721871606 -0.75037709 -0.5838897135 -0.214150929 1.705244979 #> 179 0.217350073 -0.52928508 -0.5435623569 -0.577118604 5.278506651 #> 180 -0.261121348 0.88017653 -0.1604524698 0.557155382 -0.673867548 #> 181 -0.039606801 -0.54770941 -0.1604524698 0.111007614 -0.627938735 #> 182 -0.083909710 -0.64904325 -0.2612708612 -0.577118604 -0.306437042 #> 183 -0.199097275 1.20260239 -0.2108616655 -0.123409010 -0.554452634 #> 184 -0.668708114 -0.30819306 -0.3116800568 1.600687450 -0.572824159 #> 185 0.297095310 2.55679099 -0.5939715526 -0.554433125 -0.627938735 #> 186 -0.713011024 -0.62140675 -0.0293885611 -0.380511113 -0.701424836 #> 187 -0.721871606 -0.75958926 -0.4225802873 -0.085599877 -0.609567210 #> 188 2.990712202 -0.41873907 -0.5939715526 -0.554433125 1.392929049 #> 189 -0.730732188 -0.56613375 -0.4326621264 -0.380511113 -0.710610599 #> 190 0.102162509 -0.25292005 0.0815116694 -0.304892848 -0.609567210 #> 191 -0.668708114 -0.25292005 -0.5133168395 -0.554433125 -0.343180093 #> 192 -0.730732188 -0.32661739 0.6158491435 -0.577118604 -0.205393653 #> 193 0.057859600 -0.63061892 -0.3822529308 0.413480677 -0.278879754 #> 194 -0.509217641 0.14320314 -0.4528258047 -0.577118604 0.162036853 #> 195 -0.668708114 0.11556664 -0.3721710916 0.526908076 -0.692239074 #> 196 -0.730732188 -0.76880143 -0.5838897135 -0.577118604 0.906083628 #> 197 -0.154794365 -0.47401207 2.1079613354 -0.093161703 -0.572824159 #> 198 -0.721871606 -0.67667975 -0.5939715526 -0.577118604 -0.627938735 #> 199 -0.713011024 -0.74116493 -0.4225802873 -0.161218143 -0.232950941 #> 200 -0.730732188 -0.47401207 -0.3217618960 0.511784423 -0.278879754 #> Otu00034 Otu00035 Otu00036 Otu00037 Otu00038 #> 1 -0.1482914828 -0.28857253 -0.337797955 -0.28026882 -0.269009738 #> 2 -0.1507314908 1.32771762 -0.337797955 -0.40104181 -0.269009738 #> 3 -0.1360914431 -0.09645535 -0.309626997 5.43380328 -0.251964926 #> 4 -0.1507314908 -0.24263146 -0.337797955 -0.28781713 -0.254805728 #> 5 0.0469091527 -0.38463111 -0.332163763 -0.55200805 -0.269009738 #> 6 -0.1507314908 -0.31363129 -0.337797955 -0.02362622 -0.269009738 #> 7 -0.1507314908 -0.38880757 3.099058896 -0.19723739 -0.269009738 #> 8 -0.1507314908 -0.25098438 -0.337797955 -0.13685089 -0.266168936 #> 9 -0.0775312524 -0.38880757 -0.337797955 0.32359613 -0.084357613 #> 10 -0.0604511968 -0.30110191 0.811577123 -0.51426649 -0.254805728 #> 11 -0.1507314908 1.31518824 -0.337797955 0.52740055 -0.269009738 #> 12 0.6935112580 -0.25098438 -0.337797955 -0.54445974 -0.266168936 #> 13 -0.1458514749 5.21182571 -0.337797955 -0.55200805 -0.257646530 #> 14 -0.1507314908 -0.31780775 -0.337797955 -0.43878337 -0.269009738 #> 15 -0.1507314908 -0.20921978 0.158010902 -0.40859012 -0.269009738 #> 16 -0.0824112683 -0.36792527 -0.337797955 1.16145875 -0.269009738 #> 17 -0.1507314908 -0.38880757 0.963700295 -0.29536544 0.049160077 #> 18 -0.1507314908 -0.17580810 -0.337797955 0.01411534 -0.200830492 #> 19 -0.1458514749 0.28360254 -0.337797955 -0.43123506 -0.269009738 #> 20 -0.1482914828 -0.36792527 -0.337797955 1.87100007 -0.269009738 #> 21 0.3616701775 -0.38880757 -0.337797955 7.21520489 -0.251964926 #> 22 -0.1214513954 -0.38463111 -0.337797955 0.18772652 -0.232079313 #> 23 -0.1507314908 0.35460236 -0.337797955 -0.25007557 -0.269009738 #> 24 -0.1507314908 -0.38880757 -0.337797955 0.06695353 -0.260487332 #> 25 -0.1360914431 -0.23010208 1.746852922 -0.54445974 0.270742627 #> 26 0.9887522192 -0.38463111 -0.337797955 -0.51426649 -0.260487332 #> 27 13.8524741014 -0.38880757 -0.337797955 -0.55200805 -0.266168936 #> 28 -0.1507314908 -0.38880757 -0.337797955 -0.55200805 -0.101402425 #> 29 -0.1507314908 0.05807368 -0.337797955 -0.31801038 -0.266168936 #> 30 -0.1458514749 -0.38880757 -0.337797955 -0.46897662 -0.260487332 #> 31 -0.1141313716 1.80383409 -0.320895380 0.42927250 0.301991448 #> 32 -0.1482914828 -0.38045465 -0.332163763 -0.33310700 -0.269009738 #> 33 -0.1507314908 -0.30945483 0.929895146 1.22184525 -0.269009738 #> 34 0.3836302490 -0.38880757 -0.337797955 -0.55200805 -0.269009738 #> 35 -0.1434114669 -0.38880757 -0.337797955 0.05940521 -0.266168936 #> 36 0.0542291766 -0.38880757 -0.337797955 -0.55200805 -0.254805728 #> 37 -0.1068113478 -0.38880757 -0.337797955 -0.52936311 2.219532746 #> 38 0.0883892878 -0.38463111 -0.337797955 -0.55200805 0.196881777 #> 39 -0.1507314908 -0.31780775 -0.337797955 -0.20478570 -0.226397709 #> 40 -0.1507314908 -0.27604314 -0.337797955 -0.14439921 0.114498521 #> 41 -0.1385314510 -0.38463111 -0.332163763 0.98029927 -0.269009738 #> 42 -0.0848512763 -0.30945483 -0.072990952 -0.01607790 -0.146855255 #> 43 -0.0360511174 -0.38880757 -0.337797955 -0.55200805 -0.269009738 #> 44 -0.1434114669 -0.38880757 -0.337797955 -0.55200805 -0.269009738 #> 45 -0.1019313319 -0.38880757 -0.337797955 -0.46142831 -0.266168936 #> 46 -0.1409714590 -0.38880757 3.262450451 0.53494886 -0.266168936 #> 47 -0.0214110697 -0.38880757 -0.337797955 0.82933303 -0.269009738 #> 48 -0.1312114272 -0.35121943 -0.337797955 2.98060192 -0.266168936 #> 49 -0.1287714193 -0.38880757 2.969472490 -0.52936311 -0.192308086 #> 50 -0.0946113080 -0.38880757 -0.337797955 -0.49162155 -0.269009738 #> 51 -0.1458514749 -0.18833748 -0.337797955 -0.44633168 -0.135492048 #> 52 -0.1458514749 3.57047681 -0.337797955 -0.54445974 0.392897110 #> 53 0.0493491607 -0.38880757 -0.337797955 1.64455071 -0.229238511 #> 54 0.1249894069 -0.38880757 -0.337797955 -0.54445974 -0.149696057 #> 55 -0.1482914828 -0.19251394 -0.337797955 -0.41613843 -0.269009738 #> 56 -0.0311711015 -0.38880757 -0.337797955 -0.55200805 -0.266168936 #> 57 -0.1507314908 -0.07139659 -0.337797955 -0.43123506 -0.254805728 #> 58 -0.0287310935 -0.37210173 -0.326529572 -0.54445974 -0.269009738 #> 59 -0.1092513557 -0.38880757 -0.337797955 -0.48407324 0.017911256 #> 60 -0.1507314908 -0.11733765 -0.337797955 -0.41613843 -0.269009738 #> 61 -0.1409714590 -0.38880757 -0.337797955 -0.32555869 0.071886493 #> 62 -0.1287714193 -0.28439607 -0.005380653 0.23301639 1.310476131 #> 63 -0.0458111492 -0.38880757 -0.332163763 -0.04627115 -0.007655961 #> 64 -0.1507314908 0.63442520 -0.281456039 0.48965899 -0.226397709 #> 65 -0.1507314908 -0.38880757 -0.337797955 -0.55200805 -0.220716105 #> 66 -0.1409714590 1.92912790 -0.337797955 -0.55200805 -0.090039217 #> 67 -0.1482914828 -0.32198421 -0.337797955 -0.09910934 -0.269009738 #> 68 -0.1507314908 0.04972076 2.293369503 -0.53691142 -0.269009738 #> 69 -0.1507314908 -0.05469075 -0.337797955 -0.42368675 -0.266168936 #> 70 -0.0653312127 0.55507246 -0.337797955 -0.18968908 1.685461984 #> 71 -0.1068113478 -0.38880757 -0.332163763 0.24056470 -0.260487332 #> 72 -0.1482914828 0.44230803 -0.337797955 -0.40104181 -0.226397709 #> 73 -0.1482914828 -0.38880757 -0.337797955 -0.29536544 -0.217875303 #> 74 -0.1482914828 -0.38880757 -0.337797955 -0.25762388 -0.269009738 #> 75 -0.1458514749 -0.34704297 0.011521922 -0.48407324 -0.257646530 #> 76 -0.0897312922 -0.17998456 -0.337797955 -0.55200805 -0.232079313 #> 77 -0.1409714590 -0.25933730 -0.326529572 -0.46897662 0.032115266 #> 78 -0.1482914828 0.07895598 -0.337797955 -0.55200805 -0.246283323 #> 79 -0.1507314908 -0.29692545 -0.337797955 -0.50671818 -0.269009738 #> 80 0.1591495182 -0.38463111 -0.337797955 -0.55200805 -0.269009738 #> 81 -0.1507314908 -0.01292614 0.203084435 -0.53691142 -0.266168936 #> 82 -0.0287310935 -0.36374881 7.662754058 -0.55200805 -0.269009738 #> 83 -0.1190113875 -0.38045465 -0.337797955 2.54279983 -0.195148888 #> 84 -0.1434114669 0.12489705 -0.337797955 2.80699074 -0.266168936 #> 85 0.9009119332 1.03536539 -0.337797955 -0.52936311 -0.269009738 #> 86 -0.1507314908 -0.19669040 -0.337797955 -0.55200805 -0.269009738 #> 87 -0.1507314908 0.47989617 -0.337797955 0.46701406 -0.240601719 #> 88 -0.1141313716 0.53419016 2.304637886 -0.34820363 -0.192308086 #> 89 -0.1507314908 -0.38880757 -0.337797955 -0.29536544 0.398578714 #> 90 -0.0214110697 -0.38880757 -0.337797955 -0.07646440 -0.266168936 #> 91 -0.1434114669 -0.38880757 -0.332163763 -0.46897662 -0.246283323 #> 92 -0.1482914828 1.78712825 -0.337797955 -0.55200805 -0.169581671 #> 93 -0.1507314908 -0.38880757 -0.337797955 -0.39349350 -0.240601719 #> 94 -0.1482914828 -0.32616067 1.284849214 -0.29536544 -0.158218463 #> 95 -0.0824112683 -0.35121943 -0.337797955 -0.25007557 -0.269009738 #> 96 -0.0580111889 -0.38880757 -0.337797955 -0.55200805 -0.266168936 #> 97 0.3909502729 -0.38880757 -0.337797955 -0.52936311 -0.266168936 #> 98 -0.1482914828 1.37365868 -0.337797955 -0.03117453 -0.266168936 #> 99 0.0005490018 -0.35539589 -0.337797955 -0.55200805 -0.269009738 #> 100 0.1786695817 -0.38463111 -0.337797955 -0.55200805 8.500545795 #> 101 -0.0946113080 -0.37210173 -0.247650890 -0.01607790 -0.266168936 #> 102 -0.1434114669 -0.38880757 -0.332163763 -0.42368675 -0.263328134 #> 103 -0.1019313319 -0.38880757 -0.337797955 0.73875328 -0.237760917 #> 104 -0.1482914828 0.41724927 1.160897000 -0.55200805 -0.251964926 #> 105 -0.1263314113 -0.38880757 -0.337797955 -0.52936311 -0.118447236 #> 106 0.5324707336 -0.38463111 0.496062396 -0.55200805 -0.269009738 #> 107 -0.1507314908 1.03954186 -0.337797955 0.11224340 -0.172422473 #> 108 -0.1385314510 -0.38880757 -0.337797955 -0.34820363 -0.095720821 #> 109 -0.1214513954 -0.38045465 -0.337797955 0.74630160 -0.269009738 #> 110 -0.1458514749 -0.38463111 -0.337797955 -0.47652493 -0.266168936 #> 111 -0.1507314908 -0.38463111 -0.337797955 -0.03872284 -0.269009738 #> 112 -0.0165310538 -0.17163164 -0.337797955 0.17262989 -0.263328134 #> 113 0.0200690653 -0.38880757 -0.337797955 -0.45387999 -0.200830492 #> 114 -0.1507314908 -0.32198421 -0.337797955 -0.42368675 -0.075835207 #> 115 -0.1507314908 -0.09645535 -0.337797955 -0.38594519 0.120180125 #> 116 0.1323094308 -0.35539589 -0.332163763 0.55759380 -0.206512096 #> 117 -0.1507314908 -0.30945483 1.476411727 -0.49162155 -0.260487332 #> 118 -0.1434114669 -0.38880757 -0.337797955 -0.55200805 -0.269009738 #> 119 -0.1507314908 -0.38880757 -0.337797955 0.57269042 -0.269009738 #> 120 -0.1409714590 -0.38045465 -0.332163763 0.88971952 -0.269009738 #> 121 -0.1507314908 -0.38880757 -0.332163763 -0.48407324 -0.269009738 #> 122 -0.1507314908 3.68741770 -0.337797955 -0.55200805 -0.030382377 #> 123 -0.1458514749 -0.38880757 -0.337797955 -0.55200805 -0.269009738 #> 124 -0.1019313319 -0.10063181 -0.337797955 0.85952627 -0.215034501 #> 125 -0.1287714193 -0.29692545 -0.337797955 0.49720730 -0.217875303 #> 126 -0.1092513557 0.78477778 -0.337797955 -0.10665765 0.228130598 #> 127 -0.1434114669 -0.38880757 -0.337797955 0.17262989 0.151428946 #> 128 -0.1360914431 -0.38045465 -0.332163763 -0.37839688 0.012229652 #> 129 -0.1507314908 -0.38880757 -0.337797955 -0.53691142 0.179836966 #> 130 -0.1482914828 0.61354290 -0.337797955 -0.35575194 1.557625898 #> 131 -0.1409714590 -0.38880757 -0.337797955 1.72003383 -0.234920115 #> 132 -0.1190113875 -0.34286651 -0.332163763 0.27830626 -0.269009738 #> 133 -0.1385314510 0.68454273 6.113351379 0.40662756 -0.146855255 #> 134 -0.1507314908 -0.38880757 -0.337797955 -0.43878337 -0.269009738 #> 135 -0.1336514351 -0.37210173 -0.332163763 -0.53691142 -0.260487332 #> 136 -0.1507314908 0.21260271 -0.337797955 -0.35575194 -0.254805728 #> 137 -0.1360914431 -0.38880757 -0.281456039 -0.55200805 -0.269009738 #> 138 -0.1409714590 1.77042241 -0.332163763 0.11224340 -0.124128840 #> 139 -0.1507314908 0.57595476 0.056595454 -0.52181480 -0.254805728 #> 140 -0.0458111492 0.54254308 -0.337797955 -0.55200805 -0.237760917 #> 141 -0.1507314908 0.12489705 -0.337797955 -0.40104181 -0.192308086 #> 142 -0.1482914828 0.18336749 -0.315261189 -0.55200805 -0.183785680 #> 143 -0.1238914034 -0.36374881 -0.337797955 -0.45387999 -0.243442521 #> 144 -0.1482914828 -0.38880757 1.955318009 -0.24252726 0.441190742 #> 145 -0.1312114272 -0.35957235 -0.337797955 -0.55200805 -0.260487332 #> 146 -0.1507314908 -0.10898473 -0.270187656 -0.55200805 0.784927775 #> 147 -0.0580111889 -0.38880757 -0.332163763 -0.55200805 -0.269009738 #> 148 -0.1507314908 -0.36792527 1.521485259 -0.51426649 -0.001974357 #> 149 0.2201497168 -0.33869005 -0.337797955 0.32359613 -0.269009738 #> 150 -0.0677712207 -0.38880757 -0.337797955 0.21791976 0.509369989 #> 151 -0.1507314908 -0.23845500 -0.337797955 -0.49162155 0.023592860 #> 152 -0.1482914828 -0.38463111 -0.337797955 0.77649484 -0.263328134 #> 153 -0.1482914828 -0.38880757 -0.292724422 -0.06136778 0.162792154 #> 154 -0.1385314510 -0.36374881 -0.337797955 -0.55200805 4.418313433 #> 155 0.2665098677 -0.32198421 -0.337797955 1.95403150 0.091772106 #> 156 -0.1482914828 -0.16745518 -0.337797955 0.35378938 -0.254805728 #> 157 0.4812305668 -0.37210173 -0.332163763 -0.55200805 -0.223556907 #> 158 -0.0824112683 2.04606879 -0.337797955 -0.51426649 0.052000879 #> 159 -0.1263314113 -0.10063181 -0.337797955 -0.53691142 -0.263328134 #> 160 -0.1482914828 -0.38880757 0.203084435 4.20342844 -0.260487332 #> 161 -0.1507314908 -0.38880757 0.974968678 0.32359613 -0.269009738 #> 162 -0.0994913239 -0.38880757 -0.337797955 -0.55200805 -0.263328134 #> 163 -0.1507314908 -0.18416102 -0.337797955 0.35378938 -0.269009738 #> 164 0.1079093513 -0.37627819 -0.163138017 0.90481615 -0.266168936 #> 165 -0.1287714193 -0.37627819 -0.337797955 -0.50671818 -0.237760917 #> 166 0.0347091130 0.50495493 -0.337797955 -0.54445974 5.517703777 #> 167 -0.1507314908 0.04136784 -0.337797955 -0.55200805 -0.269009738 #> 168 -0.1482914828 -0.38463111 -0.337797955 -0.55200805 -0.266168936 #> 169 -0.1482914828 -0.38880757 2.535639740 -0.55200805 -0.240601719 #> 170 0.5861509084 -0.38463111 -0.337797955 -0.55200805 0.941171881 #> 171 -0.1507314908 -0.29274899 -0.337797955 -0.50671818 -0.260487332 #> 172 -0.0799712604 -0.22592562 0.005887730 -0.35575194 -0.144014453 #> 173 0.0127490415 -0.33869005 -0.264553465 -0.12175427 -0.257646530 #> 174 -0.1507314908 -0.38463111 -0.208211549 -0.15949583 -0.001974357 #> 175 -0.1458514749 0.56342538 -0.298358614 0.11224340 -0.260487332 #> 176 -0.1312114272 1.81218701 -0.337797955 0.33869275 -0.266168936 #> 177 -0.1507314908 -0.31363129 1.279215022 -0.28781713 -0.269009738 #> 178 -0.0775312524 -0.38463111 -0.337797955 -0.55200805 -0.215034501 #> 179 0.1298694228 -0.33451359 -0.337797955 2.56544476 -0.269009738 #> 180 0.3445901219 -0.33033713 0.890455805 -0.37084856 0.091772106 #> 181 -0.1507314908 2.17136260 0.777771974 -0.43878337 -0.269009738 #> 182 -0.1507314908 5.69629511 -0.337797955 -0.50671818 -0.115606434 #> 183 -0.0994913239 -0.38045465 -0.337797955 -0.53691142 -0.269009738 #> 184 0.0371491210 -0.20086686 -0.095527718 -0.25762388 -0.223556907 #> 185 -0.1507314908 -0.38880757 2.259564353 0.05940521 -0.234920115 #> 186 -0.1385314510 -0.35957235 -0.089893526 -0.54445974 0.375852298 #> 187 -0.1360914431 -0.38880757 -0.337797955 -0.55200805 -0.246283323 #> 188 -0.1092513557 -0.38880757 -0.337797955 1.79551695 -0.266168936 #> 189 -0.1165713795 -0.36792527 0.417183714 -0.52936311 -0.246283323 #> 190 -0.1507314908 -0.35957235 -0.337797955 -0.34065532 -0.269009738 #> 191 -0.0628912048 -0.29692545 -0.337797955 0.72365666 -0.266168936 #> 192 -0.0189710618 -0.38463111 2.693397103 0.36888600 7.210821722 #> 193 -0.1360914431 -0.38880757 -0.337797955 0.26320964 -0.186626482 #> 194 0.0298290971 -0.38880757 -0.337797955 2.06725618 0.515051592 #> 195 -0.1458514749 -0.38880757 -0.337797955 -0.44633168 -0.269009738 #> 196 -0.1312114272 -0.38880757 -0.337797955 2.57299307 -0.269009738 #> 197 -0.1190113875 -0.34704297 2.225759204 -0.52936311 -0.257646530 #> 198 0.4446304476 -0.38880757 -0.332163763 0.83688134 -0.269009738 #> 199 0.0200690653 -0.38880757 -0.337797955 -0.54445974 0.128702531 #> 200 -0.1092513557 7.49217304 -0.337797955 -0.15194752 -0.269009738 #> Otu00039 Otu00040 Otu00041 Otu00042 Otu00043 #> 1 -0.369691676 -0.20704023 0.122728281 0.690525991 0.719828577 #> 2 0.504524822 -0.32139200 -0.630775883 -0.301679743 -0.243967502 #> 3 -0.439414464 0.35201286 0.855588495 -0.293479696 -0.461086399 #> 4 0.064734927 -0.33409775 -0.620453908 0.641325706 -0.127464679 #> 5 0.252450126 -0.85503359 4.860514738 2.211634782 -0.461086399 #> 6 -0.214156225 0.05978056 0.277557904 -0.301679743 0.545074343 #> 7 -0.385781550 -0.81691633 -0.424336386 -0.301679743 0.126723298 #> 8 -0.278515722 0.30118985 -0.661741808 -0.301679743 -0.381652656 #> 9 -0.133706855 -0.33409775 3.467048133 -0.297579720 -0.455790816 #> 10 -0.412598007 -0.46115527 0.071118407 -0.301679743 -0.461086399 #> 11 0.102277967 0.50448189 -0.661741808 -0.301679743 -0.461086399 #> 12 -0.417961299 -0.63903580 0.081440382 -0.301679743 0.312068697 #> 13 0.080824801 0.37742437 0.205304080 -0.010578061 -0.461086399 #> 14 -0.396508133 -0.55009554 0.298201853 4.581448478 -0.095691182 #> 15 -0.289242305 -0.37221501 1.712312408 3.257140824 -0.026848605 #> 16 -0.439414464 0.75859693 -0.651419833 -0.301679743 0.539778760 #> 17 -0.289242305 -0.33409775 0.659470973 -0.301679743 0.269704035 #> 18 -0.251699265 0.17413233 -0.155965040 -0.277079601 -0.005666274 #> 19 -0.058620775 -0.60091855 0.628505049 -0.256579483 -0.164533759 #> 20 1.362651445 1.52094206 -0.372726512 -0.297579720 -0.461086399 #> 21 -0.439414464 4.04938672 -0.661741808 -0.301679743 -0.455790816 #> 22 -0.310695471 -0.85503359 -0.661741808 -0.256579483 -0.249263085 #> 23 -0.407234716 0.79671419 -0.021779367 -0.297579720 0.132018880 #> 24 -0.305332179 1.34306153 1.640058584 -0.236079364 -0.365765907 #> 25 -0.439414464 0.25036685 -0.651419833 -0.301679743 -0.461086399 #> 26 -0.434051173 -0.74068182 0.721402822 -0.289379672 0.010220475 #> 27 -0.439414464 -0.85503359 -0.641097858 -0.231979341 -0.424017319 #> 28 -0.230246100 -0.57550704 -0.558522059 -0.002378014 -0.418721736 #> 29 0.466981782 -0.72797607 -0.290150713 -0.301679743 -0.392243822 #> 30 8.093582148 -0.74068182 -0.455302311 -0.268879554 3.399393499 #> 31 -0.310695471 0.14872083 -0.661741808 -0.297579720 -0.455790816 #> 32 -0.439414464 -0.30868625 -0.661741808 -0.281179625 -0.424017319 #> 33 -0.192703060 1.16518100 -0.630775883 -0.301679743 1.180544285 #> 34 0.139821007 0.84753719 0.174338155 -0.289379672 -0.413426153 #> 35 -0.273152431 -0.10539421 -0.475946260 -0.301679743 -0.085100016 #> 36 -0.332148636 1.02541772 -0.661741808 -0.297579720 -0.413426153 #> 37 0.542067861 -0.63903580 -0.269506763 -0.301679743 -0.053326519 #> 38 -0.439414464 -0.85503359 -0.651419833 -0.301679743 -0.461086399 #> 39 -0.417961299 -0.14351147 1.412975137 -0.301679743 -0.249263085 #> 40 0.247086835 -0.29598050 -0.114677141 -0.297579720 0.184974709 #> 41 0.043281762 0.31389561 -0.434658361 -0.301679743 -0.238671919 #> 42 -0.412598007 0.14872083 -0.279828738 -0.260679507 -0.392243822 #> 43 -0.439414464 -0.85503359 -0.641097858 -0.301679743 -0.429312902 #> 44 -0.203429643 -0.85503359 0.287879879 -0.289379672 -0.344583576 #> 45 -0.428687881 -0.82962208 -0.475946260 -0.301679743 -0.339287993 #> 46 0.129094424 0.37742437 -0.506912185 -0.252479459 -0.461086399 #> 47 -0.428687881 -0.80421058 -0.032101342 -0.297579720 0.290886366 #> 48 0.123731133 -0.05457121 -0.166287015 -0.301679743 -0.461086399 #> 49 -0.230246100 -0.62633005 -0.424336386 -0.301679743 0.820444651 #> 50 -0.417961299 0.16142658 0.019508532 -0.297579720 0.449753851 #> 51 0.450891908 -0.43574377 -0.455302311 -0.297579720 -0.461086399 #> 52 0.214907086 -0.74068182 -0.465624286 4.749549449 -0.302218913 #> 53 -0.434051173 0.17413233 -0.620453908 0.973427626 -0.461086399 #> 54 -0.439414464 1.10165224 -0.661741808 -0.297579720 -0.450495233 #> 55 -0.037167609 -0.37221501 0.225948029 -0.301679743 0.412684771 #> 56 -0.439414464 -0.85503359 -0.661741808 1.563831038 -0.461086399 #> 57 -0.235609391 -0.51197828 -0.434658361 1.157928692 -0.386948239 #> 58 -0.369691676 -0.84232784 -0.641097858 -0.293479696 -0.445199650 #> 59 -0.026441027 1.69882259 2.032293628 -0.293479696 -0.445199650 #> 60 -0.305332179 0.13601508 -0.228218864 -0.277079601 -0.010961856 #> 61 -0.412598007 -0.48656678 2.352274849 -0.293479696 -0.445199650 #> 62 -0.026441027 0.19954384 -0.290150713 -0.289379672 -0.439904067 #> 63 0.096914676 2.25787568 -0.073389241 -0.293479696 -0.445199650 #> 64 1.389467902 -0.32139200 -0.651419833 -0.289379672 0.052585138 #> 65 -0.439414464 -0.85503359 -0.424336386 -0.301679743 5.326985656 #> 66 -0.010351152 1.20329825 0.143372231 -0.301679743 -0.461086399 #> 67 -0.407234716 -0.81691633 -0.506912185 3.232540682 2.599760488 #> 68 -0.396508133 -0.55009554 1.784566232 -0.301679743 -0.455790816 #> 69 -0.316058762 0.40283587 -0.661741808 -0.301679743 0.063176303 #> 70 -0.273152431 -0.20704023 -0.661741808 -0.297579720 -0.455790816 #> 71 1.603999558 0.40283587 -0.114677141 -0.301679743 -0.381652656 #> 72 -0.273152431 0.05978056 -0.661741808 -0.301679743 -0.450495233 #> 73 -0.417961299 0.08519207 1.113637867 -0.301679743 -0.286332165 #> 74 0.048645053 0.26307260 -0.197252939 -0.297579720 0.211452623 #> 75 -0.310695471 -0.24515749 1.268467489 -0.297579720 0.788671154 #> 76 -0.257062557 -0.85503359 -0.114677141 -0.293479696 -0.116873513 #> 77 -0.358965093 -0.56280129 1.361365263 -0.289379672 -0.418721736 #> 78 -0.439414464 -0.43574377 1.144603791 -0.297579720 -0.461086399 #> 79 -0.396508133 -0.39762651 -0.052745291 -0.301679743 0.089654218 #> 80 -0.439414464 -0.81691633 -0.661741808 -0.301679743 -0.461086399 #> 81 -0.423324590 -0.23245173 -0.661741808 -0.301679743 -0.233376336 #> 82 -0.439414464 1.07624073 0.102084331 0.292823692 0.910469559 #> 83 3.760042699 0.92377171 -0.238540839 -0.297579720 -0.365765907 #> 84 2.816103414 3.09645532 -0.661741808 2.219834829 -0.450495233 #> 85 -0.439414464 -0.82962208 0.463353451 -0.100778582 0.274999617 #> 86 -0.439414464 -0.74068182 0.525285300 -0.297579720 -0.074508851 #> 87 0.820959014 -0.72797607 -0.279828738 -0.285279649 -0.402834987 #> 88 -0.273152431 -0.85503359 -0.651419833 -0.289379672 -0.333992410 #> 89 0.359715954 0.94918321 0.504641350 -0.293479696 -0.376357073 #> 90 -0.434051173 1.01271197 -0.661741808 -0.301679743 -0.461086399 #> 91 -0.391144842 -0.47386102 0.287879879 -0.301679743 -0.455790816 #> 92 -0.283879014 -0.84232784 -0.651419833 -0.301679743 -0.392243822 #> 93 -0.181976477 -0.85503359 -0.661741808 -0.297579720 -0.307514496 #> 94 -0.364328385 -0.85503359 -0.661741808 -0.297579720 -0.455790816 #> 95 -0.251699265 -0.34680350 0.463353451 -0.297579720 0.666872748 #> 96 -0.439414464 -0.09268846 0.153694206 -0.301679743 -0.461086399 #> 97 0.912134968 1.03812348 -0.641097858 -0.301679743 -0.439904067 #> 98 0.096914676 -0.51197828 0.834944546 -0.301679743 -0.461086399 #> 99 0.075461510 0.49177614 -0.661741808 -0.301679743 6.846817934 #> 100 -0.439414464 -0.85503359 -0.620453908 -0.289379672 4.109001601 #> 101 -0.294605596 -0.68985881 -0.372726512 -0.293479696 1.127588456 #> 102 -0.160523311 -0.65174155 -0.517234160 -0.244279412 -0.376357073 #> 103 -0.214156225 1.57176506 -0.589487984 -0.174579009 -0.386948239 #> 104 2.767833791 1.35576728 -0.383048487 -0.297579720 -0.450495233 #> 105 -0.407234716 -0.49927253 0.019508532 0.219023266 0.417980354 #> 106 1.051580544 -0.71527031 0.060796432 -0.301679743 2.864539631 #> 107 -0.396508133 -0.05457121 -0.444980336 -0.301679743 0.476231766 #> 108 -0.439414464 2.90586903 -0.661741808 0.145222839 -0.439904067 #> 109 -0.348238510 0.98730047 -0.630775883 -0.297579720 1.350002936 #> 110 0.134457715 -0.58821279 0.029830507 0.719226157 -0.016257439 #> 111 -0.364328385 -0.65174155 -0.661741808 -0.244279412 -0.445199650 #> 112 -0.439414464 4.51949955 0.339489753 -0.301679743 4.956294857 #> 113 -0.198066351 -0.85503359 -0.661741808 1.752432128 -0.455790816 #> 114 -0.171249894 -0.60091855 2.589680270 -0.297579720 -0.286332165 #> 115 -0.348238510 -0.04186545 -0.661741808 -0.301679743 0.089654218 #> 116 -0.181976477 -0.52468403 -0.001135417 -0.108978630 -0.291627748 #> 117 -0.396508133 0.04707481 0.969130219 -0.301679743 -0.461086399 #> 118 -0.439414464 -0.23245173 2.259377075 -0.301679743 -0.461086399 #> 119 0.107641258 -0.85503359 2.042615603 -0.293479696 -0.461086399 #> 120 6.806392213 1.94023187 -0.651419833 -0.297579720 -0.455790816 #> 121 -0.401871424 -0.65174155 1.113637867 0.018122105 -0.206898422 #> 122 0.745872935 -0.71527031 -0.661741808 1.756532152 -0.455790816 #> 123 -0.439414464 -0.85503359 -0.465624286 -0.297579720 -0.455790816 #> 124 0.761962809 0.93647746 -0.661741808 -0.297579720 -0.461086399 #> 125 -0.428687881 0.94918321 -0.558522059 0.624925612 -0.429312902 #> 126 0.037918470 -0.42303802 0.422065552 0.895527176 -0.461086399 #> 127 -0.122980272 -0.84232784 1.825854131 -0.297579720 -0.445199650 #> 128 0.155910881 -0.56280129 -0.661741808 -0.301679743 -0.243967502 #> 129 0.649333689 -0.66444731 -0.537878109 -0.301679743 -0.281036582 #> 130 -0.385781550 0.36471861 -0.166287015 -0.301679743 -0.461086399 #> 131 -0.439414464 -0.85503359 -0.589487984 -0.256579483 -0.450495233 #> 132 0.155910881 -0.33409775 -0.599809959 0.268223550 1.662442324 #> 133 0.155910881 1.68611683 -0.661741808 -0.301679743 -0.455790816 #> 134 -0.326785345 0.12330932 0.463353451 -0.301679743 1.620077661 #> 135 -0.139070146 0.80941994 -0.651419833 -0.301679743 -0.434608484 #> 136 -0.149796729 -0.21974598 2.114869427 -0.281179625 0.073767469 #> 137 -0.353601802 0.46636463 -0.661741808 0.743826299 -0.058622102 #> 138 -0.101527106 -0.39762651 -0.661741808 2.387935801 -0.461086399 #> 139 -0.149796729 -0.21974598 0.277557904 -0.301679743 -0.217489588 #> 140 0.525977987 1.19059250 0.164016180 -0.301679743 -0.461086399 #> 141 -0.332148636 -0.74068182 0.618183074 1.990233502 0.184974709 #> 142 -0.434051173 -0.84232784 -0.641097858 -0.289379672 -0.333992410 #> 143 1.587909684 -0.66444731 -0.465624286 -0.297579720 -0.318105662 #> 144 -0.439414464 -0.21974598 -0.362404537 -0.301679743 0.492118514 #> 145 -0.321422053 -0.85503359 -0.444980336 -0.281179625 1.561826250 #> 146 -0.342875219 -0.76609332 -0.475946260 9.243175419 -0.450495233 #> 147 -0.439414464 -0.85503359 -0.455302311 -0.293479696 -0.461086399 #> 148 -0.434051173 0.40283587 2.909661491 -0.301679743 0.889287228 #> 149 -0.439414464 -0.52468403 -0.403692436 -0.301679743 -0.461086399 #> 150 -0.439414464 0.45365888 0.308523828 -0.297579720 -0.376357073 #> 151 0.032555179 -0.70256456 0.287879879 -0.301679743 -0.461086399 #> 152 -0.004987861 0.96188896 -0.300472688 -0.002378014 -0.461086399 #> 153 -0.358965093 -0.81691633 6.832011934 -0.293479696 -0.461086399 #> 154 -0.412598007 0.31389561 -0.269506763 -0.297579720 0.169087960 #> 155 0.102277967 0.59342215 -0.630775883 -0.100778582 0.121427715 #> 156 -0.439414464 4.15103274 -0.290150713 -0.301679743 -0.461086399 #> 157 -0.439414464 -0.85503359 -0.630775883 -0.301679743 -0.355174742 #> 158 0.107641258 -0.47386102 0.215626055 -0.301679743 -0.386948239 #> 159 -0.031804318 -0.13080572 0.153694206 -0.281179625 -0.318105662 #> 160 1.169572955 -0.77879908 -0.630775883 -0.301679743 -0.429312902 #> 161 -0.332148636 0.22495534 -0.630775883 -0.301679743 -0.461086399 #> 162 -0.417961299 -0.01645395 -0.661741808 -0.297579720 -0.450495233 #> 163 -0.042530901 0.21224959 -0.599809959 -0.301679743 -0.455790816 #> 164 -0.407234716 1.95293763 -0.114677141 -0.281179625 -0.445199650 #> 165 -0.364328385 2.10540665 -0.610131933 -0.301679743 0.592734588 #> 166 -0.439414464 -0.85503359 -0.661741808 -0.301679743 -0.455790816 #> 167 -0.439414464 -0.85503359 -0.434658361 -0.301679743 -0.461086399 #> 168 3.373885719 -0.06727696 -0.661741808 -0.223779293 -0.450495233 #> 169 0.359715954 -0.84232784 -0.589487984 0.124722721 -0.185716091 #> 170 -0.439414464 -0.85503359 -0.661741808 -0.297579720 -0.461086399 #> 171 -0.391144842 -0.28327474 0.525285300 -0.301679743 0.635099251 #> 172 -0.439414464 0.05978056 -0.465624286 -0.240179388 0.862809314 #> 173 -0.417961299 -0.76609332 -0.630775883 -0.301679743 3.341142087 #> 174 0.338262788 -0.15621722 0.680114923 -0.301679743 -0.085100016 #> 175 0.005738722 -0.04186545 1.010418118 -0.277079601 -0.455790816 #> 176 -0.439414464 -0.85503359 -0.661741808 0.501924901 -0.461086399 #> 177 -0.391144842 -0.43574377 -0.032101342 -0.293479696 -0.058622102 #> 178 -0.439414464 1.39388453 -0.145643065 -0.301679743 -0.461086399 #> 179 -0.439414464 0.61883366 -0.661741808 -0.301679743 -0.445199650 #> 180 -0.369691676 -0.49927253 0.164016180 -0.301679743 -0.069213268 #> 181 -0.267789139 -0.39762651 0.081440382 0.961127555 -0.153942593 #> 182 3.111084440 1.03812348 -0.661741808 -0.178679033 -0.439904067 #> 183 -0.198066351 -0.51197828 -0.290150713 -0.301679743 -0.196307256 #> 184 -0.353601802 -0.70256456 2.486460522 -0.293479696 -0.408130570 #> 185 -0.439414464 1.22870976 -0.496590210 -0.281179625 -0.381652656 #> 186 -0.407234716 -0.85503359 -0.661741808 -0.293479696 -0.413426153 #> 187 -0.439414464 -0.85503359 0.607861099 -0.301679743 -0.455790816 #> 188 -0.439414464 3.94774071 -0.661741808 -0.268879554 -0.445199650 #> 189 -0.423324590 -0.84232784 -0.527556135 -0.256579483 -0.333992410 #> 190 -0.321422053 -0.41033226 1.805210182 -0.285279649 -0.397539405 #> 191 0.134457715 -0.62633005 -0.661741808 0.014022081 -0.386948239 #> 192 -0.439414464 1.52094206 -0.661741808 -0.301679743 -0.450495233 #> 193 -0.412598007 -0.09268846 -0.094033191 -0.289379672 0.455049434 #> 194 -0.423324590 0.98730047 -0.527556135 -0.133578772 -0.392243822 #> 195 -0.375054967 -0.15621722 0.236270004 -0.297579720 1.090519376 #> 196 -0.144433437 -0.85503359 -0.661741808 0.104222602 -0.450495233 #> 197 -0.439414464 -0.82962208 -0.001135417 -0.293479696 -0.376357073 #> 198 0.692240021 -0.81691633 -0.661741808 -0.301679743 -0.445199650 #> 199 -0.423324590 -0.75338757 -0.290150713 -0.293479696 -0.191011673 #> 200 0.445528616 0.11060357 0.494319376 -0.301679743 -0.392243822 #> Otu00044 Otu00045 Otu00046 Otu00047 Otu00048 Otu00049 #> 1 -0.611704260 -0.23391339 0.693551357 -0.203512195 -0.253544727 0.60651290 #> 2 -0.622709104 -0.23391339 -0.569110688 -0.208661143 -0.253544727 -0.42970775 #> 3 0.026576699 -0.23391339 -0.584323484 0.342276360 0.007337307 -0.42161228 #> 4 0.092605763 -0.23391339 -0.523472301 -0.208661143 -0.253544727 -0.43780323 #> 5 -0.303568625 -0.14075174 -0.584323484 -0.208661143 -0.194846269 0.42841248 #> 6 -0.259549248 -0.23391339 0.784828131 -0.208661143 -0.253544727 -0.43780323 #> 7 0.829930318 -0.23391339 -0.584323484 -0.033596890 -0.247022676 0.01554331 #> 8 -0.204525028 -0.23391339 0.221954690 -0.208661143 -0.253544727 -0.33256207 #> 9 -0.534670351 -0.23391339 -0.584323484 -0.208661143 -0.070927303 -0.31637112 #> 10 -0.446631598 -0.23391339 -0.584323484 0.501893767 -0.207890371 -0.42970775 #> 11 0.235668737 -0.23391339 1.895362219 -0.203512195 -0.247022676 -0.43780323 #> 12 -0.622709104 -0.23391339 -0.188790795 -0.208661143 -0.116581659 -0.40542133 #> 13 -0.314573469 -0.23391339 -0.584323484 -0.208661143 -0.227456524 -0.42161228 #> 14 -0.578689727 -0.18733256 0.298018668 -0.208661143 -0.253544727 -0.43780323 #> 15 0.884954539 -0.23391339 1.180360820 -0.208661143 -0.253544727 -0.40542133 #> 16 -0.611704260 -0.10348707 -0.584323484 -0.193214297 -0.253544727 8.67770035 #> 17 0.004567010 -0.23391339 0.678338561 -0.208661143 -0.207890371 -0.41351681 #> 18 -0.215529872 -0.23391339 0.632700174 -0.203512195 -0.253544727 -0.43780323 #> 19 0.169639672 -0.23391339 -0.386557139 -0.208661143 -0.253544727 0.68746764 #> 20 -0.402612222 0.55174991 -0.584323484 -0.208661143 -0.247022676 -0.43780323 #> 21 -0.600699416 -0.23391339 -0.477833914 -0.208661143 1.142174157 -0.42161228 #> 22 0.488780151 -0.23391339 -0.234429182 -0.203512195 -0.227456524 -0.42970775 #> 23 -0.039452366 -0.23391339 -0.097514021 -0.208661143 -0.247022676 0.50936722 #> 24 6.431395968 -0.23391339 1.119509637 0.316531617 -0.253544727 -0.13017522 #> 25 -0.435626754 -0.23391339 -0.584323484 -0.208661143 -0.207890371 -0.43780323 #> 26 0.279688113 -0.23391339 -0.127939612 -0.203512195 0.626932139 -0.43780323 #> 27 -0.732757545 -0.23391339 -0.584323484 -0.208661143 5.707609757 1.02747754 #> 28 -0.380602533 -0.23391339 -0.584323484 -0.208661143 -0.253544727 -0.43780323 #> 29 -0.633713948 -0.23080800 -0.219216386 0.002445751 -0.253544727 0.03982973 #> 30 -0.545675195 -0.23080800 -0.295280365 -0.203512195 -0.253544727 -0.43780323 #> 31 -0.644718792 -0.23391339 -0.584323484 -0.208661143 -0.207890371 -0.26779828 #> 32 -0.226534716 5.84954278 -0.584323484 -0.208661143 -0.253544727 -0.43780323 #> 33 0.026576699 -0.23391339 1.073871250 -0.141724811 -0.253544727 0.52555816 #> 34 -0.655723636 -0.23391339 0.982594476 3.756029300 0.920424427 -0.02493406 #> 35 -0.347588001 -0.23080800 -0.264854773 -0.208661143 -0.240500625 0.26650300 #> 36 -0.721752701 -0.23391339 -0.584323484 -0.208661143 0.033425511 -0.28398922 #> 37 1.677303314 -0.23391339 0.510997808 -0.208661143 -0.097015507 -0.38113491 #> 38 0.829930318 -0.23391339 0.008975549 -0.208661143 -0.233978575 -0.12207975 #> 39 -0.006437834 7.04201198 0.754402540 -0.208661143 -0.253544727 0.12078447 #> 40 0.180644516 -0.23080800 1.256424799 -0.208661143 -0.253544727 -0.41351681 #> 41 -0.138495963 -0.23080800 0.008975549 -0.208661143 -0.247022676 0.48508079 #> 42 -0.292563781 -0.22459723 -0.493046709 -0.193214297 0.274741392 -0.41351681 #> 43 -0.523665507 -0.23391339 -0.584323484 -0.208661143 1.311747479 -0.34065754 #> 44 -0.094476587 -0.14385712 2.153979746 -0.208661143 -0.227456524 -0.36494396 #> 45 0.202654204 -0.23391339 -0.462621118 -0.208661143 1.279137225 0.19364374 #> 46 -0.380602533 -0.23391339 -0.569110688 -0.188065349 -0.194846269 -0.42161228 #> 47 3.206976645 -0.23391339 -0.386557139 0.661511175 0.079079867 -0.37303944 #> 48 -0.600699416 -0.23080800 -0.584323484 -0.208661143 -0.220934473 -0.43780323 #> 49 -0.380602533 -0.23391339 -0.386557139 -0.208661143 -0.207890371 -0.08969785 #> 50 -0.490650974 -0.23391339 0.100252324 -0.203512195 1.670460276 -0.31637112 #> 51 -0.215529872 -0.23391339 0.419721034 -0.208661143 -0.253544727 -0.43780323 #> 52 -0.688738168 -0.23391339 0.997807271 -0.208661143 -0.253544727 -0.43780323 #> 53 -0.721752701 -0.23391339 -0.584323484 -0.208661143 0.046469612 -0.43780323 #> 54 -0.534670351 -0.22770262 -0.188790795 -0.208661143 0.366050104 -0.42161228 #> 55 -0.248544404 -0.23391339 3.918664050 -0.208661143 -0.253544727 -0.43780323 #> 56 -0.732757545 -0.23391339 -0.584323484 -0.208661143 -0.220934473 -0.42970775 #> 57 -0.127491119 -0.02274697 -0.508259505 -0.208661143 -0.253544727 -0.17065259 #> 58 -0.721752701 -0.23391339 0.176316302 -0.198363246 -0.247022676 -0.34065754 #> 59 -0.325578313 -0.23391339 -0.371344344 -0.203512195 -0.240500625 -0.38923038 #> 60 0.323707489 0.39026971 -0.538685096 -0.208661143 -0.253544727 0.08840257 #> 61 1.226104706 -0.23391339 -0.584323484 -0.208661143 -0.253544727 -0.42161228 #> 62 -0.699743012 -0.23391339 -0.416982731 -0.203512195 0.079079867 0.25031205 #> 63 -0.501655819 2.33734833 -0.477833914 -0.203512195 -0.175280117 -0.42970775 #> 64 -0.567684883 -0.23391339 0.510997808 -0.203512195 -0.240500625 -0.41351681 #> 65 -0.468641286 -0.23080800 -0.219216386 -0.115980068 -0.253544727 2.04750725 #> 66 0.983998136 -0.23391339 -0.082301225 -0.203512195 -0.149191913 -0.07350690 #> 67 -0.446631598 -0.23391339 -0.508259505 -0.018150044 -0.253544727 0.74413596 #> 68 1.435196744 -0.23391339 3.812174480 -0.208661143 -0.253544727 -0.43780323 #> 69 0.873949695 -0.23391339 5.303028460 -0.208661143 -0.227456524 1.06795491 #> 70 -0.534670351 -0.23391339 -0.584323484 -0.208661143 -0.083971405 -0.42970775 #> 71 0.433755930 -0.18422718 -0.553897892 -0.208661143 -0.240500625 0.54174911 #> 72 1.138065953 -0.23391339 -0.584323484 -0.208661143 -0.253544727 -0.07350690 #> 73 -0.369597689 -0.23391339 2.473448456 6.943228501 -0.227456524 -0.38923038 #> 74 -0.094476587 1.04550669 -0.386557139 -0.208661143 -0.253544727 -0.20303448 #> 75 -0.347588001 -0.23391339 0.374082647 -0.208661143 -0.253544727 -0.29208470 #> 76 -0.710747857 -0.23391339 -0.158365203 -0.208661143 0.646498291 -0.43780323 #> 77 0.510789839 -0.23080800 -0.553897892 -0.208661143 -0.253544727 -0.43780323 #> 78 -0.732757545 -0.23391339 -0.584323484 -0.208661143 0.033425511 -0.43780323 #> 79 0.048586387 4.98624476 -0.204003591 -0.208661143 -0.253544727 -0.08160238 #> 80 0.323707489 -0.23391339 -0.584323484 -0.208661143 -0.136147812 -0.43780323 #> 81 0.499784995 -0.23391339 0.997807271 -0.208661143 -0.253544727 0.09649805 #> 82 -0.732757545 -0.23391339 -0.584323484 -0.203512195 -0.129625761 -0.42161228 #> 83 0.147629984 -0.23080800 -0.356131548 -0.208661143 -0.240500625 -0.42161228 #> 84 -0.523665507 -0.23391339 -0.584323484 -0.208661143 -0.227456524 -0.30018017 #> 85 5.352921246 -0.19975412 -0.569110688 -0.208661143 -0.175280117 0.06411615 #> 86 -0.457636442 -0.23391339 -0.401769935 -0.208661143 0.248653189 -0.29208470 #> 87 0.081600919 -0.23391339 -0.553897892 -0.208661143 -0.240500625 -0.42970775 #> 88 -0.116486275 -0.23391339 -0.584323484 -0.208661143 -0.253544727 -0.34875301 #> 89 0.774906098 -0.23391339 1.773659853 -0.208661143 -0.253544727 -0.43780323 #> 90 -0.534670351 -0.22149184 -0.584323484 -0.208661143 0.666064444 -0.43780323 #> 91 -0.380602533 -0.23391339 1.682383079 -0.198363246 -0.253544727 -0.32446659 #> 92 0.499784995 -0.23391339 3.583982544 -0.208661143 -0.253544727 -0.39732586 #> 93 -0.633713948 -0.23391339 -0.538685096 -0.208661143 -0.253544727 -0.35684849 #> 94 -0.457636442 -0.23391339 0.419721034 -0.208661143 -0.253544727 -0.33256207 #> 95 -0.391607378 -0.23391339 0.298018668 -0.208661143 -0.083971405 -0.39732586 #> 96 -0.732757545 -0.23391339 -0.584323484 1.160959192 0.144300375 -0.43780323 #> 97 -0.369597689 -0.23080800 -0.584323484 -0.193214297 0.242131138 0.06411615 #> 98 -0.259549248 -0.23391339 0.434933830 -0.208661143 -0.253544727 -0.38113491 #> 99 -0.677733324 -0.23391339 -0.584323484 -0.208661143 -0.038317049 -0.39732586 #> 100 3.273005710 -0.23391339 -0.477833914 -0.208661143 -0.253544727 4.16042593 #> 101 0.554809216 -0.23391339 -0.553897892 -0.167469554 -0.057883201 1.04366849 #> 102 -0.710747857 0.20084100 -0.508259505 -0.208661143 -0.207890371 -0.42161228 #> 103 -0.435626754 -0.23391339 -0.584323484 -0.208661143 -0.175280117 0.14507089 #> 104 -0.600699416 -0.23080800 -0.584323484 -0.208661143 0.633454190 -0.33256207 #> 105 -0.281558936 -0.23391339 0.008975549 -0.208661143 -0.240500625 -0.36494396 #> 106 -0.479646130 -0.22770262 -0.097514021 -0.208661143 0.509535223 1.65892451 #> 107 0.213659048 -0.23391339 -0.569110688 -0.208661143 -0.253544727 1.18129155 #> 108 0.213659048 -0.23391339 -0.584323484 -0.208661143 -0.253544727 -0.36494396 #> 109 1.699313003 -0.22459723 1.210786411 -0.208661143 -0.253544727 3.28611475 #> 110 2.260560052 -0.03206314 1.575893509 -0.208661143 -0.240500625 0.03173426 #> 111 1.908405041 -0.23391339 -0.462621118 -0.208661143 -0.253544727 -0.42161228 #> 112 0.686867345 -0.23391339 -0.584323484 0.120871569 -0.253544727 3.50469255 #> 113 0.466770463 -0.23391339 -0.584323484 -0.208661143 -0.233978575 -0.43780323 #> 114 0.653852813 6.28429718 1.560680713 -0.203512195 -0.253544727 -0.33256207 #> 115 -0.149500807 -0.23391339 1.530255122 -0.208661143 -0.247022676 2.12846199 #> 116 -0.314573469 -0.23391339 -0.493046709 -0.146873760 -0.207890371 -0.42970775 #> 117 -0.490650974 -0.23080800 -0.584323484 -0.208661143 -0.207890371 -0.42970775 #> 118 -0.710747857 -0.23080800 -0.584323484 2.715941677 -0.240500625 -0.43780323 #> 119 -0.380602533 -0.23391339 -0.584323484 3.169049157 -0.194846269 -0.41351681 #> 120 -0.600699416 -0.23080800 -0.584323484 -0.208661143 -0.253544727 -0.42970775 #> 121 -0.358592845 5.26883512 -0.584323484 -0.208661143 -0.253544727 0.08840257 #> 122 -0.501655819 -0.23080800 -0.432195526 -0.208661143 -0.253544727 -0.42970775 #> 123 -0.369597689 -0.22149184 -0.584323484 -0.038745838 -0.247022676 -0.43780323 #> 124 -0.402612222 -0.23391339 -0.569110688 -0.208661143 -0.247022676 -0.42970775 #> 125 0.664857657 -0.23391339 -0.508259505 -0.208661143 -0.227456524 -0.42161228 #> 126 -0.490650974 -0.23391339 1.438978347 -0.203512195 -0.201368320 -0.43780323 #> 127 -0.534670351 -0.23080800 -0.401769935 -0.203512195 -0.123103710 -0.34875301 #> 128 -0.644718792 -0.23391339 -0.523472301 -0.208661143 -0.253544727 0.36364869 #> 129 0.015571854 -0.23391339 -0.310493161 1.572875082 -0.253544727 0.71175406 #> 130 -0.094476587 -0.23391339 -0.584323484 -0.203512195 -0.253544727 -0.30827565 #> 131 -0.567684883 0.69770317 -0.584323484 -0.208661143 -0.025272947 -0.43780323 #> 132 -0.039452366 -0.23391339 0.860892110 -0.198363246 -0.253544727 1.01938207 #> 133 0.972993292 -0.23391339 -0.584323484 -0.208661143 -0.240500625 -0.37303944 #> 134 0.400741398 -0.23391339 1.895362219 -0.208661143 -0.253544727 -0.40542133 #> 135 -0.534670351 -0.22770262 -0.432195526 -0.208661143 -0.253544727 -0.10588880 #> 136 0.037581543 -0.23391339 -0.584323484 -0.208661143 -0.253544727 1.36748745 #> 137 -0.578689727 -0.23391339 -0.264854773 -0.208661143 -0.227456524 1.17319607 #> 138 0.928973915 -0.22770262 -0.584323484 -0.208661143 -0.201368320 -0.43780323 #> 139 -0.545675195 -0.11901402 -0.584323484 -0.208661143 -0.247022676 -0.21922543 #> 140 3.262000866 -0.23391339 -0.584323484 -0.203512195 -0.240500625 -0.43780323 #> 141 0.895959383 -0.22149184 -0.386557139 -0.208661143 -0.253544727 0.08840257 #> 142 -0.600699416 -0.23391339 -0.462621118 -0.208661143 -0.253544727 -0.42161228 #> 143 0.125620295 0.74428400 -0.584323484 -0.193214297 -0.240500625 0.82509070 #> 144 -0.468641286 -0.21217567 0.161103507 -0.136575862 -0.253544727 -0.34065754 #> 145 -0.160505651 -0.23391339 -0.584323484 -0.198363246 -0.240500625 -0.33256207 #> 146 -0.589694571 -0.22149184 4.146855986 -0.182916400 -0.253544727 -0.43780323 #> 147 -0.633713948 -0.23391339 -0.584323484 -0.208661143 0.137778324 -0.13017522 #> 148 -0.732757545 -0.23391339 -0.584323484 -0.208661143 -0.247022676 0.81699522 #> 149 -0.567684883 -0.23391339 0.298018668 -0.208661143 0.085601918 -0.42970775 #> 150 -0.732757545 -0.23391339 -0.553897892 -0.208661143 -0.162236015 -0.43780323 #> 151 -0.611704260 -0.23080800 -0.310493161 -0.208661143 -0.253544727 -0.43780323 #> 152 -0.281558936 -0.23391339 -0.584323484 0.980745990 -0.253544727 -0.43780323 #> 153 -0.424621910 -0.23391339 2.777704371 9.152127462 -0.253544727 -0.31637112 #> 154 -0.699743012 -0.23391339 1.515042326 -0.208661143 -0.233978575 0.20983468 #> 155 -0.534670351 -0.23391339 -0.584323484 -0.208661143 -0.207890371 4.74330005 #> 156 -0.490650974 -0.23391339 -0.584323484 -0.208661143 0.020381409 -0.43780323 #> 157 -0.699743012 -0.22770262 -0.584323484 -0.208661143 11.623109885 -0.29208470 #> 158 2.271564896 -0.19975412 3.188449855 -0.208661143 -0.253544727 -0.43780323 #> 159 -0.622709104 -0.23391339 -0.584323484 -0.208661143 -0.175280117 -0.31637112 #> 160 -0.556680039 -0.23080800 -0.401769935 -0.208661143 -0.247022676 -0.43780323 #> 161 -0.567684883 0.65422773 -0.584323484 -0.208661143 -0.253544727 -0.43780323 #> 162 -0.501655819 -0.23391339 0.465359421 -0.208661143 -0.201368320 0.76032691 #> 163 1.369167679 0.46169364 1.241212003 -0.208661143 -0.253544727 -0.30018017 #> 164 -0.446631598 -0.23391339 -0.493046709 -0.198363246 0.222564986 -0.42970775 #> 165 0.400741398 -0.23080800 -0.553897892 -0.208661143 -0.240500625 -0.10588880 #> 166 -0.732757545 -0.23391339 -0.584323484 -0.208661143 1.540019259 -0.26779828 #> 167 -0.545675195 -0.23080800 0.480572217 0.337127411 -0.247022676 -0.39732586 #> 168 0.191649360 -0.23080800 -0.432195526 -0.208661143 -0.253544727 -0.43780323 #> 169 -0.512660663 -0.23391339 -0.432195526 -0.208661143 -0.175280117 0.88985449 #> 170 -0.721752701 -0.23080800 -0.584323484 -0.208661143 0.653020342 -0.36494396 #> 171 0.257678425 -0.23391339 1.362914369 -0.203512195 -0.181802168 -0.40542133 #> 172 -0.501655819 -0.19043795 -0.493046709 -0.208661143 -0.247022676 2.04750725 #> 173 -0.512660663 -0.23391339 -0.553897892 -0.208661143 0.326917799 2.76800443 #> 174 -0.677733324 1.07345519 -0.584323484 -0.208661143 -0.247022676 -0.37303944 #> 175 0.015571854 -0.23391339 -0.112726816 -0.203512195 -0.253544727 -0.43780323 #> 176 -0.358592845 -0.23391339 -0.569110688 -0.208661143 0.366050104 0.11268900 #> 177 0.059591231 0.80639177 -0.280067569 -0.208661143 -0.253544727 -0.43780323 #> 178 1.006007824 -0.23080800 -0.584323484 -0.208661143 0.561711630 -0.43780323 #> 179 -0.732757545 -0.23080800 -0.584323484 -0.208661143 -0.077449354 0.23412110 #> 180 -0.402612222 0.02693925 0.632700174 -0.188065349 -0.253544727 0.32317132 #> 181 -0.270554092 -0.23391339 0.008975549 0.450404281 -0.253544727 0.39603058 #> 182 0.609833436 -0.23391339 0.465359421 -0.208661143 -0.227456524 -0.42161228 #> 183 0.631843124 0.11389013 -0.401769935 -0.208661143 -0.253544727 -0.30018017 #> 184 -0.589694571 -0.22459723 -0.371344344 -0.172618503 0.222564986 -0.35684849 #> 185 -0.457636442 0.65112234 -0.553897892 -0.208661143 -0.253544727 -0.37303944 #> 186 -0.655723636 -0.23391339 -0.477833914 -0.208661143 -0.247022676 -0.32446659 #> 187 0.895959383 -0.23391339 -0.584323484 -0.208661143 0.092123968 -0.30827565 #> 188 -0.248544404 -0.23391339 -0.493046709 -0.208661143 -0.129625761 -0.18684354 #> 189 -0.666728480 -0.23080800 -0.553897892 4.682840053 0.150822426 -0.41351681 #> 190 -0.171510495 1.64484668 1.073871250 -0.110831119 -0.247022676 -0.42970775 #> 191 -0.369597689 -0.23391339 -0.553897892 -0.208661143 2.146569989 -0.30018017 #> 192 3.735209162 -0.22459723 -0.569110688 -0.208661143 -0.240500625 -0.43780323 #> 193 -0.369597689 -0.23080800 0.328444260 -0.208661143 -0.253544727 -0.31637112 #> 194 0.224663892 -0.23391339 -0.356131548 -0.208661143 -0.253544727 -0.32446659 #> 195 -0.204525028 -0.23080800 0.313231464 -0.177767451 -0.247022676 0.43650795 #> 196 -0.490650974 -0.23391339 -0.386557139 -0.208661143 -0.188324219 -0.43780323 #> 197 -0.435626754 -0.23391339 -0.569110688 -0.208661143 -0.142669863 -0.42161228 #> 198 -0.666728480 -0.23391339 -0.553897892 -0.208661143 -0.103537557 -0.22732091 #> 199 -0.303568625 -0.23391339 -0.340918752 -0.208661143 1.983518717 -0.29208470 #> 200 2.876831322 -0.23391339 -0.584323484 -0.208661143 -0.253544727 -0.42970775 #> Otu00050 Otu00051 Otu00052 Otu00053 Otu00054 Otu00055 #> 1 -0.475385806 -0.20991733 0.19735560 -0.082761027 -0.18688626 -0.256009183 #> 2 -0.450642238 -0.20991733 -0.25745566 0.651532741 -0.45315341 -0.418554697 #> 3 0.304036595 -0.16859502 5.36271211 -0.189845534 1.12780781 -0.377918318 #> 4 1.380381816 -0.20991733 -0.25745566 -0.128654387 -0.08703608 -0.405009237 #> 5 -0.549616511 2.09035789 -0.25745566 -0.465205697 -0.53636190 -0.201827346 #> 6 -0.475385806 -0.20991733 -0.25745566 -0.342823403 -0.58628699 -0.283100102 #> 7 -0.524872942 -0.20991733 0.06740953 -0.082761027 -0.33666153 -0.432100156 #> 8 1.652561068 -0.20991733 -0.22496914 -0.388716763 -0.51972020 -0.418554697 #> 9 0.390639084 -0.20991733 -0.25745566 1.095168558 0.76169047 0.136809140 #> 10 -0.475385806 -0.20991733 -0.25745566 -0.373418976 0.26243956 0.096172762 #> 11 3.384610848 -0.20991733 -0.25745566 -0.465205697 -0.58628699 -0.296645562 #> 12 -0.549616511 -0.20991733 -0.25745566 -0.419312337 -0.38658662 -0.296645562 #> 13 -0.549616511 -0.20991733 -0.25745566 0.085514628 -0.30337814 -0.418554697 #> 14 -0.425898669 0.04490358 -0.25745566 -0.358121189 -0.50307850 -0.350827400 #> 15 0.192690538 -0.20991733 -0.25745566 0.024323481 -0.58628699 -0.296645562 #> 16 -0.203206555 0.84380156 -0.25745566 -0.465205697 -0.53636190 -0.432100156 #> 17 -0.549616511 -0.20991733 -0.25745566 -0.419312337 -0.03711098 -0.364372859 #> 18 -0.376411533 -0.20991733 -0.25745566 -0.312227829 0.16258938 0.245172816 #> 19 1.120574349 -0.20303028 -0.25745566 -0.281632255 -0.18688626 -0.405009237 #> 20 -0.524872942 0.91955912 -0.25745566 0.100812415 -0.58628699 -0.201827346 #> 21 -0.512501158 -0.20991733 -0.25745566 -0.465205697 -0.10367777 4.850629026 #> 22 -0.487757590 -0.20303028 -0.25745566 -0.449907910 2.24280151 -0.432100156 #> 23 -0.326924396 -0.20991733 -0.25745566 -0.388716763 -0.35330323 -0.432100156 #> 24 1.256663975 -0.20991733 7.27941672 -0.465205697 -0.51972020 -0.432100156 #> 25 -0.265065475 -0.20991733 -0.25745566 -0.006272093 2.12630963 -0.201827346 #> 26 -0.549616511 -0.20991733 0.58719383 -0.388716763 -0.43651171 0.475445626 #> 27 -0.512501158 -0.20991733 -0.25745566 -0.449907910 -0.58628699 2.547900921 #> 28 0.019485560 -0.20991733 -0.25745566 -0.434610124 -0.40322832 -0.405009237 #> 29 1.442240737 -0.18236913 -0.25745566 -0.449907910 -0.32001983 1.829991571 #> 30 -0.549616511 -0.20991733 -0.25745566 -0.465205697 -0.58628699 -0.405009237 #> 31 -0.549616511 -0.20303028 -0.24662682 -0.465205697 -0.07039438 2.209264435 #> 32 -0.005258008 -0.03774104 5.22193719 1.079870772 -0.10367777 -0.418554697 #> 33 -0.302180828 -0.20991733 -0.25745566 -0.327525616 -0.51972020 0.949536707 #> 34 3.533072258 -0.20991733 -0.25745566 -0.449907910 2.79197752 0.109718221 #> 35 -0.549616511 -0.20991733 -0.25745566 -0.312227829 -0.56964529 -0.323736481 #> 36 -0.537244727 -0.20991733 -0.24662682 -0.465205697 -0.41987002 -0.418554697 #> 37 -0.549616511 -0.20991733 -0.25745566 -0.419312337 -0.50307850 -0.147645508 #> 38 -0.524872942 -0.20991733 -0.25745566 -0.465205697 -0.58628699 -0.377918318 #> 39 -0.512501158 -0.08595040 -0.25745566 0.009025694 -0.58628699 -0.405009237 #> 40 1.937112103 -0.20991733 -0.25745566 -0.465205697 0.11266429 -0.418554697 #> 41 -0.116604066 -0.20991733 -0.25745566 -0.465205697 -0.00382759 3.035537461 #> 42 -0.487757590 -0.18236913 -0.09502307 -0.189845534 0.27908126 -0.120554589 #> 43 0.897882235 -0.20303028 -0.23579798 -0.465205697 2.30936830 2.507264543 #> 44 -0.401155101 -0.20991733 -0.04087887 -0.159249961 -0.12031947 -0.377918318 #> 45 -0.549616511 -0.20991733 -0.25745566 -0.449907910 0.86154066 -0.256009183 #> 46 0.056600912 -0.20991733 -0.25745566 -0.434610124 -0.33666153 -0.432100156 #> 47 -0.500129374 -0.20991733 -0.25745566 -0.388716763 -0.33666153 -0.377918318 #> 48 -0.549616511 -0.20991733 -0.25745566 -0.465205697 -0.58628699 -0.432100156 #> 49 3.124803381 -0.20991733 -0.25745566 -0.465205697 2.22615982 -0.350827400 #> 50 -0.549616511 -0.20991733 -0.25745566 -0.342823403 -0.46979511 -0.323736481 #> 51 -0.549616511 -0.20991733 -0.25745566 -0.251036682 -0.51972020 -0.432100156 #> 52 -0.549616511 -0.20991733 -0.25745566 0.085514628 -0.56964529 -0.418554697 #> 53 -0.524872942 -0.20991733 -0.25745566 -0.465205697 -0.51972020 0.163900059 #> 54 6.564159374 -0.20991733 -0.21414030 -0.465205697 -0.30337814 -0.418554697 #> 55 0.242177675 -0.20991733 -0.25745566 -0.358121189 -0.51972020 -0.337281940 #> 56 -0.537244727 -0.19614323 -0.24662682 -0.312227829 -0.51972020 -0.418554697 #> 57 -0.388783317 0.25840217 -0.25745566 -0.404014550 -0.46979511 -0.405009237 #> 58 -0.549616511 -0.20991733 -0.17082495 -0.449907910 -0.58628699 0.123263681 #> 59 0.254549459 -0.20991733 -0.25745566 -0.465205697 -0.12031947 -0.391463778 #> 60 -0.091860497 2.84104651 -0.25745566 -0.388716763 -0.56964529 0.055536384 #> 61 -0.302180828 -0.20991733 -0.25745566 -0.449907910 -0.46979511 -0.350827400 #> 62 -0.487757590 -0.20991733 -0.25745566 -0.006272093 3.92361292 4.539083459 #> 63 -0.512501158 0.39614321 4.64800869 -0.296930042 -0.58628699 -0.418554697 #> 64 0.613331199 -0.20991733 -0.11668075 0.819808396 0.12930599 -0.432100156 #> 65 0.304036595 -0.20991733 -0.25745566 -0.296930042 -0.56964529 -0.405009237 #> 66 1.454612521 -0.20991733 -0.25745566 -0.465205697 -0.20352796 -0.432100156 #> 67 4.287751091 -0.20991733 -0.25745566 -0.327525616 -0.58628699 -0.432100156 #> 68 0.786536177 -0.20991733 2.70964640 0.223194710 -0.58628699 -0.432100156 #> 69 0.118459833 -0.20991733 -0.25745566 -0.449907910 -0.58628699 -0.418554697 #> 70 -0.537244727 -0.20991733 -0.25745566 -0.465205697 3.05824467 0.367081951 #> 71 -0.549616511 -0.19614323 -0.25745566 -0.465205697 -0.53636190 -0.432100156 #> 72 0.130831617 -0.20991733 -0.25745566 0.391470365 -0.51972020 -0.377918318 #> 73 0.922625803 -0.20991733 -0.25745566 0.116110202 -0.20352796 -0.432100156 #> 74 0.192690538 -0.18925618 -0.25745566 -0.419312337 -0.30337814 -0.432100156 #> 75 -0.524872942 -0.20991733 -0.25745566 -0.052165453 -0.48643681 -0.283100102 #> 76 -0.537244727 -0.20991733 1.99494298 0.529150446 -0.33666153 -0.418554697 #> 77 -0.512501158 -0.20303028 -0.25745566 -0.174547748 -0.58628699 0.055536384 #> 78 -0.351667964 -0.20991733 -0.25745566 1.033977411 -0.56964529 -0.242463724 #> 79 -0.425898669 2.84793356 -0.13833843 -0.419312337 -0.58628699 -0.405009237 #> 80 -0.549616511 -0.20991733 -0.24662682 -0.465205697 -0.28673644 -0.432100156 #> 81 -0.326924396 -0.20991733 -0.25745566 -0.449907910 -0.48643681 -0.432100156 #> 82 -0.549616511 -0.20991733 -0.25745566 -0.465205697 -0.50307850 -0.174736427 #> 83 -0.549616511 -0.20991733 -0.25745566 -0.404014550 -0.51972020 -0.323736481 #> 84 0.551472278 -0.20991733 -0.25745566 -0.388716763 -0.40322832 -0.323736481 #> 85 1.528843226 -0.18925618 -0.25745566 -0.220441108 -0.43651171 -0.310191021 #> 86 1.256663975 -0.20991733 -0.25745566 -0.449907910 -0.45315341 -0.432100156 #> 87 -0.549616511 -0.20991733 -0.25745566 -0.251036682 0.36228975 0.908900329 #> 88 0.266921243 -0.20991733 -0.25745566 1.095168558 -0.56964529 -0.161190967 #> 89 -0.500129374 -0.20991733 -0.25745566 0.238492497 0.42885653 -0.432100156 #> 90 -0.475385806 -0.20991733 -0.25745566 4.078236988 1.95989266 -0.283100102 #> 91 -0.277437260 -0.18925618 -0.24662682 0.330279217 -0.58628699 -0.432100156 #> 92 -0.351667964 -0.20991733 -0.25745566 1.202253066 -0.56964529 -0.391463778 #> 93 -0.166091202 -0.20991733 -0.25745566 -0.465205697 -0.58628699 -0.377918318 #> 94 -0.166091202 -0.20991733 -0.25745566 -0.465205697 -0.40322832 -0.432100156 #> 95 -0.524872942 -0.20991733 0.77128410 -0.419312337 -0.03711098 -0.310191021 #> 96 -0.376411533 -0.20991733 -0.25745566 1.752973392 4.00682140 -0.350827400 #> 97 -0.537244727 -0.20991733 -0.24662682 -0.465205697 -0.46979511 1.071445842 #> 98 -0.104232281 -0.20991733 -0.25745566 -0.404014550 -0.56964529 0.082627303 #> 99 -0.401155101 -0.20991733 -0.25745566 -0.465205697 1.22765799 5.026719999 #> 100 -0.549616511 -0.20991733 -0.25745566 8.912337624 -0.58628699 -0.093463670 #> 101 -0.549616511 -0.20991733 -0.25745566 -0.128654387 -0.53636190 -0.418554697 #> 102 -0.549616511 0.14820935 -0.25745566 -0.358121189 -0.58628699 -0.418554697 #> 103 -0.227950123 -0.20991733 -0.25745566 -0.465205697 1.47728345 0.394172870 #> 104 -0.549616511 -0.20991733 -0.25745566 0.269088070 1.22765799 -0.350827400 #> 105 -0.364039749 -0.20991733 -0.25745566 -0.388716763 0.26243956 -0.174736427 #> 106 -0.524872942 -0.20991733 -0.25745566 -0.404014550 1.76019230 -0.418554697 #> 107 0.007113776 -0.20991733 -0.24662682 -0.067463240 -0.58628699 -0.418554697 #> 108 -0.190834770 -0.20991733 -0.24662682 -0.465205697 0.12930599 -0.432100156 #> 109 1.182433270 -0.20991733 -0.24662682 -0.465205697 -0.23681135 -0.405009237 #> 110 2.036086376 0.46501372 -0.24662682 -0.205143321 -0.12031947 -0.377918318 #> 111 -0.265065475 -0.20991733 -0.25745566 0.590341593 -0.55300359 -0.337281940 #> 112 -0.227950123 -0.20991733 -0.25745566 -0.465205697 -0.20352796 -0.432100156 #> 113 -0.450642238 -0.20991733 -0.24662682 -0.006272093 -0.28673644 -0.432100156 #> 114 -0.116604066 0.05179063 1.34521260 -0.082761027 -0.27009474 -0.418554697 #> 115 -0.339296180 -0.20991733 -0.25745566 0.162003562 -0.15360286 -0.052827292 #> 116 -0.537244727 0.38236910 -0.25745566 -0.174547748 -0.08703608 1.003718545 #> 117 -0.487757590 -0.20991733 -0.25745566 -0.205143321 -0.48643681 -0.174736427 #> 118 -0.549616511 -0.20991733 -0.25745566 1.951844620 -0.35330323 -0.391463778 #> 119 -0.289809044 -0.20991733 -0.25745566 1.538804376 0.06273920 -0.432100156 #> 120 -0.549616511 -0.20303028 -0.25745566 -0.465205697 -0.56964529 -0.432100156 #> 121 -0.463014022 11.54627967 -0.25745566 -0.205143321 -0.38658662 -0.432100156 #> 122 -0.326924396 -0.20991733 -0.25745566 2.915605190 -0.55300359 -0.432100156 #> 123 -0.463014022 -0.16170797 1.12863581 -0.342823403 -0.48643681 -0.432100156 #> 124 -0.549616511 -0.20991733 -0.25745566 -0.358121189 -0.43651171 0.597354761 #> 125 1.244292191 -0.20991733 -0.25745566 0.146705776 0.94474914 -0.418554697 #> 126 -0.537244727 -0.20991733 -0.25745566 4.185321496 -0.58628699 -0.432100156 #> 127 0.316408380 -0.20303028 -0.25745566 -0.281632255 0.42885653 -0.432100156 #> 128 -0.376411533 -0.20991733 -0.25745566 -0.281632255 -0.58628699 -0.418554697 #> 129 0.588587631 -0.20991733 -0.25745566 -0.388716763 -0.35330323 -0.432100156 #> 130 -0.425898669 -0.20991733 -0.25745566 0.116110202 -0.51972020 -0.432100156 #> 131 -0.463014022 0.92644617 -0.25745566 -0.449907910 -0.43651171 6.354175024 #> 132 -0.537244727 -0.19614323 -0.25745566 -0.465205697 -0.56964529 -0.432100156 #> 133 0.514356926 -0.20991733 -0.25745566 -0.404014550 -0.56964529 0.407718329 #> 134 -0.549616511 -0.20991733 -0.25745566 -0.143952174 -0.51972020 -0.201827346 #> 135 -0.425898669 -0.20991733 -0.25745566 -0.465205697 -0.45315341 -0.364372859 #> 136 0.192690538 -0.20991733 -0.24662682 3.879365760 -0.36994493 -0.432100156 #> 137 -0.388783317 -0.20991733 -0.25745566 0.100812415 1.19437460 -0.405009237 #> 138 1.145317917 -0.20991733 -0.25745566 -0.251036682 0.31236465 -0.134100048 #> 139 0.019485560 -0.08595040 -0.24662682 -0.113356600 -0.56964529 -0.432100156 #> 140 -0.401155101 -0.20991733 -0.17082495 2.686138388 -0.51972020 -0.432100156 #> 141 -0.487757590 -0.20991733 -0.25745566 -0.052165453 0.02945580 -0.405009237 #> 142 -0.500129374 -0.20991733 -0.25745566 0.452661512 0.71176538 -0.432100156 #> 143 -0.425898669 -0.18925618 -0.25745566 0.024323481 -0.08703608 -0.432100156 #> 144 0.167946970 -0.19614323 1.64842011 -0.235738895 1.92660927 -0.432100156 #> 145 -0.537244727 -0.20991733 -0.25745566 -0.220441108 0.34564805 -0.012190913 #> 146 -0.252693691 -0.19614323 -0.25745566 0.054919055 -0.27009474 -0.296645562 #> 147 4.225892170 -0.20303028 -0.25745566 -0.465205697 0.06273920 0.231627356 #> 148 -0.376411533 -0.20991733 3.34854794 0.177301349 -0.10367777 -0.432100156 #> 149 0.761792609 -0.19614323 -0.24662682 -0.327525616 4.95539814 0.488991086 #> 150 -0.549616511 -0.20991733 -0.25745566 -0.220441108 3.50757049 -0.418554697 #> 151 -0.549616511 -0.20991733 -0.25745566 0.100812415 -0.55300359 -0.432100156 #> 152 -0.549616511 -0.20991733 -0.25745566 -0.373418976 -0.22016965 2.317628111 #> 153 -0.537244727 -0.20991733 -0.25745566 0.636234954 0.02945580 0.150354600 #> 154 1.083458997 -0.20991733 -0.25745566 -0.082761027 3.90697122 -0.377918318 #> 155 1.491727874 -0.20991733 -0.25745566 -0.388716763 -0.45315341 1.355900490 #> 156 -0.153719418 -0.20991733 -0.25745566 -0.327525616 -0.03711098 -0.337281940 #> 157 -0.549616511 -0.20991733 -0.15999611 -0.434610124 -0.58628699 3.726355893 #> 158 -0.500129374 -0.20991733 -0.25745566 -0.205143321 -0.13696117 -0.405009237 #> 159 -0.413526885 -0.20991733 -0.25745566 -0.465205697 3.57413728 -0.405009237 #> 160 -0.537244727 -0.20991733 -0.25745566 -0.449907910 -0.58628699 0.651536599 #> 161 -0.549616511 1.87685929 -0.25745566 -0.327525616 -0.58628699 1.667446057 #> 162 -0.425898669 -0.20991733 -0.25745566 2.303693717 -0.20352796 -0.283100102 #> 163 0.205062322 3.83278193 -0.25745566 -0.358121189 -0.58628699 0.001354546 #> 164 -0.500129374 -0.20991733 -0.23579798 -0.266334469 -0.15360286 0.312900113 #> 165 -0.487757590 -0.18236913 -0.23579798 -0.449907910 -0.56964529 -0.391463778 #> 166 -0.537244727 -0.20991733 -0.25745566 1.018679624 -0.15360286 -0.377918318 #> 167 -0.413526885 -0.20991733 -0.25745566 0.452661512 1.84340078 -0.337281940 #> 168 2.852624130 -0.20991733 -0.25745566 -0.434610124 -0.56964529 -0.432100156 #> 169 -0.227950123 -0.20991733 -0.25745566 -0.358121189 -0.56964529 -0.432100156 #> 170 -0.549616511 -0.20991733 -0.25745566 1.079870772 0.24579787 0.326445573 #> 171 -0.524872942 -0.20991733 -0.25745566 -0.296930042 -0.46979511 -0.337281940 #> 172 -0.463014022 -0.20991733 -0.25745566 -0.358121189 -0.56964529 -0.432100156 #> 173 0.130831617 -0.20991733 -0.25745566 -0.465205697 -0.56964529 -0.188281886 #> 174 -0.524872942 -0.16859502 -0.25745566 -0.449907910 -0.50307850 -0.432100156 #> 175 -0.425898669 -0.20991733 0.19735560 0.620937167 -0.48643681 0.190990978 #> 176 -0.500129374 -0.20991733 -0.25745566 -0.434610124 -0.20352796 -0.161190967 #> 177 0.279293027 3.47465525 -0.01922119 -0.342823403 -0.56964529 -0.405009237 #> 178 -0.512501158 -0.19614323 -0.25745566 -0.342823403 0.29572296 0.231627356 #> 179 -0.401155101 -0.20991733 -0.25745566 -0.465205697 -0.30337814 -0.432100156 #> 180 -0.475385806 1.57382902 0.34895936 -0.128654387 -0.03711098 -0.405009237 #> 181 1.095830781 -0.20991733 -0.25745566 0.054919055 0.01281411 -0.256009183 #> 182 1.009228292 -0.20991733 -0.25745566 -0.404014550 -0.36994493 -0.391463778 #> 183 2.679419152 0.31349859 -0.25745566 -0.404014550 -0.55300359 -0.350827400 #> 184 -0.438270453 1.29834696 -0.24662682 1.538804376 -0.33666153 -0.215372805 #> 185 -0.549616511 0.24462807 -0.01922119 -0.358121189 -0.58628699 -0.405009237 #> 186 -0.524872942 -0.20991733 -0.24662682 -0.419312337 -0.33666153 -0.432100156 #> 187 -0.549616511 -0.20303028 3.85750340 1.095168558 -0.38658662 -0.269554643 #> 188 -0.537244727 -0.20991733 -0.23579798 -0.419312337 -0.55300359 0.475445626 #> 189 -0.425898669 -0.20991733 -0.23579798 -0.373418976 2.70876903 0.177445519 #> 190 0.254549459 -0.09972451 3.55429589 0.162003562 -0.33666153 -0.432100156 #> 191 -0.512501158 -0.20991733 -0.25745566 -0.465205697 0.21251447 -0.147645508 #> 192 -0.537244727 -0.20991733 -0.25745566 -0.465205697 0.54534841 -0.432100156 #> 193 -0.450642238 -0.20303028 -0.25745566 -0.358121189 -0.38658662 -0.310191021 #> 194 0.885510450 -0.20991733 -0.25745566 -0.388716763 -0.56964529 -0.432100156 #> 195 -0.104232281 -0.16170797 0.01326533 -0.388716763 -0.32001983 -0.269554643 #> 196 -0.549616511 -0.20991733 -0.25745566 -0.465205697 -0.22016965 -0.256009183 #> 197 -0.512501158 -0.05151515 0.31647284 1.768271179 0.91146575 -0.174736427 #> 198 0.167946970 -0.20991733 -0.25745566 -0.465205697 -0.58628699 -0.147645508 #> 199 -0.537244727 -0.20991733 -0.25745566 -0.434610124 -0.38658662 1.708082436 #> 200 -0.450642238 -0.20991733 -0.25745566 -0.342823403 -0.15360286 2.046718922 #> Otu00056 Otu00057 Otu00058 Otu00059 Otu00060 #> 1 -0.67302626 -0.063085238 0.244028438 -0.04265350 -0.41506494 #> 2 2.49956176 -0.378272648 0.956294184 -0.33573273 -0.41506494 #> 3 -0.80430576 2.658987854 -0.313396928 -0.40900254 -0.40518715 #> 4 0.18029052 -0.340068114 -0.065652321 -0.29386427 -0.41506494 #> 5 -0.80430576 -0.426028317 -0.561141535 -0.39853543 -0.40518715 #> 6 0.77104829 0.786965657 0.151124210 0.66911037 -0.41506494 #> 7 -0.82618568 -0.244556777 -0.545657497 -0.29386427 -0.41506494 #> 8 -0.62926642 -0.426028317 -0.406301156 7.84955171 -0.16812007 #> 9 0.24593027 -0.426028317 -0.483721345 -0.40900254 -0.41506494 #> 10 -0.23542791 -0.406926049 -0.576625573 -0.40900254 2.69644047 #> 11 -0.82618568 -0.406926049 2.798894699 -0.40900254 0.40479204 #> 12 -0.56362667 0.557738450 -0.205008662 0.09341901 0.04919142 #> 13 0.66164870 -0.426028317 1.730496081 -0.40900254 -0.41506494 #> 14 0.04901101 0.529085049 0.213060362 0.69004460 -0.41506494 #> 15 1.82128432 1.407789345 0.832421880 -0.05312061 -0.41506494 #> 16 1.66812490 -0.397374916 -0.158556549 -0.40900254 -0.41506494 #> 17 -0.41046725 0.519533915 -0.220492700 0.21902440 -0.41506494 #> 18 -0.30106766 1.073499667 -0.096620397 0.03061631 -0.38543156 #> 19 -0.69490618 0.147039703 0.569193235 -0.21012735 -0.41506494 #> 20 -0.78242585 -0.359170381 -0.545657497 -0.23106158 -0.41506494 #> 21 -0.82618568 -0.406926049 -0.576625573 -0.40900254 -0.41506494 #> 22 1.88692408 -0.426028317 -0.530173459 2.16590791 -0.41506494 #> 23 0.46472945 -0.426028317 -0.205008662 0.76331441 -0.41506494 #> 24 1.05548722 -0.426028317 -0.375333080 -0.40900254 -0.41506494 #> 25 0.31157002 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 26 -0.32294758 0.357164643 -0.081136359 -0.03218638 1.06660430 #> 27 -0.78242585 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 28 -0.60738651 -0.426028317 -0.607593649 -0.40900254 2.67668488 #> 29 -0.76054593 -0.426028317 -0.437269232 0.75284729 -0.41506494 #> 30 -0.69490618 -0.053534104 -0.189524624 -0.13685754 -0.39530935 #> 31 -0.82618568 -0.426028317 1.482751474 -0.39853543 1.00733753 #> 32 -0.60738651 2.085919835 -0.375333080 -0.40900254 2.59766252 #> 33 -0.62926642 0.252102173 -0.592109611 -0.40900254 -0.31628699 #> 34 -0.82618568 0.242551039 0.770485728 -0.40900254 -0.41506494 #> 35 -0.71678609 8.532935052 0.878873994 -0.19966023 -0.33604258 #> 36 -0.49798692 -0.426028317 -0.607593649 -0.40900254 0.20723614 #> 37 2.23700275 0.280755574 -0.235976738 -0.04265350 -0.41506494 #> 38 -0.76054593 -0.426028317 0.383384780 -0.40900254 0.54308117 #> 39 -0.80430576 -0.426028317 0.615645349 -0.40900254 2.37047324 #> 40 -0.43234717 0.605494118 -0.143072511 0.03061631 0.12821378 #> 41 -0.60738651 -0.292312446 -0.437269232 -0.40900254 -0.41506494 #> 42 0.61788887 -0.416477183 -0.344365004 -0.35666697 -0.40518715 #> 43 -0.80430576 -0.426028317 -0.375333080 -0.40900254 1.60000523 #> 44 1.99632366 -0.063085238 0.042735945 -0.40900254 -0.41506494 #> 45 0.31157002 -0.034431837 -0.514689421 -0.29386427 0.39491424 #> 46 0.02713110 -0.406926049 -0.468237308 -0.40900254 -0.40518715 #> 47 -0.71678609 -0.015329570 -0.313396928 -0.14732465 -0.41506494 #> 48 -0.82618568 -0.387823782 -0.545657497 -0.40900254 -0.41506494 #> 49 -0.65114634 0.137488569 -0.266944814 -0.16825888 -0.41506494 #> 50 -0.54174675 0.634147519 0.305964590 0.28182709 -0.41506494 #> 51 0.37720978 -0.426028317 -0.561141535 4.57334451 -0.40518715 #> 52 -0.47610700 -0.177698842 -0.468237308 -0.25199581 -0.41506494 #> 53 -0.80430576 -0.416477183 -0.592109611 -0.40900254 -0.41506494 #> 54 -0.80430576 -0.426028317 -0.561141535 -0.40900254 -0.40518715 #> 55 -0.38858733 0.739209989 0.058219983 0.08295189 -0.40518715 #> 56 -0.82618568 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 57 -0.76054593 -0.416477183 0.135640172 0.40743248 -0.41506494 #> 58 1.20864664 -0.416477183 -0.452753270 -0.40900254 -0.03970874 #> 59 -0.21354799 -0.426028317 0.166608248 0.83658422 -0.40518715 #> 60 -0.10414841 -0.129943173 -0.003716169 0.02014920 -0.41506494 #> 61 0.70540854 -0.426028317 1.157586677 -0.40900254 1.35306035 #> 62 -0.76054593 0.739209989 -0.514689421 -0.40900254 -0.39530935 #> 63 0.44284953 -0.235005644 -0.359849042 -0.39853543 -0.41506494 #> 64 -0.76054593 -0.426028317 -0.592109611 -0.40900254 -0.41506494 #> 65 -0.82618568 0.318960108 -0.468237308 -0.40900254 0.21711393 #> 66 0.48660936 -0.426028317 5.369244999 -0.40900254 -0.41506494 #> 67 1.29616631 -0.426028317 -0.561141535 0.54350498 0.82953722 #> 68 1.23052655 1.197664405 0.166608248 -0.19966023 2.07413939 #> 69 1.20864664 -0.426028317 1.064682449 -0.40900254 -0.41506494 #> 70 0.13653068 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 71 -0.45422709 -0.349619247 -0.530173459 -0.38806831 6.91425892 #> 72 0.13653068 2.534823116 2.195017219 -0.07405484 1.57037184 #> 73 0.50848928 0.242551039 -0.607593649 -0.40900254 -0.41506494 #> 74 4.62191375 0.013323831 0.182092286 0.63770902 3.72373115 #> 75 0.81480812 0.748761123 0.491773045 1.42274270 -0.41506494 #> 76 -0.82618568 -0.426028317 5.431181150 -0.40900254 0.02943583 #> 77 -0.69490618 -0.426028317 0.213060362 1.06686076 -0.40518715 #> 78 -0.56362667 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 79 1.58060523 -0.091738639 0.940810146 1.19246615 -0.41506494 #> 80 -0.82618568 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 81 0.59600895 -0.426028317 1.699528005 0.20855728 -0.41506494 #> 82 3.28723879 0.939783796 -0.607593649 -0.39853543 -0.41506494 #> 83 0.83668804 -0.034431837 -0.545657497 -0.25199581 -0.40518715 #> 84 -0.76054593 -0.426028317 -0.390817118 -0.40900254 -0.16812007 #> 85 -0.43234717 -0.426028317 2.427277789 -0.40900254 -0.41506494 #> 86 -0.82618568 -0.139494307 -0.251460776 -0.40900254 -0.40518715 #> 87 -0.06038857 0.051528366 -0.390817118 -0.36713408 -0.41506494 #> 88 1.01172738 -0.426028317 6.546031883 -0.40900254 -0.41506494 #> 89 1.79940441 -0.359170381 0.151124210 -0.31479850 -0.41506494 #> 90 0.13653068 6.603606053 -0.174040587 -0.28339716 -0.41506494 #> 91 -0.23542791 -0.378272648 -0.344365004 2.80440196 0.95794856 #> 92 -0.76054593 -0.426028317 2.009208764 -0.40900254 0.41466983 #> 93 -0.82618568 -0.426028317 -0.530173459 -0.40900254 -0.41506494 #> 94 -0.80430576 -0.426028317 0.228544400 2.50085561 -0.38543156 #> 95 1.03360730 1.054397400 0.274996514 0.55397210 -0.41506494 #> 96 -0.82618568 -0.426028317 -0.576625573 -0.40900254 -0.41506494 #> 97 -0.78242585 -0.426028317 -0.592109611 -0.40900254 -0.41506494 #> 98 -0.16978816 -0.426028317 -0.468237308 1.63208501 -0.41506494 #> 99 -0.78242585 -0.406926049 -0.592109611 -0.40900254 -0.41506494 #> 100 2.41204209 -0.397374916 -0.499205383 -0.39853543 -0.37555376 #> 101 1.79940441 -0.177698842 -0.576625573 -0.40900254 -0.41506494 #> 102 -0.80430576 -0.426028317 -0.607593649 -0.36713408 -0.41506494 #> 103 -0.19166808 -0.301863579 -0.421785194 -0.40900254 -0.41506494 #> 104 -0.82618568 1.025743999 0.011767869 -0.40900254 -0.39530935 #> 105 0.18029052 0.509982781 0.027251907 0.47023517 0.07882480 #> 106 0.04901101 0.309408975 -0.235976738 0.03061631 -0.39530935 #> 107 0.20217044 -0.426028317 -0.034684245 -0.40900254 0.33564747 #> 108 0.81480812 -0.426028317 1.838884347 -0.40900254 0.80978163 #> 109 -0.62926642 -0.129943173 -0.251460776 -0.38806831 -0.41506494 #> 110 2.08384333 -0.397374916 -0.205008662 -0.27293004 -0.40518715 #> 111 0.53036920 -0.426028317 -0.220492700 -0.40900254 -0.41506494 #> 112 0.50848928 -0.426028317 0.259512476 -0.40900254 0.13809157 #> 113 -0.21354799 -0.426028317 0.569193235 -0.38806831 -0.41506494 #> 114 0.35532986 -0.378272648 1.637591853 -0.15779177 1.13574887 #> 115 0.44284953 -0.426028317 1.467267436 -0.40900254 -0.06934212 #> 116 2.01820358 -0.215903376 -0.174040587 -0.40900254 -0.41506494 #> 117 -0.03850865 -0.426028317 -0.607593649 -0.40900254 2.64705149 #> 118 0.18029052 -0.426028317 -0.514689421 -0.40900254 -0.41506494 #> 119 -0.82618568 -0.426028317 -0.050168283 -0.40900254 -0.41506494 #> 120 -0.32294758 -0.387823782 -0.607593649 -0.38806831 -0.34592038 #> 121 -0.34482750 0.414471445 1.002746297 0.35509690 4.63248828 #> 122 0.24593027 -0.416477183 -0.576625573 -0.40900254 -0.41506494 #> 123 -0.82618568 -0.426028317 -0.545657497 -0.39853543 -0.41506494 #> 124 0.02713110 -0.426028317 -0.530173459 -0.40900254 -0.41506494 #> 125 -0.60738651 -0.426028317 0.089188059 3.14981678 2.73595165 #> 126 0.63976878 -0.426028317 1.064682449 -0.40900254 -0.41506494 #> 127 -0.27918775 -0.378272648 -0.545657497 -0.31479850 -0.39530935 #> 128 -0.78242585 -0.426028317 -0.576625573 -0.40900254 -0.06934212 #> 129 -0.80430576 -0.110840906 -0.483721345 0.26089286 -0.41506494 #> 130 -0.47610700 -0.426028317 -0.344365004 -0.40900254 -0.40518715 #> 131 -0.56362667 -0.426028317 -0.390817118 -0.40900254 -0.41506494 #> 132 1.47120565 -0.426028317 -0.421785194 -0.40900254 -0.20763125 #> 133 -0.67302626 -0.426028317 -0.530173459 -0.26246293 -0.41506494 #> 134 0.46472945 0.739209989 1.869852422 1.54834808 -0.40518715 #> 135 -0.82618568 -0.406926049 -0.437269232 -0.39853543 -0.41506494 #> 136 0.85856796 -0.426028317 0.011767869 -0.40900254 -0.41506494 #> 137 -0.16978816 2.085919835 -0.468237308 -0.40900254 1.15550446 #> 138 0.88044788 -0.426028317 -0.220492700 -0.40900254 -0.40518715 #> 139 -0.71678609 -0.416477183 -0.468237308 0.11435324 -0.41506494 #> 140 -0.82618568 -0.426028317 -0.220492700 -0.40900254 -0.41506494 #> 141 -0.65114634 -0.426028317 -0.174040587 1.51694674 -0.03970874 #> 142 -0.56362667 1.617914285 0.693065539 -0.40900254 -0.41506494 #> 143 -0.73866601 -0.005778436 -0.607593649 -0.06358773 -0.41506494 #> 144 -0.58550659 1.149908736 -0.468237308 0.88891980 -0.41506494 #> 145 0.61788887 -0.196801109 -0.607593649 -0.40900254 -0.41506494 #> 146 0.81480812 -0.426028317 -0.592109611 -0.06358773 -0.40518715 #> 147 -0.82618568 -0.426028317 -0.592109611 -0.39853543 -0.41506494 #> 148 -0.73866601 -0.426028317 -0.359849042 -0.40900254 -0.41506494 #> 149 -0.71678609 0.185244237 -0.452753270 -0.40900254 -0.41506494 #> 150 -0.82618568 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 151 1.66812490 0.834721326 0.878873994 -0.40900254 -0.41506494 #> 152 1.05548722 -0.168147708 -0.576625573 -0.40900254 -0.41506494 #> 153 -0.67302626 -0.426028317 0.058219983 0.45976806 -0.41506494 #> 154 -0.82618568 -0.426028317 -0.607593649 1.78909174 -0.41506494 #> 155 -0.69490618 -0.426028317 -0.545657497 5.65145742 -0.41506494 #> 156 -0.19166808 0.643698653 -0.483721345 -0.40900254 0.16772496 #> 157 -0.82618568 -0.416477183 -0.607593649 -0.40900254 -0.23726464 #> 158 1.53684540 -0.426028317 2.597602206 -0.40900254 -0.37555376 #> 159 -0.78242585 0.041977232 -0.437269232 -0.40900254 -0.41506494 #> 160 -0.80430576 -0.426028317 -0.592109611 -0.40900254 -0.41506494 #> 161 -0.65114634 -0.426028317 0.352416704 -0.40900254 -0.41506494 #> 162 -0.32294758 -0.426028317 -0.468237308 -0.40900254 0.28625850 #> 163 0.66164870 -0.378272648 0.816937842 3.22308659 -0.41506494 #> 164 -0.80430576 -0.416477183 -0.576625573 -0.40900254 2.05438380 #> 165 -0.71678609 -0.406926049 -0.576625573 -0.40900254 2.11365057 #> 166 -0.82618568 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 167 0.48660936 3.585447818 -0.328880966 -0.40900254 -0.27677581 #> 168 -0.82618568 -0.426028317 -0.406301156 -0.40900254 -0.41506494 #> 169 -0.80430576 -0.426028317 -0.530173459 -0.38806831 1.61976082 #> 170 -0.82618568 -0.426028317 -0.607593649 -0.40900254 1.05672651 #> 171 -0.47610700 0.701005455 0.646613425 0.81564999 -0.41506494 #> 172 -0.76054593 -0.426028317 -0.437269232 -0.40900254 -0.01995315 #> 173 -0.82618568 -0.426028317 -0.592109611 -0.40900254 -0.39530935 #> 174 -0.78242585 -0.416477183 -0.421785194 -0.31479850 4.01018720 #> 175 2.43392201 -0.215903376 -0.034684245 -0.40900254 -0.40518715 #> 176 1.07736713 -0.426028317 -0.127588473 -0.39853543 -0.41506494 #> 177 0.20217044 -0.034431837 0.538225159 0.05155054 -0.41506494 #> 178 -0.82618568 -0.426028317 0.182092286 -0.40900254 -0.41506494 #> 179 -0.80430576 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 180 -0.25730783 0.844272459 -0.065652321 -0.10545619 -0.41506494 #> 181 -0.67302626 -0.416477183 -0.576625573 0.78424864 -0.41506494 #> 182 0.26781019 -0.426028317 -0.452753270 0.86798557 -0.41506494 #> 183 -0.41046725 -0.263659045 0.027251907 0.54350498 -0.41506494 #> 184 -0.36670742 -0.273210178 -0.174040587 -0.36713408 -0.30640920 #> 185 2.43392201 -0.378272648 -0.561141535 -0.40900254 -0.41506494 #> 186 -0.78242585 -0.416477183 -0.545657497 -0.37760120 -0.41506494 #> 187 0.31157002 0.548187316 -0.607593649 -0.40900254 -0.15824228 #> 188 -0.82618568 -0.426028317 -0.592109611 -0.40900254 -0.35579817 #> 189 -0.71678609 -0.340068114 -0.514689421 -0.40900254 -0.26689802 #> 190 0.81480812 0.739209989 -0.297912890 -0.25199581 -0.40518715 #> 191 0.00525118 -0.426028317 -0.499205383 -0.40900254 1.41232712 #> 192 1.12112697 -0.426028317 -0.561141535 -0.40900254 -0.41506494 #> 193 1.47120565 1.130806469 0.383384780 0.66911037 -0.05946433 #> 194 -0.56362667 -0.387823782 -0.576625573 0.02014920 0.52332558 #> 195 -0.21354799 0.901579261 0.491773045 0.50163652 -0.39530935 #> 196 -0.82618568 -0.426028317 -0.592109611 -0.40900254 -0.41506494 #> 197 -0.80430576 1.608363152 -0.514689421 -0.38806831 -0.37555376 #> 198 -0.80430576 -0.426028317 -0.530173459 -0.40900254 -0.25702023 #> 199 1.71188474 0.204346505 -0.421785194 -0.19966023 0.06894701 #> 200 3.72483714 -0.426028317 1.869852422 -0.40900254 -0.32616479 #> #> $removed #> character(0) #>"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":null,"dir":"Reference","previous_headings":"","what":"Get feature importance using the permutation method — get_feature_importance","title":"Get feature importance using the permutation method — get_feature_importance","text":"Calculates feature importance using trained model test data. Requires future.apply package.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get feature importance using the permutation method — get_feature_importance","text":"","code":"get_feature_importance( trained_model, train_data, test_data, outcome_colname, perf_metric_function, perf_metric_name, class_probs, method, seed = NA, corr_thresh = 1, groups = NULL, nperms = 100, corr_method = \"spearman\" )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get feature importance using the permutation method — get_feature_importance","text":"trained_model Trained model caret::train(). train_data Training data: dataframe outcome features. test_data Held test data: dataframe outcome features. outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). perf_metric_function Function calculate performance metric used cross-validation test performance. functions provided caret (see caret::defaultSummary()). Defaults: binary classification = twoClassSummary, multi-class classification = multiClassSummary, regression = defaultSummary. perf_metric_name column name output function provided perf_metric_function used performance metric. Defaults: binary classification = \"ROC\", multi-class classification = \"logLoss\", regression = \"RMSE\". class_probs Whether use class probabilities (TRUE categorical outcomes, FALSE numeric outcomes). method ML method. Options: c(\"glmnet\", \"rf\", \"rpart2\", \"svmRadial\", \"xgbTree\"). glmnet: linear, logistic, multiclass regression rf: random forest rpart2: decision tree svmRadial: support vector machine xgbTree: xgboost seed Random seed (default: NA). results reproducible set seed. corr_thresh feature importance, group correlations equal corr_thresh (range 0 1; default: 1). groups Vector feature names group together permutation. element string feature names separated pipe character (|). NULL (default), correlated features grouped together based corr_thresh. nperms number permutations perform (default: 100). corr_method correlation method. options supported stats::cor: spearman, pearson, kendall. (default: spearman)","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get feature importance using the permutation method — get_feature_importance","text":"Data frame performance metrics feature (group correlated features; names) permuted (perf_metric), differences actual test performance metric permuted performance metric (perf_metric_diff; test minus permuted performance), p-value (pvalue: probability obtaining actual performance value null hypothesis). Features larger perf_metric_diff important. performance metric name (perf_metric_name) seed (seed) also returned.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Get feature importance using the permutation method — get_feature_importance","text":"permutation tests, p-value number permutation statistics greater test statistic, divided number permutations. case, permutation statistic model performance (e.g. AUROC) randomizing order observations one feature, test statistic actual performance test data. default perform 100 permutations per feature; increasing increase precision estimating null distribution, also increases runtime. p-value represents probability obtaining actual performance event null hypothesis true, null hypothesis feature important model performance. strongly recommend providing multiple cores speed computation time. See vignette parallel processing details.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get feature importance using the permutation method — get_feature_importance","text":"Begüm Topçuoğlu, topcuoglu.begum@gmail.com Zena Lapp, zenalapp@umich.edu Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get feature importance using the permutation method — get_feature_importance","text":"","code":"if (FALSE) { # If you called `run_ml()` with `feature_importance = FALSE` (the default), # you can use `get_feature_importance()` later as long as you have the # trained model and test data. results <- run_ml(otu_small, \"glmnet\", kfold = 2, cv_times = 2) names(results$trained_model$trainingData)[1] <- \"dx\" feat_imp <- get_feature_importance(results$trained_model, results$trained_model$trainingData, results$test_data, \"dx\", multiClassSummary, \"AUC\", class_probs = TRUE, method = \"glmnet\" ) # We strongly recommend providing multiple cores to speed up computation time. # Do this before calling `get_feature_importance()`. doFuture::registerDoFuture() future::plan(future::multicore, workers = 2) # Optionally, you can group features together with a custom grouping feat_imp <- get_feature_importance(results$trained_model, results$trained_model$trainingData, results$test_data, \"dx\", multiClassSummary, \"AUC\", class_probs = TRUE, method = \"glmnet\", groups = c( \"Otu00007\", \"Otu00008\", \"Otu00009\", \"Otu00011\", \"Otu00012\", \"Otu00015\", \"Otu00016\", \"Otu00018\", \"Otu00019\", \"Otu00020\", \"Otu00022\", \"Otu00023\", \"Otu00025\", \"Otu00028\", \"Otu00029\", \"Otu00030\", \"Otu00035\", \"Otu00036\", \"Otu00037\", \"Otu00038\", \"Otu00039\", \"Otu00040\", \"Otu00047\", \"Otu00050\", \"Otu00052\", \"Otu00054\", \"Otu00055\", \"Otu00056\", \"Otu00060\", \"Otu00003|Otu00002|Otu00005|Otu00024|Otu00032|Otu00041|Otu00053\", \"Otu00014|Otu00021|Otu00017|Otu00031|Otu00057\", \"Otu00013|Otu00006\", \"Otu00026|Otu00001|Otu00034|Otu00048\", \"Otu00033|Otu00010\", \"Otu00042|Otu00004\", \"Otu00043|Otu00027|Otu00049\", \"Otu00051|Otu00045\", \"Otu00058|Otu00044\", \"Otu00059|Otu00046\" ) ) # the function can show a progress bar if you have the `progressr` package installed. ## optionally, specify the progress bar format: progressr::handlers(progressr::handler_progress( format = \":message :bar :percent | elapsed: :elapsed | eta: :eta\", clear = FALSE, show_after = 0 )) ## tell progressr to always report progress progressr::handlers(global = TRUE) ## run the function and watch the live progress udpates feat_imp <- get_feature_importance(results$trained_model, results$trained_model$trainingData, results$test_data, \"dx\", multiClassSummary, \"AUC\", class_probs = TRUE, method = \"glmnet\" ) # You can specify any correlation method supported by `stats::cor`: feat_imp <- get_feature_importance(results$trained_model, results$trained_model$trainingData, results$test_data, \"dx\", multiClassSummary, \"AUC\", class_probs = TRUE, method = \"glmnet\", corr_method = \"pearson\" ) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hp_performance.html","id":null,"dir":"Reference","previous_headings":"","what":"Get hyperparameter performance metrics — get_hp_performance","title":"Get hyperparameter performance metrics — get_hp_performance","text":"Get hyperparameter performance metrics","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hp_performance.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get hyperparameter performance metrics — get_hp_performance","text":"","code":"get_hp_performance(trained_model)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hp_performance.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get hyperparameter performance metrics — get_hp_performance","text":"trained_model trained model (e.g. run_ml())","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hp_performance.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get hyperparameter performance metrics — get_hp_performance","text":"Named list: dat: Dataframe performance metric group hyperparameters. params: Hyperparameters tuned. metric: Performance metric used.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hp_performance.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get hyperparameter performance metrics — get_hp_performance","text":"Zena Lapp, zenalapp@umich.edu Kelly Sovacool sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hp_performance.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get hyperparameter performance metrics — get_hp_performance","text":"","code":"get_hp_performance(otu_mini_bin_results_glmnet$trained_model) #> $dat #> alpha lambda AUC #> 1 0 1e-04 0.6082552 #> 2 0 1e-03 0.6082552 #> 3 0 1e-02 0.6086458 #> 4 0 1e-01 0.6166789 #> 5 0 1e+00 0.6221737 #> 6 0 1e+01 0.6187408 #> #> $params #> [1] \"lambda\" #> #> $metric #> [1] \"AUC\" #>"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hyperparams_list.html","id":null,"dir":"Reference","previous_headings":"","what":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","title":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","text":"details see vignette hyperparameter tuning.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hyperparams_list.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","text":"","code":"get_hyperparams_list(dataset, method)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hyperparams_list.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","text":"dataset Dataframe outcome variable columns features. method ML method. Options: c(\"glmnet\", \"rf\", \"rpart2\", \"svmRadial\", \"xgbTree\"). glmnet: linear, logistic, multiclass regression rf: random forest rpart2: decision tree svmRadial: support vector machine xgbTree: xgboost","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hyperparams_list.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","text":"Named list hyperparameters.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hyperparams_list.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","text":"Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hyperparams_list.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","text":"","code":"get_hyperparams_list(otu_mini_bin, \"rf\") #> $mtry #> [1] 2 3 6 #> get_hyperparams_list(otu_small, \"rf\") #> $mtry #> [1] 4 8 16 #> get_hyperparams_list(otu_mini_bin, \"rpart2\") #> $maxdepth #> [1] 1 2 4 8 16 30 #> get_hyperparams_list(otu_small, \"rpart2\") #> $maxdepth #> [1] 1 2 4 8 16 30 #>"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_outcome_type.html","id":null,"dir":"Reference","previous_headings":"","what":"Get outcome type. — get_outcome_type","title":"Get outcome type. — get_outcome_type","text":"outcome numeric, type continuous. Otherwise, outcome type binary two outcomes multiclass two outcomes.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_outcome_type.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get outcome type. — get_outcome_type","text":"","code":"get_outcome_type(outcomes_vec)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_outcome_type.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get outcome type. — get_outcome_type","text":"outcomes_vec Vector outcomes.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_outcome_type.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get outcome type. — get_outcome_type","text":"Outcome type (continuous, binary, multiclass).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_outcome_type.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get outcome type. — get_outcome_type","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_outcome_type.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get outcome type. — get_outcome_type","text":"","code":"get_outcome_type(c(1, 2, 1)) #> [1] \"continuous\" get_outcome_type(c(\"a\", \"b\", \"b\")) #> [1] \"binary\" get_outcome_type(c(\"a\", \"b\", \"c\")) #> [1] \"multiclass\""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":null,"dir":"Reference","previous_headings":"","what":"Select indices to partition the data into training & testing sets. — get_partition_indices","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"Use function get row indices training set.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"","code":"get_partition_indices( outcomes, training_frac = 0.8, groups = NULL, group_partitions = NULL )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"outcomes vector outcomes training_frac Fraction data training set (default: 0.8). Rows dataset randomly selected training set, remaining rows used testing set. Alternatively, provide vector integers, used row indices training set. remaining rows used testing set. groups Vector groups keep together splitting data train test sets. number groups training set larger kfold, groups also kept together cross-validation. Length matches number rows dataset (default: NULL). group_partitions Specify assign groups training testing partitions (default: NULL). groups specifies samples belong group \"\" belong group \"B\", setting group_partitions = list(train = c(\"\", \"B\"), test = c(\"B\")) result samples group \"\" placed training set, samples \"B\" also training set, remaining samples \"B\" testing set. partition sizes close training_frac possible. number groups training set larger kfold, groups also kept together cross-validation.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"Vector row indices training set.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"groups NULL, uses createDataPartition. Otherwise, uses create_grouped_data_partition(). Set seed prior calling function like data partitions reproducible (recommended).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"","code":"training_inds <- get_partition_indices(otu_mini_bin$dx) train_data <- otu_mini_bin[training_inds, ] test_data <- otu_mini_bin[-training_inds, ]"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_fn.html","id":null,"dir":"Reference","previous_headings":"","what":"Get default performance metric function — get_perf_metric_fn","title":"Get default performance metric function — get_perf_metric_fn","text":"Get default performance metric function","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_fn.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get default performance metric function — get_perf_metric_fn","text":"","code":"get_perf_metric_fn(outcome_type)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_fn.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get default performance metric function — get_perf_metric_fn","text":"outcome_type Type outcome (one : \"continuous\",\"binary\",\"multiclass\").","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_fn.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get default performance metric function — get_perf_metric_fn","text":"Performance metric function.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_fn.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get default performance metric function — get_perf_metric_fn","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_fn.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get default performance metric function — get_perf_metric_fn","text":"","code":"get_perf_metric_fn(\"continuous\") #> function (data, lev = NULL, model = NULL) #> { #> if (is.character(data$obs)) #> data$obs <- factor(data$obs, levels = lev) #> postResample(data[, \"pred\"], data[, \"obs\"]) #> } #> #> get_perf_metric_fn(\"binary\") #> function (data, lev = NULL, model = NULL) #> { #> if (!all(levels(data[, \"pred\"]) == levels(data[, \"obs\"]))) #> stop(\"levels of observed and predicted data do not match\") #> has_class_probs <- all(lev %in% colnames(data)) #> if (has_class_probs) { #> lloss <- mnLogLoss(data = data, lev = lev, model = model) #> requireNamespaceQuietStop(\"pROC\") #> requireNamespaceQuietStop(\"MLmetrics\") #> prob_stats <- lapply(levels(data[, \"pred\"]), function(x) { #> obs <- ifelse(data[, \"obs\"] == x, 1, 0) #> prob <- data[, x] #> roc_auc <- try(pROC::roc(obs, data[, x], direction = \"<\", #> quiet = TRUE), silent = TRUE) #> roc_auc <- if (inherits(roc_auc, \"try-error\")) #> NA #> else roc_auc$auc #> pr_auc <- try(MLmetrics::PRAUC(y_pred = data[, x], #> y_true = obs), silent = TRUE) #> if (inherits(pr_auc, \"try-error\")) #> pr_auc <- NA #> res <- c(ROC = roc_auc, AUC = pr_auc) #> return(res) #> }) #> prob_stats <- do.call(\"rbind\", prob_stats) #> prob_stats <- colMeans(prob_stats, na.rm = TRUE) #> } #> CM <- confusionMatrix(data[, \"pred\"], data[, \"obs\"], mode = \"everything\") #> if (length(levels(data[, \"pred\"])) == 2) { #> class_stats <- CM$byClass #> } #> else { #> class_stats <- colMeans(CM$byClass) #> names(class_stats) <- paste(\"Mean\", names(class_stats)) #> } #> overall_stats <- if (has_class_probs) #> c(CM$overall, logLoss = as.numeric(lloss), AUC = unname(prob_stats[\"ROC\"]), #> prAUC = unname(prob_stats[\"AUC\"])) #> else CM$overall #> stats <- c(overall_stats, class_stats) #> stats <- stats[!names(stats) %in% c(\"AccuracyNull\", \"AccuracyLower\", #> \"AccuracyUpper\", \"AccuracyPValue\", \"McnemarPValue\", \"Mean Prevalence\", #> \"Mean Detection Prevalence\")] #> names(stats) <- gsub(\"[[:blank:]]+\", \"_\", names(stats)) #> stat_list <- c(\"Accuracy\", \"Kappa\", \"Mean_F1\", \"Mean_Sensitivity\", #> \"Mean_Specificity\", \"Mean_Pos_Pred_Value\", \"Mean_Neg_Pred_Value\", #> \"Mean_Precision\", \"Mean_Recall\", \"Mean_Detection_Rate\", #> \"Mean_Balanced_Accuracy\") #> if (has_class_probs) #> stat_list <- c(\"logLoss\", \"AUC\", \"prAUC\", stat_list) #> if (length(levels(data[, \"pred\"])) == 2) #> stat_list <- gsub(\"^Mean_\", \"\", stat_list) #> stats <- stats[c(stat_list)] #> return(stats) #> } #> #> get_perf_metric_fn(\"multiclass\") #> function (data, lev = NULL, model = NULL) #> { #> if (!all(levels(data[, \"pred\"]) == levels(data[, \"obs\"]))) #> stop(\"levels of observed and predicted data do not match\") #> has_class_probs <- all(lev %in% colnames(data)) #> if (has_class_probs) { #> lloss <- mnLogLoss(data = data, lev = lev, model = model) #> requireNamespaceQuietStop(\"pROC\") #> requireNamespaceQuietStop(\"MLmetrics\") #> prob_stats <- lapply(levels(data[, \"pred\"]), function(x) { #> obs <- ifelse(data[, \"obs\"] == x, 1, 0) #> prob <- data[, x] #> roc_auc <- try(pROC::roc(obs, data[, x], direction = \"<\", #> quiet = TRUE), silent = TRUE) #> roc_auc <- if (inherits(roc_auc, \"try-error\")) #> NA #> else roc_auc$auc #> pr_auc <- try(MLmetrics::PRAUC(y_pred = data[, x], #> y_true = obs), silent = TRUE) #> if (inherits(pr_auc, \"try-error\")) #> pr_auc <- NA #> res <- c(ROC = roc_auc, AUC = pr_auc) #> return(res) #> }) #> prob_stats <- do.call(\"rbind\", prob_stats) #> prob_stats <- colMeans(prob_stats, na.rm = TRUE) #> } #> CM <- confusionMatrix(data[, \"pred\"], data[, \"obs\"], mode = \"everything\") #> if (length(levels(data[, \"pred\"])) == 2) { #> class_stats <- CM$byClass #> } #> else { #> class_stats <- colMeans(CM$byClass) #> names(class_stats) <- paste(\"Mean\", names(class_stats)) #> } #> overall_stats <- if (has_class_probs) #> c(CM$overall, logLoss = as.numeric(lloss), AUC = unname(prob_stats[\"ROC\"]), #> prAUC = unname(prob_stats[\"AUC\"])) #> else CM$overall #> stats <- c(overall_stats, class_stats) #> stats <- stats[!names(stats) %in% c(\"AccuracyNull\", \"AccuracyLower\", #> \"AccuracyUpper\", \"AccuracyPValue\", \"McnemarPValue\", \"Mean Prevalence\", #> \"Mean Detection Prevalence\")] #> names(stats) <- gsub(\"[[:blank:]]+\", \"_\", names(stats)) #> stat_list <- c(\"Accuracy\", \"Kappa\", \"Mean_F1\", \"Mean_Sensitivity\", #> \"Mean_Specificity\", \"Mean_Pos_Pred_Value\", \"Mean_Neg_Pred_Value\", #> \"Mean_Precision\", \"Mean_Recall\", \"Mean_Detection_Rate\", #> \"Mean_Balanced_Accuracy\") #> if (has_class_probs) #> stat_list <- c(\"logLoss\", \"AUC\", \"prAUC\", stat_list) #> if (length(levels(data[, \"pred\"])) == 2) #> stat_list <- gsub(\"^Mean_\", \"\", stat_list) #> stats <- stats[c(stat_list)] #> return(stats) #> } #> #> "},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_name.html","id":null,"dir":"Reference","previous_headings":"","what":"Get default performance metric name — get_perf_metric_name","title":"Get default performance metric name — get_perf_metric_name","text":"Get default performance metric name cross-validation.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_name.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get default performance metric name — get_perf_metric_name","text":"","code":"get_perf_metric_name(outcome_type)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_name.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get default performance metric name — get_perf_metric_name","text":"outcome_type Type outcome (one : \"continuous\",\"binary\",\"multiclass\").","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_name.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get default performance metric name — get_perf_metric_name","text":"Performance metric name.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_name.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get default performance metric name — get_perf_metric_name","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_name.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get default performance metric name — get_perf_metric_name","text":"","code":"get_perf_metric_name(\"continuous\") #> [1] \"RMSE\" get_perf_metric_name(\"binary\") #> [1] \"AUC\" get_perf_metric_name(\"multiclass\") #> [1] \"logLoss\""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_performance_tbl.html","id":null,"dir":"Reference","previous_headings":"","what":"Get model performance metrics as a one-row tibble — get_performance_tbl","title":"Get model performance metrics as a one-row tibble — get_performance_tbl","text":"Get model performance metrics one-row tibble","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_performance_tbl.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get model performance metrics as a one-row tibble — get_performance_tbl","text":"","code":"get_performance_tbl( trained_model, test_data, outcome_colname, perf_metric_function, perf_metric_name, class_probs, method, seed = NA )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_performance_tbl.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get model performance metrics as a one-row tibble — get_performance_tbl","text":"trained_model Trained model caret::train(). test_data Held test data: dataframe outcome features. outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). perf_metric_function Function calculate performance metric used cross-validation test performance. functions provided caret (see caret::defaultSummary()). Defaults: binary classification = twoClassSummary, multi-class classification = multiClassSummary, regression = defaultSummary. perf_metric_name column name output function provided perf_metric_function used performance metric. Defaults: binary classification = \"ROC\", multi-class classification = \"logLoss\", regression = \"RMSE\". class_probs Whether use class probabilities (TRUE categorical outcomes, FALSE numeric outcomes). method ML method. Options: c(\"glmnet\", \"rf\", \"rpart2\", \"svmRadial\", \"xgbTree\"). glmnet: linear, logistic, multiclass regression rf: random forest rpart2: decision tree svmRadial: support vector machine xgbTree: xgboost seed Random seed (default: NA). results reproducible set seed.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_performance_tbl.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get model performance metrics as a one-row tibble — get_performance_tbl","text":"one-row tibble columns cv_auroc, column performance metrics test data method, seed.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_performance_tbl.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get model performance metrics as a one-row tibble — get_performance_tbl","text":"Kelly Sovacool, sovacool@umich.edu Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_performance_tbl.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get model performance metrics as a one-row tibble — get_performance_tbl","text":"","code":"if (FALSE) { results <- run_ml(otu_small, \"glmnet\", kfold = 2, cv_times = 2) names(results$trained_model$trainingData)[1] <- \"dx\" get_performance_tbl(results$trained_model, results$test_data, \"dx\", multiClassSummary, \"AUC\", class_probs = TRUE, method = \"glmnet\" ) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_tuning_grid.html","id":null,"dir":"Reference","previous_headings":"","what":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","title":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","text":"Generate tuning grid tuning hyperparameters","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_tuning_grid.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","text":"","code":"get_tuning_grid(hyperparams_list, method)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_tuning_grid.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","text":"hyperparams_list Named list lists hyperparameters. method ML method. Options: c(\"glmnet\", \"rf\", \"rpart2\", \"svmRadial\", \"xgbTree\"). glmnet: linear, logistic, multiclass regression rf: random forest rpart2: decision tree svmRadial: support vector machine xgbTree: xgboost","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_tuning_grid.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","text":"tuning grid.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_tuning_grid.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","text":"Begüm Topçuoğlu, topcuoglu.begum@gmail.com Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_tuning_grid.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","text":"","code":"ml_method <- \"glmnet\" hparams_list <- get_hyperparams_list(otu_small, ml_method) get_tuning_grid(hparams_list, ml_method) #> lambda alpha #> 1 1e-04 0 #> 2 1e-03 0 #> 3 1e-02 0 #> 4 1e-01 0 #> 5 1e+00 0 #> 6 1e+01 0"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/group_correlated_features.html","id":null,"dir":"Reference","previous_headings":"","what":"Group correlated features — group_correlated_features","title":"Group correlated features — group_correlated_features","text":"Group correlated features","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/group_correlated_features.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Group correlated features — group_correlated_features","text":"","code":"group_correlated_features( features, corr_thresh = 1, group_neg_corr = TRUE, corr_method = \"spearman\" )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/group_correlated_features.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Group correlated features — group_correlated_features","text":"features dataframe column feature ML corr_thresh feature importance, group correlations equal corr_thresh (range 0 1; default: 1). group_neg_corr Whether group negatively correlated features together (e.g. c(0,1) c(1,0)). corr_method correlation method. options supported stats::cor: spearman, pearson, kendall. (default: spearman)","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/group_correlated_features.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Group correlated features — group_correlated_features","text":"vector element group correlated features separated pipes (|)","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/group_correlated_features.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Group correlated features — group_correlated_features","text":"Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/group_correlated_features.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Group correlated features — group_correlated_features","text":"","code":"features <- data.frame( a = 1:3, b = 2:4, c = c(1, 0, 1), d = (5:7), e = c(5, 1, 4), f = c(-1, 0, -1) ) group_correlated_features(features) #> [1] \"a|b|d\" \"c|f\" \"e\""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/mikropml.html","id":null,"dir":"Reference","previous_headings":"","what":"mikropml: User-Friendly R Package for Robust Machine Learning Pipelines — mikropml","title":"mikropml: User-Friendly R Package for Robust Machine Learning Pipelines — mikropml","text":"mikropml implements supervised machine learning pipelines using regression, support vector machines, decision trees, random forest, gradient-boosted trees. main functions preprocess_data() process data prior running machine learning, run_ml() run machine learning.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/mikropml.html","id":"authors","dir":"Reference","previous_headings":"","what":"Authors","title":"mikropml: User-Friendly R Package for Robust Machine Learning Pipelines — mikropml","text":"Begüm D. Topçuoğlu (ORCID) Zena Lapp (ORCID) Kelly L. Sovacool (ORCID) Evan Snitkin (ORCID) Jenna Wiens (ORCID) Patrick D. Schloss (ORCID)","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/mikropml.html","id":"see-vignettes","dir":"Reference","previous_headings":"","what":"See vignettes","title":"mikropml: User-Friendly R Package for Robust Machine Learning Pipelines — mikropml","text":"Introduction Preprocessing data Hyperparameter tuning Parallel processing mikropml paper","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin.html","id":null,"dir":"Reference","previous_headings":"","what":"Mini OTU abundance dataset — otu_mini_bin","title":"Mini OTU abundance dataset — otu_mini_bin","text":"dataset containing relatives abundances OTUs human stool samples binary outcome, dx. subset otu_small.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Mini OTU abundance dataset — otu_mini_bin","text":"","code":"otu_mini_bin"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Mini OTU abundance dataset — otu_mini_bin","text":"data frame dx column diagnosis: healthy cancerous (colorectal). columns OTU relative abundances.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_glmnet.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with L2 logistic regression on otu_mini_bin with feature importance and grouping — otu_mini_bin_results_glmnet","title":"Results from running the pipeline with L2 logistic regression on otu_mini_bin with feature importance and grouping — otu_mini_bin_results_glmnet","text":"Results running pipeline L2 logistic regression otu_mini_bin feature importance grouping","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_glmnet.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with L2 logistic regression on otu_mini_bin with feature importance and grouping — otu_mini_bin_results_glmnet","text":"","code":"otu_mini_bin_results_glmnet"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_glmnet.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with L2 logistic regression on otu_mini_bin with feature importance and grouping — otu_mini_bin_results_glmnet","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_rf.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with random forest on otu_mini_bin — otu_mini_bin_results_rf","title":"Results from running the pipeline with random forest on otu_mini_bin — otu_mini_bin_results_rf","text":"Results running pipeline random forest otu_mini_bin","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_rf.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with random forest on otu_mini_bin — otu_mini_bin_results_rf","text":"","code":"otu_mini_bin_results_rf"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_rf.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with random forest on otu_mini_bin — otu_mini_bin_results_rf","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_rpart2.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with rpart2 on otu_mini_bin — otu_mini_bin_results_rpart2","title":"Results from running the pipeline with rpart2 on otu_mini_bin — otu_mini_bin_results_rpart2","text":"Results running pipeline rpart2 otu_mini_bin","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_rpart2.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with rpart2 on otu_mini_bin — otu_mini_bin_results_rpart2","text":"","code":"otu_mini_bin_results_rpart2"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_rpart2.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with rpart2 on otu_mini_bin — otu_mini_bin_results_rpart2","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_svmRadial.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with svmRadial on otu_mini_bin — otu_mini_bin_results_svmRadial","title":"Results from running the pipeline with svmRadial on otu_mini_bin — otu_mini_bin_results_svmRadial","text":"Results running pipeline svmRadial otu_mini_bin","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_svmRadial.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with svmRadial on otu_mini_bin — otu_mini_bin_results_svmRadial","text":"","code":"otu_mini_bin_results_svmRadial"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_svmRadial.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with svmRadial on otu_mini_bin — otu_mini_bin_results_svmRadial","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_xgbTree.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with xbgTree on otu_mini_bin — otu_mini_bin_results_xgbTree","title":"Results from running the pipeline with xbgTree on otu_mini_bin — otu_mini_bin_results_xgbTree","text":"Results running pipeline xbgTree otu_mini_bin","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_xgbTree.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with xbgTree on otu_mini_bin — otu_mini_bin_results_xgbTree","text":"","code":"otu_mini_bin_results_xgbTree"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_xgbTree.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with xbgTree on otu_mini_bin — otu_mini_bin_results_xgbTree","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cont_results_glmnet.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome — otu_mini_cont_results_glmnet","title":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome — otu_mini_cont_results_glmnet","text":"Results running pipeline glmnet otu_mini_bin Otu00001 outcome","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cont_results_glmnet.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome — otu_mini_cont_results_glmnet","text":"","code":"otu_mini_cont_results_glmnet"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cont_results_glmnet.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome — otu_mini_cont_results_glmnet","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cont_results_nocv.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome column,\nusing a custom train control scheme that does not perform cross-validation — otu_mini_cont_results_nocv","title":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome column,\nusing a custom train control scheme that does not perform cross-validation — otu_mini_cont_results_nocv","text":"Results running pipeline glmnet otu_mini_bin Otu00001 outcome column, using custom train control scheme perform cross-validation","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cont_results_nocv.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome column,\nusing a custom train control scheme that does not perform cross-validation — otu_mini_cont_results_nocv","text":"","code":"otu_mini_cont_results_nocv"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cont_results_nocv.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome column,\nusing a custom train control scheme that does not perform cross-validation — otu_mini_cont_results_nocv","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cv.html","id":null,"dir":"Reference","previous_headings":"","what":"Cross validation on train_data_mini with grouped features. — otu_mini_cv","title":"Cross validation on train_data_mini with grouped features. — otu_mini_cv","text":"Cross validation train_data_mini grouped features.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cv.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Cross validation on train_data_mini with grouped features. — otu_mini_cv","text":"","code":"otu_mini_cv"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cv.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Cross validation on train_data_mini with grouped features. — otu_mini_cv","text":"object class list length 27.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi.html","id":null,"dir":"Reference","previous_headings":"","what":"Mini OTU abundance dataset with 3 categorical variables — otu_mini_multi","title":"Mini OTU abundance dataset with 3 categorical variables — otu_mini_multi","text":"dataset containing relatives abundances OTUs human stool samples","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Mini OTU abundance dataset with 3 categorical variables — otu_mini_multi","text":"","code":"otu_mini_multi"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Mini OTU abundance dataset with 3 categorical variables — otu_mini_multi","text":"data frame dx column colorectal cancer diagnosis: adenoma, carcinoma, normal. columns OTU relative abundances.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi_group.html","id":null,"dir":"Reference","previous_headings":"","what":"Groups for otu_mini_multi — otu_mini_multi_group","title":"Groups for otu_mini_multi — otu_mini_multi_group","text":"Groups otu_mini_multi","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi_group.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Groups for otu_mini_multi — otu_mini_multi_group","text":"","code":"otu_mini_multi_group"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi_group.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Groups for otu_mini_multi — otu_mini_multi_group","text":"object class character length 490.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi_results_glmnet.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with glmnet on otu_mini_multi for\nmulticlass outcomes — otu_mini_multi_results_glmnet","title":"Results from running the pipeline with glmnet on otu_mini_multi for\nmulticlass outcomes — otu_mini_multi_results_glmnet","text":"Results running pipeline glmnet otu_mini_multi multiclass outcomes","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi_results_glmnet.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with glmnet on otu_mini_multi for\nmulticlass outcomes — otu_mini_multi_results_glmnet","text":"","code":"otu_mini_multi_results_glmnet"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi_results_glmnet.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with glmnet on otu_mini_multi for\nmulticlass outcomes — otu_mini_multi_results_glmnet","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_small.html","id":null,"dir":"Reference","previous_headings":"","what":"Small OTU abundance dataset — otu_small","title":"Small OTU abundance dataset — otu_small","text":"dataset containing relatives abundances 60 OTUs 60 human stool samples. subset data provided extdata/otu_large.csv, used Topçuoğlu et al. 2020.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_small.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Small OTU abundance dataset — otu_small","text":"","code":"otu_small"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_small.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Small OTU abundance dataset — otu_small","text":"data frame 60 rows 61 variables. dx column diagnosis: healthy cancerous (colorectal). columns OTU relative abundances.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/permute_p_value.html","id":null,"dir":"Reference","previous_headings":"","what":"Calculated a permuted p-value comparing two models — permute_p_value","title":"Calculated a permuted p-value comparing two models — permute_p_value","text":"Calculated permuted p-value comparing two models","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/permute_p_value.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Calculated a permuted p-value comparing two models — permute_p_value","text":"","code":"permute_p_value( merged_data, metric, group_name, group_1, group_2, nperm = 10000 )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/permute_p_value.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Calculated a permuted p-value comparing two models — permute_p_value","text":"merged_data concatenated performance data run_ml metric metric compare, must numeric group_name column group variables compare group_1 name one group compare group_2 name group compare nperm number permutations, default=10000","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/permute_p_value.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Calculated a permuted p-value comparing two models — permute_p_value","text":"numeric p-value comparing two models","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/permute_p_value.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Calculated a permuted p-value comparing two models — permute_p_value","text":"Begüm Topçuoğlu, topcuoglu.begum@gmail.com Courtney R Armour, armourc@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/permute_p_value.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Calculated a permuted p-value comparing two models — permute_p_value","text":"","code":"df <- dplyr::tibble( model = c(\"rf\", \"rf\", \"glmnet\", \"glmnet\", \"svmRadial\", \"svmRadial\"), AUC = c(.2, 0.3, 0.8, 0.9, 0.85, 0.95) ) set.seed(123) permute_p_value(df, \"AUC\", \"model\", \"rf\", \"glmnet\", nperm = 100) #> [1] 0.3663366"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_hp_performance.html","id":null,"dir":"Reference","previous_headings":"","what":"Plot hyperparameter performance metrics — plot_hp_performance","title":"Plot hyperparameter performance metrics — plot_hp_performance","text":"Plot hyperparameter performance metrics","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_hp_performance.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Plot hyperparameter performance metrics — plot_hp_performance","text":"","code":"plot_hp_performance(dat, param_col, metric_col)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_hp_performance.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Plot hyperparameter performance metrics — plot_hp_performance","text":"dat dataframe hyperparameters performance metric (e.g. get_hp_performance() combine_hp_performance()) param_col hyperparameter plotted. must column dat. metric_col performance metric. must column dat.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_hp_performance.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Plot hyperparameter performance metrics — plot_hp_performance","text":"ggplot hyperparameter performance.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_hp_performance.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Plot hyperparameter performance metrics — plot_hp_performance","text":"Zena Lapp, zenalapp@umich.edu Kelly Sovacool sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_hp_performance.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Plot hyperparameter performance metrics — plot_hp_performance","text":"","code":"# plot for a single `run_ml()` call hp_metrics <- get_hp_performance(otu_mini_bin_results_glmnet$trained_model) hp_metrics #> $dat #> alpha lambda AUC #> 1 0 1e-04 0.6082552 #> 2 0 1e-03 0.6082552 #> 3 0 1e-02 0.6086458 #> 4 0 1e-01 0.6166789 #> 5 0 1e+00 0.6221737 #> 6 0 1e+01 0.6187408 #> #> $params #> [1] \"lambda\" #> #> $metric #> [1] \"AUC\" #> plot_hp_performance(hp_metrics$dat, lambda, AUC) if (FALSE) { # plot for multiple `run_ml()` calls results <- lapply(seq(100, 102), function(seed) { run_ml(otu_small, \"glmnet\", seed = seed) }) models <- lapply(results, function(x) x$trained_model) hp_metrics <- combine_hp_performance(models) plot_hp_performance(hp_metrics$dat, lambda, AUC) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_model_performance.html","id":null,"dir":"Reference","previous_headings":"","what":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","title":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","text":"ggplot2 required use function.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_model_performance.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","text":"","code":"plot_model_performance(performance_df)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_model_performance.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","text":"performance_df dataframe performance results multiple calls run_ml()","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_model_performance.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","text":"ggplot2 plot performance.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_model_performance.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","text":"Begüm Topçuoglu, topcuoglu.begum@gmail.com Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_model_performance.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","text":"","code":"if (FALSE) { # call `run_ml()` multiple times with different seeds results_lst <- lapply(seq(100, 104), function(seed) { run_ml(otu_small, \"glmnet\", seed = seed) }) # extract and combine the performance results perf_df <- lapply(results_lst, function(result) { result[[\"performance\"]] }) %>% dplyr::bind_rows() # plot the performance results p <- plot_model_performance(perf_df) # call `run_ml()` with different ML methods param_grid <- expand.grid( seeds = seq(100, 104), methods = c(\"glmnet\", \"rf\") ) results_mtx <- mapply( function(seed, method) { run_ml(otu_mini_bin, method, seed = seed, kfold = 2) }, param_grid$seeds, param_grid$methods ) # extract and combine the performance results perf_df2 <- dplyr::bind_rows(results_mtx[\"performance\", ]) # plot the performance results p <- plot_model_performance(perf_df2) # you can continue adding layers to customize the plot p + theme_classic() + scale_color_brewer(palette = \"Dark2\") + coord_flip() }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":null,"dir":"Reference","previous_headings":"","what":"Preprocess data prior to running machine learning — preprocess_data","title":"Preprocess data prior to running machine learning — preprocess_data","text":"Function preprocess data input run_ml().","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Preprocess data prior to running machine learning — preprocess_data","text":"","code":"preprocess_data( dataset, outcome_colname, method = c(\"center\", \"scale\"), remove_var = \"nzv\", collapse_corr_feats = TRUE, to_numeric = TRUE, group_neg_corr = TRUE, prefilter_threshold = 1 )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Preprocess data prior to running machine learning — preprocess_data","text":"dataset Dataframe outcome variable columns features. outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). method Methods preprocess data, described caret::preProcess() (default: c(\"center\",\"scale\"), use NULL normalization). remove_var Whether remove variables near-zero variance ('nzv'; default), zero variance ('zv'), none (NULL). collapse_corr_feats Whether keep one perfectly correlated features. to_numeric Whether change features numeric possible. group_neg_corr Whether group negatively correlated features together (e.g. c(0,1) c(1,0)). prefilter_threshold Remove features non-zero & non-NA values N rows fewer (default: 1). Set -1 keep columns step. step also skipped to_numeric set FALSE.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Preprocess data prior to running machine learning — preprocess_data","text":"Named list including: dat_transformed: Preprocessed data. grp_feats: features grouped together, named list features corresponding group. removed_feats: features removed preprocessing (e.g. zero variance near-zero variance features). progressr package installed, progress bar time elapsed estimated time completion can displayed.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":"more-details","dir":"Reference","previous_headings":"","what":"More details","title":"Preprocess data prior to running machine learning — preprocess_data","text":"See preprocessing vignette details. Note values outcome_colname contain spaces, converted underscores compatibility caret.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Preprocess data prior to running machine learning — preprocess_data","text":"Zena Lapp, zenalapp@umich.edu Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Preprocess data prior to running machine learning — preprocess_data","text":"","code":"preprocess_data(mikropml::otu_small, \"dx\") #> Using 'dx' as the outcome column. #> $dat_transformed #> # A tibble: 200 × 61 #> dx Otu00001 Otu00002 Otu00003 Otu00004 Otu00005 Otu00006 Otu00007 Otu00008 #> #> 1 norm… -0.420 -0.219 -0.174 -0.591 -0.0488 -0.167 -0.569 -0.0624 #> 2 norm… -0.105 1.75 -0.718 0.0381 1.54 -0.573 -0.643 -0.132 #> 3 norm… -0.708 0.696 1.43 0.604 -0.265 -0.0364 -0.612 -0.207 #> 4 norm… -0.494 -0.665 2.02 -0.593 -0.676 -0.586 -0.552 -0.470 #> 5 norm… 1.11 -0.395 -0.754 -0.586 -0.754 2.73 0.191 -0.676 #> 6 norm… -0.685 0.614 -0.174 -0.584 0.376 0.804 -0.337 -0.00608 #> 7 canc… -0.770 -0.496 -0.318 0.159 -0.658 2.20 -0.717 0.0636 #> 8 norm… -0.424 -0.478 -0.397 -0.556 -0.391 -0.0620 0.376 -0.0222 #> 9 norm… -0.556 1.14 1.62 -0.352 -0.275 -0.465 -0.804 0.294 #> 10 canc… 1.46 -0.451 -0.694 -0.0567 -0.706 0.689 -0.370 1.59 #> # … with 190 more rows, and 52 more variables: Otu00009 , Otu00010 , #> # Otu00011 , Otu00012 , Otu00013 , Otu00014 , #> # Otu00015 , Otu00016 , Otu00017 , Otu00018 , #> # Otu00019 , Otu00020 , Otu00021 , Otu00022 , #> # Otu00023 , Otu00024 , Otu00025 , Otu00026 , #> # Otu00027 , Otu00028 , Otu00029 , Otu00030 , #> # Otu00031 , Otu00032 , Otu00033 , Otu00034 , … #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0) #> # the function can show a progress bar if you have the progressr package installed ## optionally, specify the progress bar format progressr::handlers(progressr::handler_progress( format = \":message :bar :percent | elapsed: :elapsed | eta: :eta\", clear = FALSE, show_after = 0 )) ## tell progressor to always report progress if (FALSE) { progressr::handlers(global = TRUE) ## run the function and watch the live progress udpates dat_preproc <- preprocess_data(mikropml::otu_small, \"dx\") }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/randomize_feature_order.html","id":null,"dir":"Reference","previous_headings":"","what":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","title":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","text":"Randomize feature order eliminate position-dependent effects","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/randomize_feature_order.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","text":"","code":"randomize_feature_order(dataset, outcome_colname)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/randomize_feature_order.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","text":"dataset Dataframe outcome variable columns features. outcome_colname Column name string outcome variable (default NULL; first column chosen automatically).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/randomize_feature_order.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","text":"Dataset feature order randomized.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/randomize_feature_order.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","text":"Nick Lesniak, nlesniak@umich.edu Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/randomize_feature_order.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","text":"","code":"dat <- data.frame( outcome = c(\"1\", \"2\", \"3\"), a = 4:6, b = 7:9, c = 10:12, d = 13:15 ) randomize_feature_order(dat, \"outcome\") #> outcome c b a d #> 1 1 10 7 4 13 #> 2 2 11 8 5 14 #> 3 3 12 9 6 15"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/reexports.html","id":null,"dir":"Reference","previous_headings":"","what":"dplyr pipe — reexports","title":"dplyr pipe — reexports","text":"objects imported packages. Follow links see documentation. caret contr.ltfr dplyr %>% rlang :=, !!, .data","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/remove_singleton_columns.html","id":null,"dir":"Reference","previous_headings":"","what":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","title":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","text":"Removes columns non-zero & non-NA values threshold row(s) fewer.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/remove_singleton_columns.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","text":"","code":"remove_singleton_columns(dat, threshold = 1)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/remove_singleton_columns.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","text":"dat dataframe threshold Number rows. column non-zero & non-NA values threshold row(s) fewer, removed.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/remove_singleton_columns.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","text":"dataframe without singleton columns","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/remove_singleton_columns.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","text":"Kelly Sovacool, sovacool@umich.edu Courtney Armour","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/remove_singleton_columns.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","text":"","code":"remove_singleton_columns(data.frame(a = 1:3, b = c(0, 1, 0), c = 4:6)) #> $dat #> a c #> 1 1 4 #> 2 2 5 #> 3 3 6 #> #> $removed_feats #> [1] \"b\" #> remove_singleton_columns(data.frame(a = 1:3, b = c(0, 1, 0), c = 4:6), threshold = 0) #> $dat #> a b c #> 1 1 0 4 #> 2 2 1 5 #> 3 3 0 6 #> #> $removed_feats #> character(0) #> remove_singleton_columns(data.frame(a = 1:3, b = c(0, 1, NA), c = 4:6)) #> $dat #> a c #> 1 1 4 #> 2 2 5 #> 3 3 6 #> #> $removed_feats #> [1] \"b\" #> remove_singleton_columns(data.frame(a = 1:3, b = c(1, 1, 1), c = 4:6)) #> $dat #> a b c #> 1 1 1 4 #> 2 2 1 5 #> 3 3 1 6 #> #> $removed_feats #> character(0) #>"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/replace_spaces.html","id":null,"dir":"Reference","previous_headings":"","what":"Replace spaces in all elements of a character vector with underscores — replace_spaces","title":"Replace spaces in all elements of a character vector with underscores — replace_spaces","text":"Replace spaces elements character vector underscores","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/replace_spaces.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Replace spaces in all elements of a character vector with underscores — replace_spaces","text":"","code":"replace_spaces(x, new_char = \"_\")"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/replace_spaces.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Replace spaces in all elements of a character vector with underscores — replace_spaces","text":"x character vector new_char character replace spaces (default: _)","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/replace_spaces.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Replace spaces in all elements of a character vector with underscores — replace_spaces","text":"character vector spaces replaced new_char","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/replace_spaces.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Replace spaces in all elements of a character vector with underscores — replace_spaces","text":"Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/replace_spaces.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Replace spaces in all elements of a character vector with underscores — replace_spaces","text":"","code":"dat <- data.frame( dx = c(\"outcome 1\", \"outcome 2\", \"outcome 1\"), a = 1:3, b = c(5, 7, 1) ) dat$dx <- replace_spaces(dat$dx) dat #> dx a b #> 1 outcome_1 1 5 #> 2 outcome_2 2 7 #> 3 outcome_1 3 1"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":null,"dir":"Reference","previous_headings":"","what":"Run the machine learning pipeline — run_ml","title":"Run the machine learning pipeline — run_ml","text":"function runs machine learning (ML), evaluates best model, optionally calculates feature importance using framework outlined Topçuoğlu et al. 2020 (doi:10.1128/mBio.00434-20 ). Required inputs dataframe outcome variable columns features, well ML method. See vignette('introduction') details.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Run the machine learning pipeline — run_ml","text":"","code":"run_ml( dataset, method, outcome_colname = NULL, hyperparameters = NULL, find_feature_importance = FALSE, calculate_performance = TRUE, kfold = 5, cv_times = 100, cross_val = NULL, training_frac = 0.8, perf_metric_function = NULL, perf_metric_name = NULL, groups = NULL, group_partitions = NULL, corr_thresh = 1, ntree = 1000, seed = NA )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Run the machine learning pipeline — run_ml","text":"dataset Dataframe outcome variable columns features. method ML method. Options: c(\"glmnet\", \"rf\", \"rpart2\", \"svmRadial\", \"xgbTree\"). glmnet: linear, logistic, multiclass regression rf: random forest rpart2: decision tree svmRadial: support vector machine xgbTree: xgboost outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). hyperparameters Dataframe hyperparameters (default NULL; sensible defaults chosen automatically). find_feature_importance Run permutation importance (default: FALSE). TRUE recommended like identify features important predicting outcome, resource-intensive. calculate_performance Whether calculate performance metrics (default: TRUE). might choose skip perform cross-validation model training. kfold Fold number k-fold cross-validation (default: 5). cv_times Number cross-validation partitions create (default: 100). cross_val custom cross-validation scheme caret::trainControl() (default: NULL, uses kfold cross validation repeated cv_times). kfold cv_times ignored user provides custom cross-validation scheme. See caret::trainControl() docs information use . training_frac Fraction data training set (default: 0.8). Rows dataset randomly selected training set, remaining rows used testing set. Alternatively, provide vector integers, used row indices training set. remaining rows used testing set. perf_metric_function Function calculate performance metric used cross-validation test performance. functions provided caret (see caret::defaultSummary()). Defaults: binary classification = twoClassSummary, multi-class classification = multiClassSummary, regression = defaultSummary. perf_metric_name column name output function provided perf_metric_function used performance metric. Defaults: binary classification = \"ROC\", multi-class classification = \"logLoss\", regression = \"RMSE\". groups Vector groups keep together splitting data train test sets. number groups training set larger kfold, groups also kept together cross-validation. Length matches number rows dataset (default: NULL). group_partitions Specify assign groups training testing partitions (default: NULL). groups specifies samples belong group \"\" belong group \"B\", setting group_partitions = list(train = c(\"\", \"B\"), test = c(\"B\")) result samples group \"\" placed training set, samples \"B\" also training set, remaining samples \"B\" testing set. partition sizes close training_frac possible. number groups training set larger kfold, groups also kept together cross-validation. corr_thresh feature importance, group correlations equal corr_thresh (range 0 1; default: 1). ntree random forest, many trees use (default: 1000). Note caret allow parameter tuned. seed Random seed (default: NA). results reproducible set seed.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Run the machine learning pipeline — run_ml","text":"Named list results: trained_model: Output caret::train(), including best model. test_data: Part data used testing. performance: Dataframe performance metrics. first column cross-validation performance metric, last two columns ML method used seed (one set), respectively. columns performance metrics calculated test data. contains one row, can easily combine performance dataframes multiple calls run_ml() (see vignette(\"parallel\")). feature_importance: feature importances calculated, dataframe row feature correlated group. columns performance metric permuted data, difference true performance metric performance metric permuted data (true - permuted), feature name, ML method, performance metric name, seed (provided). AUC RMSE, higher perf_metric_diff , important feature predicting outcome. log loss, lower perf_metric_diff , important feature predicting outcome.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":"more-details","dir":"Reference","previous_headings":"","what":"More details","title":"Run the machine learning pipeline — run_ml","text":"details, please see vignettes.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Run the machine learning pipeline — run_ml","text":"Begüm Topçuoğlu, topcuoglu.begum@gmail.com Zena Lapp, zenalapp@umich.edu Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Run the machine learning pipeline — run_ml","text":"","code":"if (FALSE) { # regression run_ml(otu_small, \"glmnet\", seed = 2019 ) # random forest w/ feature importance run_ml(otu_small, \"rf\", outcome_colname = \"dx\", find_feature_importance = TRUE ) # custom cross validation & hyperparameters run_ml(otu_mini_bin[, 2:11], \"glmnet\", outcome_colname = \"Otu00001\", seed = 2019, hyperparameters = list(lambda = c(1e-04), alpha = 0), cross_val = caret::trainControl(method = \"none\"), calculate_performance = FALSE ) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/tidy_perf_data.html","id":null,"dir":"Reference","previous_headings":"","what":"Tidy the performance dataframe — tidy_perf_data","title":"Tidy the performance dataframe — tidy_perf_data","text":"Used plot_model_performance().","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/tidy_perf_data.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Tidy the performance dataframe — tidy_perf_data","text":"","code":"tidy_perf_data(performance_df)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/tidy_perf_data.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Tidy the performance dataframe — tidy_perf_data","text":"performance_df dataframe performance results multiple calls run_ml()","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/tidy_perf_data.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Tidy the performance dataframe — tidy_perf_data","text":"Tidy dataframe model performance metrics.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/tidy_perf_data.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Tidy the performance dataframe — tidy_perf_data","text":"Begüm Topçuoglu, topcuoglu.begum@gmail.com Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/tidy_perf_data.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Tidy the performance dataframe — tidy_perf_data","text":"","code":"if (FALSE) { # call `run_ml()` multiple times with different seeds results_lst <- lapply(seq(100, 104), function(seed) { run_ml(otu_small, \"glmnet\", seed = seed) }) # extract and combine the performance results perf_df <- lapply(results_lst, function(result) { result[[\"performance\"]] }) %>% dplyr::bind_rows() # make it pretty! tidy_perf_data(perf_df) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/train_model.html","id":null,"dir":"Reference","previous_headings":"","what":"Train model using caret::train(). — train_model","title":"Train model using caret::train(). — train_model","text":"Train model using caret::train().","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/train_model.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Train model using caret::train(). — train_model","text":"","code":"train_model( model_formula, train_data, method, cv, perf_metric_name, tune_grid, ntree )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/train_model.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Train model using caret::train(). — train_model","text":"model_formula Model formula, typically created stats::.formula(). train_data Training data. Expected subset full dataset. method ML method. Options: c(\"glmnet\", \"rf\", \"rpart2\", \"svmRadial\", \"xgbTree\"). glmnet: linear, logistic, multiclass regression rf: random forest rpart2: decision tree svmRadial: support vector machine xgbTree: xgboost cv Cross-validation caret scheme define_cv(). perf_metric_name column name output function provided perf_metric_function used performance metric. Defaults: binary classification = \"ROC\", multi-class classification = \"logLoss\", regression = \"RMSE\". tune_grid Tuning grid get_tuning_grid(). ntree random forest, many trees use (default: 1000). Note caret allow parameter tuned.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/train_model.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Train model using caret::train(). — train_model","text":"Trained model caret::train().","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/train_model.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Train model using caret::train(). — train_model","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/train_model.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Train model using caret::train(). — train_model","text":"","code":"if (FALSE) { training_data <- otu_mini_bin_results_glmnet$trained_model$trainingData %>% dplyr::rename(dx = .outcome) method <- \"rf\" hyperparameters <- get_hyperparams_list(otu_mini_bin, method) cross_val <- define_cv(training_data, \"dx\", hyperparameters, perf_metric_function = caret::multiClassSummary, class_probs = TRUE, cv_times = 2 ) tune_grid <- get_tuning_grid(hyperparameters, method) rf_model <- train_model( stats::as.formula(paste(\"dx\", \"~ .\")), training_data, method, cross_val, \"AUC\", tune_grid, 1000 ) rf_model$results %>% dplyr::select(mtry, AUC, prAUC) }"},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-130","dir":"Changelog","previous_headings":"","what":"mikropml 1.3.0","title":"mikropml 1.3.0","text":"CRAN release: 2022-05-20 mikropml now requires R version 4.1.0 greater due update randomForest package (#292). New function compare_models() compares performance two models permutation test (#295, @courtneyarmour). Fixed bug cv_times affect reported repeats cross-validation (#291, @kelly-sovacool). Made minor documentation improvements (#293, @kelly-sovacool)","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-122","dir":"Changelog","previous_headings":"","what":"mikropml 1.2.2","title":"mikropml 1.2.2","text":"CRAN release: 2022-02-03 minor patch fixes test failure platforms long doubles. actual package code remains unchanged.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-121","dir":"Changelog","previous_headings":"","what":"mikropml 1.2.1","title":"mikropml 1.2.1","text":"CRAN release: 2022-01-30 using groups parameter, groups kept together cross-validation partitions kfold <= number groups training set. Previously, error thrown condition met. Now, enough groups training set groups kept together CV, groups allowed split across CV partitions. Report p-values permutation feature importance (#288, @kelly-sovacool).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-120","dir":"Changelog","previous_headings":"","what":"mikropml 1.2.0","title":"mikropml 1.2.0","text":"CRAN release: 2021-11-10 Also added new parameter calculate_performance, controls whether performance metrics calculated (default: TRUE). Users may wish skip performance calculations training models cross-validation. New parameter group_partitions added run_ml() allows users control groups go partition train/test split (#281, @kelly-sovacool). default, training_frac fraction 0 1 specifies much dataset used training fraction train/test split. Users can instead give training_frac vector indices correspond rows dataset go training fraction train/test split. gives users direct control exactly observations training fraction desired.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-111","dir":"Changelog","previous_headings":"","what":"mikropml 1.1.1","title":"mikropml 1.1.1","text":"CRAN release: 2021-09-14 Also, group_correlated_features() now user-facing function.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-110","dir":"Changelog","previous_headings":"","what":"mikropml 1.1.0","title":"mikropml 1.1.0","text":"CRAN release: 2021-08-10 default still “spearman”, now can use methods supported stats::cor corr_method parameter: get_feature_importance(corr_method = \"pearson\") now video tutorials covering mikropml skills related machine learning, created @pschloss (#270). Fixed bug preprocess_data() converted outcome column character vector (#273, @kelly-sovacool, @ecmaggioncalda).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-100","dir":"Changelog","previous_headings":"","what":"mikropml 1.0.0","title":"mikropml 1.0.0","text":"CRAN release: 2021-05-13 mikropml now logo created @NLesniak! Made documentation improvements (#238, #231 @kelly-sovacool; #256 @BTopcuoglu). Remove features appear N=prefilter_threshold fewer rows data. Created function remove_singleton_columns() called preprocess_data() carry . Provide custom groups features permute together permutation importance. groups NULL default; case, correlated features corr_thresh grouped together. preprocess_data() now replaces spaces outcome column underscores (#247, @kelly-sovacool, @JonnyTran). Clarify intro vignette support multi-label outcomes. (#254, @zenalapp) Optional progress bar preprocess_data() get_feature_importance() using progressr package (#257, @kelly-sovacool, @JonnyTran, @FedericoComoglio). mikropml paper soon published JOSS!","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-002","dir":"Changelog","previous_headings":"","what":"mikropml 0.0.2","title":"mikropml 0.0.2","text":"CRAN release: 2020-12-03 Fixed test failure Solaris. Fixed multiple test failures R 3.6.2 due stringsAsFactors behavior. Made minor documentation improvements. Moved rpart Suggests Imports consistency packages used model training.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-001","dir":"Changelog","previous_headings":"","what":"mikropml 0.0.1","title":"mikropml 0.0.1","text":"CRAN release: 2020-11-23 first release version mikropml! 🎉 Added NEWS.md file track changes package. run_ml() preprocess_data() plot_model_performance() plot_hp_performance() glmnet: logistic linear regression rf: random forest rpart2: decision trees svmRadial: support vector machines xgbTree: gradient-boosted trees Introduction Preprocess data Hyperparameter tuning Parallel processing mikropml paper","code":""}] +[{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":null,"dir":"","previous_headings":"","what":"Contributor Covenant Code of Conduct","title":"Contributor Covenant Code of Conduct","text":"document adapted Tidyverse Code Conduct.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"our-pledge","dir":"","previous_headings":"","what":"Our Pledge","title":"Contributor Covenant Code of Conduct","text":"members, contributors, leaders pledge make participation community harassment-free experience everyone, regardless age, body size, visible invisible disability, ethnicity, sex characteristics, gender identity expression, level experience, education, socio-economic status, nationality, personal appearance, race, religion, sexual identity orientation. pledge act interact ways contribute open, welcoming, diverse, inclusive, healthy community.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"our-standards","dir":"","previous_headings":"","what":"Our Standards","title":"Contributor Covenant Code of Conduct","text":"Examples behavior contributes positive environment community include: Demonstrating empathy kindness toward people respectful differing opinions, viewpoints, experiences Giving gracefully accepting constructive feedback Accepting responsibility apologizing affected mistakes, learning experience Focusing best just us individuals, overall community Examples unacceptable behavior include: use sexualized language imagery, sexual attention advances kind Trolling, insulting derogatory comments, personal political attacks Public private harassment Publishing others’ private information, physical email address, without explicit permission conduct reasonably considered inappropriate professional setting","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"enforcement-responsibilities","dir":"","previous_headings":"","what":"Enforcement Responsibilities","title":"Contributor Covenant Code of Conduct","text":"Community leaders responsible clarifying enforcing standards acceptable behavior take appropriate fair corrective action response behavior deem inappropriate, threatening, offensive, harmful. Community leaders right responsibility remove, edit, reject comments, commits, code, wiki edits, issues, contributions aligned Code Conduct, communicate reasons moderation decisions appropriate.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"scope","dir":"","previous_headings":"","what":"Scope","title":"Contributor Covenant Code of Conduct","text":"Code Conduct applies within community spaces, also applies individual officially representing community public spaces. Examples representing community include using official e-mail address, posting via official social media account, acting appointed representative online offline event.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"enforcement","dir":"","previous_headings":"","what":"Enforcement","title":"Contributor Covenant Code of Conduct","text":"Instances abusive, harassing, otherwise unacceptable behavior may reported community leaders responsible enforcement [INSERT CONTACT METHOD]. complaints reviewed investigated promptly fairly. community leaders obligated respect privacy security reporter incident.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"enforcement-guidelines","dir":"","previous_headings":"","what":"Enforcement Guidelines","title":"Contributor Covenant Code of Conduct","text":"Community leaders follow Community Impact Guidelines determining consequences action deem violation Code Conduct:","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"1-correction","dir":"","previous_headings":"Enforcement Guidelines","what":"1. Correction","title":"Contributor Covenant Code of Conduct","text":"Community Impact: Use inappropriate language behavior deemed unprofessional unwelcome community. Consequence: private, written warning community leaders, providing clarity around nature violation explanation behavior inappropriate. public apology may requested.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"2-warning","dir":"","previous_headings":"Enforcement Guidelines","what":"2. Warning","title":"Contributor Covenant Code of Conduct","text":"Community Impact: violation single incident series actions. Consequence: warning consequences continued behavior. interaction people involved, including unsolicited interaction enforcing Code Conduct, specified period time. includes avoiding interactions community spaces well external channels like social media. Violating terms may lead temporary permanent ban.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"3-temporary-ban","dir":"","previous_headings":"Enforcement Guidelines","what":"3. Temporary Ban","title":"Contributor Covenant Code of Conduct","text":"Community Impact: serious violation community standards, including sustained inappropriate behavior. Consequence: temporary ban sort interaction public communication community specified period time. public private interaction people involved, including unsolicited interaction enforcing Code Conduct, allowed period. Violating terms may lead permanent ban.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"4-permanent-ban","dir":"","previous_headings":"Enforcement Guidelines","what":"4. Permanent Ban","title":"Contributor Covenant Code of Conduct","text":"Community Impact: Demonstrating pattern violation community standards, including sustained inappropriate behavior, harassment individual, aggression toward disparagement classes individuals. Consequence: permanent ban sort public interaction within community.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CODE_OF_CONDUCT.html","id":"attribution","dir":"","previous_headings":"","what":"Attribution","title":"Contributor Covenant Code of Conduct","text":"Code Conduct adapted Contributor Covenant, version 2.0, available https://www.contributor-covenant.org/version/2/0/ code_of_conduct.html. Community Impact Guidelines inspired Mozilla’s code conduct enforcement ladder. answers common questions code conduct, see FAQ https://www.contributor-covenant.org/faq. Translations available https:// www.contributor-covenant.org/translations.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CONTRIBUTING.html","id":null,"dir":"","previous_headings":"","what":"Contributing to mikropml","title":"Contributing to mikropml","text":"document adapted Tidyverse Contributing guide.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CONTRIBUTING.html","id":"fixing-typos","dir":"","previous_headings":"","what":"Fixing typos","title":"Contributing to mikropml","text":"can fix typos, spelling mistakes, grammatical errors documentation directly using GitHub web interface, long changes made source file. generally means ’ll need edit roxygen2 comments .R, .Rd file. can find .R file generates .Rd reading comment first line.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CONTRIBUTING.html","id":"bigger-changes","dir":"","previous_headings":"","what":"Bigger changes","title":"Contributing to mikropml","text":"want make bigger change, ’s good idea first file issue make sure someone team agrees ’s needed. ’ve found bug, please file issue illustrates bug minimal reprex (also help write unit test, needed).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CONTRIBUTING.html","id":"pull-request-process","dir":"","previous_headings":"Bigger changes","what":"Pull request process","title":"Contributing to mikropml","text":"Fork package clone onto computer. haven’t done , recommend using usethis::create_from_github(\"SchlossLab/mikropml\", fork = TRUE). Install development dependences devtools::install_dev_deps(), make sure package passes R CMD check running devtools::check(). R CMD check doesn’t pass cleanly, ’s good idea ask help continuing. Create Git branch pull request (PR). recommend using usethis::pr_init(\"brief-description--change\"). Make changes, commit git, create PR running usethis::pr_push(), following prompts browser. title PR briefly describe change. body PR contain Fixes #issue-number. user-facing changes, add bullet top NEWS.md (.e. just first header). Follow style described https://style.tidyverse.org/news.html.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CONTRIBUTING.html","id":"code-style","dir":"","previous_headings":"Bigger changes","what":"Code style","title":"Contributing to mikropml","text":"New code follow tidyverse style guide. can use styler package apply styles, please don’t restyle code nothing PR. use roxygen2, Markdown syntax, documentation. use testthat unit tests. Contributions test cases included easier accept.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/CONTRIBUTING.html","id":"code-of-conduct","dir":"","previous_headings":"","what":"Code of Conduct","title":"Contributing to mikropml","text":"Please note mikropml project released Contributor Code Conduct. contributing project agree abide terms.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/LICENSE.html","id":null,"dir":"","previous_headings":"","what":"MIT License","title":"MIT License","text":"Copyright (c) 2019-2021 Begüm D. Topçuoğlu, Zena Lapp, Kelly L. Sovacool, Evan Snitkin, Jenna Wiens, Patrick D. Schloss Permission hereby granted, free charge, person obtaining copy software associated documentation files (“Software”), deal Software without restriction, including without limitation rights use, copy, modify, merge, publish, distribute, sublicense, /sell copies Software, permit persons Software furnished , subject following conditions: copyright notice permission notice shall included copies substantial portions Software. SOFTWARE PROVIDED “”, WITHOUT WARRANTY KIND, EXPRESS IMPLIED, INCLUDING LIMITED WARRANTIES MERCHANTABILITY, FITNESS PARTICULAR PURPOSE NONINFRINGEMENT. EVENT SHALL AUTHORS COPYRIGHT HOLDERS LIABLE CLAIM, DAMAGES LIABILITY, WHETHER ACTION CONTRACT, TORT OTHERWISE, ARISING , CONNECTION SOFTWARE USE DEALINGS SOFTWARE.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/SUPPORT.html","id":null,"dir":"","previous_headings":"","what":"Getting help with mikropml","title":"Getting help with mikropml","text":"Thanks using mikropml! filing issue, places explore pieces put together make process smooth possible.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/SUPPORT.html","id":"make-a-reprex","dir":"","previous_headings":"","what":"Make a reprex","title":"Getting help with mikropml","text":"Start making minimal reproducible example using reprex package. haven’t heard used reprex , ’re treat! Seriously, reprex make R-question-asking endeavors easier (pretty insane ROI five ten minutes ’ll take learn ’s ). additional reprex pointers, check Get help! section tidyverse site.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/SUPPORT.html","id":"where-to-ask","dir":"","previous_headings":"","what":"Where to ask?","title":"Getting help with mikropml","text":"Armed reprex, next step figure ask. ’s question: start community.rstudio.com, /StackOverflow. people answer questions. ’s bug: ’re right place, file issue. ’re sure: let community help figure ! problem bug feature request, can easily return report . opening new issue, sure search issues pull requests make sure bug hasn’t reported /already fixed development version. default, search pre-populated :issue :open. can edit qualifiers (e.g. :pr, :closed) needed. example, ’d simply remove :open search issues repo, open closed.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/SUPPORT.html","id":"what-happens-next","dir":"","previous_headings":"","what":"What happens next?","title":"Getting help with mikropml","text":"efficient possible, development tidyverse packages tends bursty, shouldn’t worry don’t get immediate response. Typically don’t look repo sufficient quantity issues accumulates, ’s burst intense activity focus efforts. makes development efficient avoids expensive context switching problems, cost taking longer get back . process makes good reprex particularly important might multiple months initial report start working . can’t reproduce bug, can’t fix !","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"its-running-so-slow","dir":"Articles","previous_headings":"","what":"It’s running so slow!","title":"Introduction to mikropml","text":"Since assume lot won’t read entire vignette, ’m going say beginning. run_ml() function running super slow, consider parallelizing. See vignette(\"parallel\") examples.","code":""},{"path":[]},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"the-input-data","dir":"Articles","previous_headings":"Understanding the inputs","what":"The input data","title":"Introduction to mikropml","text":"input data run_ml() dataframe row sample observation. One column (assumed first) outcome interest, columns features. package otu_mini_bin small example dataset mikropml. , dx outcome column (normal cancer), 10 features (Otu00001 Otu00010). 2 outcomes, performing binary classification majority examples . bottom, also briefly provide examples multi-class continuous outcomes. ’ll see, run way binary classification! feature columns amount Operational Taxonomic Unit (OTU) microbiome samples patients cancer without cancer. goal predict dx, stands diagnosis. diagnosis can cancer based individual’s microbiome. need understand exactly means, ’re interested can read original paper (Topçuoğlu et al. 2020). real machine learning applications ’ll need use features, purposes vignette ’ll stick example dataset everything runs faster.","code":"#install.packages(\"devtools\") #devtools::install_github(\"SchlossLab/mikropml\") library(mikropml) head(otu_mini_bin) #> dx Otu00001 Otu00002 Otu00003 Otu00004 Otu00005 Otu00006 Otu00007 #> 1 normal 350 268 213 1 208 230 70 #> 2 normal 568 1320 13 293 671 103 48 #> 3 normal 151 756 802 556 145 271 57 #> 4 normal 299 30 1018 0 25 99 75 #> 5 normal 1409 174 0 3 2 1136 296 #> 6 normal 167 712 213 4 332 534 139 #> Otu00008 Otu00009 Otu00010 #> 1 230 235 64 #> 2 204 119 115 #> 3 176 37 710 #> 4 78 255 197 #> 5 1 537 533 #> 6 251 155 122"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"the-methods-we-support","dir":"Articles","previous_headings":"Understanding the inputs","what":"The methods we support","title":"Introduction to mikropml","text":"methods use supported great ML wrapper package caret, use train machine learning models. methods tested (backend packages) : Logistic/multiclass/linear regression (\"glmnet\") Random forest (\"rf\") Decision tree (\"rpart2\") Support vector machine radial basis kernel (\"svmRadial\") xgboost (\"xgbTree\") documentation methods, well many others, can look available models (see list tag). vetted models used caret, function general enough others might work. can’t promise can help models, feel free open issue GitHub questions models might able help. first focus glmnet, default implementation L2-regularized logistic regression. cover examples towards end.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"before-running-ml","dir":"Articles","previous_headings":"","what":"Before running ML","title":"Introduction to mikropml","text":"execute run_ml(), consider preprocessing data, either preprocess_data() function. can learn preprocessing vignette: vignette(\"preprocess\").","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"the-simplest-way-to-run_ml","dir":"Articles","previous_headings":"","what":"The simplest way to run_ml()","title":"Introduction to mikropml","text":"mentioned , minimal input dataset (dataset) machine learning model want use (method). may also want provide: outcome column name. default run_ml() pick first column, ’s best practice specify column name explicitly. seed results reproducible, get results see (.e train/test split). Say want use logistic regression, method use glmnet. , run ML pipeline : ’ll notice things: takes little run. parameters use. message stating ‘dx’ used outcome column. want, ’s nice sanity check! warning. Don’t worry warning right now - just means hyperparameters aren’t good fit - ’re interested learning , see vignette(\"tuning\"). Now, let’s dig output bit. results list 4 things: trained_model trained model caret. bunch info won’t get , can learn caret::train() documentation. test_data partition dataset used testing. machine learning, ’s always important held-test dataset used training stage. pipeline using run_ml() split data training testing sets. training data used build model (e.g. tune hyperparameters, learn data) test data used evaluate well model performs. performance dataframe (mainly) performance metrics (1 column cross-validation performance metric, several test performance metrics, 2 columns end ML method seed): using logistic regression binary classification, area receiver-operator characteristic curve (AUC) useful metric evaluate model performance. , ’s default use mikropml. However, crucial evaluate model performance using multiple metrics. can find information performance metrics use package. cv_metric_AUC AUC cross-validation folds training data. gives us sense well model performs training data. columns performance metrics test data — data wasn’t used build model. , can see AUC test data much 0.5, suggesting model predict much better chance, model overfit cross-validation AUC (cv_metric_AUC, measured training) much higher testing AUC. isn’t surprising since ’re using features example dataset, don’t discouraged. default option also provides number performance metrics might interested , including area precision-recall curve (prAUC). last columns results$performance method seed (set one) help combining results multiple runs (see vignette(\"parallel\")). feature_importance information feature importance values find_feature_importance = TRUE (default FALSE). Since used defaults, ’s nothing :","code":"results <- run_ml(otu_mini_bin, 'glmnet', outcome_colname = 'dx', seed = 2019) names(results) #> [1] \"trained_model\" \"test_data\" \"performance\" #> [4] \"feature_importance\" names(results$trained_model) #> [1] \"method\" \"modelInfo\" \"modelType\" \"results\" \"pred\" #> [6] \"bestTune\" \"call\" \"dots\" \"metric\" \"control\" #> [11] \"finalModel\" \"preProcess\" \"trainingData\" \"ptype\" \"resample\" #> [16] \"resampledCM\" \"perfNames\" \"maximize\" \"yLimits\" \"times\" #> [21] \"levels\" \"terms\" \"coefnames\" \"xlevels\" head(results$test_data) #> dx Otu00009 Otu00005 Otu00010 Otu00001 Otu00008 Otu00004 Otu00003 #> 9 normal 119 142 248 256 363 112 871 #> 14 normal 60 209 70 86 96 1 123 #> 16 cancer 205 5 180 1668 95 22 3 #> 17 normal 188 356 107 381 1035 915 315 #> 27 normal 4 21 161 7 1 27 8 #> 30 normal 13 166 5 31 33 5 58 #> Otu00002 Otu00007 Otu00006 #> 9 995 0 137 #> 14 426 54 40 #> 16 20 590 570 #> 17 357 253 341 #> 27 25 322 5 #> 30 179 6 30 results$performance #> # A tibble: 1 × 17 #> cv_metric_AUC logLoss AUC prAUC Accuracy Kappa F1 Sensi…¹ Speci…² Pos_P…³ #> #> 1 0.622 0.684 0.647 0.606 0.590 0.179 0.6 0.6 0.579 0.6 #> # … with 7 more variables: Neg_Pred_Value , Precision , Recall , #> # Detection_Rate , Balanced_Accuracy , method , seed , #> # and abbreviated variable names ¹​Sensitivity, ²​Specificity, ³​Pos_Pred_Value results$feature_importance #> [1] \"Skipped feature importance\""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"customizing-parameters","dir":"Articles","previous_headings":"","what":"Customizing parameters","title":"Introduction to mikropml","text":"arguments allow change execute run_ml(). ’ve chosen reasonable defaults , encourage change think something else better data.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"changing-kfold-cv_times-and-training_frac","dir":"Articles","previous_headings":"Customizing parameters","what":"Changing kfold, cv_times, and training_frac","title":"Introduction to mikropml","text":"kfold: number folds run cross-validation (default: 5). cv_times: number times run repeated cross-validation (default: 100). training_frac: fraction data training set (default: 0.8). rest data used testing. ’s example change default parameters: might noticed one ran faster — ’s reduced kfold cv_times. okay testing things may even necessary smaller datasets. general may better larger numbers parameters; think defaults good starting point (Topçuoğlu et al. 2020).","code":"results_custom <- run_ml(otu_mini_bin, 'glmnet', kfold = 2, cv_times = 5, training_frac = 0.5, seed = 2019) #> Using 'dx' as the outcome column. #> Training the model... #> Loading required package: ggplot2 #> Loading required package: lattice #> #> Attaching package: 'caret' #> The following object is masked from 'package:mikropml': #> #> compare_models #> Warning in (function (w) : `caret::train()` issued the following warning: #> #> simpleWarning in nominalTrainWorkflow(x = x, y = y, wts = weights, info = trainInfo, : There were missing values in resampled performance measures. #> #> This warning usually means that the model didn't converge in some cross-validation folds because it is predicting something close to a constant. As a result, certain performance metrics can't be calculated. This suggests that some of the hyperparameters chosen are doing very poorly. #> Training complete."},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"custom-training-indices","dir":"Articles","previous_headings":"Customizing parameters > Changing kfold, cv_times, and training_frac","what":"Custom training indices","title":"Introduction to mikropml","text":"training_frac fraction 0 1, random sample observations dataset chosen training set satisfy training_frac. However, cases might wish control exactly observations training set. can instead assign training_frac vector indices correspond rows dataset go training set (remaining sequences go testing set).","code":"n_obs <- otu_mini_bin %>% nrow() training_size <- 0.8 * n_obs training_rows <- sample(n_obs, training_size) results_custom_train <- run_ml(otu_mini_bin, 'glmnet', kfold = 2, cv_times = 5, training_frac = training_rows, seed = 2019 ) #> Using 'dx' as the outcome column. #> Using the custom training set indices provided by `training_frac`. #> The fraction of data in the training set will be 0.8 #> Training the model... #> Training complete."},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"changing-the-performance-metric","dir":"Articles","previous_headings":"Customizing parameters","what":"Changing the performance metric","title":"Introduction to mikropml","text":"two arguments allow change performance metric use model evaluation, performance metrics calculate using test data. perf_metric_function function used calculate performance metrics. default classification caret::multiClassSummary() default regression caret::defaultSummary(). ’d suggest changing unless really know ’re . perf_metric_name column name output perf_metric_function. chose reasonable defaults (AUC binary, logLoss multiclass, RMSE continuous), default functions calculate bunch different performance metrics, can choose different one ’d like. default performance metrics available classification : default performance metrics available regression : ’s example using prAUC instead AUC: ’ll see cross-validation metric prAUC, instead default AUC:","code":"#> [1] \"logLoss\" \"AUC\" \"prAUC\" #> [4] \"Accuracy\" \"Kappa\" \"Mean_F1\" #> [7] \"Mean_Sensitivity\" \"Mean_Specificity\" \"Mean_Pos_Pred_Value\" #> [10] \"Mean_Neg_Pred_Value\" \"Mean_Precision\" \"Mean_Recall\" #> [13] \"Mean_Detection_Rate\" \"Mean_Balanced_Accuracy\" #> [1] \"RMSE\" \"Rsquared\" \"MAE\" results_pr <- run_ml(otu_mini_bin, 'glmnet', cv_times = 5, perf_metric_name = 'prAUC', seed = 2019) #> Using 'dx' as the outcome column. #> Training the model... #> Warning in (function (w) : `caret::train()` issued the following warning: #> #> simpleWarning in nominalTrainWorkflow(x = x, y = y, wts = weights, info = trainInfo, : There were missing values in resampled performance measures. #> #> This warning usually means that the model didn't converge in some cross-validation folds because it is predicting something close to a constant. As a result, certain performance metrics can't be calculated. This suggests that some of the hyperparameters chosen are doing very poorly. #> Training complete. results_pr$performance #> # A tibble: 1 × 17 #> cv_metric_p…¹ logLoss AUC prAUC Accur…² Kappa F1 Sensi…³ Speci…⁴ Pos_P…⁵ #> #> 1 0.577 0.691 0.663 0.605 0.538 0.0539 0.690 1 0.0526 0.526 #> # … with 7 more variables: Neg_Pred_Value , Precision , Recall , #> # Detection_Rate , Balanced_Accuracy , method , seed , #> # and abbreviated variable names ¹​cv_metric_prAUC, ²​Accuracy, ³​Sensitivity, #> # ⁴​Specificity, ⁵​Pos_Pred_Value"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"using-groups","dir":"Articles","previous_headings":"Customizing parameters","what":"Using groups","title":"Introduction to mikropml","text":"optional groups vector groups keep together splitting data train test sets cross-validation. Sometimes ’s important split data based grouping instead just randomly. allows control similarities within groups don’t want skew predictions (.e. batch effects). example, biological data may samples collected multiple hospitals, might like keep observations hospital partition. ’s example split data train/test sets based groups: one difference run_ml() report much data training set run code chunk. can little finicky depending many samples groups . won’t exactly specify training_frac, since include one group either training set test set.","code":"# make random groups set.seed(2019) grps <- sample(LETTERS[1:8], nrow(otu_mini_bin), replace=TRUE) results_grp <- run_ml(otu_mini_bin, 'glmnet', cv_times = 2, training_frac = 0.8, groups = grps, seed = 2019) #> Using 'dx' as the outcome column. #> Fraction of data in the training set: 0.795 #> Groups in the training set: A B D F G H #> Groups in the testing set: C E #> Groups will be kept together in CV partitions #> Training the model... #> Training complete."},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"controlling-how-groups-are-assigned-to-partitions","dir":"Articles","previous_headings":"Customizing parameters > Using groups","what":"Controlling how groups are assigned to partitions","title":"Introduction to mikropml","text":"use groups parameter , default run_ml() assume want observations group placed partition train/test split. makes sense want use groups control batch effects. However, cases might prefer control exactly groups end partition, might even okay observations group assigned different partitions. example, say want groups B used training, C D testing, don’t preference happens groups. can give group_partitions parameter named list specify groups go training set go testing set. case, observations & B used training, C & D used testing, remaining groups randomly assigned one satisfy training_frac closely possible. another scenario, maybe want groups F used training, also want allow observations selected training F used testing: need even control , take look setting custom training indices. might also prefer provide train control scheme cross_val parameter run_ml().","code":"results_grp_part <- run_ml(otu_mini_bin, 'glmnet', cv_times = 2, training_frac = 0.8, groups = grps, group_partitions = list(train = c('A', 'B'), test = c('C', 'D') ), seed = 2019) #> Using 'dx' as the outcome column. #> Fraction of data in the training set: 0.785 #> Groups in the training set: A B E F G H #> Groups in the testing set: C D #> Groups will not be kept together in CV partitions because the number of groups in the training set is not larger than `kfold` #> Training the model... #> Training complete. results_grp_trainA <- run_ml(otu_mini_bin, 'glmnet', cv_times = 2, kfold = 2, training_frac = 0.5, groups = grps, group_partitions = list(train = c(\"A\", \"B\", \"C\", \"D\", \"E\", \"F\"), test = c(\"A\", \"B\", \"C\", \"D\", \"E\", \"F\", \"G\", \"H\") ), seed = 2019) #> Using 'dx' as the outcome column. #> Fraction of data in the training set: 0.5 #> Groups in the training set: A B C D E F #> Groups in the testing set: A B C D E F G H #> Groups will be kept together in CV partitions #> Training the model... #> Training complete."},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"finding-feature-importance","dir":"Articles","previous_headings":"","what":"Finding feature importance","title":"Introduction to mikropml","text":"find features contributing predictive power, can use find_feature_importance = TRUE. use permutation importance determine feature importance described (Topçuoğlu et al. 2020). Briefly, permutes features individually (correlated ones together) evaluates much performance metric decreases. performance decreases feature randomly shuffled, important feature . default FALSE takes run useful want know features important predicting outcome. Let’s look feature importance results: Now, can check feature importances: several columns: perf_metric: performance value permuted feature. perf_metric_diff: difference performance actual permuted data (.e. test performance minus permuted performance). Features larger perf_metric_diff important. pvalue: probability obtaining actual performance value null hypothesis. names: feature permuted. method: ML method used. perf_metric_name: performance metric used. seed: seed (set). can see , differences negligible (close zero), makes sense since model isn’t great. ’re interested feature importance, ’s especially useful run multiple different train/test splits, shown example snakemake workflow. can also choose permute correlated features together using corr_thresh (default: 1). features correlation threshold permuted together; .e. perfectly correlated features permuted together using default value. can see features permuted together names column. 3 features permuted together (doesn’t really make sense, ’s just example). previously executed run_ml() without feature importance now wish find feature importance fact, see example code get_feature_importance() documentation. get_feature_importance() can show live progress bar, see vignette(\"parallel\") examples.","code":"results_imp <- run_ml(otu_mini_bin, \"rf\", outcome_colname = \"dx\", find_feature_importance = TRUE, seed = 2019 ) results_imp$feature_importance #> perf_metric perf_metric_diff pvalue names method perf_metric_name #> 1 0.5542375 0.0082625 0.40594059 Otu00001 rf AUC #> 2 0.5731750 -0.0106750 0.62376238 Otu00002 rf AUC #> 3 0.5548750 0.0076250 0.43564356 Otu00003 rf AUC #> 4 0.6414750 -0.0789750 0.99009901 Otu00004 rf AUC #> 5 0.5049625 0.0575375 0.05940594 Otu00005 rf AUC #> 6 0.5444500 0.0180500 0.19801980 Otu00006 rf AUC #> 7 0.5417125 0.0207875 0.23762376 Otu00007 rf AUC #> 8 0.5257750 0.0367250 0.08910891 Otu00008 rf AUC #> 9 0.5395750 0.0229250 0.05940594 Otu00009 rf AUC #> 10 0.4977625 0.0647375 0.05940594 Otu00010 rf AUC #> seed #> 1 2019 #> 2 2019 #> 3 2019 #> 4 2019 #> 5 2019 #> 6 2019 #> 7 2019 #> 8 2019 #> 9 2019 #> 10 2019 results_imp_corr <- run_ml(otu_mini_bin, 'glmnet', cv_times = 5, find_feature_importance = TRUE, corr_thresh = 0.2, seed = 2019) #> Using 'dx' as the outcome column. #> Training the model... #> Warning in (function (w) : `caret::train()` issued the following warning: #> #> simpleWarning in nominalTrainWorkflow(x = x, y = y, wts = weights, info = trainInfo, : There were missing values in resampled performance measures. #> #> This warning usually means that the model didn't converge in some cross-validation folds because it is predicting something close to a constant. As a result, certain performance metrics can't be calculated. This suggests that some of the hyperparameters chosen are doing very poorly. #> Training complete. #> Finding feature importance... #> Feature importance complete. results_imp_corr$feature_importance #> perf_metric perf_metric_diff pvalue #> 1 0.5502105 0.09715789 0.08910891 #> 2 0.6369474 0.01042105 0.44554455 #> 3 0.5951316 0.05223684 0.11881188 #> names #> 1 Otu00001|Otu00002|Otu00003|Otu00005|Otu00006|Otu00007|Otu00009|Otu00010 #> 2 Otu00004 #> 3 Otu00008 #> method perf_metric_name seed #> 1 glmnet AUC 2019 #> 2 glmnet AUC 2019 #> 3 glmnet AUC 2019"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"tuning-hyperparameters-using-the-hyperparameter-argument","dir":"Articles","previous_headings":"","what":"Tuning hyperparameters (using the hyperparameter argument)","title":"Introduction to mikropml","text":"important, whole vignette . bottom line provide default hyperparameters can start , ’s important tune hyperparameters. information default hyperparameters , tune hyperparameters, see vignette(\"tuning\").","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"other-models","dir":"Articles","previous_headings":"","what":"Other models","title":"Introduction to mikropml","text":"examples train evaluate models. output similar, won’t go details.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"random-forest","dir":"Articles","previous_headings":"Other models","what":"Random forest","title":"Introduction to mikropml","text":"can also change number trees use random forest (ntree; default: 1000). can’t tuned using rf package implementation random forest. Please refer caret documentation interested packages random forest implementations.","code":"results_rf <- run_ml(otu_mini_bin, 'rf', cv_times = 5, seed = 2019) results_rf_nt <- run_ml(otu_mini_bin, 'rf', cv_times = 5, ntree = 10, seed = 2019)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"decision-tree","dir":"Articles","previous_headings":"Other models","what":"Decision tree","title":"Introduction to mikropml","text":"","code":"results_dt <- run_ml(otu_mini_bin, 'rpart2', cv_times = 5, seed = 2019)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"svm","dir":"Articles","previous_headings":"Other models","what":"SVM","title":"Introduction to mikropml","text":"get message “maximum number iterations reached”, see issue caret.","code":"results_svm <- run_ml(otu_mini_bin, 'svmRadial', cv_times = 5, seed = 2019)"},{"path":[]},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"multiclass-data","dir":"Articles","previous_headings":"Other data","what":"Multiclass data","title":"Introduction to mikropml","text":"provide otu_mini_multi multiclass outcome (three outcomes): ’s example running multiclass data: performance metrics slightly different, format everything else :","code":"otu_mini_multi %>% dplyr::pull('dx') %>% unique() #> [1] \"adenoma\" \"carcinoma\" \"normal\" results_multi <- run_ml(otu_mini_multi, outcome_colname = \"dx\", seed = 2019 ) results_multi$performance #> # A tibble: 1 × 17 #> cv_metric…¹ logLoss AUC prAUC Accur…² Kappa Mean_F1 Mean_…³ Mean_…⁴ Mean_…⁵ #> #> 1 1.07 1.11 0.506 0.353 0.382 0.0449 NA 0.360 0.682 NaN #> # … with 7 more variables: Mean_Neg_Pred_Value , Mean_Precision , #> # Mean_Recall , Mean_Detection_Rate , Mean_Balanced_Accuracy , #> # method , seed , and abbreviated variable names #> # ¹​cv_metric_logLoss, ²​Accuracy, ³​Mean_Sensitivity, ⁴​Mean_Specificity, #> # ⁵​Mean_Pos_Pred_Value"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"continuous-data","dir":"Articles","previous_headings":"Other data","what":"Continuous data","title":"Introduction to mikropml","text":"’s example running continuous data, outcome column numerical: , performance metrics slightly different, format rest :","code":"results_cont <- run_ml(otu_mini_bin[, 2:11], 'glmnet', outcome_colname = 'Otu00001', seed = 2019) results_cont$performance #> # A tibble: 1 × 6 #> cv_metric_RMSE RMSE Rsquared MAE method seed #> #> 1 622. 731. 0.0893 472. glmnet 2019"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/introduction.html","id":"references","dir":"Articles","previous_headings":"","what":"References","title":"Introduction to mikropml","text":"Tang, Shengpu, Parmida Davarmanesh, Yanmeng Song, Danai Koutra, Michael W. Sjoding, Jenna Wiens. 2020. “Democratizing EHR Analyses FIDDLE: Flexible Data-Driven Preprocessing Pipeline Structured Clinical Data.” J Med Inform Assoc, October. https://doi.org/10.1093/jamia/ocaa139. Topçuoğlu, Begüm D., Nicholas . Lesniak, Mack T. Ruffin, Jenna Wiens, Patrick D. Schloss. 2020. “Framework Effective Application Machine Learning Microbiome-Based Classification Problems.” mBio 11 (3). https://doi.org/10.1128/mBio.00434-20.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"summary","dir":"Articles","previous_headings":"","what":"Summary","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"Machine learning (ML) classification prediction based set features used make decisions healthcare, economics, criminal justice . However, implementing ML pipeline including preprocessing, model selection, evaluation can time-consuming, confusing, difficult. , present mikropml (pronounced “meek-ROPE em el”), easy--use R package implements ML pipelines using regression, support vector machines, decision trees, random forest, gradient-boosted trees. package available GitHub, CRAN, conda.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"statement-of-need","dir":"Articles","previous_headings":"","what":"Statement of need","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"applications machine learning (ML) require reproducible steps data pre-processing, cross-validation, testing, model evaluation, often interpretation model makes particular predictions. Performing steps important, failure implement can result incorrect misleading results (Teschendorff 2019; Wiens et al. 2019). Supervised ML widely used recognize patterns large datasets make predictions outcomes interest. Several packages including caret (Kuhn 2008) tidymodels (Kuhn, Wickham, RStudio 2020) R, scikitlearn (Pedregosa et al. 2011) Python, H2O autoML platform (H2O.ai 2020) allow scientists train ML models variety algorithms. packages provide tools necessary ML step, implement complete ML pipeline according good practices literature. makes difficult practitioners new ML easily begin perform ML analyses. enable broader range researchers apply ML problem domains, created mikropml, easy--use R package (R Core Team 2020) implements ML pipeline created Topçuoğlu et al. (Topçuoğlu et al. 2020) single function returns trained model, model performance metrics feature importance. mikropml leverages caret package support several ML algorithms: linear regression, logistic regression, support vector machines radial basis kernel, decision trees, random forest, gradient boosted trees. incorporates good practices ML training, testing, model evaluation (Topçuoğlu et al. 2020; Teschendorff 2019). Furthermore, provides data preprocessing steps based FIDDLE (FlexIble Data-Driven pipeLinE) framework outlined Tang et al. (Tang et al. 2020) post-training permutation importance steps estimate importance feature models trained (Breiman 2001; Fisher, Rudin, Dominici 2018). mikropml can used starting point application ML datasets many different fields. already applied microbiome data categorize patients colorectal cancer (Topçuoğlu et al. 2020), identify differences genomic clinical features associated bacterial infections (Lapp et al. 2020), predict gender-based biases academic publishing (Hagan et al. 2020).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"mikropml-package","dir":"Articles","previous_headings":"","what":"mikropml package","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"mikropml package includes functionality preprocess data, train ML models, evaluate model performance, quantify feature importance (Figure 1). also provide vignettes example Snakemake workflow (Köster Rahmann 2012) showcase run ideal ML pipeline multiple different train/test data splits. results can visualized using helper functions use ggplot2 (Wickham 2016). mikropml allows users get started quickly facilitates reproducibility, replacement understanding ML workflow still necessary interpreting results (Pollard et al. 2019). facilitate understanding enable one tailor code application, heavily commented code provided supporting documentation can read online.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"preprocessing-data","dir":"Articles","previous_headings":"mikropml package","what":"Preprocessing data","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"provide function preprocess_data() preprocess features using several different functions caret package. preprocess_data() takes continuous categorical data, re-factors categorical data binary features, provides options normalize continuous data, remove features near-zero variance, keep one instance perfectly correlated features. set default options based implemented FIDDLE (Tang et al. 2020). details use preprocess_data() can found accompanying vignette.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"running-ml","dir":"Articles","previous_headings":"mikropml package","what":"Running ML","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"main function mikropml, run_ml(), minimally takes model choice data frame outcome column feature columns. model choice, mikropml currently supports logistic linear regression (glmnet: Friedman, Hastie, Tibshirani 2010), support vector machines radial basis kernel (kernlab: Karatzoglou et al. 2004), decision trees (rpart: Therneau et al. 2019), random forest (randomForest: Liaw Wiener 2002), gradient-boosted trees (xgboost: Chen et al. 2020). run_ml() randomly splits data train test sets maintaining distribution outcomes found full dataset. also provides option split data train test sets based categorical variables (e.g. batch, geographic location, etc.). mikropml uses caret package (Kuhn 2008) train evaluate models, optionally quantifies feature importance. output includes best model built based tuning hyperparameters internal repeated cross-validation step, model evaluation metrics, optional feature importances. Feature importances calculated using permutation test, breaks relationship feature true outcome test data, measures change model performance. provides intuitive metric individual features influence model performance comparable across model types, particularly useful model interpretation (Topçuoğlu et al. 2020). introductory vignette contains comprehensive tutorial use run_ml(). mikropml pipeline","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"ideal-workflow-for-running-mikropml-with-many-different-traintest-splits","dir":"Articles","previous_headings":"mikropml package","what":"Ideal workflow for running mikropml with many different train/test splits","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"investigate variation model performance depending train test set used (Topçuoğlu et al. 2020; Lapp et al. 2020), provide examples run_ml() many times different train/test splits get summary information model performance local computer high-performance computing cluster using Snakemake workflow.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"tuning-visualization","dir":"Articles","previous_headings":"mikropml package","what":"Tuning & visualization","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"One particularly important aspect ML hyperparameter tuning. provide reasonable range default hyperparameters model type. However practitioners explore whether range appropriate data, customize hyperparameter range. Therefore, provide function plot_hp_performance() plot cross-validation performance metric single model models built using different train/test splits. helps evaluate hyperparameter range searched exhaustively allows user pick ideal set. also provide summary plots test performance metrics many train/test splits different models using plot_model_performance(). Examples described accompanying vignette hyperparameter tuning.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"dependencies","dir":"Articles","previous_headings":"mikropml package","what":"Dependencies","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"mikropml written R (R Core Team 2020) depends several packages: dplyr (Wickham et al. 2020), rlang (Henry, Wickham, RStudio 2020) caret (Kuhn 2008). ML algorithms supported mikropml require: glmnet (Friedman, Hastie, Tibshirani 2010), e1071 (Meyer et al. 2020), MLmetrics (Yan 2016) logistic regression, rpart2 (Therneau et al. 2019) decision trees, randomForest (Liaw Wiener 2002) random forest, xgboost (Chen et al. 2020) xgboost, kernlab (Karatzoglou et al. 2004) support vector machines. also allow parallelization cross-validation steps using foreach, doFuture, future.apply, future packages (Bengtsson Team 2020). Finally, use ggplot2 plotting (Wickham 2016).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"acknowledgments","dir":"Articles","previous_headings":"","what":"Acknowledgments","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"thank members Schloss Lab participated code clubs related initial development pipeline, made documentation improvements, provided general feedback. also thank Nick Lesniak designing mikropml logo. thank US Research Software Sustainability Institute (NSF #1743188) providing training KLS Winter School Research Software Engineering.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"funding","dir":"Articles","previous_headings":"","what":"Funding","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"Salary support PDS came NIH grant 1R01CA215574. KLS received support NIH Training Program Bioinformatics (T32 GM070449). ZL received support National Science Foundation Graduate Research Fellowship Program Grant . DGE 1256260. opinions, findings, conclusions recommendations expressed material authors necessarily reflect views National Science Foundation.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"author-contributions","dir":"Articles","previous_headings":"","what":"Author contributions","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"BDT, ZL, KLS contributed equally. Author order among co-first authors determined time since joining project. BDT, ZL, KLS conceptualized study wrote code. KLS structured code R package form. BDT, ZL, JW, PDS developed methodology. PDS, ES, JW supervised project. BDT, ZL, KLS wrote original draft. authors reviewed edited manuscript.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"conflicts-of-interest","dir":"Articles","previous_headings":"","what":"Conflicts of interest","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"None.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/paper.html","id":"references","dir":"Articles","previous_headings":"","what":"References","title":"mikropml: User-Friendly R Package for Supervised Machine Learning Pipelines","text":"Bengtsson, Henrik, R Core Team. 2020. “Future.Apply: Apply Function Elements Parallel Using Futures,” July. Breiman, Leo. 2001. “Random Forests.” Machine Learning 45 (1): 5–32. https://doi.org/10.1023/:1010933404324. Chen, Tianqi, Tong , Michael Benesty, Vadim Khotilovich, Yuan Tang, Hyunsu Cho, Kailong Chen, et al. 2020. “Xgboost: Extreme Gradient Boosting,” June. Fisher, Aaron, Cynthia Rudin, Francesca Dominici. 2018. “Models Wrong, Many Useful: Learning Variable’s Importance Studying Entire Class Prediction Models Simultaneously.” Friedman, Jerome H., Trevor Hastie, Rob Tibshirani. 2010. “Regularization Paths Generalized Linear Models via Coordinate Descent.” Journal Statistical Software 33 (1): 1–22. https://doi.org/10.18637/jss.v033.i01. H2O.ai. 2020. H2O: Scalable Machine Learning Platform. Manual. Hagan, Ada K., Begüm D. Topçuoğlu, Mia E. Gregory, Hazel . Barton, Patrick D. Schloss. 2020. “Women Underrepresented Receive Differential Outcomes ASM Journals: Six-Year Retrospective Analysis.” mBio 11 (6). https://doi.org/10.1128/mBio.01680-20. Henry, Lionel, Hadley Wickham, RStudio. 2020. “Rlang: Functions Base Types Core R ’Tidyverse’ Features,” July. Karatzoglou, Alexandros, Alexandros Smola, Kurt Hornik, Achim Zeileis. 2004. “Kernlab - S4 Package Kernel Methods R.” Journal Statistical Software 11 (1): 1–20. https://doi.org/10.18637/jss.v011.i09. Köster, Johannes, Sven Rahmann. 2012. “Snakemakea Scalable Bioinformatics Workflow Engine.” Bioinformatics 28 (19): 2520–2. https://doi.org/10.1093/bioinformatics/bts480. Kuhn, Max. 2008. “Building Predictive Models R Using Caret Package.” Journal Statistical Software 28 (1): 1–26. https://doi.org/10.18637/jss.v028.i05. Kuhn, Max, Hadley Wickham, RStudio. 2020. “Tidymodels: Easily Install Load ’Tidymodels’ Packages,” July. Lapp, Zena, Jennifer Han, Jenna Wiens, Ellie JC Goldstein, Ebbing Lautenbach, Evan Snitkin. 2020. “Machine Learning Models Identify Patient Microbial Genetic Factors Associated Carbapenem-Resistant Klebsiella Pneumoniae Infection.” medRxiv, July, 2020.07.06.20147306. https://doi.org/10.1101/2020.07.06.20147306. Liaw, Andy, Matthew Wiener. 2002. “Classification Regression randomForest” 2: 5. Meyer, David, Evgenia Dimitriadou, Kurt Hornik, Andreas Weingessel, Friedrich Leisch, Chih-Chung Chang (libsvm C++-code), Chih-Chen Lin (libsvm C++-code). 2020. “E1071: Misc Functions Department Statistics, Probability Theory Group (Formerly: E1071), TU Wien.” Pedregosa, Fabian, Gaël Varoquaux, Alexandre Gramfort, Vincent Michel, Bertrand Thirion, Olivier Grisel, Mathieu Blondel, et al. 2011. “Scikit-Learn: Machine Learning Python.” Journal Machine Learning Research 12 (85): 2825–30. Pollard, Tom J., Irene Chen, Jenna Wiens, Steven Horng, Danny Wong, Marzyeh Ghassemi, Heather Mattie, Emily Lindemer, Trishan Panch. 2019. “Turning Crank Machine Learning: Ease, Expense?” Lancet Digital Health 1 (5): e198–e199. https://doi.org/10.1016/S2589-7500(19)30112-8. R Core Team. 2020. “R: Language Environment Statistical Computing.” Tang, Shengpu, Parmida Davarmanesh, Yanmeng Song, Danai Koutra, Michael W. Sjoding, Jenna Wiens. 2020. “Democratizing EHR Analyses FIDDLE: Flexible Data-Driven Preprocessing Pipeline Structured Clinical Data.” J Med Inform Assoc, October. https://doi.org/10.1093/jamia/ocaa139. Teschendorff, Andrew E. 2019. “Avoiding Common Pitfalls Machine Learning Omic Data Science.” Nature Materials 18 (5): 422–27. https://doi.org/10.1038/s41563-018-0241-z. Therneau, Terry, Beth Atkinson, Brian Ripley (producer initial R. port, maintainer 1999-2017). 2019. “Rpart: Recursive Partitioning Regression Trees,” April. Topçuoğlu, Begüm D., Nicholas . Lesniak, Mack T. Ruffin, Jenna Wiens, Patrick D. Schloss. 2020. “Framework Effective Application Machine Learning Microbiome-Based Classification Problems.” mBio 11 (3). https://doi.org/10.1128/mBio.00434-20. Wickham, Hadley. 2016. Ggplot2: Elegant Graphics Data Analysis. Use R! Cham: Springer International Publishing. https://doi.org/10.1007/978-3-319-24277-4. Wickham, Hadley, Romain François, Lionel Henry, Kirill Müller, RStudio. 2020. “Dplyr: Grammar Data Manipulation,” August. Wiens, Jenna, Suchi Saria, Mark Sendak, Marzyeh Ghassemi, Vincent X. Liu, Finale Doshi-Velez, Kenneth Jung, et al. 2019. “Harm: Roadmap Responsible Machine Learning Health Care.” Nat. Med. 25 (9): 1337–40. https://doi.org/10.1038/s41591-019-0548-6. Yan, Yachen. 2016. “MLmetrics: Machine Learning Evaluation Metrics.”","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"speed-up-single-runs","dir":"Articles","previous_headings":"","what":"Speed up single runs","title":"Parallel processing","text":"default, preprocess_data(), run_ml(), compare_models() use one process series. ’d like parallelize various steps pipeline make run faster, install foreach, future, future.apply, doFuture. , register future plan prior calling functions: , used multicore plan split work across 2 cores. See future documentation picking best plan use case. Notably, multicore work inside RStudio Windows; need use multisession instead cases. registering future plan, can call preprocess_data() run_ml() usual, run certain tasks parallel.","code":"doFuture::registerDoFuture() future::plan(future::multicore, workers = 2) otu_data_preproc <- preprocess_data(otu_mini_bin, 'dx')$dat_transformed #> Using 'dx' as the outcome column. result1 <- run_ml(otu_data_preproc, 'glmnet') #> Using 'dx' as the outcome column. #> Training the model... #> Loading required package: ggplot2 #> Loading required package: lattice #> #> Attaching package: 'caret' #> The following object is masked from 'package:mikropml': #> #> compare_models #> Training complete."},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"call-run_ml-multiple-times-in-parallel-in-r","dir":"Articles","previous_headings":"","what":"Call run_ml() multiple times in parallel in R","title":"Parallel processing","text":"can use functions future.apply package call run_ml() multiple times parallel different parameters. first need run future::plan() haven’t already. , call run_ml() multiple seeds using future_lapply(): call run_ml() different seed uses different random split data training testing sets. Since using seeds, must set future.seed TRUE (see future.apply documentation blog post details parallel-safe random seeds). example uses seeds speed simplicity, real data recommend using many seeds get better estimate model performance. examples, used functions future.apply package run_ml() parallel, can accomplish thing parallel versions purrr::map() functions using furrr package (e.g. furrr::future_map_dfr()). Extract performance results combine one dataframe seeds:","code":"# NOTE: use more seeds for real-world data results_multi <- future.apply::future_lapply(seq(100, 102), function(seed) { run_ml(otu_data_preproc, 'glmnet', seed = seed) }, future.seed = TRUE) #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. perf_df <- future.apply::future_lapply(results_multi, function(result) { result[['performance']] %>% select(cv_metric_AUC, AUC, method) }, future.seed = TRUE) %>% dplyr::bind_rows() perf_df #> # A tibble: 3 × 3 #> cv_metric_AUC AUC method #> #> 1 0.630 0.634 glmnet #> 2 0.591 0.608 glmnet #> 3 0.671 0.471 glmnet"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"multiple-ml-methods","dir":"Articles","previous_headings":"Call run_ml() multiple times in parallel in R","what":"Multiple ML methods","title":"Parallel processing","text":"may also wish compare performance different ML methods. mapply() can iterate multiple lists vectors, future_mapply() works way: Extract combine performance results seeds methods: Visualize performance results (ggplot2 required): plot_model_performance() returns ggplot2 object. can add layers customize plot: can also create plots however like using performance results.","code":"# NOTE: use more seeds for real-world data param_grid <- expand.grid(seeds = seq(100, 102), methods = c('glmnet', 'rf')) results_mtx <- future.apply::future_mapply( function(seed, method) { run_ml(otu_data_preproc, method, seed = seed) }, param_grid$seeds, param_grid$methods %>% as.character(), future.seed = TRUE ) #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. perf_df2 <- lapply(results_mtx['performance',], function(x) { x %>% select(cv_metric_AUC, AUC, method) }) %>% dplyr::bind_rows() perf_df2 #> # A tibble: 6 × 3 #> cv_metric_AUC AUC method #> #> 1 0.630 0.634 glmnet #> 2 0.591 0.608 glmnet #> 3 0.671 0.471 glmnet #> 4 0.665 0.708 rf #> 5 0.651 0.697 rf #> 6 0.701 0.592 rf perf_boxplot <- plot_model_performance(perf_df2) perf_boxplot perf_boxplot + theme_classic() + scale_color_brewer(palette = \"Dark2\") + coord_flip()"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"live-progress-updates","dir":"Articles","previous_headings":"","what":"Live progress updates","title":"Parallel processing","text":"preprocess_data() get_feature_importance() support reporting live progress updates using progressr package. format , recommend using progress bar like : Note future backends support “near-live” progress updates, meaning progress may reported immediately parallel processing futures. Read progressr vignette. progressr customize format progress updates, see progressr docs.","code":"# optionally, specify the progress bar format with the `progress` package. progressr::handlers(progressr::handler_progress( format = \":message :bar :percent | elapsed: :elapsed | eta: :eta\", clear = FALSE, show_after = 0)) # tell progressr to always report progress in any functions that use it. # set this to FALSE to turn it back off again. progressr::handlers(global = TRUE) # run your code and watch the live progress updates. dat <- preprocess_data(otu_mini_bin, 'dx')$dat_transformed #> Using 'dx' as the outcome column. #> preprocessing ========================>------- 78% | elapsed: 1s | eta: 0s results <- run_ml(dat, \"glmnet\", kfold = 2, cv_times = 2, find_feature_importance = TRUE) #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Feature importance =========================== 100% | elapsed: 37s | eta: 0s"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/parallel.html","id":"parallelizing-with-snakemake","dir":"Articles","previous_headings":"","what":"Parallelizing with Snakemake","title":"Parallel processing","text":"parallelizing multiple calls run_ml() R examples , results objects held memory. isn’t big deal small dataset run seeds. However, large datasets run parallel , say, 100 seeds (recommended), may run problems trying store objects memory . One solution write results files run_ml() call, concatenate end. show one way accomplish Snakemake example Snakemake workflow .","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"its-running-so-slow","dir":"Articles","previous_headings":"","what":"It’s running so slow!","title":"Preprocessing data","text":"Since assume lot won’t read entire vignette, ’m going say beginning. preprocess_data() function running super slow, consider parallelizing goes faster! preprocess_data() also can report live progress updates. See vignette(\"parallel\") details.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"examples","dir":"Articles","previous_headings":"","what":"Examples","title":"Preprocessing data","text":"’re going start simple get complicated, want whole shebang , just scroll bottom. First, load mikropml:","code":"library(mikropml)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"binary-data","dir":"Articles","previous_headings":"Examples","what":"Binary data","title":"Preprocessing data","text":"Let’s start binary variables: addition dataframe , provide name outcome column preprocess_data(). ’s preprocessed data looks like: output list: dat_transformed transformed data, grp_feats list grouped features, removed_feats list features removed. , grp_feats NULL perfectly correlated features (e.g. c(0,1,0) c(0,1,0), c(0,1,0) c(1,0,1) - see details). first column (var1) dat_transformed character changed var1_yes zeros () ones (yes). values second column (var2) stay ’s already binary, name changes var2_1. third column (var3) factor also changed binary b 1 0, denoted new column name var3_b.","code":"# raw binary dataset bin_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"no\", \"yes\", \"no\"), var2 = c(0, 1, 1), var3 = factor(c(\"a\",\"a\",\"b\")) ) bin_df #> outcome var1 var2 var3 #> 1 normal no 0 a #> 2 normal yes 1 a #> 3 cancer no 1 b # preprocess raw binary data preprocess_data(dataset = bin_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 4 #> outcome var1_yes var2_1 var3_b #> #> 1 normal 0 0 0 #> 2 normal 1 1 0 #> 3 cancer 0 1 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"categorical-data","dir":"Articles","previous_headings":"Examples","what":"Categorical data","title":"Preprocessing data","text":"non-binary categorical data: can see, variable split 3 different columns - one type (, b, c). , grp_feats NULL.","code":"# raw categorical dataset cat_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c('a','b','c') ) cat_df #> outcome var1 #> 1 normal a #> 2 normal b #> 3 cancer c # preprocess raw categorical data preprocess_data(dataset = cat_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 4 #> outcome var1_a var1_b var1_c #> #> 1 normal 1 0 0 #> 2 normal 0 1 0 #> 3 cancer 0 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"continuous-data","dir":"Articles","previous_headings":"Examples","what":"Continuous data","title":"Preprocessing data","text":"Now, looking continuous variables: Wow! numbers change? default normalize data using \"center\" \"scale\". often best practice, may want normalize data, may want normalize data different way. don’t want normalize data, can use method=NULL: can also normalize data different ways. can choose method supported method argument caret::preProcess() (see caret::preProcess() docs details). Note methods applied continuous variables. Another feature preprocess_data() provide continuous variables characters, converted numeric: don’t want happen, want character data remain character data even can converted numeric, can use to_numeric=FALSE kept categorical: can see output, case features treated groups rather numbers (e.g. normalized).","code":"# raw continuous dataset cont_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(1,2,3) ) cont_df #> outcome var1 #> 1 normal 1 #> 2 normal 2 #> 3 cancer 3 # preprocess raw continuous data preprocess_data(dataset = cont_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 2 #> outcome var1 #> #> 1 normal -1 #> 2 normal 0 #> 3 cancer 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0) # preprocess raw continuous data, no normalization preprocess_data(dataset = cont_df, outcome_colname = \"outcome\", method = NULL) # raw continuous dataset as characters cont_char_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"1\",\"2\",\"3\") ) cont_char_df #> outcome var1 #> 1 normal 1 #> 2 normal 2 #> 3 cancer 3 # preprocess raw continuous character data as numeric preprocess_data(dataset = cont_char_df, outcome_colname = \"outcome\") # preprocess raw continuous character data as characters preprocess_data(dataset = cont_char_df, outcome_colname = \"outcome\", to_numeric = FALSE) #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 4 #> outcome var1_1 var1_2 var1_3 #> #> 1 normal 1 0 0 #> 2 normal 0 1 0 #> 3 cancer 0 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"collapse-perfectly-correlated-features","dir":"Articles","previous_headings":"Examples","what":"Collapse perfectly correlated features","title":"Preprocessing data","text":"default, preprocess_data() collapses features perfectly positively negatively correlated. multiple copies features add information machine learning, makes run_ml faster. can see, end one variable, 3 grouped together. Also, second element list longer NULL. Instead, tells grp1 contains var1, var2, var3. want group positively correlated features, negatively correlated features (e.g. interpretability, another downstream application), can using group_neg_corr=FALSE: , var3 kept ’s ’s negatively correlated var1 var2. can also choose keep features separate, even perfectly correlated, using collapse_corr_feats=FALSE: case, grp_feats always NULL.","code":"# raw correlated dataset corr_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"no\", \"yes\", \"no\"), var2 = c(0, 1, 0), var3 = c(1,0,1) ) corr_df #> outcome var1 var2 var3 #> 1 normal no 0 1 #> 2 normal yes 1 0 #> 3 cancer no 0 1 # preprocess raw correlated dataset preprocess_data(dataset = corr_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 2 #> outcome grp1 #> #> 1 normal 0 #> 2 normal 1 #> 3 cancer 0 #> #> $grp_feats #> $grp_feats$grp1 #> [1] \"var1_yes\" \"var3_1\" #> #> #> $removed_feats #> [1] \"var2\" # preprocess raw correlated dataset; don't group negatively correlated features preprocess_data(dataset = corr_df, outcome_colname = \"outcome\", group_neg_corr = FALSE) #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_yes var3_1 #> #> 1 normal 0 1 #> 2 normal 1 0 #> 3 cancer 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var2\" # preprocess raw correlated dataset; don't group negatively correlated features preprocess_data(dataset = corr_df, outcome_colname = \"outcome\", collapse_corr_feats = FALSE) #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_yes var3_1 #> #> 1 normal 0 1 #> 2 normal 1 0 #> 3 cancer 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var2\""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"data-with-near-zero-variance","dir":"Articles","previous_headings":"Examples","what":"Data with near-zero variance","title":"Preprocessing data","text":"variables zero, “”? ones won’t contribute information, remove : , var3, var4, var5 variability, variables removed preprocessing: can read caret::preProcess() documentation information. default, remove features “near-zero variance” (remove_var='nzv'). uses default arguments caret::nearZeroVar(). However, particularly smaller datasets, might want remove features near-zero variance. want remove features zero variance, can use remove_var='zv': want include features, can use argument remove_zv=NULL. work, collapse correlated features (otherwise errors underlying caret function use). want nuanced remove near-zero variance features (e.g. change default 10% cutoff percentage distinct values total number samples), can use caret::preProcess() function running preprocess_data remove_var=NULL (see caret::nearZeroVar() function information).","code":"# raw dataset with non-variable features nonvar_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"no\", \"yes\", \"no\"), var2 = c(0, 1, 1), var3 = c(\"no\",\"no\",\"no\"), var4 = c(0,0,0), var5 = c(12,12,12) ) nonvar_df #> outcome var1 var2 var3 var4 var5 #> 1 normal no 0 no 0 12 #> 2 normal yes 1 no 0 12 #> 3 cancer no 1 no 0 12 # remove features with near-zero variance preprocess_data(dataset = nonvar_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_yes var2_1 #> #> 1 normal 0 0 #> 2 normal 1 1 #> 3 cancer 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var4\" \"var3\" \"var5\" # remove features with zero variance preprocess_data(dataset = nonvar_df, outcome_colname = \"outcome\", remove_var = 'zv') #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_yes var2_1 #> #> 1 normal 0 0 #> 2 normal 1 1 #> 3 cancer 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var4\" \"var3\" \"var5\" # don't remove features with near-zero or zero variance preprocess_data(dataset = nonvar_df, outcome_colname = \"outcome\", remove_var = NULL, collapse_corr_feats = FALSE) #> Using 'outcome' as the outcome column. #> $dat_transformed #> # A tibble: 3 × 5 #> outcome var1_yes var2_1 var3 var5 #> #> 1 normal 0 0 0 12 #> 2 normal 1 1 0 12 #> 3 cancer 0 1 0 12 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var4\""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"missing-data","dir":"Articles","previous_headings":"Examples","what":"Missing data","title":"Preprocessing data","text":"preprocess_data() also deals missing data. : Removes missing outcome variables. Maintains zero variability feature already variability (.e. feature removed removing features near-zero variance). Replaces missing binary categorical variables zero (splitting multiple columns). Replaces missing continuous data median value feature. ’d like deal missing data different way, please prior inputting data preprocess_data().","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"remove-missing-outcome-variables","dir":"Articles","previous_headings":"Examples > Missing data","what":"Remove missing outcome variables","title":"Preprocessing data","text":"","code":"# raw dataset with missing outcome value miss_oc_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\",NA), var1 = c(\"no\", \"yes\", \"no\",\"no\"), var2 = c(0, 1, 1,1) ) miss_oc_df #> outcome var1 var2 #> 1 normal no 0 #> 2 normal yes 1 #> 3 cancer no 1 #> 4 no 1 # preprocess raw dataset with missing outcome value preprocess_data(dataset = miss_oc_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> Removed 1/4 (25%) of samples because of missing outcome value (NA). #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_yes var2_1 #> #> 1 normal 0 0 #> 2 normal 1 1 #> 3 cancer 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"maintain-zero-variability-in-a-feature-if-it-already-has-no-variability","dir":"Articles","previous_headings":"Examples > Missing data","what":"Maintain zero variability in a feature if it already has no variability","title":"Preprocessing data","text":", non-variable feature missing data removed removed features near-zero variance. maintained feature, ’d ones:","code":"# raw dataset with missing value in non-variable feature miss_nonvar_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"no\", \"yes\", \"no\"), var2 = c(NA, 1, 1) ) miss_nonvar_df #> outcome var1 var2 #> 1 normal no NA #> 2 normal yes 1 #> 3 cancer no 1 # preprocess raw dataset with missing value in non-variable feature preprocess_data(dataset = miss_nonvar_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> There are 1 missing value(s) in features with no variation. Missing values were replaced with the non-varying value. #> $dat_transformed #> # A tibble: 3 × 2 #> outcome var1_yes #> #> 1 normal 0 #> 2 normal 1 #> 3 cancer 0 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var2\" # preprocess raw dataset with missing value in non-variable feature preprocess_data(dataset = miss_nonvar_df, outcome_colname = \"outcome\", remove_var = NULL, collapse_corr_feats = FALSE) #> Using 'outcome' as the outcome column. #> There are 1 missing value(s) in features with no variation. Missing values were replaced with the non-varying value. #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_yes var2 #> #> 1 normal 0 1 #> 2 normal 1 1 #> 3 cancer 0 1 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"replace-missing-binary-and-categorical-variables-with-zero","dir":"Articles","previous_headings":"Examples > Missing data","what":"Replace missing binary and categorical variables with zero","title":"Preprocessing data","text":"binary variable split two, missing value considered zero .","code":"# raw dataset with missing value in categorical feature miss_cat_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\"), var1 = c(\"no\", \"yes\", NA), var2 = c(NA, 1, 0) ) miss_cat_df #> outcome var1 var2 #> 1 normal no NA #> 2 normal yes 1 #> 3 cancer 0 # preprocess raw dataset with missing value in non-variable feature preprocess_data(dataset = miss_cat_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> 2 categorical missing value(s) (NA) were replaced with 0. Note that the matrix is not full rank so missing values may be duplicated in separate columns. #> $dat_transformed #> # A tibble: 3 × 3 #> outcome var1_no var1_yes #> #> 1 normal 1 0 #> 2 normal 0 1 #> 3 cancer 0 0 #> #> $grp_feats #> NULL #> #> $removed_feats #> [1] \"var2\""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"replace-missing-continuous-data-with-the-median-value-of-that-feature","dir":"Articles","previous_headings":"Examples > Missing data","what":"Replace missing continuous data with the median value of that feature","title":"Preprocessing data","text":"’re normalizing continuous features ’s easier see ’s going (.e. median value used):","code":"# raw dataset with missing value in continuous feature miss_cont_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\",\"normal\"), var1 = c(1,2,2,NA), var2 = c(1,2,3,NA) ) miss_cont_df #> outcome var1 var2 #> 1 normal 1 1 #> 2 normal 2 2 #> 3 cancer 2 3 #> 4 normal NA NA # preprocess raw dataset with missing value in continuous feature preprocess_data(dataset = miss_cont_df, outcome_colname = \"outcome\", method = NULL) #> Using 'outcome' as the outcome column. #> 2 missing continuous value(s) were imputed using the median value of the feature. #> $dat_transformed #> # A tibble: 4 × 3 #> outcome var1 var2 #> #> 1 normal 1 1 #> 2 normal 2 2 #> 3 cancer 2 3 #> 4 normal 2 2 #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"putting-it-all-together","dir":"Articles","previous_headings":"Examples","what":"Putting it all together","title":"Preprocessing data","text":"’s complicated example raw data puts everything discussed together: Let’s throw preprocessing function default values: can see, got several messages: One samples (row 4) removed outcome value missing. One variables feature variation missing value replaced non-varying value (var11). Four categorical missing values replaced zero (var9). 4 missing rather just 1 (like raw data) split categorical variable 4 different columns first. One missing continuous value imputed using median value feature (var8). Additionally, can see continuous variables normalized, categorical variables changed binary, several features grouped together. variables group can found grp_feats.","code":"test_df <- data.frame( outcome = c(\"normal\", \"normal\", \"cancer\", NA), var1 = 1:4, var2 = c(\"a\", \"b\", \"c\", \"d\"), var3 = c(\"no\", \"yes\", \"no\", \"no\"), var4 = c(0, 1, 0, 0), var5 = c(0, 0, 0, 0), var6 = c(\"no\", \"no\", \"no\", \"no\"), var7 = c(1, 1, 0, 0), var8 = c(5, 6, NA, 7), var9 = c(NA, \"x\", \"y\", \"z\"), var10 = c(1, 0, NA, NA), var11 = c(1, 1, NA, NA), var12 = c(\"1\", \"2\", \"3\", \"4\") ) test_df #> outcome var1 var2 var3 var4 var5 var6 var7 var8 var9 var10 var11 var12 #> 1 normal 1 a no 0 0 no 1 5 1 1 1 #> 2 normal 2 b yes 1 0 no 1 6 x 0 1 2 #> 3 cancer 3 c no 0 0 no 0 NA y NA NA 3 #> 4 4 d no 0 0 no 0 7 z NA NA 4 preprocess_data(dataset = test_df, outcome_colname = \"outcome\") #> Using 'outcome' as the outcome column. #> Removed 1/4 (25%) of samples because of missing outcome value (NA). #> There are 1 missing value(s) in features with no variation. Missing values were replaced with the non-varying value. #> 2 categorical missing value(s) (NA) were replaced with 0. Note that the matrix is not full rank so missing values may be duplicated in separate columns. #> 1 missing continuous value(s) were imputed using the median value of the feature. #> $dat_transformed #> # A tibble: 3 × 6 #> outcome grp1 var2_a grp2 grp3 var8 #> #> 1 normal -1 1 0 0 -0.707 #> 2 normal 0 0 1 0 0.707 #> 3 cancer 1 0 0 1 0 #> #> $grp_feats #> $grp_feats$grp1 #> [1] \"var1\" \"var12\" #> #> $grp_feats$var2_a #> [1] \"var2_a\" #> #> $grp_feats$grp2 #> [1] \"var2_b\" \"var3_yes\" \"var9_x\" #> #> $grp_feats$grp3 #> [1] \"var2_c\" \"var7_1\" \"var9_y\" #> #> $grp_feats$var8 #> [1] \"var8\" #> #> #> $removed_feats #> [1] \"var4\" \"var5\" \"var10\" \"var6\" \"var11\""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/preprocess.html","id":"next-step-train-and-evaluate-your-model","dir":"Articles","previous_headings":"Examples","what":"Next step: train and evaluate your model!","title":"Preprocessing data","text":"preprocess data (either using preprocess_data() preprocessing data ), ’re ready train evaluate machine learning models! Please see run_ml() information training models. Tang, Shengpu, Parmida Davarmanesh, Yanmeng Song, Danai Koutra, Michael W. Sjoding, Jenna Wiens. 2020. “Democratizing EHR Analyses FIDDLE: Flexible Data-Driven Preprocessing Pipeline Structured Clinical Data.” J Med Inform Assoc, October. https://doi.org/10.1093/jamia/ocaa139.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"the-simplest-way-to-run_ml","dir":"Articles","previous_headings":"","what":"The simplest way to run_ml()","title":"Hyperparameter tuning","text":"mentioned , minimal input dataset (dataset) machine learning model want use (method). run_ml(), default 100 times repeated, 5-fold cross-validation, evaluate hyperparameters 500 total iterations. Say want run L2 regularized logistic regression. : ’ll probably get warning run dataset small. want learn , check introductory vignette training evaluating ML model: vignette(\"introduction\"). default, run_ml() selects hyperparameters depending dataset method used. can see, alpha hyperparameter set 0, specifies L2 regularization. glmnet gives us option run L1 L2 regularization. change alpha 1, run L1-regularized logistic regression. can also tune alpha specifying variety values 0 1. use value 0 1, running elastic net. default hyperparameter lambda adjusts L2 regularization penalty range values 10^-4 10. look 100 repeated cross-validation performance metrics AUC, Accuracy, prAUC tested lambda value, see appropriate dataset better others.","code":"results <- run_ml(dat, 'glmnet', outcome_colname = 'dx', cv_times = 100, seed = 2019) #> Using 'dx' as the outcome column. #> Training the model... #> Loading required package: ggplot2 #> Loading required package: lattice #> #> Attaching package: 'caret' #> The following object is masked from 'package:mikropml': #> #> compare_models #> Training complete. results$trained_model #> glmnet #> #> 161 samples #> 10 predictor #> 2 classes: 'cancer', 'normal' #> #> No pre-processing #> Resampling: Cross-Validated (5 fold, repeated 100 times) #> Summary of sample sizes: 128, 129, 129, 129, 129, 130, ... #> Resampling results across tuning parameters: #> #> lambda logLoss AUC prAUC Accuracy Kappa F1 #> 1e-04 0.7113272 0.6123301 0.5725828 0.5853927 0.17080523 0.5730989 #> 1e-03 0.7113272 0.6123301 0.5725828 0.5853927 0.17080523 0.5730989 #> 1e-02 0.7112738 0.6123883 0.5726478 0.5854514 0.17092470 0.5731635 #> 1e-01 0.6819806 0.6210744 0.5793961 0.5918756 0.18369829 0.5779616 #> 1e+00 0.6803749 0.6278273 0.5827655 0.5896356 0.17756961 0.5408139 #> 1e+01 0.6909820 0.6271894 0.5814202 0.5218000 0.02920942 0.1875293 #> Sensitivity Specificity Pos_Pred_Value Neg_Pred_Value Precision #> 0.5789667 0.5920074 0.5796685 0.5977166 0.5796685 #> 0.5789667 0.5920074 0.5796685 0.5977166 0.5796685 #> 0.5789667 0.5921250 0.5797769 0.5977182 0.5797769 #> 0.5805917 0.6032353 0.5880165 0.6026963 0.5880165 #> 0.5057833 0.6715588 0.6005149 0.5887829 0.6005149 #> 0.0607250 0.9678676 0.7265246 0.5171323 0.7265246 #> Recall Detection_Rate Balanced_Accuracy #> 0.5789667 0.2839655 0.5854870 #> 0.5789667 0.2839655 0.5854870 #> 0.5789667 0.2839636 0.5855458 #> 0.5805917 0.2847195 0.5919135 #> 0.5057833 0.2478291 0.5886711 #> 0.0607250 0.0292613 0.5142963 #> #> Tuning parameter 'alpha' was held constant at a value of 0 #> AUC was used to select the optimal model using the largest value. #> The final values used for the model were alpha = 0 and lambda = 1. results$trained_model$results #> alpha lambda logLoss AUC prAUC Accuracy Kappa F1 #> 1 0 1e-04 0.7113272 0.6123301 0.5725828 0.5853927 0.17080523 0.5730989 #> 2 0 1e-03 0.7113272 0.6123301 0.5725828 0.5853927 0.17080523 0.5730989 #> 3 0 1e-02 0.7112738 0.6123883 0.5726478 0.5854514 0.17092470 0.5731635 #> 4 0 1e-01 0.6819806 0.6210744 0.5793961 0.5918756 0.18369829 0.5779616 #> 5 0 1e+00 0.6803749 0.6278273 0.5827655 0.5896356 0.17756961 0.5408139 #> 6 0 1e+01 0.6909820 0.6271894 0.5814202 0.5218000 0.02920942 0.1875293 #> Sensitivity Specificity Pos_Pred_Value Neg_Pred_Value Precision Recall #> 1 0.5789667 0.5920074 0.5796685 0.5977166 0.5796685 0.5789667 #> 2 0.5789667 0.5920074 0.5796685 0.5977166 0.5796685 0.5789667 #> 3 0.5789667 0.5921250 0.5797769 0.5977182 0.5797769 0.5789667 #> 4 0.5805917 0.6032353 0.5880165 0.6026963 0.5880165 0.5805917 #> 5 0.5057833 0.6715588 0.6005149 0.5887829 0.6005149 0.5057833 #> 6 0.0607250 0.9678676 0.7265246 0.5171323 0.7265246 0.0607250 #> Detection_Rate Balanced_Accuracy logLossSD AUCSD prAUCSD AccuracySD #> 1 0.2839655 0.5854870 0.085315967 0.09115229 0.07296554 0.07628572 #> 2 0.2839655 0.5854870 0.085315967 0.09115229 0.07296554 0.07628572 #> 3 0.2839636 0.5855458 0.085276565 0.09122242 0.07301412 0.07637123 #> 4 0.2847195 0.5919135 0.048120032 0.09025695 0.07329214 0.07747312 #> 5 0.2478291 0.5886711 0.012189172 0.09111917 0.07505095 0.07771171 #> 6 0.0292613 0.5142963 0.001610008 0.09266875 0.07640896 0.03421597 #> KappaSD F1SD SensitivitySD SpecificitySD Pos_Pred_ValueSD #> 1 0.15265728 0.09353786 0.13091452 0.11988406 0.08316345 #> 2 0.15265728 0.09353786 0.13091452 0.11988406 0.08316345 #> 3 0.15281903 0.09350099 0.13073501 0.12002481 0.08329024 #> 4 0.15485134 0.09308733 0.12870031 0.12037225 0.08554483 #> 5 0.15563046 0.10525917 0.13381009 0.11639614 0.09957685 #> 6 0.06527242 0.09664720 0.08010494 0.06371495 0.31899811 #> Neg_Pred_ValueSD PrecisionSD RecallSD Detection_RateSD Balanced_AccuracySD #> 1 0.08384956 0.08316345 0.13091452 0.06394409 0.07640308 #> 2 0.08384956 0.08316345 0.13091452 0.06394409 0.07640308 #> 3 0.08385838 0.08329024 0.13073501 0.06384692 0.07648207 #> 4 0.08427362 0.08554483 0.12870031 0.06272897 0.07748791 #> 5 0.07597766 0.09957685 0.13381009 0.06453637 0.07773039 #> 6 0.02292294 0.31899811 0.08010494 0.03803159 0.03184136"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"customizing-hyperparameters","dir":"Articles","previous_headings":"","what":"Customizing hyperparameters","title":"Hyperparameter tuning","text":"example, want change lambda values provide better range test cross-validation step. don’t want use defaults provide named list new values. example: Now let’s run L2 logistic regression new lambda values: time, cover larger different range lambda settings cross-validation. know lambda value best one? answer , need run ML pipeline multiple data splits look mean cross-validation performance lambda across modeling experiments. describe run pipeline multiple data splits vignette(\"parallel\"). train model new lambda range defined . run 3 times different seed, result different splits data training testing sets. can use plot_hp_performance see lambda gives us largest mean AUC value across modeling experiments. can see, get mean maxima 0.03 best lambda value dataset run 3 data splits. fact seeing maxima middle range edges, shows providing large enough range exhaust lambda search build model. recommend user use plot make sure best hyperparameter edges provided list. better understanding global maxima, better run data splits using seeds. picked 3 seeds keep runtime vignette, real-world data recommend using many seeds.","code":"new_hp <- list(alpha = 1, lambda = c(0.00001, 0.0001, 0.001, 0.01, 0.015, 0.02, 0.025, 0.03, 0.04, 0.05, 0.06, 0.1)) new_hp #> $alpha #> [1] 1 #> #> $lambda #> [1] 0.00001 0.00010 0.00100 0.01000 0.01500 0.02000 0.02500 0.03000 0.04000 #> [10] 0.05000 0.06000 0.10000 results <- run_ml(dat, 'glmnet', outcome_colname = 'dx', cv_times = 100, hyperparameters = new_hp, seed = 2019 ) #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. results$trained_model #> glmnet #> #> 161 samples #> 10 predictor #> 2 classes: 'cancer', 'normal' #> #> No pre-processing #> Resampling: Cross-Validated (5 fold, repeated 100 times) #> Summary of sample sizes: 128, 129, 129, 129, 129, 130, ... #> Resampling results across tuning parameters: #> #> lambda logLoss AUC prAUC Accuracy Kappa F1 #> 0.00001 0.7215038 0.6112253 0.5720005 0.5842184 0.1684871 0.5726974 #> 0.00010 0.7215038 0.6112253 0.5720005 0.5842184 0.1684871 0.5726974 #> 0.00100 0.7209099 0.6112771 0.5719601 0.5845329 0.1691285 0.5730414 #> 0.01000 0.6984432 0.6156112 0.5758977 0.5830960 0.1665062 0.5759265 #> 0.01500 0.6913332 0.6169396 0.5770496 0.5839720 0.1683912 0.5786347 #> 0.02000 0.6870103 0.6177313 0.5779563 0.5833645 0.1673234 0.5796891 #> 0.02500 0.6846387 0.6169757 0.5769305 0.5831907 0.1669901 0.5792840 #> 0.03000 0.6834369 0.6154763 0.5754118 0.5821394 0.1649081 0.5786336 #> 0.04000 0.6833322 0.6124776 0.5724802 0.5786224 0.1578750 0.5735757 #> 0.05000 0.6850454 0.6069059 0.5668928 0.5732197 0.1468699 0.5624480 #> 0.06000 0.6880861 0.5974311 0.5596714 0.5620224 0.1240112 0.5375824 #> 0.10000 0.6944846 0.5123565 0.3034983 0.5120114 0.0110144 0.3852423 #> Sensitivity Specificity Pos_Pred_Value Neg_Pred_Value Precision #> 0.5798500 0.5888162 0.5780748 0.5971698 0.5780748 #> 0.5798500 0.5888162 0.5780748 0.5971698 0.5780748 #> 0.5801167 0.5891912 0.5784544 0.5974307 0.5784544 #> 0.5883667 0.5783456 0.5755460 0.5977390 0.5755460 #> 0.5929750 0.5756471 0.5763123 0.5987220 0.5763123 #> 0.5967167 0.5708824 0.5748385 0.5990649 0.5748385 #> 0.5970250 0.5702721 0.5743474 0.5997928 0.5743474 #> 0.5964500 0.5687721 0.5734044 0.5982451 0.5734044 #> 0.5904500 0.5677353 0.5699817 0.5943308 0.5699817 #> 0.5734833 0.5736176 0.5668523 0.5864448 0.5668523 #> 0.5360333 0.5881250 0.5595918 0.5722851 0.5595918 #> 0.1145917 0.8963456 0.5255752 0.5132665 0.5255752 #> Recall Detection_Rate Balanced_Accuracy #> 0.5798500 0.28441068 0.5843331 #> 0.5798500 0.28441068 0.5843331 #> 0.5801167 0.28453770 0.5846539 #> 0.5883667 0.28860521 0.5833561 #> 0.5929750 0.29084305 0.5843110 #> 0.5967167 0.29264681 0.5837995 #> 0.5970250 0.29278708 0.5836485 #> 0.5964500 0.29248583 0.5826110 #> 0.5904500 0.28951992 0.5790926 #> 0.5734833 0.28119862 0.5735505 #> 0.5360333 0.26270204 0.5620792 #> 0.1145917 0.05585777 0.5054686 #> #> Tuning parameter 'alpha' was held constant at a value of 1 #> AUC was used to select the optimal model using the largest value. #> The final values used for the model were alpha = 1 and lambda = 0.02. results <- lapply(seq(100, 102), function(seed) { run_ml(dat, \"glmnet\", seed = seed, hyperparameters = new_hp) }) #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. #> Using 'dx' as the outcome column. #> Training the model... #> Training complete. models <- lapply(results, function(x) x$trained_model) hp_metrics <- combine_hp_performance(models) plot_hp_performance(hp_metrics$dat, lambda, AUC)"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"hyperparameter-options","dir":"Articles","previous_headings":"","what":"Hyperparameter options","title":"Hyperparameter tuning","text":"can see default hyperparameters used dataset get_hyperparams_list(). examples built-datasets provide: hyperparameters tuned modeling methods. output similar, won’t go details.","code":"get_hyperparams_list(otu_mini_bin, 'glmnet') #> $lambda #> [1] 1e-04 1e-03 1e-02 1e-01 1e+00 1e+01 #> #> $alpha #> [1] 0 get_hyperparams_list(otu_mini_bin, 'rf') #> $mtry #> [1] 2 3 6 get_hyperparams_list(otu_small, 'rf') #> $mtry #> [1] 4 8 16"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"regression","dir":"Articles","previous_headings":"Hyperparameter options","what":"Regression","title":"Hyperparameter tuning","text":"mentioned , glmnet uses alpha parameter lambda hyperparameter. alpha 0 L2 regularization (ridge). alpha 1 L1 regularization (lasso). alpha elastic net. can also tune alpha like hyperparameter. Please refer original glmnet documentation information: https://web.stanford.edu/~hastie/glmnet/glmnet_alpha.html default hyperparameters chosen run_ml() fixed glmnet.","code":"#> $lambda #> [1] 1e-04 1e-03 1e-02 1e-01 1e+00 1e+01 #> #> $alpha #> [1] 0"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"random-forest","dir":"Articles","previous_headings":"Hyperparameter options","what":"Random forest","title":"Hyperparameter tuning","text":"run rf using randomForest package implementation. tuning mtry hyperparameter. number features randomly collected sampled tree node. number needs less number features dataset. Please refer original documentation information: https://cran.r-project.org/web/packages/randomForest/randomForest.pdf default, take square root number features dataset provide range [sqrt_features / 2, sqrt_features, sqrt_features * 2]. example number features 1000: Similar glmnet method, can provide mtry range.","code":"#> $mtry #> [1] 16 32 64"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"decision-tree","dir":"Articles","previous_headings":"Hyperparameter options","what":"Decision tree","title":"Hyperparameter tuning","text":"run rpart2, running rpart package implementation decision tree. tuning maxdepth hyperparameter. maximum depth node final tree. Please refer original documentation information maxdepth: https://cran.r-project.org/web/packages/rpart/rpart.pdf default, provide range less number features dataset. example 1000 features: 10 features:","code":"#> $maxdepth #> [1] 1 2 4 8 16 30 #> $maxdepth #> [1] 1 2 4 8"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"svm-with-radial-basis-kernel","dir":"Articles","previous_headings":"Hyperparameter options","what":"SVM with radial basis kernel","title":"Hyperparameter tuning","text":"run svmRadial method, tuning C sigma hyperparameters. sigma defines far influence single training example reaches C behaves regularization parameter. Please refer great sklearn resource information hyperparameters: https://scikit-learn.org/stable/auto_examples/svm/plot_rbf_parameters.html default, provide 2 separate range values two hyperparameters.","code":"#> $C #> [1] 1e-03 1e-02 1e-01 1e+00 1e+01 1e+02 #> #> $sigma #> [1] 1e-06 1e-05 1e-04 1e-03 1e-02 1e-01"},{"path":"http://www.schlosslab.org/mikropml/dev/articles/tuning.html","id":"xgboost","dir":"Articles","previous_headings":"Hyperparameter options","what":"XGBoost","title":"Hyperparameter tuning","text":"run xgbTree method, tuning nrounds, gamma, eta max_depth, colsample_bytree, min_child_weight subsample hyperparameters. can read hyperparameters : https://xgboost.readthedocs.io/en/latest/parameter.html default, set nrounds, gamma, colsample_bytree min_child_weight fixed values provide range values eta, max_depth subsample. can changed optimized user supplying custom named list hyperparameters run_ml().","code":"#> $nrounds #> [1] 100 #> #> $gamma #> [1] 0 #> #> $eta #> [1] 0.001 0.010 0.100 1.000 #> #> $max_depth #> [1] 1 2 4 8 16 30 #> #> $colsample_bytree #> [1] 0.8 #> #> $min_child_weight #> [1] 1 #> #> $subsample #> [1] 0.4 0.5 0.6 0.7"},{"path":"http://www.schlosslab.org/mikropml/dev/authors.html","id":null,"dir":"","previous_headings":"","what":"Authors","title":"Authors and Citation","text":"Begüm Topçuoğlu. Author. Zena Lapp. Author. Kelly Sovacool. Author, maintainer. Evan Snitkin. Author. Jenna Wiens. Author. Patrick Schloss. Author. Nick Lesniak. Contributor. Courtney Armour. Contributor. Sarah Lucas. Contributor.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/authors.html","id":"citation","dir":"","previous_headings":"","what":"Citation","title":"Authors and Citation","text":"Topçuoğlu et al., (2021). mikropml: User-Friendly R Package Supervised Machine Learning Pipelines. Journal Open Source Software, 6(61), 3073, https://doi.org/10.21105/joss.03073","code":"@Article{, title = {{mikropml}: User-Friendly R Package for Supervised Machine Learning Pipelines}, author = {Begüm D. Topçuoğlu and Zena Lapp and Kelly L. Sovacool and Evan Snitkin and Jenna Wiens and Patrick D. Schloss}, journal = {Journal of Open Source Software}, year = {2021}, month = {May}, volume = {6}, number = {61}, pages = {3073}, doi = {10.21105/joss.03073}, url = {https://joss.theoj.org/papers/10.21105/joss.03073}, }"},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"mikropml-","dir":"","previous_headings":"","what":"User-Friendly R Package for Supervised Machine Learning Pipelines","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"meek-ROPE em el User-Friendly R Package Supervised Machine Learning Pipelines interface build machine learning models classification regression problems. mikropml implements ML pipeline described Topçuoğlu et al. (2020) reasonable default options data preprocessing, hyperparameter tuning, cross-validation, testing, model evaluation, interpretation steps. See website information, documentation, examples.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"installation","dir":"","previous_headings":"","what":"Installation","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"can install latest release CRAN: development version GitHub: install terminal using conda mamba:","code":"install.packages('mikropml') # install.packages(\"devtools\") devtools::install_github(\"SchlossLab/mikropml\") mamba install -c conda-forge r-mikropml"},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"dependencies","dir":"","previous_headings":"Installation","what":"Dependencies","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"Imports: caret, dplyr, e1071, glmnet, kernlab, MLmetrics, randomForest, rlang, rpart, stats, utils, xgboost Suggests: doFuture, foreach, future, future.apply, ggplot2, knitr, progress, progressr, purrr, rmarkdown, testthat, tidyr","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"usage","dir":"","previous_headings":"","what":"Usage","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"Check introductory vignette quick start tutorial. -depth discussion, read vignettes /take look reference documentation. can watch Riffomonas Project series video tutorials covering mikropml skills related machine learning. also provide example Snakemake workflow running mikropml HPC.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"help--contributing","dir":"","previous_headings":"","what":"Help & Contributing","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"come across bug, open issue include minimal reproducible example. questions, create new post Discussions. ’d like contribute, see guidelines .","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"code-of-conduct","dir":"","previous_headings":"","what":"Code of Conduct","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"Please note mikropml project released Contributor Code Conduct. contributing project, agree abide terms.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"license","dir":"","previous_headings":"","what":"License","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"mikropml package licensed MIT license. Text images included repository, including mikropml logo, licensed CC 4.0 license.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"citation","dir":"","previous_headings":"","what":"Citation","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"cite mikropml publications, use: Topçuoğlu BD, Lapp Z, Sovacool KL, Snitkin E, Wiens J, Schloss PD (2021). “mikropml: User-Friendly R Package Supervised Machine Learning Pipelines.” Journal Open Source Software, 6(61), 3073. doi:10.21105/joss.03073, https://joss.theoj.org/papers/10.21105/joss.03073. BibTeX entry LaTeX users :","code":"@Article{, title = {{mikropml}: User-Friendly R Package for Supervised Machine Learning Pipelines}, author = {Begüm D. Topçuoğlu and Zena Lapp and Kelly L. Sovacool and Evan Snitkin and Jenna Wiens and Patrick D. Schloss}, journal = {Journal of Open Source Software}, year = {2021}, month = {May}, volume = {6}, number = {61}, pages = {3073}, doi = {10.21105/joss.03073}, url = {https://joss.theoj.org/papers/10.21105/joss.03073}, }"},{"path":"http://www.schlosslab.org/mikropml/dev/index.html","id":"why-the-name","dir":"","previous_headings":"","what":"Why the name?","title":"User-Friendly R Package for Supervised Machine Learning Pipelines","text":"word “mikrop” (pronounced “meek-ROPE”) Turkish “microbe”. package originally implemented machine learning pipeline microbiome-based classification problems (see Topçuoğlu et al. 2020). realized methods applicable many fields , stuck name like !","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/pull_request_template.html","id":"issues","dir":"","previous_headings":"","what":"Issues","title":"NA","text":"Resolves # .","code":""},{"path":[]},{"path":"http://www.schlosslab.org/mikropml/dev/pull_request_template.html","id":"checklist","dir":"","previous_headings":"","what":"Checklist","title":"NA","text":"(Strikethrough points applicable.) Write unit tests new functionality bug fixes. roxygen comments vignettes Update NEWS.md includes user-facing changes. check workflow succeeds recent commit. always required PR can merged.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/calc_perf_metrics.html","id":null,"dir":"Reference","previous_headings":"","what":"Get performance metrics for test data — calc_perf_metrics","title":"Get performance metrics for test data — calc_perf_metrics","text":"Get performance metrics test data","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/calc_perf_metrics.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get performance metrics for test data — calc_perf_metrics","text":"","code":"calc_perf_metrics( test_data, trained_model, outcome_colname, perf_metric_function, class_probs )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/calc_perf_metrics.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get performance metrics for test data — calc_perf_metrics","text":"test_data Held test data: dataframe outcome features. trained_model Trained model caret::train(). outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). perf_metric_function Function calculate performance metric used cross-validation test performance. functions provided caret (see caret::defaultSummary()). Defaults: binary classification = twoClassSummary, multi-class classification = multiClassSummary, regression = defaultSummary. class_probs Whether use class probabilities (TRUE categorical outcomes, FALSE numeric outcomes).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/calc_perf_metrics.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get performance metrics for test data — calc_perf_metrics","text":"Dataframe performance metrics.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/calc_perf_metrics.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get performance metrics for test data — calc_perf_metrics","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/calc_perf_metrics.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get performance metrics for test data — calc_perf_metrics","text":"","code":"if (FALSE) { results <- run_ml(otu_small, \"glmnet\", kfold = 2, cv_times = 2) calc_perf_metrics(results$test_data, results$trained_model, \"dx\", multiClassSummary, class_probs = TRUE ) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/combine_hp_performance.html","id":null,"dir":"Reference","previous_headings":"","what":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","title":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","text":"Combine hyperparameter performance metrics multiple train/test splits generated , instance, looping R using snakemake workflow high-performance computer.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/combine_hp_performance.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","text":"","code":"combine_hp_performance(trained_model_lst)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/combine_hp_performance.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","text":"trained_model_lst List trained models.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/combine_hp_performance.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","text":"Named list: dat: Dataframe performance metric group hyperparameters params: Hyperparameters tuned. Metric: Performance metric used.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/combine_hp_performance.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/combine_hp_performance.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Combine hyperparameter performance metrics for multiple train/test splits — combine_hp_performance","text":"","code":"if (FALSE) { results <- lapply(seq(100, 102), function(seed) { run_ml(otu_small, \"glmnet\", seed = seed, cv_times = 2, kfold = 2) }) models <- lapply(results, function(x) x$trained_model) combine_hp_performance(models) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/compare_models.html","id":null,"dir":"Reference","previous_headings":"","what":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","title":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","text":"wrapper permute_p_value().","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/compare_models.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","text":"","code":"compare_models(merged_data, metric, group_name, nperm = 10000)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/compare_models.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","text":"merged_data concatenated performance data run_ml metric metric compare, must numeric group_name column group variables compare nperm number permutations, default=10000","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/compare_models.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","text":"table p-values pairs group variable","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/compare_models.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","text":"Courtney R Armour, armourc@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/compare_models.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Perform permutation tests to compare the performance metric\nacross all pairs of a group variable. — compare_models","text":"","code":"df <- dplyr::tibble( model = c(\"rf\", \"rf\", \"glmnet\", \"glmnet\", \"svmRadial\", \"svmRadial\"), AUC = c(.2, 0.3, 0.8, 0.9, 0.85, 0.95) ) set.seed(123) compare_models(df, \"AUC\", \"model\", nperm = 10) #> group1 group2 p_value #> 1 glmnet svmRadial 0.7272727 #> 2 rf glmnet 0.2727273 #> 3 rf svmRadial 0.5454545"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/define_cv.html","id":null,"dir":"Reference","previous_headings":"","what":"Define cross-validation scheme and training parameters — define_cv","title":"Define cross-validation scheme and training parameters — define_cv","text":"Define cross-validation scheme training parameters","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/define_cv.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Define cross-validation scheme and training parameters — define_cv","text":"","code":"define_cv( train_data, outcome_colname, hyperparams_list, perf_metric_function, class_probs, kfold = 5, cv_times = 100, groups = NULL, group_partitions = NULL )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/define_cv.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Define cross-validation scheme and training parameters — define_cv","text":"train_data Dataframe training model. outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). hyperparams_list Named list lists hyperparameters. perf_metric_function Function calculate performance metric used cross-validation test performance. functions provided caret (see caret::defaultSummary()). Defaults: binary classification = twoClassSummary, multi-class classification = multiClassSummary, regression = defaultSummary. class_probs Whether use class probabilities (TRUE categorical outcomes, FALSE numeric outcomes). kfold Fold number k-fold cross-validation (default: 5). cv_times Number cross-validation partitions create (default: 100). groups Vector groups keep together splitting data train test sets. number groups training set larger kfold, groups also kept together cross-validation. Length matches number rows dataset (default: NULL). group_partitions Specify assign groups training testing partitions (default: NULL). groups specifies samples belong group \"\" belong group \"B\", setting group_partitions = list(train = c(\"\", \"B\"), test = c(\"B\")) result samples group \"\" placed training set, samples \"B\" also training set, remaining samples \"B\" testing set. partition sizes close training_frac possible. number groups training set larger kfold, groups also kept together cross-validation.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/define_cv.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Define cross-validation scheme and training parameters — define_cv","text":"Caret object trainControl controls cross-validation","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/define_cv.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Define cross-validation scheme and training parameters — define_cv","text":"Begüm Topçuoğlu, topcuoglu.begum@gmail.com Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/define_cv.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Define cross-validation scheme and training parameters — define_cv","text":"","code":"training_inds <- get_partition_indices(otu_small %>% dplyr::pull(\"dx\"), training_frac = 0.8, groups = NULL ) train_data <- otu_small[training_inds, ] test_data <- otu_small[-training_inds, ] cv <- define_cv(train_data, outcome_colname = \"dx\", hyperparams_list = get_hyperparams_list(otu_small, \"glmnet\"), perf_metric_function = caret::multiClassSummary, class_probs = TRUE, kfold = 5 )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_caret_processed_df.html","id":null,"dir":"Reference","previous_headings":"","what":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","title":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","text":"Get preprocessed dataframe continuous variables","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_caret_processed_df.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","text":"","code":"get_caret_processed_df(features, method)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_caret_processed_df.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","text":"features Dataframe features machine learning method Methods preprocess data, described caret::preProcess() (default: c(\"center\",\"scale\"), use NULL normalization).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_caret_processed_df.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","text":"Named list: processed: Dataframe processed features. removed: Names features removed preprocessing.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_caret_processed_df.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_caret_processed_df.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get preprocessed dataframe for continuous variables — get_caret_processed_df","text":"","code":"get_caret_processed_df(mikropml::otu_small[, 2:ncol(otu_small)], c(\"center\", \"scale\")) #> $processed #> Otu00001 Otu00002 Otu00003 Otu00004 Otu00005 #> 1 -0.4198476322 -0.218855527 -0.174296240 -0.59073845 -0.048774220 #> 2 -0.1045750483 1.754032339 -0.718419364 0.03805034 1.537072974 #> 3 -0.7076423302 0.696324396 1.428146361 0.60439092 -0.264559044 #> 4 -0.4936040623 -0.665193276 2.015799335 -0.59289184 -0.675577755 #> 5 1.1116829471 -0.395140184 -0.753787367 -0.58643168 -0.754356341 #> 6 -0.6845030580 0.613808173 -0.174296240 -0.58427829 0.375945115 #> 7 -0.7698291243 -0.496410093 -0.318488868 0.15863997 -0.658451975 #> 8 -0.4241862457 -0.477656406 -0.397386721 -0.55628427 -0.391289813 #> 9 -0.5557908564 1.144537514 1.615868839 -0.35171258 -0.274834512 #> 10 1.4573258257 -0.451401245 -0.693933823 -0.05669866 -0.706404158 #> 11 0.2931311927 -0.721454336 -0.753787367 3.03341063 -0.449517464 #> 12 1.1044519245 0.002437979 -0.473563958 -0.41846755 0.413621830 #> 13 -0.5933921737 -0.297621012 -0.340253793 -0.59289184 -0.655026820 #> 14 -0.8016456236 0.077452727 -0.419151646 -0.59073845 -0.045349064 #> 15 -0.7915221920 0.291244758 -0.269517787 -0.59289184 -0.220032017 #> 16 1.4862499159 -0.683946963 -0.745625520 -0.54551734 -0.744080874 #> 17 -0.3750152923 -0.051947713 0.103206554 1.37745659 0.458148857 #> 18 0.2135899445 0.325001395 0.478651509 -0.34309903 0.560903535 #> 19 -0.5181895390 -0.100707299 -0.073633462 -0.40770062 -0.237157796 #> 20 0.8745054069 -0.676445488 -0.560623658 -0.58212491 -0.154954054 #> 21 2.0184531767 -0.682071594 -0.740184289 -0.58643168 -0.720104782 #> 22 0.5867107089 -0.646439589 -0.560623658 0.09188499 -0.593374013 #> 23 -0.4603413585 -0.397015552 0.386150578 -0.42062094 -0.463218088 #> 24 -0.7553670792 1.401463025 0.829610924 -0.58858507 -0.295385447 #> 25 1.9316809059 0.334378238 -0.228708552 -0.42923448 -0.535146362 #> 26 1.2201482855 -0.108208774 -0.302165174 -0.58858507 0.358819335 #> 27 -0.9158957801 -0.674570119 -0.732022442 -0.53475041 -0.689278379 #> 28 -0.7597056927 -0.595804634 -0.375621796 -0.57566475 -0.730380250 #> 29 -0.5109585165 -0.558297260 -0.432754724 3.84093048 -0.672152599 #> 30 -0.8811868718 -0.385763340 -0.595991661 -0.58212491 -0.192630769 #> 31 0.3437483507 0.902614952 1.376454664 -0.59289184 1.396641581 #> 32 -0.5109585165 0.535042688 -0.484446421 -0.59289184 0.550628067 #> 33 1.2302717171 -0.582677053 0.007985007 -0.40554723 -0.672152599 #> 34 -0.0770971626 0.244360541 -0.313047636 -0.28711099 2.273481498 #> 35 -0.2275024319 2.211622300 1.515206061 -0.57781814 1.269910812 #> 36 0.0284757669 -0.663317907 -0.634080280 -0.57781814 -0.730380250 #> 37 -0.3157209072 -0.290119537 -0.231429168 -0.58643168 -0.233732640 #> 38 -0.1653156379 1.476477772 1.836238704 1.65309003 4.393653017 #> 39 -0.6859492625 -0.134463935 -0.258635324 0.68191283 0.399921206 #> 40 -0.3967083600 -0.126962461 -0.269517787 -0.57135798 0.304016840 #> 41 0.0009978811 -0.485157881 -0.291282711 -0.58858507 -0.463218088 #> 42 -0.4111704051 -0.029443288 -0.400107336 -0.19236199 0.050555301 #> 43 1.4399713715 -0.693323806 -0.734743058 3.42532693 -0.757781497 #> 44 -0.4805882217 -0.593929265 -0.397386721 1.44851834 -0.648176508 #> 45 0.3827958725 -0.421395345 -0.609594739 2.34648041 -0.333062162 #> 46 1.6438862078 -0.592053897 -0.579667967 -0.58427829 -0.726955094 #> 47 -0.5471136293 -0.280742693 0.269164106 1.53680717 -0.202906237 #> 48 1.6164083221 -0.653941063 -0.615035970 -0.58643168 -0.696128691 #> 49 -0.8609400086 -0.078202875 -0.397386721 0.99630722 -0.086450936 #> 50 0.6026189585 0.146841369 -0.345695024 -0.58212491 1.002748649 #> 51 -0.8363545319 -0.635187377 -0.421872261 -0.56705120 -0.593374013 #> 52 -0.1783314785 0.328752132 -0.666727667 -0.58643168 -0.531721206 #> 53 1.6728102981 -0.548920417 -0.639521511 -0.58858507 -0.562547610 #> 54 1.2620882164 -0.458902719 -0.751066751 1.21595262 -0.579673389 #> 55 -0.7828449649 -0.065075294 -0.130766390 1.23963987 0.375945115 #> 56 2.0705165392 -0.605181477 -0.606874124 -0.58858507 -0.733805406 #> 57 -0.6469017407 -0.327626911 -0.666727667 -0.57566475 -0.600224325 #> 58 -0.3663380652 8.732279248 -0.740184289 -0.54551734 -0.730380250 #> 59 -0.4415406999 1.363955651 -0.748346136 -0.10191977 1.746007486 #> 60 -0.4111704051 -0.479531775 -0.242311630 -0.59289184 -0.422116216 #> 61 -0.2694423628 2.967395884 -0.740184289 0.50964193 0.721885863 #> 62 0.6112961856 0.047446828 -0.579667967 -0.52613687 0.564328691 #> 63 -0.2347334545 -0.425146083 -0.479005189 -0.05454527 -0.665302287 #> 64 -0.8334621229 -0.344505229 -0.356577486 -0.45507512 -0.350187942 #> 65 -0.8884178944 -0.550795785 -0.400107336 0.57424351 -0.476918711 #> 66 -0.6049618098 -0.721454336 1.305718658 1.43129125 -0.487194179 #> 67 -0.8970951214 -0.642688851 -0.623197817 -0.58858507 -0.682428067 #> 68 -0.7293353979 0.801345043 -0.130766390 0.47303436 -0.257708732 #> 69 -0.7221043754 -0.721454336 2.736762475 1.05660204 -0.052199376 #> 70 -0.1002364348 -0.717703599 0.679977065 -0.57135798 0.512951352 #> 71 -0.2708885673 -0.584552422 0.783360459 -0.59289184 0.389645738 #> 72 -0.7221043754 0.150592106 1.036377712 1.00276738 0.030004366 #> 73 -0.4979426759 0.486283102 0.116809632 0.01436309 0.314292308 #> 74 -0.5557908564 -0.412018502 -0.527976271 -0.32587194 -0.315936383 #> 75 -0.7061961257 0.306247708 -0.323930099 -0.17944168 0.526651975 #> 76 0.4203971899 -0.558297260 0.269164106 0.31583716 0.074531393 #> 77 -0.6425631272 0.304372339 0.699021375 -0.52829025 0.427322453 #> 78 0.6488975029 -0.053823082 2.739483091 -0.59073845 -0.291960291 #> 79 -0.7380126250 -0.190724996 -0.424592877 -0.16867475 -0.199481081 #> 80 2.5159475292 -0.717703599 -0.710257517 -0.59289184 -0.757781497 #> 81 -0.8710634402 -0.331377648 -0.470843343 -0.55628427 -0.555697298 #> 82 -0.4039393825 1.645260955 -0.138928237 -0.59289184 0.410196674 #> 83 1.0032176086 -0.425146083 -0.405548568 -0.59073845 0.095082329 #> 84 -0.8305697138 -0.548920417 -0.748346136 -0.59289184 -0.319361539 #> 85 -0.8088766461 -0.368885022 2.105579651 -0.57135798 -0.579673389 #> 86 -0.6859492625 -0.550795785 0.410636119 0.67545267 -0.490619335 #> 87 -0.0062331415 0.167470424 0.367106269 -0.59289184 0.869167568 #> 88 -0.6497941497 1.360204913 -0.751066751 -0.59289184 -0.291960291 #> 89 -0.4458793134 0.788217462 0.731668762 -0.30864485 1.211683161 #> 90 -0.1421763657 -0.717703599 0.767036765 -0.59289184 2.119349482 #> 91 -0.7915221920 1.123908458 0.652770909 2.20651030 -0.045349064 #> 92 -0.4502179269 0.116835470 -0.054589152 -0.26342374 3.900430564 #> 93 1.3633225323 -0.593929265 -0.753787367 0.12203240 -0.206331393 #> 94 -0.6743796264 -0.442024401 -0.538858733 2.10314776 -0.679002911 #> 95 -0.2072555688 0.193725586 -0.364739333 -0.49383607 0.554053223 #> 96 -0.3460912020 2.147859765 2.856469562 1.86412189 1.304162371 #> 97 0.2121437400 -0.700825281 -0.742904905 0.42135309 -0.747506030 #> 98 -0.5948383782 0.169345793 0.024308701 -0.36463290 -0.401565281 #> 99 1.5281898468 -0.704576018 -0.699375055 -0.58858507 -0.692703535 #> 100 -0.5688066970 -0.680196225 -0.729301827 -0.58212491 -0.740655718 #> 101 -0.2361796590 -0.179472784 -0.498049499 -0.58427829 -0.631050728 #> 102 2.6518907534 -0.683946963 -0.721139980 -0.53905718 -0.740655718 #> 103 0.8325654761 -0.590178528 -0.473563958 2.55966565 -0.672152599 #> 104 0.3061470334 -0.162594466 0.524901975 0.39981923 1.636402496 #> 105 -0.5644680835 0.915742533 -0.449078418 -0.57997152 0.841766321 #> 106 -0.2390720680 0.345630450 -0.348415640 -0.56274443 0.132759044 #> 107 -0.7394588295 -0.278867325 -0.748346136 0.10049854 0.290316216 #> 108 -0.3648918607 1.302068484 0.802404768 -0.59289184 0.506101040 #> 109 1.5079429836 -0.682071594 -0.650403974 -0.50029623 -0.713254470 #> 110 -0.6035156053 -0.451401245 -0.582388583 -0.59289184 -0.583098545 #> 111 -0.2810119989 1.495231459 0.660932756 -0.59289184 -0.130977963 #> 112 -0.4502179269 -0.719578968 1.033657096 -0.58427829 -0.754356341 #> 113 -0.4458793134 0.508787527 2.353155672 -0.59289184 4.314874431 #> 114 -0.7813987604 0.090580308 -0.304885790 -0.17944168 -0.329637006 #> 115 -0.7365664205 3.171811071 0.228354872 -0.59073845 0.649957589 #> 116 -0.0264800046 -0.078202875 -0.413710414 0.21462801 0.321142620 #> 117 -0.6324396956 -0.205727946 -0.753787367 0.15863997 -0.702979003 #> 118 -0.5557908564 -0.213229420 0.821449078 0.09188499 2.633122871 #> 119 0.2309443987 1.073273503 2.619776003 -0.42062094 0.817790229 #> 120 0.3900268951 -0.674570119 -0.718419364 -0.58858507 -0.737230562 #> 121 -0.9057723485 -0.344505229 -0.661286436 -0.59073845 -0.668727443 #> 122 -0.0264800046 0.013690191 -0.751066751 -0.59289184 -0.761206653 #> 123 -0.7076423302 -0.637062745 -0.468122727 -0.50890978 -0.675577755 #> 124 0.0545074481 -0.650190326 0.443283506 -0.57566475 0.348543867 #> 125 0.3249476920 0.144966000 -0.585109199 -0.58427829 -0.196055925 #> 126 -0.0496192768 2.852998394 0.233796103 -0.58858507 0.944520999 #> 127 -0.3388601794 -0.057573819 0.954759243 -0.42923448 -0.004247193 #> 128 -0.0366034362 -0.286368800 -0.511652577 1.86196850 -0.757781497 #> 129 -0.8522627815 -0.355757441 -0.386504258 -0.32371856 0.194411850 #> 130 -0.2766733854 0.094331045 1.893371632 2.95158195 2.937961748 #> 131 0.5433245735 -0.537668204 -0.712978133 -0.58427829 -0.747506030 #> 132 -0.3764614968 -0.121336354 0.062397319 -0.56274443 -0.360463410 #> 133 0.1340486963 -0.316374699 0.312693956 -0.45722851 -0.052199376 #> 134 -0.5196357436 0.308123076 -0.280400249 -0.56705120 0.598580250 #> 135 0.1499569459 -0.706451387 -0.712978133 -0.57781814 -0.744080874 #> 136 -0.3808001103 1.189546362 0.475930894 -0.59289184 0.214962786 #> 137 -0.6859492625 0.872609053 5.601570724 -0.56705120 -0.672152599 #> 138 0.1210328557 -0.301371749 -0.443637186 -0.59289184 -0.562547610 #> 139 -0.8450317590 -0.192600365 -0.636800895 1.93303025 -0.709829314 #> 140 0.1803272407 0.475030890 0.435121659 1.12120363 1.276761123 #> 141 -0.8464779635 0.321250657 -0.220546705 -0.58643168 -0.538571518 #> 142 -0.8826330763 -0.472030300 0.764316150 3.24444248 0.026579210 #> 143 -0.8537089861 -0.522665255 -0.549741196 1.32792871 -0.357038254 #> 144 -0.7582594882 -0.344505229 -0.664007052 -0.45722851 -0.726955094 #> 145 -0.5832687421 -0.171971309 1.553294680 -0.58212491 -0.374164034 #> 146 -0.6469017407 -0.470154931 -0.644962742 -0.59073845 -0.730380250 #> 147 -0.3460912020 -0.023817182 2.127344576 1.81459400 0.307441996 #> 148 -0.5644680835 -0.408267765 2.886396334 0.15863997 -0.346762786 #> 149 1.1478380599 -0.593929265 -0.601432892 -0.58427829 -0.730380250 #> 150 0.1427259234 3.299336143 2.657864622 -0.56705120 3.835352601 #> 151 -0.5659142880 3.123051485 1.289394964 -0.57781814 0.899993971 #> 152 0.9699549048 0.081203464 -0.285841480 -0.58643168 0.423897298 #> 153 -0.8378007364 0.203102430 0.070559166 -0.58427829 -0.442667152 #> 154 -0.6830568535 -0.653941063 -0.560623658 -0.56274443 -0.668727443 #> 155 -0.3735690877 -0.466404194 -0.590550430 -0.49383607 -0.689278379 #> 156 -0.3460912020 -0.235733845 -0.294003327 2.08161390 -0.682428067 #> 157 4.1935447642 -0.668944013 -0.683051361 2.59842660 -0.716679626 #> 158 0.1687576046 -0.477656406 -0.000176840 -0.57135798 -0.713254470 #> 159 0.9280149739 -0.592053897 2.867352025 -0.58212491 0.584879626 #> 160 1.1001133110 -0.674570119 -0.715698748 -0.48737591 -0.494044491 #> 161 -0.6526865587 -0.693323806 -0.718419364 -0.57566475 -0.750931186 #> 162 -0.0192489821 0.495659946 -0.751066751 -0.59289184 5.606158216 #> 163 -0.2491954996 -0.653941063 -0.536138117 -0.16006120 -0.668727443 #> 164 0.4478750756 -0.250736794 -0.179737471 -0.44215480 -0.500894803 #> 165 0.4088275538 -0.663317907 -0.595991661 -0.56705120 0.310867152 #> 166 -0.9130033711 0.317499920 0.761595534 -0.59289184 -0.449517464 #> 167 -0.8999875305 0.506912158 0.595637981 0.91447854 -0.720104782 #> 168 2.9367930424 -0.721454336 -0.748346136 -0.58643168 -0.754356341 #> 169 3.0119956771 -0.689573069 -0.680330745 -0.59073845 -0.726955094 #> 170 3.9332279519 -0.706451387 -0.740184289 0.67760606 -0.432391684 #> 171 -0.5962845827 0.291244758 -0.196061165 -0.57351136 0.701334928 #> 172 -0.7683829198 -0.554546523 -0.658565820 -0.12776041 -0.552272142 #> 173 -0.1754390695 -0.712077493 -0.734743058 2.77931105 -0.730380250 #> 174 -0.3186133163 -0.166345203 -0.397386721 1.53034702 -0.028223285 #> 175 -0.5890535602 0.373760981 -0.043706690 -0.30649147 -0.518020582 #> 176 -0.3446449975 -0.160719097 2.959852956 -0.57781814 -0.161804366 #> 177 -0.5283129706 -0.288244168 -0.606874124 -0.57566475 0.067681081 #> 178 1.8608168848 -0.036944763 -0.160693162 -0.39262692 -0.329637006 #> 179 0.0501688346 -0.698949912 -0.726581211 0.53978933 -0.750931186 #> 180 -0.8363545319 0.364384137 -0.492608267 -0.33233210 -0.446092308 #> 181 -0.8378007364 0.131838419 -0.296723943 -0.56489782 -0.634475884 #> 182 -0.8233386913 -0.593929265 -0.095398387 -0.58858507 1.266485656 #> 183 -0.7177657618 -0.571424841 -0.737463673 3.51146238 -0.644751352 #> 184 -0.7625981017 0.683196815 -0.405548568 -0.39478030 -0.175504990 #> 185 -0.3301829524 -0.672694750 -0.742904905 -0.59289184 -0.569397921 #> 186 -0.9202343936 -0.383887972 -0.117163312 1.05660204 -0.048774220 #> 187 0.0762005158 -0.006938864 -0.593271045 2.91066761 0.036854678 #> 188 0.5028308471 -0.708326756 -0.742904905 -0.23542972 -0.062474844 #> 189 -0.8652786222 -0.389514078 0.032470547 -0.47230221 -0.055624532 #> 190 -0.7842911694 -0.059449188 -0.356577486 0.82403632 -0.668727443 #> 191 0.7212077286 -0.685822331 -0.753787367 -0.55197750 -0.631050728 #> 192 0.9844169499 -0.713952862 -0.751066751 -0.58427829 -0.754356341 #> 193 0.3061470334 0.120586207 -0.261355940 -0.58427829 0.817790229 #> 194 -0.2810119989 -0.577050947 -0.443637186 0.18017383 -0.661877131 #> 195 -0.5413288112 0.195600955 -0.356577486 -0.58212491 0.677358836 #> 196 1.6858261387 -0.702700649 -0.734743058 -0.59073845 -0.723529938 #> 197 1.1478380599 -0.078202875 3.286326831 -0.57135798 0.852041788 #> 198 5.1046536074 -0.691448437 -0.753787367 -0.48737591 -0.716679626 #> 199 0.9309073830 -0.350131335 -0.590550430 -0.58212491 1.232234096 #> 200 -0.6252086730 0.400016142 -0.424592877 -0.58427829 -0.048774220 #> Otu00006 Otu00007 Otu00008 Otu00009 Otu00010 Otu00011 #> 1 -0.16741683 -0.568745802 -0.0623643594 0.05469266 -0.637876227 0.72598299 #> 2 -0.57321773 -0.642744310 -0.1320452079 -0.40907548 -0.383314722 0.01116897 #> 3 -0.03641025 -0.612472193 -0.2070861217 -0.73691158 2.586569508 -0.77512645 #> 4 -0.58599886 -0.551927960 -0.4697293198 0.13465268 0.025980248 -0.02010414 #> 5 2.72750923 0.191420685 -0.6760918326 1.26208901 1.703091342 1.58822740 #> 6 0.80394909 -0.336659574 -0.0060836741 -0.26514744 -0.348374907 0.55621466 #> 7 2.20028760 -0.716742817 0.0635971744 -0.84485761 -0.682798846 -0.48920084 #> 8 -0.06197251 0.376416954 -0.0221638699 0.61841082 2.381922023 -0.75278851 #> 9 -0.46457812 -0.804195599 0.2940799810 -0.40907548 0.280541753 0.18540489 #> 10 0.68891891 -0.370295259 1.5885357433 -0.72091957 0.495172042 -0.82873750 #> 11 -0.24090833 0.066968649 -0.1963659911 0.09067467 0.500163444 0.38644633 #> 12 -0.35593851 -0.777287051 0.3423205684 0.48247878 0.634931300 -0.48920084 #> 13 -0.49972623 -0.141572599 -0.2178062522 -0.14520740 -0.248546866 1.11019552 #> 14 -0.77452053 -0.622562899 -0.4214887324 -0.64495755 -0.607927815 -0.04690967 #> 15 -0.68824790 -0.582200076 0.8997673563 -0.79688159 -0.892437732 -0.62769606 #> 16 0.91897926 1.180309832 -0.4241687650 -0.06524738 -0.058873587 -0.06478002 #> 17 0.18725954 0.046787238 2.0950619112 -0.13321340 -0.423245938 -0.65003400 #> 18 1.43341977 -0.316478163 0.7791658878 0.44249877 -0.353366309 -0.02903932 #> 19 -0.73298186 -0.693197838 -0.2124461869 0.28657672 0.045945856 -0.55174707 #> 20 -0.11948759 -0.481293021 -0.2365664806 0.54644680 2.327016600 1.20848245 #> 21 5.17190045 0.712773807 -0.6787718653 0.43450277 1.468495445 -0.87788097 #> 22 -0.60517055 1.372033238 0.5004424938 1.02620694 0.040954454 0.29262699 #> 23 -0.26008003 -0.568745802 -0.6787718653 -0.80087960 -0.677807444 0.29709458 #> 24 -0.25368946 0.524413969 -0.6787718653 0.62240882 0.619957094 -0.09605313 #> 25 0.79116796 -0.797468462 0.2779997852 0.79432287 -0.363349113 0.69024229 #> 26 -0.25368946 0.904497213 -0.5126098420 0.24659671 3.609806932 -0.80193198 #> 27 -0.88635543 0.278873467 -0.6760918326 -0.86884562 -0.153710227 -0.87788097 #> 28 -0.88955071 0.077059355 0.1895587082 0.22260871 -0.842523712 -0.87788097 #> 29 2.07567158 -0.518292274 -0.6760918326 0.31056473 0.445258021 -0.81086715 #> 30 -0.80647336 -0.784014188 -0.5903307884 -0.83286360 -0.932368949 -0.80639956 #> 31 1.79768199 -0.797468462 -0.6787718653 -0.82886560 -0.378323320 2.83915193 #> 32 -0.49333566 0.100604335 -0.6787718653 -0.25715144 -0.712747258 -0.54727948 #> 33 -0.79049695 0.917951487 0.3878811232 1.19812099 -0.647859031 0.67683952 #> 34 -0.37511020 1.028949248 -0.6787718653 0.02670665 -0.558013794 -0.87788097 #> 35 -0.15463570 -0.239116087 -0.5822906904 -0.64895555 0.585017279 0.01116897 #> 36 0.92536983 0.743045923 0.9480079437 2.52545738 0.470215032 -0.46239532 #> 37 2.33129419 0.325963426 -0.5849707231 -0.84485761 -0.897429135 1.27549626 #> 38 1.15862546 -0.787377757 -0.6787718653 -0.11722139 0.679853918 1.45419977 #> 39 -0.53806962 -0.373658828 0.0582371091 -0.35710146 -0.737704268 -0.31496493 #> 40 -0.56363188 -0.535110117 -0.5045697441 -0.02926537 0.555068867 -0.24348353 #> 41 -0.11309703 1.207218380 -0.0864846531 0.96623692 1.363676002 0.34177045 #> 42 2.76585263 -0.387113102 0.7014449414 -0.70492757 -0.892437732 0.98063548 #> 43 -0.62753753 -0.797468462 -0.6707317674 6.20761646 1.054209073 0.15859936 #> 44 -0.36552436 0.547958949 -0.6653717021 0.57043481 0.510146248 0.65896917 #> 45 0.01151899 -0.794104894 -0.6466114737 1.17413298 1.608254703 -0.85554303 #> 46 0.17128313 -0.555291528 0.0207166523 1.17813099 -0.233572660 2.34771729 #> 47 -0.75215356 0.036696533 0.9185275847 0.13865068 -0.298460887 0.34623804 #> 48 1.35034242 3.773621166 0.6022837339 -0.57299353 0.150765299 3.36185968 #> 49 -0.81286393 -0.784014188 -0.2526466764 -0.83686161 2.242162765 -0.80193198 #> 50 -0.60836584 0.574867497 -0.4214887324 -0.80087960 -0.148718825 -0.70364505 #> 51 0.16489256 2.014474827 -0.6787718653 0.28257872 2.297068188 -0.56514983 #> 52 0.63140383 0.161148568 0.2860398831 -0.21717142 0.400335403 -0.23454835 #> 53 0.38856234 -0.800832031 -0.6680517347 3.36503763 0.055928660 0.69917746 #> 54 -0.76173940 3.867801085 -0.6787718653 -0.23716143 -0.617910619 -0.87341338 #> 55 -0.54765547 -0.689834269 1.0686094123 -0.25315343 -0.792609691 -0.73045058 #> 56 2.00537536 1.789115735 -0.6787718653 -0.62496754 1.618237507 -0.87341338 #> 57 -0.78730166 -0.296296752 -0.1856458606 0.29857073 0.794656166 -0.46686291 #> 58 0.91258870 -0.800832031 -0.6734118000 -0.88483762 -0.867480722 -0.84660785 #> 59 -0.10990175 -0.565382234 -0.6760918326 0.78232887 0.150765299 -0.77065886 #> 60 -0.80966864 -0.403930944 0.2833598504 -0.08123938 -0.043899381 -0.18540489 #> 61 -0.27925173 -0.659562152 -0.5045697441 -0.58098953 -0.692781650 0.40431668 #> 62 0.74004343 -0.091119071 -0.6573316042 0.29857073 -0.423245938 0.12285866 #> 63 -0.33037625 -0.333296005 -0.4884895482 0.41451276 -0.742695670 -0.39538150 #> 64 -0.46777340 -0.054119818 1.5965758412 -0.61297354 -0.538048186 -0.44005738 #> 65 -0.80008280 -0.777287051 -0.5769306252 -0.73691158 -0.353366309 -0.82873750 #> 66 -0.72659129 0.450415461 1.5670954822 -0.02126937 -0.508099773 0.69470987 #> 67 -0.66588092 -0.696561406 1.1248900976 -0.75290358 -0.498116969 -0.62322847 #> 68 0.06583880 0.362962680 -0.6787718653 0.10666667 -0.218598454 0.82426992 #> 69 0.25755576 -0.575472939 0.0448369460 -0.42506748 -0.687790248 -0.66790435 #> 70 1.16821131 0.188057116 -0.1320452079 -0.08923539 -0.288478082 -0.12732625 #> 71 0.02430012 0.140967157 -0.6707317674 0.50646679 0.709802331 2.57556426 #> 72 0.12335389 -0.461111609 0.6451642560 -0.36109947 -0.273503876 -0.38197874 #> 73 -0.16741683 -0.175208285 -0.1722456974 -0.62896554 -0.558013794 0.16753454 #> 74 0.12335389 -0.040665543 -0.2392465133 0.19862070 0.020988846 -0.68130711 #> 75 -0.48055453 -0.683107132 -0.3116073944 -0.53701152 -0.188650041 -0.02457173 #> 76 -0.48694510 -0.804195599 -0.0302039678 -0.04525737 -0.518082577 0.55621466 #> 77 -0.84162147 -0.558655097 -0.6117710494 -0.80087960 0.205670722 -0.42218703 #> 78 0.86785474 0.053514375 0.1654384145 -0.88083962 2.322025198 -0.86001062 #> 79 -0.62114697 -0.498110863 -0.3089273618 -0.54500752 -0.712747258 -0.15413177 #> 80 -0.90233184 -0.797468462 -0.6760918326 -0.88483762 0.649905506 2.71405948 #> 81 -0.50611679 -0.716742817 -0.6707317674 -0.75290358 -0.852506516 1.21741763 #> 82 -0.08433949 -0.366931691 -0.6787718653 0.05869066 -0.328409299 -0.87788097 #> 83 0.65377080 -0.155026873 -0.6600116368 0.49847079 1.488461053 0.78406163 #> 84 -0.81925449 0.231783507 -0.6787718653 0.63040483 -0.308443691 -0.84660785 #> 85 -0.71381016 -0.753742071 -0.1427653384 -0.71692157 -0.882454928 -0.86894579 #> 86 -0.88316014 0.322599857 -0.6734118000 1.87378319 -0.533056784 1.00744101 #> 87 0.13293973 -0.477929452 -0.6707317674 -0.03326337 1.223916744 2.28070348 #> 88 -0.35274323 -0.400567376 -0.4482890587 -0.66494756 -0.418254536 -0.22114559 #> 89 -0.12587816 -0.141572599 0.5138426570 -0.60097954 -0.458185753 0.30602975 #> 90 1.82643953 -0.575472939 -0.3866483081 -0.34910546 -0.088822000 1.92776406 #> 91 -0.90233184 -0.804195599 -0.5983708863 -0.71692157 -0.707755856 -0.41325185 #> 92 -0.36871964 -0.494747295 -0.0516442289 -0.32111945 -0.702764454 -0.80193198 #> 93 -0.14824514 -0.800832031 0.0930775334 4.84030006 3.445090663 -0.35964080 #> 94 -0.86718373 -0.091119071 1.1329301955 0.40651676 0.305498763 -0.60089053 #> 95 -0.21215079 -0.380385965 0.2377992956 -0.53701152 -0.707755856 0.12732625 #> 96 -0.55724132 -0.464475178 -0.6787718653 -0.87284362 -0.538048186 -0.87788097 #> 97 3.31863652 3.736621913 -0.6734118000 0.94624692 2.671423343 -0.50707119 #> 98 -0.75215356 -0.535110117 -0.6787718653 -0.50102951 -0.013950969 1.64630604 #> 99 0.78477739 -0.804195599 0.2699596873 0.65039483 0.510146248 -0.67237194 #> 100 1.54844994 -0.800832031 -0.6787718653 -0.88483762 -0.667824639 -0.73938575 #> 101 2.03732818 -0.030574838 0.2511994588 -0.60897554 -0.098804804 1.15040381 #> 102 -0.88316014 2.815004136 -0.3330476555 1.66588713 -0.937360351 -0.31049734 #> 103 -0.41664888 2.848639821 -0.6787718653 1.03820094 -0.443211546 -0.15859936 #> 104 -0.39747718 0.262055624 0.0501970112 2.35754133 -0.268512474 -0.13179383 #> 105 -0.44860171 0.191420685 0.2404793283 0.11466267 -0.533056784 2.22262484 #> 106 -0.77771582 -0.730197092 0.8381266057 -0.82486760 -0.942351753 0.84214027 #> 107 -0.39428190 -0.020484132 1.2026110440 -0.52501751 -0.712747258 0.19434006 #> 108 0.83590191 -0.538473686 -0.3384077207 -0.50502751 -0.363349113 -0.66343676 #> 109 -0.28883757 0.066968649 1.6716167550 -0.57699153 -0.657841835 -0.76172369 #> 110 -0.73298186 -0.340023142 0.0475169786 -0.20517742 -0.707755856 -0.41325185 #> 111 -0.47096868 -0.518292274 -0.2794470028 -0.42906549 -0.043899381 -0.25688629 #> 112 -0.89913656 0.194784253 -0.6760918326 -0.46104950 -0.957325959 -0.87788097 #> 113 0.46524913 0.369689817 -0.6787718653 -0.41707148 0.530111856 -0.33283528 #> 114 -0.87357430 -0.582200076 -0.0007236088 -0.64095955 -0.702764454 -0.39091392 #> 115 -0.36232907 -0.363568122 -0.2499666438 -0.07324338 0.270558949 -0.10498831 #> 116 0.08501049 0.921315055 -0.3276875902 -0.61297354 0.125808289 2.32091177 #> 117 1.88395462 0.009787984 2.6230283401 -0.88083962 1.203951135 -0.81980233 #> 118 -0.89594127 1.405668923 1.9905406385 -0.46104950 -0.867480722 0.22561317 #> 119 -0.58599886 0.151057863 -0.6734118000 -0.60897554 1.628220311 0.02010414 #> 120 2.31851306 3.225359496 -0.3357276881 3.15314357 0.365395588 0.14519660 #> 121 -0.19617438 -0.713379249 0.2377992956 -0.79288359 -0.927377547 0.19434006 #> 122 0.04027654 -0.454384472 0.5084825917 0.21861070 0.020988846 -0.05584485 #> 123 -0.83203562 0.053514375 -0.6787718653 0.16663669 -0.882454928 -0.80193198 #> 124 1.19696885 0.201511390 0.0421569133 0.49447279 -0.632884825 0.15413177 #> 125 -0.02682440 -0.111300483 -0.6707317674 -0.08923539 -0.108787608 0.07371520 #> 126 -0.41984416 -0.521655843 -0.1508054363 -0.20917542 -0.113779010 0.12732625 #> 127 -0.31439983 -0.259297498 -0.6626916695 0.65039483 0.844570187 -0.73045058 #> 128 -0.50292151 2.169198979 0.0582371091 -0.09323339 1.193968331 -0.84214027 #> 129 -0.84162147 -0.171844716 -0.2338864480 -0.83686161 -0.822558104 -0.81980233 #> 130 -0.64670923 -0.370295259 -0.6787718653 -0.45705149 -0.682798846 -0.87788097 #> 131 -0.04599609 1.752116482 -0.6760918326 0.94624692 1.533383672 0.28815940 #> 132 0.82631607 -0.138209031 -0.6760918326 -0.44505749 0.160748104 -0.86894579 #> 133 0.43329630 0.235147076 3.7084415570 -0.53701152 -0.563005196 -0.46686291 #> 134 -0.27286116 -0.575472939 -0.0543242615 -0.45705149 -0.672816042 -0.47133049 #> 135 2.29934136 3.295994435 0.5835235055 1.39802105 -0.538048186 -0.72598299 #> 136 -0.49014038 -0.414021650 0.3369605031 -0.55300352 0.440266619 -0.20327524 #> 137 -0.07475364 -0.498110863 -0.1963659911 -0.53701152 -0.752678475 0.19434006 #> 138 -0.53167905 -0.420748787 -0.6787718653 -0.36909547 -0.882454928 0.56068225 #> 139 -0.87037901 -0.696561406 -0.3893283408 -0.35710146 0.919441218 -0.82873750 #> 140 -0.88955071 1.153401284 -0.1052448815 -0.87684162 1.832867796 -0.87788097 #> 141 -0.74256771 -0.646107878 -0.6787718653 0.21461270 -0.832540908 -0.68130711 #> 142 -0.81286393 -0.740287797 -0.1963659911 -0.83286360 -0.947343155 -0.77959404 #> 143 -0.52209321 -0.740287797 -0.4080885692 -0.70492757 -0.762661279 -0.87341338 #> 144 -0.78410638 -0.528382980 -0.0328840004 -0.74490758 -0.777635485 -0.53387672 #> 145 -0.81925449 -0.666289290 -0.6707317674 -0.88483762 -0.907411939 0.97170031 #> 146 -0.89594127 -0.625926467 -0.4992096788 -0.10122939 -0.243555464 -0.35070563 #> 147 1.67945653 -0.800832031 3.4377582610 -0.88483762 -0.927377547 -0.87788097 #> 148 -0.87357430 -0.350113848 -0.6760918326 -0.19318342 -0.847515114 0.23901594 #> 149 -0.14504986 3.423810040 -0.6573316042 -0.82886560 -0.937360351 -0.86894579 #> 150 -0.54765547 -0.686470701 -0.4911695809 -0.88083962 -0.957325959 -0.87788097 #> 151 0.95732265 -0.740287797 -0.6707317674 -0.52501751 -0.677807444 0.73045058 #> 152 0.12974445 0.023242259 0.2538794914 0.43050477 -0.852506516 -0.36410839 #> 153 -0.88316014 -0.760469208 -0.0570042941 -0.69693156 -0.083830598 -0.60089053 #> 154 -0.87676958 0.181329979 -0.6787718653 -0.58098953 -0.907411939 -0.74385334 #> 155 -0.22493192 0.299054878 -0.6760918326 0.60641682 1.154037115 2.12880550 #> 156 1.17460187 -0.185298990 3.1563548344 0.87028490 0.120816887 -0.01116897 #> 157 -0.85440260 -0.800832031 -0.6707317674 -0.87684162 -0.947343155 -0.87788097 #> 158 -0.26966588 -0.235752518 -0.6653717021 -0.02526737 -0.123761814 -0.39091392 #> 159 -0.77452053 -0.740287797 2.1754628902 -0.10122939 -0.822558104 -0.10945590 #> 160 -0.57002245 2.630007867 2.0468213238 2.70936544 -0.797601093 1.57035705 #> 161 -0.85440260 -0.269388203 1.2990922188 -0.24115943 0.001023237 0.19434006 #> 162 -0.75854412 0.164512137 -0.6787718653 -0.76889559 0.979338042 -0.51600637 #> 163 0.34063311 -0.269388203 1.1945709461 -0.40107948 0.020988846 0.41325185 #> 164 1.27046035 2.199471096 0.4656020696 -0.85685161 -0.303452289 -0.74832093 #> 165 -0.10031590 1.190400537 0.2243991325 0.93825092 -0.353366309 -0.50260361 #> 166 -0.89913656 -0.656198584 -0.6653717021 3.16913557 -0.957325959 -0.52047396 #> 167 -0.61156112 -0.639380741 -0.2044060890 -0.73691158 0.110834083 -0.74832093 #> 168 -0.89913656 -0.141572599 1.6823368855 0.02270865 2.371939219 -0.78406163 #> 169 0.02430012 0.282237035 1.0257288901 2.52145938 0.714793733 0.36857598 #> 170 -0.84162147 -0.804195599 -0.6760918326 -0.88483762 -0.952334557 6.31046751 #> 171 -0.58919414 -0.356840985 0.3021200789 -0.34110946 -0.338392103 0.06924761 #> 172 -0.45179699 -0.511565137 -0.0650443920 -0.63296355 -0.617910619 -0.69470987 #> 173 -0.61795169 -0.356840985 -0.4992096788 -0.30112945 -0.717738660 -0.81086715 #> 174 0.44927271 0.803590157 -0.6760918326 0.21861070 0.450249423 0.52940913 #> 175 0.46205385 -0.158390442 -0.6760918326 -0.47304350 -0.637876227 0.72598299 #> 176 0.81673022 0.019878690 -0.6734118000 -0.09723139 0.370386990 0.38644633 #> 177 -0.41345360 -0.659562152 -0.0757645226 -0.40107948 -0.862489320 0.01563656 #> 178 -0.90233184 -0.797468462 -0.6787718653 1.03420294 0.170730908 2.49514768 #> 179 -0.82884034 -0.252570361 4.7054136970 0.72635685 3.634763942 -0.78852921 #> 180 -0.47096868 -0.706652112 -0.6626916695 -0.68493756 -0.927377547 -0.44899256 #> 181 -0.02362912 -0.760469208 -0.3678880797 -0.82886560 -0.872472124 -0.81533474 #> 182 -0.74256771 -0.625926467 0.9506879764 -0.01727137 -0.727721464 0.10945590 #> 183 -0.22812720 1.583938055 -0.1910059258 0.38252875 -0.652850433 -0.01116897 #> 184 -0.72020073 -0.706652112 -0.6358913431 -0.68493756 -0.518082577 -0.54727948 #> 185 -0.25049418 0.938132898 -0.6787718653 0.13065468 -0.782626887 -0.16753454 #> 186 -0.80966864 -0.733560660 -0.2285263827 -0.86884562 -0.912403341 -0.81980233 #> 187 -0.89913656 -0.797468462 -0.6787718653 1.25409301 2.312042394 -0.86894579 #> 188 2.32490362 0.023242259 -0.6171311147 -0.87684162 -0.008959567 -0.83320509 #> 189 -0.89274599 -0.784014188 -0.3678880797 -0.88083962 -0.882454928 -0.84214027 #> 190 -0.31120455 -0.548564391 -0.3518078839 -0.39308348 -0.777635485 0.04690967 #> 191 2.05330460 0.921315055 0.9453279111 -0.26914544 -0.208615649 2.93297127 #> 192 0.66335665 0.070332218 -0.6787718653 0.32655674 -0.368340516 -0.35070563 #> 193 1.15862546 -0.131481894 -0.3062473291 -0.44905549 -0.563005196 0.07371520 #> 194 2.01815649 -0.121391188 -0.3625280144 -0.44505749 0.899475609 -0.36857598 #> 195 -0.59877999 -0.498110863 0.9346077806 -0.28113944 -0.767652681 -0.58302018 #> 196 -0.89913656 -0.457748041 6.7583186948 -0.02926537 0.035963052 -0.68577470 #> 197 -0.21215079 -0.757105640 -0.4429289935 0.58242881 -0.737704268 0.90468649 #> 198 0.46844441 3.948526730 -0.6600116368 1.81381317 0.609974290 -0.86894579 #> 199 2.88088280 -0.797468462 -0.6064109842 -0.01727137 3.969187880 0.98957066 #> 200 -0.20576023 0.147694294 -0.5126098420 -0.88083962 0.210662124 0.80193198 #> Otu00012 Otu00013 Otu00014 Otu00015 Otu00016 #> 1 -0.025998508 1.524303976 1.671766383 1.2574976512 -0.49503122 #> 2 0.413356123 -0.551251533 0.971673864 0.3058058591 -0.50331257 #> 3 -0.678984290 -0.980085316 0.007910136 -0.6658514951 -0.47570808 #> 4 -0.654799631 -0.842858506 -0.542162557 -0.4795062491 -0.50055212 #> 5 -0.674953513 0.517974032 -0.792195600 -0.9653349262 0.34414511 #> 6 -0.267845094 1.215543652 2.385497069 0.4056336694 -0.10856844 #> 7 -0.638676525 0.323569384 -0.737642936 -0.4928166238 -0.39841553 #> 8 -0.058244719 0.294980465 -0.437603285 -0.6059548089 3.78366388 #> 9 -0.038090837 -0.219620075 0.953489643 -0.9453693641 -0.32940431 #> 10 0.183601866 -0.699913911 -0.751281102 -0.8921278653 0.16471596 #> 11 0.425448452 0.392182789 -0.792195600 0.6585307890 -0.50331257 #> 12 -0.211414224 -0.911471911 2.280937796 0.0861846763 1.72712978 #> 13 -0.199321895 -0.534098182 0.485245945 -0.7457137434 -0.49503122 #> 14 -0.610461090 -0.671324993 0.958035698 0.1327709878 -0.12237068 #> 15 -0.501630127 -0.717067263 0.262489234 -0.0136431341 -0.47294763 #> 16 -0.509691680 -0.339693534 -0.769465323 -0.5260925606 -0.19690279 #> 17 -0.404891492 -0.008062075 1.294443792 -0.4795062491 -0.50331257 #> 18 -0.582245655 -0.162442237 1.358088566 1.4970843961 0.02669354 #> 19 -0.457291586 -0.791398452 0.812561928 1.1044283420 -0.37633194 #> 20 -0.594337985 0.643765275 -0.010274085 0.0928398636 1.70504620 #> 21 -0.707199725 0.020526844 -0.792195600 -0.9520245515 -0.47846853 #> 22 -0.030029284 0.312133816 -0.342136123 2.6883629331 0.29997793 #> 23 0.683418144 -0.585558236 0.262489234 0.8981175339 -0.50331257 #> 24 0.687448920 0.981114517 -0.783103489 -0.2066435675 0.39935408 #> 25 0.598771839 -0.231055642 1.771779600 -0.4329199376 -0.36252970 #> 26 -0.654799631 -0.379718020 0.635265770 -0.7390585561 0.90727659 #> 27 -0.767661371 -1.008674235 -0.792195600 -0.9520245515 -0.48674988 #> 28 0.107017114 -0.705631695 -0.701274494 -0.7257481813 -0.49779167 #> 29 0.175540314 -0.448331426 -0.692182383 -0.6725066825 -0.48122898 #> 30 -0.699138172 -0.934343046 0.080647021 -0.8521967411 -0.50331257 #> 31 -0.328306740 0.060551330 1.680858494 -0.9586797389 -0.24383041 #> 32 -0.650768855 0.357876086 -0.792195600 1.2308769018 -0.04507812 #> 33 2.170774647 1.209825868 -0.387596677 -0.2865058157 -0.50331257 #> 34 0.119109444 -1.014392019 -0.792195600 -0.0668846329 -0.30456028 #> 35 -0.142891024 -0.168160021 2.430957622 -0.1800228180 0.33586376 #> 36 -0.529845562 2.467738298 -0.778557434 -0.5726788721 2.28474037 #> 37 -0.618522643 1.204108084 0.894390924 -0.0202983214 -0.20242369 #> 38 -0.723322831 0.500820681 -0.792195600 -0.9120934274 2.25437544 #> 39 -0.308152858 0.923936680 -0.164839966 -0.0469190709 -0.01471319 #> 40 0.687448920 -0.785680668 1.117147634 0.1327709878 -0.50055212 #> 41 1.594373617 1.095470193 -0.687636328 2.3422931905 -0.05888037 #> 42 -0.437137704 -0.762809533 -0.196662353 -0.5527133100 -0.36529014 #> 43 -0.046152390 1.850217651 -0.787649545 0.3723577327 -0.13893337 #> 44 0.542340969 -0.088111048 0.062462800 -0.2931610031 -0.45914539 #> 45 -0.416983822 0.849605491 -0.096649136 -0.8721623032 0.21164358 #> 46 2.731052571 0.180624789 -0.487609894 1.3173943374 -0.50331257 #> 47 1.062311128 0.489385113 0.594351273 0.1727021119 -0.13617292 #> 48 -0.066306272 3.559834998 -0.628537609 -0.9520245515 -0.18586099 #> 49 -0.646738078 -0.871447425 0.553436775 -0.8056104296 -0.18862144 #> 50 -0.130798695 -0.705631695 1.571753166 -0.4595406870 -0.50331257 #> 51 0.006247703 -0.145288886 -0.778557434 -0.3730232514 0.66435713 #> 52 0.489940875 -0.414024723 0.489792000 3.0677086125 0.06533981 #> 53 -0.622553420 0.695225329 -0.687636328 -0.9520245515 -0.50055212 #> 54 3.315515153 -0.997238668 -0.778557434 -0.9387141768 0.70024296 #> 55 -0.570153326 -0.734220614 1.735411158 -0.4528854997 -0.35424835 #> 56 -0.098552484 2.307640352 -0.783103489 0.8714967845 1.32410431 #> 57 -0.683015066 -0.677042776 0.026094358 -0.0202983214 -0.11961023 #> 58 -0.739445936 -1.014392019 -0.783103489 -0.9586797389 -0.48122898 #> 59 0.240032736 1.221261436 0.048824634 -0.3530576893 0.45732350 #> 60 -0.545968668 0.878194409 0.471607779 0.5986341028 -0.23830952 #> 61 -0.678984290 -0.053804345 -0.792195600 -0.2332643169 -0.01747364 #> 62 -0.683015066 -0.791398452 -0.037550417 -0.4196095629 -0.32112297 #> 63 -0.009875402 -0.757091749 -0.733096881 -0.2399195042 -0.13617292 #> 64 -0.578214879 -0.596993804 -0.787649545 -0.9054382400 -0.50331257 #> 65 -0.755569042 -1.014392019 -0.483063838 -0.9387141768 -0.27143490 #> 66 2.138528435 1.753015327 -0.623991553 -0.8721623032 -0.28799759 #> 67 -0.497599350 -0.368282453 -0.483063838 -0.7656793055 0.66435713 #> 68 -0.352491399 -0.528380398 1.198976630 -0.0003327594 0.05429802 #> 69 -0.102583260 -0.396871372 -0.792195600 2.5352936239 -0.49779167 #> 70 -0.421014598 -0.385435804 1.417187285 2.0228441973 -0.48951032 #> 71 -0.489537798 1.158365814 0.357956396 -0.1800228180 -0.50331257 #> 72 -0.574184103 0.003373492 -0.792195600 2.9346048653 -0.12513113 #> 73 -0.545968668 -0.133853318 0.862568536 1.2042561523 -0.28799759 #> 74 -0.203352671 0.071986898 0.467061724 1.5370155202 0.84102583 #> 75 -0.493568574 -0.351129101 2.640076167 0.0262879901 -0.48674988 #> 76 2.344098033 -1.014392019 0.953489643 -0.6192651836 -0.27143490 #> 77 -0.654799631 -0.494073696 -0.778557434 -0.1999883801 -0.48951032 #> 78 2.194959305 -1.014392019 -0.787649545 3.3339161068 -0.50055212 #> 79 -0.489537798 -0.768527317 0.621627604 0.4854959177 0.23648762 #> 80 -0.731384383 2.416278244 -0.787649545 -0.9387141768 -0.50331257 #> 81 -0.719292054 -0.762809533 -0.437603285 0.6452204143 -0.31836252 #> 82 -0.558060997 0.346440519 -0.792195600 -0.9653349262 -0.50331257 #> 83 -0.574184103 0.986832301 -0.210300519 1.8431541387 -0.01747364 #> 84 0.514125534 -0.842858506 -0.587623111 -0.9520245515 -0.50055212 #> 85 -0.650768855 -0.814269587 -0.469425672 -0.1667124433 -0.50331257 #> 86 0.514125534 -1.014392019 -0.792195600 -0.9387141768 -0.06992216 #> 87 -0.392799163 0.552280735 1.153516077 1.2841184006 -0.43706180 #> 88 -0.441168480 -0.471202561 -0.792195600 0.6052892902 -0.41221777 #> 89 -0.231568106 0.134882519 1.435371507 -0.1334365065 -0.50331257 #> 90 0.280340501 3.136718999 0.989858085 -0.1134709444 -0.50331257 #> 91 -0.674953513 0.026244628 -0.792195600 0.8648415971 -0.47294763 #> 92 -0.320245187 0.043397979 0.639811826 2.3223276284 -0.44534315 #> 93 1.646773711 -0.133853318 -0.792195600 0.5387374166 -0.23002817 #> 94 -0.610461090 -0.842858506 0.357956396 -0.6858170572 1.29926027 #> 95 -0.340399070 -0.516944831 2.621891945 -0.4728510618 -0.47294763 #> 96 -0.767661371 -1.014392019 -0.792195600 -0.7190929940 -0.44534315 #> 97 -0.263814317 2.730756352 -0.792195600 -0.6458859330 1.69400440 #> 98 -0.287998976 -0.196748940 1.176246353 -0.2066435675 -0.16929830 #> 99 5.508257532 1.512868408 -0.769465323 -0.8721623032 -0.43154091 #> 100 -0.751538266 -1.014392019 -0.783103489 -0.0003327594 -0.50055212 #> 101 -0.582245655 0.112011384 -0.764919268 0.2459091729 3.51866083 #> 102 -0.203352671 -0.728502830 -0.755827157 -0.9520245515 -0.48122898 #> 103 2.106282224 -0.196748940 -0.701274494 0.5786685407 -0.50331257 #> 104 -0.421014598 0.134882519 -0.792195600 -0.9453693641 1.54494019 #> 105 -0.263814317 0.300698249 2.976484260 0.1660469246 -0.50331257 #> 106 -0.683015066 -0.202466723 0.903483034 -0.6725066825 0.38279139 #> 107 -0.267845094 -0.202466723 -0.251215017 1.0578420305 -0.18034010 #> 108 0.312586712 -0.276797912 -0.787649545 1.0179109063 -0.44810360 #> 109 0.115078667 -0.522662615 -0.751281102 -0.6325755583 -0.50055212 #> 110 -0.646738078 -0.133853318 -0.651267885 -0.6658514951 -0.07268261 #> 111 -0.570153326 -0.516944831 2.126371915 0.3989784821 -0.01195274 #> 112 0.288402054 -0.322540183 -0.792195600 1.1510146535 -0.40393642 #> 113 -0.412953045 -0.665607209 0.685272379 2.0960512583 -0.41773866 #> 114 -0.662861184 -0.762809533 -0.664906051 0.6252548522 -0.26867445 #> 115 -0.433106927 -0.333975750 1.989990256 1.0844627799 -0.28523714 #> 116 -0.392799163 -0.030933210 -0.646721830 0.4056336694 -0.20794458 #> 117 -0.425045375 -0.591276020 -0.792195600 -0.7656793055 0.21716448 #> 118 -0.521784009 -0.282515696 0.271581345 -0.1933331927 0.04049578 #> 119 0.151355655 -0.625582722 2.549155060 1.6434985179 -0.50055212 #> 120 -0.231568106 0.603740788 -0.792195600 -0.8588519285 0.26409210 #> 121 -0.703168948 -0.848576290 -0.133017579 -0.3197817525 -0.50055212 #> 122 0.941387835 1.284157057 0.062462800 1.2109113397 -0.27971624 #> 123 -0.594337985 -1.014392019 -0.410326953 -0.7324033687 -0.49227077 #> 124 -0.493568574 1.186954733 0.307949787 2.1958790686 2.14947840 #> 125 0.933326283 0.409336140 -0.573984945 0.8781519718 -0.46466629 #> 126 -0.421014598 0.746685383 1.939983647 0.2392539855 -0.48122898 #> 127 -0.296060529 -0.728502830 -0.092103081 -0.5460581227 -0.47294763 #> 128 -0.723322831 -0.882882992 -0.724004770 -0.9187486147 -0.45914539 #> 129 0.006247703 -0.968649749 -0.323951902 -0.7856448676 -0.36529014 #> 130 -0.404891492 -0.568404885 2.108187694 -0.8388863664 -0.50331257 #> 131 0.058647797 -0.242491210 -0.351228234 0.7982897235 0.86034897 #> 132 -0.445199257 1.524303976 -0.787649545 0.4389096062 -0.13065203 #> 133 0.264217395 0.129164735 -0.605807332 -0.7923000549 -0.20242369 #> 134 -0.199321895 -0.151006669 3.244701524 0.1527365499 -0.50331257 #> 135 0.393202241 4.720545104 -0.783103489 -0.7324033687 -0.32388342 #> 136 0.123140220 -0.002344291 -0.273945294 0.4189440442 -0.36805059 #> 137 -0.038090837 0.792427653 1.785417766 -0.9453693641 -0.50331257 #> 138 3.795177548 -0.145288886 1.271713515 0.5919789155 -0.50331257 #> 139 -0.723322831 -0.934343046 -0.623991553 -0.8322311791 1.30478117 #> 140 0.824495319 -1.008674235 1.008042307 1.8564645134 -0.49503122 #> 141 0.868833860 -0.213902291 -0.442149340 -0.7324033687 -0.50331257 #> 142 -0.735415160 -0.962931965 -0.037550417 -0.8521967411 -0.45362449 #> 143 -0.723322831 -0.922907479 0.671634213 -0.7590241181 -0.30732073 #> 144 -0.598368761 -0.562687101 -0.696728438 0.1527365499 -0.35424835 #> 145 -0.658830408 -1.002956451 -0.783103489 3.7132617861 -0.41497822 #> 146 -0.638676525 -0.837140722 -0.783103489 -0.7457137434 -0.50331257 #> 147 -0.634645749 -1.008674235 -0.787649545 -0.9653349262 -0.50055212 #> 148 -0.715261278 -0.837140722 0.507976221 -0.8189208043 -0.11408933 #> 149 0.921233953 -0.940060830 -0.423965119 -0.8921278653 -0.50331257 #> 150 -0.106614037 -1.014392019 -0.792195600 -0.9653349262 -0.50331257 #> 151 -0.416983822 -0.408306939 -0.223938685 -0.3131265652 -0.42049911 #> 152 3.017237697 0.180624789 -0.546708613 0.4122888568 -0.41773866 #> 153 -0.566122550 -0.922907479 2.344582571 0.1993228614 -0.50331257 #> 154 -0.344429846 -1.014392019 -0.664906051 -0.9586797389 1.93140297 #> 155 1.134865104 -0.614147155 -0.783103489 1.1310490914 -0.45638494 #> 156 1.219511409 -0.419742507 -0.319405847 -0.9586797389 -0.44534315 #> 157 -0.767661371 1.890242137 -0.783103489 -0.9653349262 -0.49779167 #> 158 4.012839476 2.439149379 -0.351228234 0.1727021119 -0.49779167 #> 159 0.514125534 -0.968649749 -0.787649545 -0.8255759917 0.72232655 #> 160 0.485910099 0.929654463 -0.583077055 -0.4994718112 -0.16377741 #> 161 -0.715261278 1.106905760 -0.792195600 0.6984619132 -0.50331257 #> 162 -0.731384383 0.603740788 -0.792195600 1.6368433306 0.95144377 #> 163 -0.594337985 0.780992085 -0.687636328 0.0129776153 -0.48674988 #> 164 -0.545968668 0.060551330 -0.528524391 1.2907735880 -0.49227077 #> 165 -0.477445468 2.216155812 -0.787649545 -0.6791618698 2.69604719 #> 166 -0.646738078 -1.008674235 -0.792195600 -0.9653349262 -0.49503122 #> 167 -0.529845562 -0.431178074 0.017002247 0.9912901569 -0.45914539 #> 168 0.961541718 -1.002956451 -0.792195600 -0.8987830526 -0.49503122 #> 169 0.308555936 -0.682760560 -0.746735047 -0.8189208043 0.49596977 #> 170 -0.634645749 -1.008674235 -0.419419064 -0.9387141768 -0.02299454 #> 171 -0.469383915 -0.499791479 2.426411566 0.0861846763 -0.38185283 #> 172 0.183601866 -0.871447425 -0.755827157 -0.6991274319 8.63929272 #> 173 -0.191260342 -0.854294073 -0.792195600 -0.9520245515 1.62499319 #> 174 1.155018986 -0.299669047 -0.787649545 0.0395983648 -0.38737373 #> 175 0.227940407 0.981114517 0.021548302 0.7117722879 -0.32112297 #> 176 -0.384737610 0.186342573 -0.774011379 -0.9254038021 -0.50331257 #> 177 -0.541937891 -0.791398452 0.785285596 0.2126332361 -0.50331257 #> 178 1.183234421 0.352158303 -0.701274494 0.5254270419 1.07566395 #> 179 -0.235598882 -0.213902291 -0.792195600 -0.9320589894 -0.50055212 #> 180 -0.751538266 -0.677042776 -0.787649545 0.8714967845 -0.23830952 #> 181 -0.122737142 -0.728502830 -0.628537609 0.0994950510 -0.50055212 #> 182 -0.150952577 -0.048086562 -0.714912660 -0.6791618698 -0.44534315 #> 183 -0.469383915 0.094858033 -0.533070447 0.3257714212 0.23372717 #> 184 -0.654799631 -0.877165208 -0.619445498 -0.2399195042 -0.40669687 #> 185 -0.271875870 0.060551330 -0.787649545 -0.2665402537 -0.50331257 #> 186 -0.715261278 -0.962931965 0.648903936 2.6218110595 -0.48951032 #> 187 1.803973992 0.918218896 -0.655813940 -0.9653349262 3.58767204 #> 188 -0.545968668 0.415053924 -0.792195600 -0.8721623032 2.44484638 #> 189 -0.038090837 -0.940060830 -0.660359996 -0.8455415538 -0.50331257 #> 190 -0.638676525 -0.333975750 0.007910136 0.3856681074 0.21992493 #> 191 0.078801679 2.730756352 -0.678544217 -0.7324033687 -0.48674988 #> 192 -0.416983822 0.094858033 -0.792195600 -0.9586797389 -0.48398943 #> 193 -0.400860716 1.152648031 2.117279805 -0.1667124433 0.36070780 #> 194 4.726286904 -0.191031156 -0.683090272 -0.7190929940 0.57602278 #> 195 -0.154983354 -0.516944831 2.149102192 -0.2598850663 -0.41221777 #> 196 0.631018050 0.317851600 -0.792195600 -0.9653349262 -0.50331257 #> 197 1.195326751 0.826734356 0.821654039 -0.7390585561 -0.50331257 #> 198 -0.719292054 2.136106839 -0.792195600 -0.6458859330 -0.13341247 #> 199 -0.497599350 1.381359381 0.280673455 -0.8056104296 0.18403910 #> 200 -0.283968200 1.124059112 0.703456600 1.6501537053 -0.44258270 #> Otu00017 Otu00018 Otu00019 Otu00020 Otu00021 Otu00022 #> 1 0.47611468 0.399615523 0.55293856 0.554816232 -0.35537010 1.647612103 #> 2 -0.32110972 -0.679309939 0.61541514 -0.360008658 0.15159833 -0.375705829 #> 3 0.49083266 -0.679309939 -0.13846893 -0.529188603 -0.63100342 -0.081618920 #> 4 -0.26714376 0.030253653 0.08644676 -0.266019799 0.74224116 -0.187490207 #> 5 -0.52961456 -0.674449915 -0.64244668 -0.685836701 -0.63100342 -0.367863511 #> 6 3.30687454 -0.008626544 -0.08432256 0.172594874 0.78161735 -0.356100035 #> 7 -0.50263159 -0.518929127 -0.52165862 -0.403870125 -0.63100342 -0.026722697 #> 8 -0.53452056 0.419055622 0.69871725 -0.027914691 -0.60639331 -0.207096001 #> 9 1.29296306 -0.679309939 0.29053693 -0.673304853 -0.63100342 0.141887131 #> 10 -0.52225557 -0.436308709 -0.03017619 0.918239819 -0.52271890 -0.281598018 #> 11 -0.53452056 -0.679309939 -0.35921951 1.005962753 -0.63100342 -0.383548146 #> 12 2.65928302 -0.664729865 -0.21344082 -0.641975234 0.46660784 -0.273755700 #> 13 -0.44375963 -0.650149792 -0.64244668 -0.522922680 -0.62608140 -0.371784670 #> 14 0.94709032 -0.120407110 -0.34255909 -0.479061212 3.60193686 -0.277676859 #> 15 0.68216652 -0.280787922 -0.30923825 -0.585581919 -0.11911297 -0.360021194 #> 16 -0.53452056 2.304745168 -0.35921951 1.087419764 -0.62608140 -0.301203812 #> 17 2.23246135 -0.674449915 -0.23426635 -0.535454527 0.23035070 -0.340415400 #> 18 1.37881799 0.146894244 0.02813529 -0.165765017 0.69302092 -0.163963254 #> 19 0.70914950 0.137174194 0.40299477 -0.159499093 -0.16341118 0.185019877 #> 20 -0.50508458 2.960848490 -0.39670546 -0.234690180 -0.61623735 0.628110819 #> 21 -0.53452056 -0.664729865 -0.63828157 -0.679570777 -0.62115938 -0.379626987 #> 22 -0.53206756 0.224654637 0.28637182 0.673868786 -0.47842069 -0.367863511 #> 23 -0.53452056 0.278114908 0.60291983 2.033574274 -0.63100342 -0.003195744 #> 24 -0.52716157 -0.674449915 -0.64244668 -0.485327136 -0.62115938 -0.379626987 #> 25 -0.35299870 1.157779362 0.69455215 0.254051885 0.41738760 0.185019877 #> 26 2.12943543 0.900198058 -0.44668673 -0.604379690 -0.23231951 -0.352178876 #> 27 -0.53452056 -0.669589890 -0.64244668 -0.685836701 -0.63100342 -0.379626987 #> 28 -0.53452056 -0.679309939 5.46359780 2.321806774 -0.63100342 -0.336494241 #> 29 -0.51489658 -0.674449915 -0.38004504 0.442029602 -0.63100342 -0.293361494 #> 30 1.07709922 -0.679309939 4.20990108 -0.660773005 -0.29630582 -0.367863511 #> 31 -0.53452056 -0.023206617 -0.55081436 -0.585581919 -0.62115938 1.173151890 #> 32 0.40252473 -0.314808094 -0.56330968 -0.441465669 -0.63100342 0.604583867 #> 33 -0.53452056 -0.679309939 0.01980508 -0.071776158 -0.56701712 -0.379626987 #> 34 -0.53452056 -0.679309939 -0.64244668 -0.679570777 1.28366375 0.216389147 #> 35 0.31176380 -0.188447454 -0.18428509 -0.585581919 -0.26677368 -0.383548146 #> 36 -0.51980257 4.146694494 -0.57997010 -0.554252299 -0.63100342 -0.371784670 #> 37 1.22673211 0.389895474 -0.24676167 -0.660773005 -0.02559452 -0.152199778 #> 38 -0.53452056 -0.674449915 -0.63411647 -0.259753876 -0.61131533 -0.375705829 #> 39 -0.53452056 0.176054391 -0.49250288 -0.447731593 -0.53748498 -0.352178876 #> 40 2.04358049 -0.674449915 0.93612826 -0.197094636 0.03346976 -0.261992224 #> 41 0.24553285 0.559996335 -0.24676167 2.240349763 -0.62608140 -0.379626987 #> 42 -0.46093062 -0.329388168 -0.23843146 -0.410136049 1.79063218 -0.332573082 #> 43 -0.46093062 0.219794613 -0.64244668 -0.685836701 -0.62115938 -0.375705829 #> 44 1.26843308 0.195494490 1.00693505 -0.510390832 -0.60639331 0.024252367 #> 45 0.51536265 -0.679309939 -0.57997010 -0.240956104 -0.38982427 -0.379626987 #> 46 -0.50753758 -0.402288537 -0.17178977 -0.190828713 -0.62115938 -0.332573082 #> 47 0.75820946 -0.679309939 -0.54664925 0.078606015 0.89974591 -0.348257717 #> 48 -0.53452056 -0.105827036 0.02813529 3.430875305 -0.58670521 -0.328651923 #> 49 1.34692902 -0.343968241 -0.55081436 -0.610645614 0.80622746 0.024252367 #> 50 3.17195964 2.469986005 -0.22177104 -0.547986375 1.48054469 -0.367863511 #> 51 -0.53206756 -0.679309939 -0.41336588 0.968367210 -0.62608140 -0.265913383 #> 52 0.13514793 -0.207887552 -0.11347830 -0.529188603 0.72747509 -0.363942352 #> 53 -0.36526369 -0.679309939 -0.64244668 -0.598113766 -0.40951236 -0.360021194 #> 54 -0.53452056 -0.664729865 -0.36754972 -0.353742734 -0.55225105 0.094833225 #> 55 2.23491435 -0.368268364 0.18224419 -0.522922680 0.82099353 -0.254149906 #> 56 -0.51244358 0.885617984 -0.64244668 2.722825904 -0.49810879 -0.375705829 #> 57 -0.48055460 -0.431448684 -0.32173356 -0.366274582 0.53059414 -0.312967288 #> 58 -0.51734957 -0.679309939 -0.62995136 -0.679570777 -0.63100342 -0.363942352 #> 59 -0.51980257 -0.363408340 0.80700999 0.003414929 0.45184176 1.631927468 #> 60 0.14005393 1.138339263 -0.05100172 0.028478624 -0.38490224 -0.332573082 #> 61 -0.53452056 -0.679309939 -0.03434129 -0.472795288 -0.62608140 -0.383548146 #> 62 -0.03901494 -0.679309939 -0.55914457 -0.598113766 1.13108102 -0.301203812 #> 63 -0.52225557 0.788417492 -0.36754972 -0.303615343 -0.62608140 -0.363942352 #> 64 -0.53452056 -0.159287306 -0.09681787 1.156344927 -0.24216356 -0.132593984 #> 65 -0.47810160 -0.679309939 1.00276994 -0.616911538 -0.63100342 -0.171805572 #> 66 -0.53452056 -0.674449915 1.28183200 0.636273243 0.37308939 -0.332573082 #> 67 -0.48546060 -0.562669349 -0.35505441 -0.347476810 -0.62608140 -0.246307589 #> 68 -0.53206756 -0.008626544 -0.49250288 -0.052978387 -0.63100342 -0.293361494 #> 69 -0.53452056 -0.669589890 1.39845495 -0.491593060 -0.01575048 -0.258071065 #> 70 3.36819949 1.269559928 -0.62995136 -0.623177462 1.17045721 0.008567732 #> 71 0.32402879 -0.679309939 -0.20511061 -0.479061212 -0.55717307 0.012488891 #> 72 -0.53452056 0.321855129 1.36513411 0.141265254 -0.63100342 0.290891164 #> 73 1.25862108 0.083713924 -0.64244668 -0.134435397 2.44033929 0.118360178 #> 74 0.65273054 -0.679309939 1.11939289 -0.410136049 -0.25692963 -0.297282653 #> 75 2.94383081 -0.679309939 0.50295730 -0.372540506 1.28366375 -0.367863511 #> 76 1.98716153 1.775002486 -0.03017619 -0.397604201 -0.62608140 -0.379626987 #> 77 -0.29903274 -0.679309939 -0.50499820 -0.648241158 2.05149943 0.761430218 #> 78 -0.53452056 0.195494490 -0.64244668 -0.685836701 0.71763104 0.204625671 #> 79 0.99615028 -0.275927897 -0.24676167 -0.554252299 0.07776797 -0.371784670 #> 80 -0.53206756 -0.679309939 6.88389873 -0.679570777 -0.62608140 -0.383548146 #> 81 0.06646398 0.005953530 -0.36754972 -0.629443386 -0.63100342 -0.277676859 #> 82 -0.28186175 -0.674449915 -0.64244668 0.128733407 4.36977254 -0.046328491 #> 83 0.49573866 0.200354514 -0.55914457 -0.491593060 0.13683226 -0.344336558 #> 84 -0.53452056 -0.674449915 -0.64244668 -0.178296865 -0.62608140 7.537192593 #> 85 -0.53206756 -0.664729865 -0.64244668 -0.685836701 -0.63100342 -0.316888447 #> 86 -0.53452056 2.192964602 1.78164465 -0.679570777 -0.63100342 -0.234544113 #> 87 0.40743073 -0.475188906 -0.28008251 -0.422667897 0.31894713 0.377156657 #> 88 -0.53452056 -0.193307479 -0.05100172 -0.090573930 2.66183035 0.702612836 #> 89 -0.24016078 -0.679309939 0.47380156 0.254051885 -0.46857665 1.141782620 #> 90 -0.53452056 -0.679309939 -0.47167736 0.924505743 -0.63100342 0.561451120 #> 91 -0.29412674 -0.679309939 -0.64244668 -0.497858984 -0.62608140 -0.379626987 #> 92 -0.53452056 -0.679309939 0.44048072 -0.504124908 -0.62608140 -0.371784670 #> 93 -0.53452056 -0.679309939 0.27387650 1.782937318 -0.63100342 -0.383548146 #> 94 -0.53452056 2.601206669 1.18603458 -0.259753876 -0.08958083 -0.250228748 #> 95 3.55708035 -0.664729865 1.49008727 -0.598113766 1.48546672 -0.211017160 #> 96 -0.46828961 -0.655009816 -0.64244668 -0.679570777 4.06952910 0.020331208 #> 97 -0.53452056 -0.679309939 -0.45501694 -0.667038929 -0.62608140 -0.383548146 #> 98 0.78519244 -0.455748807 -0.05516682 -0.103105778 -0.63100342 -0.281598018 #> 99 -0.53452056 -0.669589890 3.29774300 0.354306667 -0.62608140 -0.383548146 #> 100 -0.53206756 -0.679309939 -0.52582373 0.147531178 -0.60639331 -0.383548146 #> 101 -0.40451166 1.002258574 -0.63411647 -0.065510234 1.30335184 -0.371784670 #> 102 -0.52225557 -0.679309939 -0.45918204 -0.604379690 -0.63100342 -0.379626987 #> 103 -0.43885363 2.800467678 -0.10514809 0.166328950 -0.62115938 -0.383548146 #> 104 -0.53452056 0.161474318 -0.52165862 -0.178296865 -0.61131533 0.549687644 #> 105 2.59305208 -0.674449915 0.31552756 -0.529188603 0.41246558 0.345787387 #> 106 1.42787796 -0.679309939 1.39012474 -0.673304853 0.20574059 -0.301203812 #> 107 -0.53452056 -0.188447454 0.50712240 -0.272285723 0.61919057 2.274997508 #> 108 -0.25978477 0.681496950 0.22389524 0.222722265 -0.62608140 1.337840559 #> 109 -0.52470857 -0.217607602 2.99785542 2.096233513 -0.60639331 -0.352178876 #> 110 -0.50263159 -0.382848438 -0.41336588 -0.203360560 -0.61623735 -0.269834542 #> 111 -0.53206756 -0.421728635 -0.62578626 -0.416401973 -0.62608140 -0.199253683 #> 112 -0.21072481 -0.669589890 -0.64244668 0.454561450 -0.62608140 -0.383548146 #> 113 -0.53452056 -0.032926667 -0.41336588 0.053542320 2.00227919 -0.316888447 #> 114 -0.40941766 -0.412008586 -0.06349703 -0.491593060 -0.54240700 0.286970005 #> 115 -0.53206756 0.054553776 -0.08848766 -0.052978387 -0.43412248 -0.128672825 #> 116 -0.45111862 1.211239632 0.01147487 0.015946776 0.82591556 -0.336494241 #> 117 -0.53452056 -0.013486568 0.57792920 -0.685836701 -0.39966831 -0.371784670 #> 118 -0.16902384 -0.465468857 0.42798540 0.028478624 0.34847927 0.094833225 #> 119 -0.53452056 -0.679309939 0.72370788 1.739075850 -0.63100342 -0.383548146 #> 120 -0.53452056 0.244094736 -0.21344082 -0.159499093 -0.63100342 -0.383548146 #> 121 -0.52716157 -0.679309939 -0.44252162 -0.679570777 -0.23724154 -0.383548146 #> 122 -0.53452056 -0.679309939 0.23639056 -0.522922680 0.03346976 -0.383548146 #> 123 -0.53452056 4.550076536 -0.48417267 1.544832209 -0.56701712 -0.340415400 #> 124 -0.53206756 -0.421728635 -0.48833778 0.009680852 -0.15356714 -0.352178876 #> 125 -0.48055460 -0.139847208 -0.13846893 -0.215892408 -0.63100342 -0.375705829 #> 126 -0.53452056 -0.309948069 -0.03017619 0.141265254 0.65364473 -0.348257717 #> 127 -0.47319561 -0.596689521 -0.45085183 -0.516656756 1.18522328 -0.156120937 #> 128 -0.49772559 1.687522044 -0.63828157 -0.140701321 -0.63100342 -0.332573082 #> 129 0.10571196 0.919638156 -0.57580499 2.716559980 0.73239711 -0.238465271 #> 130 1.58486984 -0.023206617 0.17391397 -0.660773005 -0.63100342 -0.383548146 #> 131 -0.51489658 0.419055622 -0.64244668 0.084871939 -0.25200761 -0.301203812 #> 132 -0.52470857 -0.669589890 1.18186948 -0.604379690 -0.54732902 -0.379626987 #> 133 -0.53452056 0.030253653 0.86115636 -0.234690180 -0.52764093 -0.285519177 #> 134 3.26762657 -0.650149792 0.57376409 -0.485327136 1.72172385 -0.328651923 #> 135 -0.53452056 0.880757959 1.11106268 2.478454871 -0.59654926 -0.324730765 #> 136 0.11552395 -0.679309939 -0.13430382 -0.547986375 0.70778699 0.118360178 #> 137 -0.53452056 -0.679309939 -0.64244668 -0.667038929 -0.61623735 -0.379626987 #> 138 -0.53206756 -0.460608832 0.26138119 -0.685836701 4.39438266 0.032094685 #> 139 0.17439590 0.380175425 -0.54248415 -0.109371702 -0.62115938 -0.324730765 #> 140 -0.52716157 -0.674449915 -0.63411647 -0.259753876 0.83083758 -0.265913383 #> 141 -0.53452056 0.428775671 0.59042451 -0.009116919 0.05807988 0.141887131 #> 142 -0.37262268 -0.523789152 -0.56330968 -0.673304853 0.61919057 2.714167291 #> 143 -0.53452056 -0.538369226 -0.35921951 -0.109371702 -0.61623735 -0.277676859 #> 144 -0.49527259 0.973098427 -0.53831904 0.786655417 -0.63100342 -0.277676859 #> 145 -0.08807490 -0.528649176 -0.63411647 -0.566784147 3.53302853 -0.352178876 #> 146 -0.51244358 -0.222467626 -0.60079562 -0.435199745 -0.62115938 -0.363942352 #> 147 -0.53452056 -0.679309939 -0.64244668 -0.466529364 -0.62608140 3.682693510 #> 148 0.14741292 -0.081526913 -0.50499820 -0.366274582 -0.62608140 2.231864761 #> 149 -0.53452056 -0.655009816 0.59042451 5.498630194 -0.49810879 -0.383548146 #> 150 -0.53452056 -0.679309939 -0.64244668 -0.554252299 -0.20770940 0.443816357 #> 151 -0.43394764 -0.679309939 -0.39254036 -0.360008658 -0.60147128 -0.261992224 #> 152 -0.48546060 -0.314808094 -0.62162115 0.091137863 1.57898517 -0.352178876 #> 153 -0.53452056 -0.596689521 -0.58413520 -0.591847843 0.34847927 0.130123654 #> 154 -0.52961456 -0.679309939 -0.63828157 4.320636500 0.09745607 -0.191411366 #> 155 -0.53452056 0.214934588 0.20306971 1.024760525 -0.57193914 -0.379626987 #> 156 -0.52470857 0.030253653 -0.63828157 -0.353742734 -0.63100342 -0.328651923 #> 157 -0.53206756 -0.679309939 -0.64244668 -0.685836701 -0.63100342 -0.383548146 #> 158 -0.53452056 -0.091246962 4.23489171 -0.673304853 -0.62608140 -0.211017160 #> 159 -0.53452056 2.523446276 -0.63828157 -0.328679038 0.54043819 1.333919400 #> 160 -0.53452056 1.002258574 0.05312592 1.569895905 -0.63100342 -0.371784670 #> 161 -0.52225557 0.428775671 -0.57997010 0.066074168 -0.63100342 -0.344336558 #> 162 -0.53452056 1.998563618 -0.64244668 0.066074168 -0.63100342 7.666590833 #> 163 -0.53206756 -0.266207848 -0.25925698 2.459657100 -0.63100342 -0.383548146 #> 164 -0.51244358 -0.674449915 -0.62578626 -0.228424256 -0.61623735 -0.371784670 #> 165 -0.51489658 0.351015277 0.32385777 -0.103105778 -0.63100342 -0.375705829 #> 166 -0.53452056 -0.674449915 -0.64244668 -0.648241158 0.11222214 -0.383548146 #> 167 -0.49036659 -0.514069103 -0.63828157 0.279115580 1.49038874 -0.258071065 #> 168 -0.53452056 -0.412008586 0.18224419 -0.159499093 -0.62608140 -0.360021194 #> 169 -0.53206756 -0.679309939 -0.63828157 -0.504124908 -0.63100342 -0.383548146 #> 170 -0.04882693 -0.679309939 -0.63828157 -0.685836701 -0.63100342 -0.261992224 #> 171 3.46877241 -0.407148561 1.34847369 -0.009116919 1.17045721 -0.132593984 #> 172 -0.50753758 1.109179116 -0.31340335 -0.616911538 -0.52764093 -0.167884413 #> 173 -0.53452056 -0.562669349 -0.60912584 2.171424600 -0.62115938 -0.309046129 #> 174 -0.45602462 0.423915646 -0.36754972 0.698932482 -0.63100342 -0.175726731 #> 175 0.17439590 0.039973702 -0.54248415 -0.554252299 0.23527273 -0.258071065 #> 176 0.70914950 -0.679309939 -0.64244668 -0.121903550 2.44526132 -0.375705829 #> 177 0.95444931 -0.271067872 -0.38004504 -0.585581919 -0.06989273 -0.344336558 #> 178 -0.11996387 1.279279977 -0.64244668 -0.685836701 3.24755116 -0.136515143 #> 179 -0.53452056 -0.679309939 -0.19261530 0.435763678 -0.61131533 -0.360021194 #> 180 -0.48546060 -0.518929127 -0.26342209 -0.479061212 -0.63100342 -0.320809606 #> 181 -0.49772559 -0.635569718 -0.56747478 -0.673304853 -0.60639331 2.278918667 #> 182 -0.53206756 1.964543446 -0.63411647 0.391902211 -0.06004869 -0.375705829 #> 183 -0.52716157 -0.169007356 -0.42169609 3.180238349 -0.62608140 -0.383548146 #> 184 -0.32601572 -0.314808094 -0.50499820 -0.610645614 -0.13387904 -0.062013126 #> 185 -0.51489658 3.373950582 -0.27591741 -0.510390832 -0.61131533 -0.383548146 #> 186 -0.51980257 -0.679309939 -0.63411647 -0.641975234 -0.29630582 0.651637772 #> 187 0.38535374 0.783557467 -0.64244668 -0.504124908 1.10154888 -0.371784670 #> 188 -0.53452056 1.993703594 0.05729102 0.084871939 -0.63100342 -0.383548146 #> 189 -0.49281959 -0.353688291 -0.55081436 4.583805304 -0.60639331 3.910120720 #> 190 -0.37262268 -0.339108217 -0.08015745 -0.347476810 -0.62608140 -0.062013126 #> 191 -0.53452056 1.532001256 1.58588470 -0.428933821 -0.57193914 -0.081618920 #> 192 -0.53452056 -0.669589890 -0.27175230 -0.266019799 -0.63100342 -0.379626987 #> 193 3.84898713 -0.518929127 -0.16345956 -0.510390832 0.37308939 -0.348257717 #> 194 -0.52716157 0.715517123 0.39466456 -0.497858984 -0.21755344 -0.379626987 #> 195 3.26026757 0.268394859 -0.03017619 0.153797102 0.67825485 -0.211017160 #> 196 -0.48546060 4.652137053 0.77785425 -0.416401973 -0.63100342 -0.383548146 #> 197 -0.51244358 0.351015277 -0.14679914 -0.685836701 0.41738760 -0.367863511 #> 198 -0.53452056 -0.679309939 -0.63828157 -0.623177462 -0.63100342 -0.383548146 #> 199 1.06483423 -0.674449915 -0.53831904 -0.667038929 -0.18309928 -0.375705829 #> 200 -0.53452056 -0.552949299 0.14059313 -0.002850995 0.27957094 0.196783353 #> Otu00023 Otu00024 Otu00025 Otu00026 Otu00027 Otu00028 #> 1 -0.0069254588 -0.177204415 -0.24303824 -0.22202016 -0.24641906 -0.292554022 #> 2 -0.6642571429 -0.678440995 -0.43616774 -0.29146475 -0.38539990 -0.307394436 #> 3 -0.3747181868 0.177117995 0.04157367 -0.47086329 -0.41259180 -0.168883908 #> 4 -0.3199405465 0.954898895 -0.28369708 0.43770350 -0.36425064 -0.314814643 #> 5 -0.9068438359 -0.695725015 -0.39550890 -0.61553953 -0.06816104 -0.314814643 #> 6 -0.3434166781 0.851194775 0.03649131 -0.45350214 -0.38842122 -0.319761448 #> 7 0.4078195324 -0.669798985 -0.42600303 0.87751927 -0.23131245 -0.295027425 #> 8 -0.0851792307 -0.592020895 -0.35485005 -0.57503018 0.01945732 -0.322234850 #> 9 -0.8990184587 -0.393254665 -0.45141481 -0.62132658 -0.31288816 -0.319761448 #> 10 -0.4060196956 -0.341402605 1.42397434 -0.62132658 -0.40957048 0.214493446 #> 11 0.1965343482 3.962318375 -0.07023815 0.46085170 -0.20412055 -0.322234850 #> 12 1.2451348919 0.324032165 -0.14647348 -0.58660428 0.02852128 -0.319761448 #> 13 0.0713283131 0.488230355 -0.30402650 -0.37248345 -0.39748519 -0.314814643 #> 14 -0.5625272394 -0.280908535 -0.26845001 1.35205733 -0.37935725 -0.322234850 #> 15 -0.6955586517 0.107981915 -0.37009712 -0.26252951 -0.31288816 -0.312341241 #> 16 1.6911813918 -0.713009035 -0.43616774 -0.01368637 -0.32497345 -0.307394436 #> 17 -0.1399568711 0.099339905 0.21437375 -0.25095541 -0.38237857 -0.314814643 #> 18 -0.4138450728 -0.030290245 0.21437375 -0.22780721 -0.39144254 -0.183724322 #> 19 -0.7581616692 -0.021648235 -0.37517948 0.53608334 -0.12556616 -0.307394436 #> 20 0.8538660323 -0.592020895 -0.45141481 -0.54030789 -0.30986683 -0.312341241 #> 21 -0.8911930815 -0.704367025 5.62708227 -0.62132658 -0.41259180 -0.297500827 #> 22 0.7756122604 -0.704367025 0.61587983 -0.32618705 -0.31288816 -0.205984942 #> 23 0.3686926464 -0.721651045 -0.45649716 0.48978694 0.23699254 -0.299974229 #> 24 -0.1243061167 0.203044025 -0.40059125 -0.62132658 0.44848511 -0.314814643 #> 25 1.1434049884 -0.013006225 -0.29386179 -0.62132658 -0.41863444 -0.235665770 #> 26 -0.8285900640 0.168475985 -0.03974402 -0.58660428 0.33367486 -0.089735035 #> 27 -0.8677169499 -0.721651045 -0.14139113 -0.62132658 -0.41561312 1.485822222 #> 28 0.2200104798 -0.678440995 -0.44125010 2.96085712 -0.42467709 4.458851770 #> 29 -0.4216704500 -0.522884815 -0.43616774 -0.10049212 -0.32195212 -0.319761448 #> 30 -0.7816378008 -0.142636375 -0.37517948 -0.58660428 -0.40654915 -0.314814643 #> 31 -0.4920988447 1.680827735 -0.42600303 -0.60396543 -0.40352783 -0.317288045 #> 32 -0.6642571429 1.853667935 -0.31419121 -0.41299279 -0.40957048 -0.210931747 #> 33 1.3546901726 -0.721651045 -0.34976770 -0.59239133 0.49682627 -0.228245563 #> 34 -0.8990184587 -0.410538685 3.72119899 -0.49979854 -0.05909707 -0.260399793 #> 35 -0.2729882833 4.938865505 -0.18204997 -0.52873379 -0.33101609 -0.309867838 #> 36 2.7789088215 -0.661156975 1.47988025 -0.61553953 -0.15275807 -0.314814643 #> 37 -0.5234003535 2.026508135 0.45324446 -0.58081723 0.09801170 -0.314814643 #> 38 -0.9068438359 -0.721651045 0.34143264 -0.59817838 -0.36122932 -0.307394436 #> 39 -0.0069254588 -0.661156975 -0.26845001 -0.43614099 0.49984759 -0.287607218 #> 40 -0.6407810114 0.038845835 -0.25320295 -0.21623311 -0.37935725 -0.314814643 #> 41 1.1825318744 -0.609304915 -0.42092068 -0.61553953 0.26418444 -0.317288045 #> 42 -0.4529719588 0.073413875 -0.42092068 -0.37248345 -0.37935725 5.443265880 #> 43 3.1388761724 -0.721651045 -0.37517948 -0.62132658 -0.34914403 -0.297500827 #> 44 0.4391210411 0.090697895 -0.34976770 -0.59817838 -0.31288816 -0.295027425 #> 45 0.5252001902 -0.410538685 1.46971554 -0.61553953 -0.09535294 -0.317288045 #> 46 1.3077379094 -0.436464715 -0.24303824 0.16571217 -0.37633593 -0.210931747 #> 47 0.5173748130 0.393168245 0.04665602 -0.60396543 0.54818875 -0.317288045 #> 48 1.4877215849 -0.661156975 -0.33960299 -0.62132658 -0.41561312 -0.314814643 #> 49 -0.8442408184 0.151191965 -0.24812059 -0.60396543 -0.41863444 -0.290080620 #> 50 -0.6720825201 0.747490655 -0.18204997 -0.58660428 -0.38842122 -0.267820000 #> 51 -0.3590674325 -0.574736875 -0.44125010 1.11478830 -0.42467709 1.305263855 #> 52 -0.6407810114 0.427736285 -0.21762646 -0.60975248 -0.35518667 -0.302447632 #> 53 1.7459590322 -0.704367025 6.00825892 -0.60975248 0.58746594 -0.223298758 #> 54 1.4877215849 -0.522884815 1.16985657 -0.41877984 -0.36425064 -0.262873195 #> 55 -0.7425109149 0.254896085 -0.17188526 0.50714809 -0.10441691 -0.314814643 #> 56 0.8225645235 -0.713009035 0.03649131 -0.61553953 -0.36727196 -0.314814643 #> 57 -0.3590674325 -0.557452855 -0.45141481 1.07427895 0.25209915 -0.109522253 #> 58 -0.8911930815 -0.669798985 1.25117426 -0.62132658 -0.42467709 0.738854731 #> 59 -0.1008299851 0.445020305 -0.45141481 -0.38984460 0.56027404 -0.312341241 #> 60 0.0165506728 -0.254982505 0.61587983 0.62867613 0.19167270 -0.277713609 #> 61 -0.4294958272 -0.488316775 -0.45649716 -0.28567770 -0.37331461 -0.317288045 #> 62 -0.2338613974 -0.427822705 0.39733855 -0.40720575 -0.17390732 2.002763299 #> 63 1.9259427076 -0.592020895 -0.44633245 0.99904731 -0.42165577 -0.230718965 #> 64 -0.3981943184 -0.713009035 0.88524467 0.14256397 0.11613964 -0.317288045 #> 65 -0.6564317657 -0.531526825 -0.47174423 -0.55188199 8.52145880 0.006727654 #> 66 -0.6955586517 -0.177204415 -0.47174423 -0.62132658 -0.23433377 -0.322234850 #> 67 -0.5625272394 -0.687083005 -0.47174423 2.85669023 0.33367486 -0.322234850 #> 68 -0.3121151693 0.393168245 -0.45649716 0.17728626 -0.39748519 -0.319761448 #> 69 1.1590557428 -0.721651045 0.02124425 1.73400261 0.03758525 -0.309867838 #> 70 0.1808835938 1.940088035 -0.43616774 -0.54030789 -0.38539990 -0.319761448 #> 71 1.0181989533 -0.358686625 1.11395066 -0.61553953 -0.31893080 -0.304921034 #> 72 -0.3355913009 -0.721651045 -0.30910886 1.01640846 -0.16182203 -0.275240206 #> 73 -0.5860033710 -0.038932255 -0.42092068 -0.23359426 -0.26756832 -0.314814643 #> 74 -0.5781779938 -0.177204415 -0.36501477 0.14256397 0.83521439 0.006727654 #> 75 -0.4686227131 0.894404825 0.01107953 -0.30882590 -0.35216535 -0.304921034 #> 76 -0.6486063886 0.531440405 -0.44125010 -0.52294674 -0.36727196 -0.307394436 #> 77 -0.4842734675 0.721564625 -0.47174423 2.76409744 -0.37029328 -0.309867838 #> 78 -0.9068438359 1.015392965 0.94115058 -0.23938131 -0.39446386 -0.292554022 #> 79 -0.4451465816 -0.237698485 -0.26336766 -0.08313097 -0.28569625 -0.314814643 #> 80 0.0791536903 -0.721651045 0.36176206 -0.61553953 -0.42467709 -0.248032781 #> 81 -0.7190347833 -0.687083005 -0.29894415 0.60552794 -0.30986683 -0.322234850 #> 82 0.0087252956 1.145023115 -0.39042654 -0.23938131 -0.11045955 -0.270293402 #> 83 1.9885457251 -0.315476575 -0.33452063 -0.60396543 -0.40654915 -0.257926390 #> 84 0.2747881201 -0.721651045 -0.32943828 2.66571759 2.25221464 -0.314814643 #> 85 -0.8833677043 -0.229056475 -0.46157952 1.49673357 0.05269186 0.911992891 #> 86 -0.9068438359 -0.626588935 -0.45141481 1.59511342 1.12224003 -0.322234850 #> 87 -0.2495121518 5.517880175 -0.38534419 -0.61553953 -0.40352783 -0.309867838 #> 88 -0.2886390377 0.721564625 -0.08040286 -0.22780721 -0.21922716 -0.275240206 #> 89 -0.5234003535 0.133907945 -0.30910886 -0.19308491 -0.41561312 -0.173830713 #> 90 0.0008999184 0.082055885 -0.41075596 0.40876825 -0.42165577 -0.302447632 #> 91 -0.7659870464 -0.393254665 -0.44633245 0.45506465 -0.33705874 -0.302447632 #> 92 -0.7738124236 0.954898895 0.85983289 -0.30882590 -0.41561312 1.837045346 #> 93 0.1417567078 -0.721651045 6.81127108 -0.62132658 -0.14369410 -0.302447632 #> 94 -0.6016541254 -0.341402605 -0.46157952 1.02798256 -0.10743823 -0.149096690 #> 95 0.7286599972 0.254896085 -0.07532051 -0.53452084 -0.30080287 -0.319761448 #> 96 -0.9068438359 0.194402015 -0.46157952 -0.34354820 -0.42467709 -0.322234850 #> 97 1.9181173304 -0.704367025 -0.27353237 -0.62132658 0.98325919 -0.248032781 #> 98 -0.4529719588 0.142549955 0.31093850 0.24094381 -0.35820799 -0.277713609 #> 99 0.7286599972 -0.713009035 -0.07023815 -0.59239133 0.11311831 -0.280187011 #> 100 -0.5234003535 -0.704367025 -0.46666187 -0.60396543 0.06175583 3.006964628 #> 101 0.0243760500 0.514156385 -0.28369708 -0.61553953 3.79913175 -0.322234850 #> 102 5.4160609352 -0.609304915 -0.43108539 -0.61553953 5.83248179 -0.275240206 #> 103 1.1512303656 -0.609304915 -0.44125010 -0.54609494 0.83823571 -0.205984942 #> 104 -0.9068438359 -0.574736875 -0.28369708 0.40298120 -0.42467709 -0.319761448 #> 105 0.1495820850 0.254896085 -0.11597935 -0.59817838 -0.22526980 -0.282660413 #> 106 -0.7972885552 -0.056216275 -0.21254410 -0.59239133 0.43942114 -0.312341241 #> 107 -0.2260360202 -0.229056475 -0.34468534 0.61710203 -0.30080287 0.169972205 #> 108 -0.5468764851 1.335147335 -0.45141481 1.46779833 -0.12254484 -0.309867838 #> 109 1.1121034796 -0.678440995 -0.39550890 -0.59817838 -0.32195212 -0.312341241 #> 110 0.7599615060 -0.479674765 -0.45141481 0.94696386 -0.05305442 -0.309867838 #> 111 -0.6407810114 -0.289550545 1.47479789 0.06154527 -0.40957048 0.058669102 #> 112 -0.5468764851 -0.721651045 -0.25320295 -0.40141870 -0.07722500 -0.314814643 #> 113 -0.8990184587 -0.721651045 -0.24303824 -0.61553953 -0.42165577 -0.314814643 #> 114 -0.6486063886 -0.082142305 -0.30910886 -0.20465901 -0.22829113 -0.319761448 #> 115 -0.4842734675 0.073413875 -0.41583832 -0.62132658 0.20980063 -0.277713609 #> 116 0.1261059534 0.583292465 -0.43108539 -0.60396543 -0.40352783 -0.025426576 #> 117 0.0243760500 -0.514242805 -0.45141481 -0.62132658 -0.39748519 0.763588754 #> 118 -0.0304015904 -0.721651045 -0.27861472 -0.15257556 0.01945732 -0.319761448 #> 119 -0.7033840289 2.389472555 -0.45141481 -0.62132658 -0.38237857 -0.317288045 #> 120 1.8320381813 -0.652514965 -0.20237939 -0.61553953 0.10103302 -0.309867838 #> 121 -0.5547018623 -0.548810845 -0.47174423 -0.44771509 0.03154261 -0.272766804 #> 122 -0.1869091342 -0.254982505 3.03508101 -0.53452084 -0.31893080 -0.250506184 #> 123 -0.2260360202 -0.462390745 -0.46157952 2.06965148 -0.42467709 6.323797094 #> 124 0.1652328394 1.170949145 -0.44125010 -0.60975248 -0.42467709 3.514012096 #> 125 -0.9068438359 -0.531526825 -0.33960299 4.84743529 -0.38842122 -0.299974229 #> 126 -0.6329556342 3.564785915 -0.24812059 -0.52294674 -0.39748519 -0.245559379 #> 127 -0.9068438359 -0.367328635 -0.40059125 0.37983300 -0.36727196 -0.314814643 #> 128 1.6677052603 0.185760005 3.05032807 0.39140710 0.28533370 -0.314814643 #> 129 -0.0851792307 -0.522884815 -0.16680290 5.25252877 0.85032100 -0.280187011 #> 130 -0.6251302570 -0.695725015 0.10764429 -0.60975248 -0.27663229 -0.322234850 #> 131 -0.9068438359 -0.419180695 -0.42600303 -0.51715969 -0.02586252 -0.317288045 #> 132 1.4407693217 -0.592020895 -0.44125010 -0.55188199 1.61169427 -0.285133816 #> 133 0.4547717955 -0.488316775 0.03649131 -0.17572376 -0.21318451 -0.248032781 #> 134 -0.2808136605 0.427736285 0.24486788 -0.45928919 -0.29476022 -0.314814643 #> 135 -0.0695284764 -0.678440995 -0.33452063 -0.59239133 0.91679010 -0.317288045 #> 136 0.3217403832 -0.280908535 -0.39550890 -0.54030789 0.65997768 0.031461677 #> 137 0.4547717955 0.868478795 -0.44125010 0.07890642 -0.36727196 -0.136729678 #> 138 -0.5312257307 0.453662315 -0.47174423 -0.44192804 -0.40957048 1.082657649 #> 139 0.0400268043 -0.133994365 -0.41583832 1.91918820 0.06477715 -0.322234850 #> 140 -0.9068438359 2.795647025 -0.44125010 -0.55188199 -0.41561312 -0.317288045 #> 141 -0.4920988447 -0.583378885 -0.47174423 2.26062412 0.17656609 -0.116942460 #> 142 -0.7894631780 -0.237698485 -0.21762646 -0.42456689 -0.42467709 -0.099628644 #> 143 -0.5155749763 0.038845835 -0.24812059 0.23515676 -0.42467709 -0.015532966 #> 144 0.1417567078 0.142549955 0.09239722 1.66455801 -0.27663229 0.320849745 #> 145 -0.8833677043 -0.315476575 -0.15155584 -0.61553953 -0.40050651 5.809329418 #> 146 -0.3668928096 -0.609304915 -0.44633245 0.68075958 -0.42467709 -0.292554022 #> 147 -0.8990184587 -0.713009035 -0.44125010 -0.60975248 -0.31893080 -0.314814643 #> 148 -0.1869091342 -0.073500295 -0.41075596 1.02798256 0.45452776 -0.223298758 #> 149 -0.1008299851 -0.626588935 -0.39042654 -0.11785327 -0.39748519 -0.299974229 #> 150 0.0322014271 2.372188535 -0.39042654 0.42612940 -0.40352783 -0.322234850 #> 151 -0.2495121518 1.231443215 -0.46157952 -0.60396543 -0.42467709 -0.304921034 #> 152 0.3921687780 1.352431355 -0.20746175 -0.46507624 -0.41259180 -0.280187011 #> 153 -0.8442408184 0.548724425 -0.43108539 0.60552794 -0.34008006 -0.307394436 #> 154 1.2060080059 -0.617946925 -0.36501477 -0.62132658 0.43639982 -0.245559379 #> 155 0.9086436726 -0.531526825 -0.22779117 -0.56924313 0.30648295 0.706700501 #> 156 -0.4686227131 -0.522884815 -0.42092068 -0.61553953 -0.42165577 -0.314814643 #> 157 -0.8911930815 -0.687083005 0.98180942 -0.62132658 -0.33705874 -0.210931747 #> 158 0.9947228218 -0.220414465 0.74293871 0.07311937 -0.41561312 -0.295027425 #> 159 -0.6564317657 -0.125352355 -0.40567361 2.60784710 -0.41561312 -0.277713609 #> 160 -0.6877332745 -0.713009035 -0.34468534 -0.59239133 0.64184975 -0.139203081 #> 161 0.4078195324 -0.669798985 -0.47174423 3.04187582 -0.41561312 -0.314814643 #> 162 -0.8990184587 -0.721651045 -0.14647348 -0.62132658 -0.37633593 -0.285133816 #> 163 1.1121034796 -0.721651045 -0.35993241 0.74441713 -0.29173890 -0.290080620 #> 164 0.9712466902 -0.168562405 -0.32435592 -0.59817838 0.79895852 -0.272766804 #> 165 0.2356612341 -0.566094865 -0.33960299 -0.49979854 5.67839434 -0.297500827 #> 166 -0.3434166781 1.369715375 -0.46157952 -0.60975248 -0.41561312 4.716085608 #> 167 -0.5468764851 0.419094275 -0.46666187 3.73053472 -0.40654915 -0.307394436 #> 168 -0.5155749763 -0.721651045 -0.40567361 -0.59817838 -0.34008006 -0.287607218 #> 169 3.5849226723 -0.704367025 0.95639764 -0.53452084 0.37597337 -0.304921034 #> 170 -0.9068438359 -0.687083005 -0.39042654 -0.62132658 -0.41863444 -0.312341241 #> 171 -0.5390511079 0.617860505 -0.07532051 -0.37827050 -0.37633593 -0.314814643 #> 172 -0.4529719588 -0.626588935 -0.46157952 -0.26252951 2.99243865 -0.077368024 #> 173 -0.8207646868 -0.687083005 -0.40567361 -0.62132658 0.99836580 0.019094666 #> 174 0.4312956639 1.741321805 -0.39042654 -0.51137264 -0.15275807 -0.290080620 #> 175 -0.0695284764 0.107981915 -0.45649716 -0.50558559 -0.29778154 -0.295027425 #> 176 0.4547717955 4.307998775 1.64759798 -0.58660428 -0.37029328 -0.304921034 #> 177 -0.1321314939 -0.220414465 -0.24812059 0.70969483 -0.38842122 -0.319761448 #> 178 -0.9068438359 -0.410538685 -0.45649716 -0.62132658 -0.42165577 -0.299974229 #> 179 0.2982642517 -0.574736875 -0.16680290 -0.06576982 0.68414826 -0.319761448 #> 180 -0.5077495991 0.280822115 -0.44633245 -0.33776115 -0.37029328 0.244174274 #> 181 -0.6877332745 -0.522884815 0.01616189 0.77335237 -0.08931029 -0.302447632 #> 182 -0.5938287482 0.436378295 -0.46157952 1.04534371 -0.20109922 -0.196091333 #> 183 -0.4451465816 -0.367328635 -0.22779117 -0.19308491 -0.30684551 0.273855101 #> 184 -0.7738124236 0.151191965 0.03649131 -0.51137264 -0.36727196 1.483348819 #> 185 3.0997492864 -0.617946925 -0.42092068 -0.56924313 0.18260873 -0.314814643 #> 186 -0.8677169499 0.393168245 -0.47174423 0.21200856 -0.39144254 -0.069947817 #> 187 -0.9068438359 -0.609304915 -0.46157952 -0.61553953 -0.42165577 -0.309867838 #> 188 2.7710834443 -0.721651045 -0.34468534 -0.60396543 -0.08628897 0.773482363 #> 189 -0.8755423271 -0.047574265 -0.43108539 -0.43614099 -0.41863444 0.187286021 #> 190 -0.3355913009 -0.246340495 -0.40567361 1.58353932 -0.11650220 -0.302447632 #> 191 -0.6094795026 -0.479674765 -0.42092068 -0.45350214 -0.41259180 -0.245559379 #> 192 0.1104551991 -0.721651045 0.80900933 -0.59239133 -0.40957048 -0.307394436 #> 193 -0.5077495991 0.609218495 0.12289135 -0.56924313 -0.14671542 -0.297500827 #> 194 3.4518912600 -0.687083005 -0.40567361 1.55460407 0.06175583 -0.260399793 #> 195 -0.4842734675 0.315390155 2.58783373 -0.52873379 0.17958741 -0.282660413 #> 196 2.4658937338 -0.721651045 1.35282136 -0.16414966 -0.42467709 -0.322234850 #> 197 -0.0382269676 -0.669798985 -0.39550890 -0.58660428 -0.40352783 -0.161463701 #> 198 -0.9068438359 -0.721651045 0.15338549 -0.62132658 -0.41561312 -0.297500827 #> 199 -0.8598915727 0.107981915 0.40750326 -0.60396543 -0.27058964 -0.299974229 #> 200 -0.0304015904 0.004277795 -0.14647348 -0.55766903 -0.23131245 -0.317288045 #> Otu00029 Otu00030 Otu00031 Otu00032 Otu00033 #> 1 0.695821495 0.39193166 0.2730666130 1.850227727 -0.352365855 #> 2 -0.252260766 0.44720466 -0.1402887916 -0.493938512 0.152851091 #> 3 0.066720182 -0.59377025 -0.4629076438 -0.357825634 -0.288065517 #> 4 -0.473775313 -0.71352842 1.5937875395 -0.501500339 -0.435037719 #> 5 -0.571241714 0.33665866 -0.5637260352 -0.577118604 0.952012441 #> 6 -0.216818439 -0.52928508 -0.2411071829 0.337862411 0.079364989 #> 7 3.079318020 0.19847615 -0.3520074134 -0.395634767 -0.618752972 #> 8 0.031277854 -0.17001055 -0.3822529308 -0.357825634 -0.444223482 #> 9 -0.730732188 -0.11473754 0.3335576478 -0.070476224 -0.168650602 #> 10 0.137604837 -0.76880143 -0.4830713221 -0.516623992 0.740739900 #> 11 -0.305424257 0.16162748 -0.5939715526 -0.577118604 -0.600381447 #> 12 -0.730732188 -0.54770941 -0.5233986787 0.148816747 0.465167021 #> 13 -0.269981930 -0.62140675 -0.2209435046 0.103445788 -0.453409245 #> 14 -0.526938804 0.54853851 0.1420027042 0.572279035 -0.646310260 #> 15 -0.535799386 -0.33582956 -0.2411071829 0.436166157 -0.655496023 #> 16 -0.340866585 -0.38189040 -0.4729894830 -0.569556778 1.071427356 #> 17 -0.181376111 1.20260239 -0.4427439656 1.071359589 -0.582009922 #> 18 0.279374147 0.65908451 0.0109387955 -0.100723530 0.106922277 #> 19 0.270513565 0.72356969 -0.0797977567 0.466413463 -0.232950941 #> 20 1.431249791 0.85254003 0.4646215565 -0.546871298 0.446795495 #> 21 -0.730732188 -0.76880143 -0.5939715526 -0.569556778 1.787916843 #> 22 2.937548710 -0.28055656 -0.5536441961 -0.456129379 -0.159464840 #> 23 -0.004164473 0.04186930 -0.3217618960 0.141254920 -0.673867548 #> 24 0.146465418 1.07363205 -0.5838897135 0.504222596 0.116108040 #> 25 -0.730732188 0.79726702 -0.1806161481 -0.577118604 -0.021678400 #> 26 -0.730732188 -0.70431626 -0.5637260352 -0.138532663 4.424230724 #> 27 -0.686429278 -0.76880143 -0.5838897135 -0.531747645 1.705244979 #> 28 0.562912767 -0.76880143 -0.5939715526 -0.577118604 -0.490152295 #> 29 0.279374147 -0.52928508 -0.1402887916 -0.357825634 1.098984644 #> 30 -0.721871606 7.25499635 -0.5637260352 0.020265695 -0.692239074 #> 31 -0.128212620 1.34078490 1.6643604135 -0.569556778 -0.012492637 #> 32 1.378086300 -0.06867671 -0.5838897135 2.530792119 -0.627938735 #> 33 0.075580763 -0.43716340 -0.5939715526 -0.577118604 0.428423970 #> 34 -0.243400184 -0.76880143 -0.5838897135 -0.577118604 -0.223765178 #> 35 0.199628910 0.76041836 0.3033121304 -0.441005726 -0.407480431 #> 36 2.388192634 3.49643206 -0.5939715526 -0.509062165 -0.407480431 #> 37 -0.695289860 -0.67667975 -0.4830713221 0.821819312 -0.701424836 #> 38 -0.721871606 -0.03182804 -0.5939715526 -0.577118604 -0.012492637 #> 39 -0.234539602 2.08697046 0.5251125913 -0.350263807 -0.591195684 #> 40 -0.323145421 0.04186930 -0.1402887916 0.065636655 -0.609567210 #> 41 1.316062227 -0.34504173 -0.5233986787 -0.448567553 0.290637530 #> 42 -0.367448331 -0.06867671 -0.2713527003 -0.123409010 -0.692239074 #> 43 -0.721871606 -0.76880143 -0.5738078743 -0.577118604 -0.609567210 #> 44 0.748984986 0.39193166 1.3316597220 -0.478814859 -0.379923143 #> 45 1.989466449 -0.75037709 -0.4931531613 -0.289769194 2.936137175 #> 46 -0.057327965 -0.76880143 -0.4729894830 -0.569556778 2.467663279 #> 47 -0.730732188 -0.73195276 -0.3217618960 -0.297331021 -0.141093314 #> 48 3.495765369 -0.20685922 -0.5435623569 -0.524185818 -0.058421450 #> 49 -0.385169494 -0.72274059 -0.2108616655 -0.229274582 0.492724309 #> 50 -0.624405205 -0.63983108 -0.4124984482 0.489098943 0.042621939 #> 51 -0.588962878 2.18830430 -0.4830713221 -0.561994951 3.110666665 #> 52 -0.137073202 0.12477881 0.6662583392 1.056235936 -0.232950941 #> 53 -0.730732188 -0.76880143 -0.5939715526 -0.561994951 -0.692239074 #> 54 -0.305424257 -0.75037709 -0.5738078743 -0.577118604 -0.398294669 #> 55 -0.535799386 -0.63983108 -0.4225802873 0.050513002 -0.591195684 #> 56 -0.730732188 0.92623737 -0.5536441961 -0.478814859 0.446795495 #> 57 -0.367448331 2.16066779 -0.2511890220 5.563084576 -0.600381447 #> 58 -0.721871606 -0.75037709 -0.5838897135 -0.546871298 0.042621939 #> 59 -0.721871606 -0.23449572 2.7128716834 -0.577118604 1.622573115 #> 60 0.376840547 0.43799250 -0.4024166090 -0.115847183 -0.122721789 #> 61 0.111023091 0.09714230 4.3360477841 -0.055352571 -0.582009922 #> 62 -0.562381132 0.13399097 -0.2209435046 -0.577118604 -0.021678400 #> 63 1.750230739 0.22611265 -0.5133168395 -0.463691206 -0.554452634 #> 64 -0.314284839 0.36429516 2.6422988095 0.254682319 0.079364989 #> 65 -0.721871606 -0.75958926 -0.3923347699 -0.577118604 -0.085978738 #> 66 0.252792401 -0.54770941 -0.5939715526 -0.569556778 -0.333994330 #> 67 -0.358587749 -0.54770941 -0.4024166090 -0.554433125 -0.471780770 #> 68 -0.677568696 0.15241531 0.6965038566 0.012703869 -0.315622805 #> 69 0.642658004 -0.19764705 -0.0596340785 0.156378574 -0.517709583 #> 70 0.155326000 0.24453698 2.8741811096 -0.577118604 -0.499338058 #> 71 0.935057206 -0.48322424 -0.5939715526 0.942808538 -0.389108906 #> 72 -0.491496477 0.21690048 0.1117571868 -0.577118604 -0.343180093 #> 73 -0.730732188 -0.02261587 -0.4729894830 0.186625880 -0.673867548 #> 74 0.048999018 -0.46479990 -0.4225802873 -0.191465449 -0.425851957 #> 75 -0.145933784 1.34078490 -0.3217618960 0.436166157 -0.232950941 #> 76 -0.730732188 1.31314840 4.7393213494 0.141254920 -0.453409245 #> 77 -0.730732188 -0.05025237 4.3864569797 1.404079959 0.079364989 #> 78 -0.730732188 -0.76880143 -0.1302069524 -0.289769194 2.081861248 #> 79 -0.243400184 0.63144801 -0.3520074134 -0.168779969 -0.673867548 #> 80 6.614690190 0.31823432 -0.5939715526 -0.577118604 -0.389108906 #> 81 -0.394030076 -0.05025237 -0.5334805178 -0.342701980 -0.664681786 #> 82 1.759091320 -0.76880143 -0.5939715526 -0.577118604 0.162036853 #> 83 2.007187613 -0.28055656 -0.5334805178 -0.350263807 0.520281597 #> 84 -0.730732188 0.35508299 -0.5939715526 -0.478814859 -0.205393653 #> 85 -0.633265787 -0.08710104 -0.1201251133 -0.577118604 -0.710610599 #> 86 -0.101630874 0.08793014 -0.3419255742 -0.577118604 -0.269693992 #> 87 1.218595826 0.21690048 0.2125755781 1.094045069 -0.131907552 #> 88 -0.721871606 -0.40031473 -0.1906979872 -0.577118604 0.125293803 #> 89 -0.207957857 -0.45558774 -0.5939715526 -0.509062165 -0.425851957 #> 90 -0.730732188 -0.30819306 0.8376496045 -0.577118604 0.667253799 #> 91 -0.730732188 -0.76880143 1.7450151266 -0.093161703 -0.067607213 #> 92 -0.544659968 -0.17001055 -0.1503706307 -0.078038050 -0.582009922 #> 93 0.881893714 -0.76880143 -0.3520074134 -0.577118604 -0.398294669 #> 94 -0.137073202 -0.73195276 -0.1402887916 -0.577118604 -0.554452634 #> 95 -0.624405205 -0.29898089 -0.2612708612 0.383233371 -0.333994330 #> 96 -0.730732188 -0.76880143 -0.5939715526 2.349308281 -0.591195684 #> 97 0.243931819 -0.59377025 -0.5939715526 -0.577118604 2.807536497 #> 98 -0.482635895 0.42878033 1.4223962743 2.530792119 -0.159464840 #> 99 -0.730732188 -0.69510409 -0.5939715526 -0.561994951 -0.600381447 #> 100 -0.730732188 0.40114383 0.1420027042 -0.569556778 -0.600381447 #> 101 -0.704150442 0.91702520 -0.5637260352 -0.561994951 -0.389108906 #> 102 -0.491496477 2.38175981 -0.5939715526 -0.577118604 -0.683053311 #> 103 -0.243400184 -0.30819306 -0.4326621264 -0.569556778 -0.370737381 #> 104 1.316062227 -0.76880143 -0.5939715526 -0.009981611 -0.343180093 #> 105 0.040138436 0.56696284 -0.1201251133 0.156378574 -0.232950941 #> 106 -0.668708114 -0.23449572 -0.4528258047 0.020265695 -0.710610599 #> 107 0.261652983 1.19339022 0.4444578782 -0.138532663 -0.600381447 #> 108 -0.730732188 0.74199402 -0.5838897135 0.564717209 -0.582009922 #> 109 -0.704150442 -0.55692158 -0.4931531613 -0.561994951 -0.040049925 #> 110 -0.261121348 1.46975524 0.3133939695 -0.183903622 -0.288065517 #> 111 -0.367448331 -0.22528355 3.8823650230 -0.055352571 -0.572824159 #> 112 -0.721871606 -0.75958926 -0.5939715526 -0.531747645 -0.710610599 #> 113 -0.128212620 0.83411569 3.5496643316 0.678144607 -0.315622805 #> 114 -0.650986951 -0.10552538 -0.4830713221 -0.546871298 -0.664681786 #> 115 -0.500357059 0.99072254 3.0052450183 0.715953740 0.033436176 #> 116 -0.243400184 -0.56613375 -0.3419255742 -0.259521888 -0.361551618 #> 117 0.917336042 -0.76880143 -0.4427439656 -0.365387460 2.100232773 #> 118 0.616076258 0.43799250 0.7569948914 3.377716696 -0.563638396 #> 119 -0.225679020 -0.76880143 1.0090408698 2.939130754 0.703996850 #> 120 2.512240780 0.53932634 -0.5838897135 -0.546871298 -0.131907552 #> 121 -0.394030076 0.44720466 -0.4830713221 -0.531747645 -0.683053311 #> 122 0.111023091 -0.41873907 1.2409231698 0.950370364 -0.333994330 #> 123 -0.721871606 -0.75037709 -0.2915163786 -0.448567553 -0.683053311 #> 124 0.261652983 0.06029364 -0.3520074134 -0.161218143 -0.609567210 #> 125 -0.721871606 0.94466170 -0.3822529308 0.247120493 -0.012492637 #> 126 0.137604837 -0.75958926 -0.4225802873 -0.569556778 -0.058421450 #> 127 -0.713011024 -0.56613375 0.1117571868 -0.554433125 -0.232950941 #> 128 0.075580763 -0.51086074 -0.5233986787 -0.168779969 3.955756829 #> 129 -0.500357059 -0.56613375 -0.4427439656 -0.463691206 -0.471780770 #> 130 -0.642126369 -0.05946454 -0.5939715526 -0.456129379 -0.333994330 #> 131 2.972991038 -0.66746759 -0.5233986787 0.050513002 1.493972438 #> 132 -0.730732188 0.35508299 -0.4024166090 -0.040228917 0.823411764 #> 133 2.078072268 -0.70431626 0.0109387955 -0.463691206 -0.040049925 #> 134 -0.473775313 -0.54770941 -0.1402887916 0.315176932 -0.517709583 #> 135 2.645149508 -0.53849724 -0.5838897135 -0.561994951 1.319442948 #> 136 0.350258802 -0.45558774 1.1804321350 1.313338040 -0.049235688 #> 137 -0.269981930 -0.20685922 3.0254086966 1.857789554 -0.591195684 #> 138 0.093301927 -0.54770941 -0.4528258047 2.583724905 -0.683053311 #> 139 0.607215676 -0.66746759 -0.2209435046 7.158629984 -0.517709583 #> 140 -0.730732188 0.83411569 2.2087797267 -0.577118604 3.312753443 #> 141 -0.110491456 1.50660391 0.2125755781 0.368109718 -0.600381447 #> 142 -0.305424257 -0.75037709 -0.1705343090 -0.569556778 -0.710610599 #> 143 -0.278842512 -0.06867671 -0.3217618960 0.179064053 -0.683053311 #> 144 -0.571241714 0.50247767 -0.0293885611 2.349308281 -0.582009922 #> 145 1.271759317 -0.29898089 -0.4427439656 -0.365387460 -0.710610599 #> 146 -0.110491456 0.47484117 0.0008569563 0.549593556 0.051807701 #> 147 -0.730732188 -0.76880143 -0.5838897135 -0.577118604 -0.673867548 #> 148 -0.367448331 0.19847615 1.9164063918 0.632773648 -0.710610599 #> 149 -0.642126369 -0.74116493 -0.4326621264 -0.569556778 -0.701424836 #> 150 -0.730732188 4.27025412 -0.5939715526 -0.577118604 -0.701424836 #> 151 -0.402890658 -0.38189040 -0.4629076438 -0.577118604 0.805040239 #> 152 0.740124404 -0.36346606 -0.2511890220 0.050513002 -0.609567210 #> 153 -0.580102296 -0.65825542 0.0109387955 1.162101508 1.025498543 #> 154 -0.704150442 -0.74116493 -0.2209435046 2.825703355 -0.655496023 #> 155 0.004696108 0.90781303 -0.5133168395 -0.448567553 0.005878888 #> 156 0.846451387 -0.07788888 -0.2612708612 -0.561994951 -0.664681786 #> 157 -0.713011024 -0.76880143 -0.5838897135 -0.561994951 -0.710610599 #> 158 -0.367448331 -0.76880143 -0.0797977567 0.156378574 -0.637124498 #> 159 -0.163654947 -0.40031473 2.0676339788 -0.569556778 -0.646310260 #> 160 0.004696108 -0.48322424 -0.5738078743 -0.539309471 -0.370737381 #> 161 1.094547680 -0.48322424 -0.3923347699 -0.433443899 -0.591195684 #> 162 -0.730732188 0.41956816 -0.5939715526 -0.577118604 1.319442948 #> 163 0.181907746 -0.61219458 -0.5637260352 -0.569556778 -0.444223482 #> 164 -0.721871606 -0.25292005 -0.4830713221 -0.501500339 0.465167021 #> 165 -0.030746219 0.01423280 -0.5838897135 -0.554433125 -0.223765178 #> 166 -0.713011024 -0.76880143 0.6662583392 -0.577118604 -0.710610599 #> 167 -0.713011024 4.09522294 1.1602684568 -0.577118604 2.302319551 #> 168 2.388192634 -0.70431626 -0.5939715526 -0.577118604 1.007127017 #> 169 0.270513565 -0.76880143 -0.5738078743 -0.539309471 0.593767698 #> 170 -0.730732188 -0.76880143 0.1016753477 -0.569556778 -0.710610599 #> 171 -0.571241714 -0.61219458 -0.1100432742 0.534469902 -0.600381447 #> 172 -0.287703094 -0.48322424 -0.4225802873 -0.524185818 -0.407480431 #> 173 1.422389209 -0.61219458 -0.5738078743 -0.577118604 2.752421921 #> 174 0.456585784 0.14320314 -0.1705343090 -0.546871298 1.806288368 #> 175 -0.296563675 -0.39110257 -0.0697159176 -0.493938512 -0.627938735 #> 176 0.562912767 1.38684574 -0.5939715526 0.587402689 -0.012492637 #> 177 0.952778369 -0.48322424 -0.1604524698 -0.244398235 -0.683053311 #> 178 -0.721871606 -0.75037709 -0.5838897135 -0.214150929 1.705244979 #> 179 0.217350073 -0.52928508 -0.5435623569 -0.577118604 5.278506651 #> 180 -0.261121348 0.88017653 -0.1604524698 0.557155382 -0.673867548 #> 181 -0.039606801 -0.54770941 -0.1604524698 0.111007614 -0.627938735 #> 182 -0.083909710 -0.64904325 -0.2612708612 -0.577118604 -0.306437042 #> 183 -0.199097275 1.20260239 -0.2108616655 -0.123409010 -0.554452634 #> 184 -0.668708114 -0.30819306 -0.3116800568 1.600687450 -0.572824159 #> 185 0.297095310 2.55679099 -0.5939715526 -0.554433125 -0.627938735 #> 186 -0.713011024 -0.62140675 -0.0293885611 -0.380511113 -0.701424836 #> 187 -0.721871606 -0.75958926 -0.4225802873 -0.085599877 -0.609567210 #> 188 2.990712202 -0.41873907 -0.5939715526 -0.554433125 1.392929049 #> 189 -0.730732188 -0.56613375 -0.4326621264 -0.380511113 -0.710610599 #> 190 0.102162509 -0.25292005 0.0815116694 -0.304892848 -0.609567210 #> 191 -0.668708114 -0.25292005 -0.5133168395 -0.554433125 -0.343180093 #> 192 -0.730732188 -0.32661739 0.6158491435 -0.577118604 -0.205393653 #> 193 0.057859600 -0.63061892 -0.3822529308 0.413480677 -0.278879754 #> 194 -0.509217641 0.14320314 -0.4528258047 -0.577118604 0.162036853 #> 195 -0.668708114 0.11556664 -0.3721710916 0.526908076 -0.692239074 #> 196 -0.730732188 -0.76880143 -0.5838897135 -0.577118604 0.906083628 #> 197 -0.154794365 -0.47401207 2.1079613354 -0.093161703 -0.572824159 #> 198 -0.721871606 -0.67667975 -0.5939715526 -0.577118604 -0.627938735 #> 199 -0.713011024 -0.74116493 -0.4225802873 -0.161218143 -0.232950941 #> 200 -0.730732188 -0.47401207 -0.3217618960 0.511784423 -0.278879754 #> Otu00034 Otu00035 Otu00036 Otu00037 Otu00038 #> 1 -0.1482914828 -0.28857253 -0.337797955 -0.28026882 -0.269009738 #> 2 -0.1507314908 1.32771762 -0.337797955 -0.40104181 -0.269009738 #> 3 -0.1360914431 -0.09645535 -0.309626997 5.43380328 -0.251964926 #> 4 -0.1507314908 -0.24263146 -0.337797955 -0.28781713 -0.254805728 #> 5 0.0469091527 -0.38463111 -0.332163763 -0.55200805 -0.269009738 #> 6 -0.1507314908 -0.31363129 -0.337797955 -0.02362622 -0.269009738 #> 7 -0.1507314908 -0.38880757 3.099058896 -0.19723739 -0.269009738 #> 8 -0.1507314908 -0.25098438 -0.337797955 -0.13685089 -0.266168936 #> 9 -0.0775312524 -0.38880757 -0.337797955 0.32359613 -0.084357613 #> 10 -0.0604511968 -0.30110191 0.811577123 -0.51426649 -0.254805728 #> 11 -0.1507314908 1.31518824 -0.337797955 0.52740055 -0.269009738 #> 12 0.6935112580 -0.25098438 -0.337797955 -0.54445974 -0.266168936 #> 13 -0.1458514749 5.21182571 -0.337797955 -0.55200805 -0.257646530 #> 14 -0.1507314908 -0.31780775 -0.337797955 -0.43878337 -0.269009738 #> 15 -0.1507314908 -0.20921978 0.158010902 -0.40859012 -0.269009738 #> 16 -0.0824112683 -0.36792527 -0.337797955 1.16145875 -0.269009738 #> 17 -0.1507314908 -0.38880757 0.963700295 -0.29536544 0.049160077 #> 18 -0.1507314908 -0.17580810 -0.337797955 0.01411534 -0.200830492 #> 19 -0.1458514749 0.28360254 -0.337797955 -0.43123506 -0.269009738 #> 20 -0.1482914828 -0.36792527 -0.337797955 1.87100007 -0.269009738 #> 21 0.3616701775 -0.38880757 -0.337797955 7.21520489 -0.251964926 #> 22 -0.1214513954 -0.38463111 -0.337797955 0.18772652 -0.232079313 #> 23 -0.1507314908 0.35460236 -0.337797955 -0.25007557 -0.269009738 #> 24 -0.1507314908 -0.38880757 -0.337797955 0.06695353 -0.260487332 #> 25 -0.1360914431 -0.23010208 1.746852922 -0.54445974 0.270742627 #> 26 0.9887522192 -0.38463111 -0.337797955 -0.51426649 -0.260487332 #> 27 13.8524741014 -0.38880757 -0.337797955 -0.55200805 -0.266168936 #> 28 -0.1507314908 -0.38880757 -0.337797955 -0.55200805 -0.101402425 #> 29 -0.1507314908 0.05807368 -0.337797955 -0.31801038 -0.266168936 #> 30 -0.1458514749 -0.38880757 -0.337797955 -0.46897662 -0.260487332 #> 31 -0.1141313716 1.80383409 -0.320895380 0.42927250 0.301991448 #> 32 -0.1482914828 -0.38045465 -0.332163763 -0.33310700 -0.269009738 #> 33 -0.1507314908 -0.30945483 0.929895146 1.22184525 -0.269009738 #> 34 0.3836302490 -0.38880757 -0.337797955 -0.55200805 -0.269009738 #> 35 -0.1434114669 -0.38880757 -0.337797955 0.05940521 -0.266168936 #> 36 0.0542291766 -0.38880757 -0.337797955 -0.55200805 -0.254805728 #> 37 -0.1068113478 -0.38880757 -0.337797955 -0.52936311 2.219532746 #> 38 0.0883892878 -0.38463111 -0.337797955 -0.55200805 0.196881777 #> 39 -0.1507314908 -0.31780775 -0.337797955 -0.20478570 -0.226397709 #> 40 -0.1507314908 -0.27604314 -0.337797955 -0.14439921 0.114498521 #> 41 -0.1385314510 -0.38463111 -0.332163763 0.98029927 -0.269009738 #> 42 -0.0848512763 -0.30945483 -0.072990952 -0.01607790 -0.146855255 #> 43 -0.0360511174 -0.38880757 -0.337797955 -0.55200805 -0.269009738 #> 44 -0.1434114669 -0.38880757 -0.337797955 -0.55200805 -0.269009738 #> 45 -0.1019313319 -0.38880757 -0.337797955 -0.46142831 -0.266168936 #> 46 -0.1409714590 -0.38880757 3.262450451 0.53494886 -0.266168936 #> 47 -0.0214110697 -0.38880757 -0.337797955 0.82933303 -0.269009738 #> 48 -0.1312114272 -0.35121943 -0.337797955 2.98060192 -0.266168936 #> 49 -0.1287714193 -0.38880757 2.969472490 -0.52936311 -0.192308086 #> 50 -0.0946113080 -0.38880757 -0.337797955 -0.49162155 -0.269009738 #> 51 -0.1458514749 -0.18833748 -0.337797955 -0.44633168 -0.135492048 #> 52 -0.1458514749 3.57047681 -0.337797955 -0.54445974 0.392897110 #> 53 0.0493491607 -0.38880757 -0.337797955 1.64455071 -0.229238511 #> 54 0.1249894069 -0.38880757 -0.337797955 -0.54445974 -0.149696057 #> 55 -0.1482914828 -0.19251394 -0.337797955 -0.41613843 -0.269009738 #> 56 -0.0311711015 -0.38880757 -0.337797955 -0.55200805 -0.266168936 #> 57 -0.1507314908 -0.07139659 -0.337797955 -0.43123506 -0.254805728 #> 58 -0.0287310935 -0.37210173 -0.326529572 -0.54445974 -0.269009738 #> 59 -0.1092513557 -0.38880757 -0.337797955 -0.48407324 0.017911256 #> 60 -0.1507314908 -0.11733765 -0.337797955 -0.41613843 -0.269009738 #> 61 -0.1409714590 -0.38880757 -0.337797955 -0.32555869 0.071886493 #> 62 -0.1287714193 -0.28439607 -0.005380653 0.23301639 1.310476131 #> 63 -0.0458111492 -0.38880757 -0.332163763 -0.04627115 -0.007655961 #> 64 -0.1507314908 0.63442520 -0.281456039 0.48965899 -0.226397709 #> 65 -0.1507314908 -0.38880757 -0.337797955 -0.55200805 -0.220716105 #> 66 -0.1409714590 1.92912790 -0.337797955 -0.55200805 -0.090039217 #> 67 -0.1482914828 -0.32198421 -0.337797955 -0.09910934 -0.269009738 #> 68 -0.1507314908 0.04972076 2.293369503 -0.53691142 -0.269009738 #> 69 -0.1507314908 -0.05469075 -0.337797955 -0.42368675 -0.266168936 #> 70 -0.0653312127 0.55507246 -0.337797955 -0.18968908 1.685461984 #> 71 -0.1068113478 -0.38880757 -0.332163763 0.24056470 -0.260487332 #> 72 -0.1482914828 0.44230803 -0.337797955 -0.40104181 -0.226397709 #> 73 -0.1482914828 -0.38880757 -0.337797955 -0.29536544 -0.217875303 #> 74 -0.1482914828 -0.38880757 -0.337797955 -0.25762388 -0.269009738 #> 75 -0.1458514749 -0.34704297 0.011521922 -0.48407324 -0.257646530 #> 76 -0.0897312922 -0.17998456 -0.337797955 -0.55200805 -0.232079313 #> 77 -0.1409714590 -0.25933730 -0.326529572 -0.46897662 0.032115266 #> 78 -0.1482914828 0.07895598 -0.337797955 -0.55200805 -0.246283323 #> 79 -0.1507314908 -0.29692545 -0.337797955 -0.50671818 -0.269009738 #> 80 0.1591495182 -0.38463111 -0.337797955 -0.55200805 -0.269009738 #> 81 -0.1507314908 -0.01292614 0.203084435 -0.53691142 -0.266168936 #> 82 -0.0287310935 -0.36374881 7.662754058 -0.55200805 -0.269009738 #> 83 -0.1190113875 -0.38045465 -0.337797955 2.54279983 -0.195148888 #> 84 -0.1434114669 0.12489705 -0.337797955 2.80699074 -0.266168936 #> 85 0.9009119332 1.03536539 -0.337797955 -0.52936311 -0.269009738 #> 86 -0.1507314908 -0.19669040 -0.337797955 -0.55200805 -0.269009738 #> 87 -0.1507314908 0.47989617 -0.337797955 0.46701406 -0.240601719 #> 88 -0.1141313716 0.53419016 2.304637886 -0.34820363 -0.192308086 #> 89 -0.1507314908 -0.38880757 -0.337797955 -0.29536544 0.398578714 #> 90 -0.0214110697 -0.38880757 -0.337797955 -0.07646440 -0.266168936 #> 91 -0.1434114669 -0.38880757 -0.332163763 -0.46897662 -0.246283323 #> 92 -0.1482914828 1.78712825 -0.337797955 -0.55200805 -0.169581671 #> 93 -0.1507314908 -0.38880757 -0.337797955 -0.39349350 -0.240601719 #> 94 -0.1482914828 -0.32616067 1.284849214 -0.29536544 -0.158218463 #> 95 -0.0824112683 -0.35121943 -0.337797955 -0.25007557 -0.269009738 #> 96 -0.0580111889 -0.38880757 -0.337797955 -0.55200805 -0.266168936 #> 97 0.3909502729 -0.38880757 -0.337797955 -0.52936311 -0.266168936 #> 98 -0.1482914828 1.37365868 -0.337797955 -0.03117453 -0.266168936 #> 99 0.0005490018 -0.35539589 -0.337797955 -0.55200805 -0.269009738 #> 100 0.1786695817 -0.38463111 -0.337797955 -0.55200805 8.500545795 #> 101 -0.0946113080 -0.37210173 -0.247650890 -0.01607790 -0.266168936 #> 102 -0.1434114669 -0.38880757 -0.332163763 -0.42368675 -0.263328134 #> 103 -0.1019313319 -0.38880757 -0.337797955 0.73875328 -0.237760917 #> 104 -0.1482914828 0.41724927 1.160897000 -0.55200805 -0.251964926 #> 105 -0.1263314113 -0.38880757 -0.337797955 -0.52936311 -0.118447236 #> 106 0.5324707336 -0.38463111 0.496062396 -0.55200805 -0.269009738 #> 107 -0.1507314908 1.03954186 -0.337797955 0.11224340 -0.172422473 #> 108 -0.1385314510 -0.38880757 -0.337797955 -0.34820363 -0.095720821 #> 109 -0.1214513954 -0.38045465 -0.337797955 0.74630160 -0.269009738 #> 110 -0.1458514749 -0.38463111 -0.337797955 -0.47652493 -0.266168936 #> 111 -0.1507314908 -0.38463111 -0.337797955 -0.03872284 -0.269009738 #> 112 -0.0165310538 -0.17163164 -0.337797955 0.17262989 -0.263328134 #> 113 0.0200690653 -0.38880757 -0.337797955 -0.45387999 -0.200830492 #> 114 -0.1507314908 -0.32198421 -0.337797955 -0.42368675 -0.075835207 #> 115 -0.1507314908 -0.09645535 -0.337797955 -0.38594519 0.120180125 #> 116 0.1323094308 -0.35539589 -0.332163763 0.55759380 -0.206512096 #> 117 -0.1507314908 -0.30945483 1.476411727 -0.49162155 -0.260487332 #> 118 -0.1434114669 -0.38880757 -0.337797955 -0.55200805 -0.269009738 #> 119 -0.1507314908 -0.38880757 -0.337797955 0.57269042 -0.269009738 #> 120 -0.1409714590 -0.38045465 -0.332163763 0.88971952 -0.269009738 #> 121 -0.1507314908 -0.38880757 -0.332163763 -0.48407324 -0.269009738 #> 122 -0.1507314908 3.68741770 -0.337797955 -0.55200805 -0.030382377 #> 123 -0.1458514749 -0.38880757 -0.337797955 -0.55200805 -0.269009738 #> 124 -0.1019313319 -0.10063181 -0.337797955 0.85952627 -0.215034501 #> 125 -0.1287714193 -0.29692545 -0.337797955 0.49720730 -0.217875303 #> 126 -0.1092513557 0.78477778 -0.337797955 -0.10665765 0.228130598 #> 127 -0.1434114669 -0.38880757 -0.337797955 0.17262989 0.151428946 #> 128 -0.1360914431 -0.38045465 -0.332163763 -0.37839688 0.012229652 #> 129 -0.1507314908 -0.38880757 -0.337797955 -0.53691142 0.179836966 #> 130 -0.1482914828 0.61354290 -0.337797955 -0.35575194 1.557625898 #> 131 -0.1409714590 -0.38880757 -0.337797955 1.72003383 -0.234920115 #> 132 -0.1190113875 -0.34286651 -0.332163763 0.27830626 -0.269009738 #> 133 -0.1385314510 0.68454273 6.113351379 0.40662756 -0.146855255 #> 134 -0.1507314908 -0.38880757 -0.337797955 -0.43878337 -0.269009738 #> 135 -0.1336514351 -0.37210173 -0.332163763 -0.53691142 -0.260487332 #> 136 -0.1507314908 0.21260271 -0.337797955 -0.35575194 -0.254805728 #> 137 -0.1360914431 -0.38880757 -0.281456039 -0.55200805 -0.269009738 #> 138 -0.1409714590 1.77042241 -0.332163763 0.11224340 -0.124128840 #> 139 -0.1507314908 0.57595476 0.056595454 -0.52181480 -0.254805728 #> 140 -0.0458111492 0.54254308 -0.337797955 -0.55200805 -0.237760917 #> 141 -0.1507314908 0.12489705 -0.337797955 -0.40104181 -0.192308086 #> 142 -0.1482914828 0.18336749 -0.315261189 -0.55200805 -0.183785680 #> 143 -0.1238914034 -0.36374881 -0.337797955 -0.45387999 -0.243442521 #> 144 -0.1482914828 -0.38880757 1.955318009 -0.24252726 0.441190742 #> 145 -0.1312114272 -0.35957235 -0.337797955 -0.55200805 -0.260487332 #> 146 -0.1507314908 -0.10898473 -0.270187656 -0.55200805 0.784927775 #> 147 -0.0580111889 -0.38880757 -0.332163763 -0.55200805 -0.269009738 #> 148 -0.1507314908 -0.36792527 1.521485259 -0.51426649 -0.001974357 #> 149 0.2201497168 -0.33869005 -0.337797955 0.32359613 -0.269009738 #> 150 -0.0677712207 -0.38880757 -0.337797955 0.21791976 0.509369989 #> 151 -0.1507314908 -0.23845500 -0.337797955 -0.49162155 0.023592860 #> 152 -0.1482914828 -0.38463111 -0.337797955 0.77649484 -0.263328134 #> 153 -0.1482914828 -0.38880757 -0.292724422 -0.06136778 0.162792154 #> 154 -0.1385314510 -0.36374881 -0.337797955 -0.55200805 4.418313433 #> 155 0.2665098677 -0.32198421 -0.337797955 1.95403150 0.091772106 #> 156 -0.1482914828 -0.16745518 -0.337797955 0.35378938 -0.254805728 #> 157 0.4812305668 -0.37210173 -0.332163763 -0.55200805 -0.223556907 #> 158 -0.0824112683 2.04606879 -0.337797955 -0.51426649 0.052000879 #> 159 -0.1263314113 -0.10063181 -0.337797955 -0.53691142 -0.263328134 #> 160 -0.1482914828 -0.38880757 0.203084435 4.20342844 -0.260487332 #> 161 -0.1507314908 -0.38880757 0.974968678 0.32359613 -0.269009738 #> 162 -0.0994913239 -0.38880757 -0.337797955 -0.55200805 -0.263328134 #> 163 -0.1507314908 -0.18416102 -0.337797955 0.35378938 -0.269009738 #> 164 0.1079093513 -0.37627819 -0.163138017 0.90481615 -0.266168936 #> 165 -0.1287714193 -0.37627819 -0.337797955 -0.50671818 -0.237760917 #> 166 0.0347091130 0.50495493 -0.337797955 -0.54445974 5.517703777 #> 167 -0.1507314908 0.04136784 -0.337797955 -0.55200805 -0.269009738 #> 168 -0.1482914828 -0.38463111 -0.337797955 -0.55200805 -0.266168936 #> 169 -0.1482914828 -0.38880757 2.535639740 -0.55200805 -0.240601719 #> 170 0.5861509084 -0.38463111 -0.337797955 -0.55200805 0.941171881 #> 171 -0.1507314908 -0.29274899 -0.337797955 -0.50671818 -0.260487332 #> 172 -0.0799712604 -0.22592562 0.005887730 -0.35575194 -0.144014453 #> 173 0.0127490415 -0.33869005 -0.264553465 -0.12175427 -0.257646530 #> 174 -0.1507314908 -0.38463111 -0.208211549 -0.15949583 -0.001974357 #> 175 -0.1458514749 0.56342538 -0.298358614 0.11224340 -0.260487332 #> 176 -0.1312114272 1.81218701 -0.337797955 0.33869275 -0.266168936 #> 177 -0.1507314908 -0.31363129 1.279215022 -0.28781713 -0.269009738 #> 178 -0.0775312524 -0.38463111 -0.337797955 -0.55200805 -0.215034501 #> 179 0.1298694228 -0.33451359 -0.337797955 2.56544476 -0.269009738 #> 180 0.3445901219 -0.33033713 0.890455805 -0.37084856 0.091772106 #> 181 -0.1507314908 2.17136260 0.777771974 -0.43878337 -0.269009738 #> 182 -0.1507314908 5.69629511 -0.337797955 -0.50671818 -0.115606434 #> 183 -0.0994913239 -0.38045465 -0.337797955 -0.53691142 -0.269009738 #> 184 0.0371491210 -0.20086686 -0.095527718 -0.25762388 -0.223556907 #> 185 -0.1507314908 -0.38880757 2.259564353 0.05940521 -0.234920115 #> 186 -0.1385314510 -0.35957235 -0.089893526 -0.54445974 0.375852298 #> 187 -0.1360914431 -0.38880757 -0.337797955 -0.55200805 -0.246283323 #> 188 -0.1092513557 -0.38880757 -0.337797955 1.79551695 -0.266168936 #> 189 -0.1165713795 -0.36792527 0.417183714 -0.52936311 -0.246283323 #> 190 -0.1507314908 -0.35957235 -0.337797955 -0.34065532 -0.269009738 #> 191 -0.0628912048 -0.29692545 -0.337797955 0.72365666 -0.266168936 #> 192 -0.0189710618 -0.38463111 2.693397103 0.36888600 7.210821722 #> 193 -0.1360914431 -0.38880757 -0.337797955 0.26320964 -0.186626482 #> 194 0.0298290971 -0.38880757 -0.337797955 2.06725618 0.515051592 #> 195 -0.1458514749 -0.38880757 -0.337797955 -0.44633168 -0.269009738 #> 196 -0.1312114272 -0.38880757 -0.337797955 2.57299307 -0.269009738 #> 197 -0.1190113875 -0.34704297 2.225759204 -0.52936311 -0.257646530 #> 198 0.4446304476 -0.38880757 -0.332163763 0.83688134 -0.269009738 #> 199 0.0200690653 -0.38880757 -0.337797955 -0.54445974 0.128702531 #> 200 -0.1092513557 7.49217304 -0.337797955 -0.15194752 -0.269009738 #> Otu00039 Otu00040 Otu00041 Otu00042 Otu00043 #> 1 -0.369691676 -0.20704023 0.122728281 0.690525991 0.719828577 #> 2 0.504524822 -0.32139200 -0.630775883 -0.301679743 -0.243967502 #> 3 -0.439414464 0.35201286 0.855588495 -0.293479696 -0.461086399 #> 4 0.064734927 -0.33409775 -0.620453908 0.641325706 -0.127464679 #> 5 0.252450126 -0.85503359 4.860514738 2.211634782 -0.461086399 #> 6 -0.214156225 0.05978056 0.277557904 -0.301679743 0.545074343 #> 7 -0.385781550 -0.81691633 -0.424336386 -0.301679743 0.126723298 #> 8 -0.278515722 0.30118985 -0.661741808 -0.301679743 -0.381652656 #> 9 -0.133706855 -0.33409775 3.467048133 -0.297579720 -0.455790816 #> 10 -0.412598007 -0.46115527 0.071118407 -0.301679743 -0.461086399 #> 11 0.102277967 0.50448189 -0.661741808 -0.301679743 -0.461086399 #> 12 -0.417961299 -0.63903580 0.081440382 -0.301679743 0.312068697 #> 13 0.080824801 0.37742437 0.205304080 -0.010578061 -0.461086399 #> 14 -0.396508133 -0.55009554 0.298201853 4.581448478 -0.095691182 #> 15 -0.289242305 -0.37221501 1.712312408 3.257140824 -0.026848605 #> 16 -0.439414464 0.75859693 -0.651419833 -0.301679743 0.539778760 #> 17 -0.289242305 -0.33409775 0.659470973 -0.301679743 0.269704035 #> 18 -0.251699265 0.17413233 -0.155965040 -0.277079601 -0.005666274 #> 19 -0.058620775 -0.60091855 0.628505049 -0.256579483 -0.164533759 #> 20 1.362651445 1.52094206 -0.372726512 -0.297579720 -0.461086399 #> 21 -0.439414464 4.04938672 -0.661741808 -0.301679743 -0.455790816 #> 22 -0.310695471 -0.85503359 -0.661741808 -0.256579483 -0.249263085 #> 23 -0.407234716 0.79671419 -0.021779367 -0.297579720 0.132018880 #> 24 -0.305332179 1.34306153 1.640058584 -0.236079364 -0.365765907 #> 25 -0.439414464 0.25036685 -0.651419833 -0.301679743 -0.461086399 #> 26 -0.434051173 -0.74068182 0.721402822 -0.289379672 0.010220475 #> 27 -0.439414464 -0.85503359 -0.641097858 -0.231979341 -0.424017319 #> 28 -0.230246100 -0.57550704 -0.558522059 -0.002378014 -0.418721736 #> 29 0.466981782 -0.72797607 -0.290150713 -0.301679743 -0.392243822 #> 30 8.093582148 -0.74068182 -0.455302311 -0.268879554 3.399393499 #> 31 -0.310695471 0.14872083 -0.661741808 -0.297579720 -0.455790816 #> 32 -0.439414464 -0.30868625 -0.661741808 -0.281179625 -0.424017319 #> 33 -0.192703060 1.16518100 -0.630775883 -0.301679743 1.180544285 #> 34 0.139821007 0.84753719 0.174338155 -0.289379672 -0.413426153 #> 35 -0.273152431 -0.10539421 -0.475946260 -0.301679743 -0.085100016 #> 36 -0.332148636 1.02541772 -0.661741808 -0.297579720 -0.413426153 #> 37 0.542067861 -0.63903580 -0.269506763 -0.301679743 -0.053326519 #> 38 -0.439414464 -0.85503359 -0.651419833 -0.301679743 -0.461086399 #> 39 -0.417961299 -0.14351147 1.412975137 -0.301679743 -0.249263085 #> 40 0.247086835 -0.29598050 -0.114677141 -0.297579720 0.184974709 #> 41 0.043281762 0.31389561 -0.434658361 -0.301679743 -0.238671919 #> 42 -0.412598007 0.14872083 -0.279828738 -0.260679507 -0.392243822 #> 43 -0.439414464 -0.85503359 -0.641097858 -0.301679743 -0.429312902 #> 44 -0.203429643 -0.85503359 0.287879879 -0.289379672 -0.344583576 #> 45 -0.428687881 -0.82962208 -0.475946260 -0.301679743 -0.339287993 #> 46 0.129094424 0.37742437 -0.506912185 -0.252479459 -0.461086399 #> 47 -0.428687881 -0.80421058 -0.032101342 -0.297579720 0.290886366 #> 48 0.123731133 -0.05457121 -0.166287015 -0.301679743 -0.461086399 #> 49 -0.230246100 -0.62633005 -0.424336386 -0.301679743 0.820444651 #> 50 -0.417961299 0.16142658 0.019508532 -0.297579720 0.449753851 #> 51 0.450891908 -0.43574377 -0.455302311 -0.297579720 -0.461086399 #> 52 0.214907086 -0.74068182 -0.465624286 4.749549449 -0.302218913 #> 53 -0.434051173 0.17413233 -0.620453908 0.973427626 -0.461086399 #> 54 -0.439414464 1.10165224 -0.661741808 -0.297579720 -0.450495233 #> 55 -0.037167609 -0.37221501 0.225948029 -0.301679743 0.412684771 #> 56 -0.439414464 -0.85503359 -0.661741808 1.563831038 -0.461086399 #> 57 -0.235609391 -0.51197828 -0.434658361 1.157928692 -0.386948239 #> 58 -0.369691676 -0.84232784 -0.641097858 -0.293479696 -0.445199650 #> 59 -0.026441027 1.69882259 2.032293628 -0.293479696 -0.445199650 #> 60 -0.305332179 0.13601508 -0.228218864 -0.277079601 -0.010961856 #> 61 -0.412598007 -0.48656678 2.352274849 -0.293479696 -0.445199650 #> 62 -0.026441027 0.19954384 -0.290150713 -0.289379672 -0.439904067 #> 63 0.096914676 2.25787568 -0.073389241 -0.293479696 -0.445199650 #> 64 1.389467902 -0.32139200 -0.651419833 -0.289379672 0.052585138 #> 65 -0.439414464 -0.85503359 -0.424336386 -0.301679743 5.326985656 #> 66 -0.010351152 1.20329825 0.143372231 -0.301679743 -0.461086399 #> 67 -0.407234716 -0.81691633 -0.506912185 3.232540682 2.599760488 #> 68 -0.396508133 -0.55009554 1.784566232 -0.301679743 -0.455790816 #> 69 -0.316058762 0.40283587 -0.661741808 -0.301679743 0.063176303 #> 70 -0.273152431 -0.20704023 -0.661741808 -0.297579720 -0.455790816 #> 71 1.603999558 0.40283587 -0.114677141 -0.301679743 -0.381652656 #> 72 -0.273152431 0.05978056 -0.661741808 -0.301679743 -0.450495233 #> 73 -0.417961299 0.08519207 1.113637867 -0.301679743 -0.286332165 #> 74 0.048645053 0.26307260 -0.197252939 -0.297579720 0.211452623 #> 75 -0.310695471 -0.24515749 1.268467489 -0.297579720 0.788671154 #> 76 -0.257062557 -0.85503359 -0.114677141 -0.293479696 -0.116873513 #> 77 -0.358965093 -0.56280129 1.361365263 -0.289379672 -0.418721736 #> 78 -0.439414464 -0.43574377 1.144603791 -0.297579720 -0.461086399 #> 79 -0.396508133 -0.39762651 -0.052745291 -0.301679743 0.089654218 #> 80 -0.439414464 -0.81691633 -0.661741808 -0.301679743 -0.461086399 #> 81 -0.423324590 -0.23245173 -0.661741808 -0.301679743 -0.233376336 #> 82 -0.439414464 1.07624073 0.102084331 0.292823692 0.910469559 #> 83 3.760042699 0.92377171 -0.238540839 -0.297579720 -0.365765907 #> 84 2.816103414 3.09645532 -0.661741808 2.219834829 -0.450495233 #> 85 -0.439414464 -0.82962208 0.463353451 -0.100778582 0.274999617 #> 86 -0.439414464 -0.74068182 0.525285300 -0.297579720 -0.074508851 #> 87 0.820959014 -0.72797607 -0.279828738 -0.285279649 -0.402834987 #> 88 -0.273152431 -0.85503359 -0.651419833 -0.289379672 -0.333992410 #> 89 0.359715954 0.94918321 0.504641350 -0.293479696 -0.376357073 #> 90 -0.434051173 1.01271197 -0.661741808 -0.301679743 -0.461086399 #> 91 -0.391144842 -0.47386102 0.287879879 -0.301679743 -0.455790816 #> 92 -0.283879014 -0.84232784 -0.651419833 -0.301679743 -0.392243822 #> 93 -0.181976477 -0.85503359 -0.661741808 -0.297579720 -0.307514496 #> 94 -0.364328385 -0.85503359 -0.661741808 -0.297579720 -0.455790816 #> 95 -0.251699265 -0.34680350 0.463353451 -0.297579720 0.666872748 #> 96 -0.439414464 -0.09268846 0.153694206 -0.301679743 -0.461086399 #> 97 0.912134968 1.03812348 -0.641097858 -0.301679743 -0.439904067 #> 98 0.096914676 -0.51197828 0.834944546 -0.301679743 -0.461086399 #> 99 0.075461510 0.49177614 -0.661741808 -0.301679743 6.846817934 #> 100 -0.439414464 -0.85503359 -0.620453908 -0.289379672 4.109001601 #> 101 -0.294605596 -0.68985881 -0.372726512 -0.293479696 1.127588456 #> 102 -0.160523311 -0.65174155 -0.517234160 -0.244279412 -0.376357073 #> 103 -0.214156225 1.57176506 -0.589487984 -0.174579009 -0.386948239 #> 104 2.767833791 1.35576728 -0.383048487 -0.297579720 -0.450495233 #> 105 -0.407234716 -0.49927253 0.019508532 0.219023266 0.417980354 #> 106 1.051580544 -0.71527031 0.060796432 -0.301679743 2.864539631 #> 107 -0.396508133 -0.05457121 -0.444980336 -0.301679743 0.476231766 #> 108 -0.439414464 2.90586903 -0.661741808 0.145222839 -0.439904067 #> 109 -0.348238510 0.98730047 -0.630775883 -0.297579720 1.350002936 #> 110 0.134457715 -0.58821279 0.029830507 0.719226157 -0.016257439 #> 111 -0.364328385 -0.65174155 -0.661741808 -0.244279412 -0.445199650 #> 112 -0.439414464 4.51949955 0.339489753 -0.301679743 4.956294857 #> 113 -0.198066351 -0.85503359 -0.661741808 1.752432128 -0.455790816 #> 114 -0.171249894 -0.60091855 2.589680270 -0.297579720 -0.286332165 #> 115 -0.348238510 -0.04186545 -0.661741808 -0.301679743 0.089654218 #> 116 -0.181976477 -0.52468403 -0.001135417 -0.108978630 -0.291627748 #> 117 -0.396508133 0.04707481 0.969130219 -0.301679743 -0.461086399 #> 118 -0.439414464 -0.23245173 2.259377075 -0.301679743 -0.461086399 #> 119 0.107641258 -0.85503359 2.042615603 -0.293479696 -0.461086399 #> 120 6.806392213 1.94023187 -0.651419833 -0.297579720 -0.455790816 #> 121 -0.401871424 -0.65174155 1.113637867 0.018122105 -0.206898422 #> 122 0.745872935 -0.71527031 -0.661741808 1.756532152 -0.455790816 #> 123 -0.439414464 -0.85503359 -0.465624286 -0.297579720 -0.455790816 #> 124 0.761962809 0.93647746 -0.661741808 -0.297579720 -0.461086399 #> 125 -0.428687881 0.94918321 -0.558522059 0.624925612 -0.429312902 #> 126 0.037918470 -0.42303802 0.422065552 0.895527176 -0.461086399 #> 127 -0.122980272 -0.84232784 1.825854131 -0.297579720 -0.445199650 #> 128 0.155910881 -0.56280129 -0.661741808 -0.301679743 -0.243967502 #> 129 0.649333689 -0.66444731 -0.537878109 -0.301679743 -0.281036582 #> 130 -0.385781550 0.36471861 -0.166287015 -0.301679743 -0.461086399 #> 131 -0.439414464 -0.85503359 -0.589487984 -0.256579483 -0.450495233 #> 132 0.155910881 -0.33409775 -0.599809959 0.268223550 1.662442324 #> 133 0.155910881 1.68611683 -0.661741808 -0.301679743 -0.455790816 #> 134 -0.326785345 0.12330932 0.463353451 -0.301679743 1.620077661 #> 135 -0.139070146 0.80941994 -0.651419833 -0.301679743 -0.434608484 #> 136 -0.149796729 -0.21974598 2.114869427 -0.281179625 0.073767469 #> 137 -0.353601802 0.46636463 -0.661741808 0.743826299 -0.058622102 #> 138 -0.101527106 -0.39762651 -0.661741808 2.387935801 -0.461086399 #> 139 -0.149796729 -0.21974598 0.277557904 -0.301679743 -0.217489588 #> 140 0.525977987 1.19059250 0.164016180 -0.301679743 -0.461086399 #> 141 -0.332148636 -0.74068182 0.618183074 1.990233502 0.184974709 #> 142 -0.434051173 -0.84232784 -0.641097858 -0.289379672 -0.333992410 #> 143 1.587909684 -0.66444731 -0.465624286 -0.297579720 -0.318105662 #> 144 -0.439414464 -0.21974598 -0.362404537 -0.301679743 0.492118514 #> 145 -0.321422053 -0.85503359 -0.444980336 -0.281179625 1.561826250 #> 146 -0.342875219 -0.76609332 -0.475946260 9.243175419 -0.450495233 #> 147 -0.439414464 -0.85503359 -0.455302311 -0.293479696 -0.461086399 #> 148 -0.434051173 0.40283587 2.909661491 -0.301679743 0.889287228 #> 149 -0.439414464 -0.52468403 -0.403692436 -0.301679743 -0.461086399 #> 150 -0.439414464 0.45365888 0.308523828 -0.297579720 -0.376357073 #> 151 0.032555179 -0.70256456 0.287879879 -0.301679743 -0.461086399 #> 152 -0.004987861 0.96188896 -0.300472688 -0.002378014 -0.461086399 #> 153 -0.358965093 -0.81691633 6.832011934 -0.293479696 -0.461086399 #> 154 -0.412598007 0.31389561 -0.269506763 -0.297579720 0.169087960 #> 155 0.102277967 0.59342215 -0.630775883 -0.100778582 0.121427715 #> 156 -0.439414464 4.15103274 -0.290150713 -0.301679743 -0.461086399 #> 157 -0.439414464 -0.85503359 -0.630775883 -0.301679743 -0.355174742 #> 158 0.107641258 -0.47386102 0.215626055 -0.301679743 -0.386948239 #> 159 -0.031804318 -0.13080572 0.153694206 -0.281179625 -0.318105662 #> 160 1.169572955 -0.77879908 -0.630775883 -0.301679743 -0.429312902 #> 161 -0.332148636 0.22495534 -0.630775883 -0.301679743 -0.461086399 #> 162 -0.417961299 -0.01645395 -0.661741808 -0.297579720 -0.450495233 #> 163 -0.042530901 0.21224959 -0.599809959 -0.301679743 -0.455790816 #> 164 -0.407234716 1.95293763 -0.114677141 -0.281179625 -0.445199650 #> 165 -0.364328385 2.10540665 -0.610131933 -0.301679743 0.592734588 #> 166 -0.439414464 -0.85503359 -0.661741808 -0.301679743 -0.455790816 #> 167 -0.439414464 -0.85503359 -0.434658361 -0.301679743 -0.461086399 #> 168 3.373885719 -0.06727696 -0.661741808 -0.223779293 -0.450495233 #> 169 0.359715954 -0.84232784 -0.589487984 0.124722721 -0.185716091 #> 170 -0.439414464 -0.85503359 -0.661741808 -0.297579720 -0.461086399 #> 171 -0.391144842 -0.28327474 0.525285300 -0.301679743 0.635099251 #> 172 -0.439414464 0.05978056 -0.465624286 -0.240179388 0.862809314 #> 173 -0.417961299 -0.76609332 -0.630775883 -0.301679743 3.341142087 #> 174 0.338262788 -0.15621722 0.680114923 -0.301679743 -0.085100016 #> 175 0.005738722 -0.04186545 1.010418118 -0.277079601 -0.455790816 #> 176 -0.439414464 -0.85503359 -0.661741808 0.501924901 -0.461086399 #> 177 -0.391144842 -0.43574377 -0.032101342 -0.293479696 -0.058622102 #> 178 -0.439414464 1.39388453 -0.145643065 -0.301679743 -0.461086399 #> 179 -0.439414464 0.61883366 -0.661741808 -0.301679743 -0.445199650 #> 180 -0.369691676 -0.49927253 0.164016180 -0.301679743 -0.069213268 #> 181 -0.267789139 -0.39762651 0.081440382 0.961127555 -0.153942593 #> 182 3.111084440 1.03812348 -0.661741808 -0.178679033 -0.439904067 #> 183 -0.198066351 -0.51197828 -0.290150713 -0.301679743 -0.196307256 #> 184 -0.353601802 -0.70256456 2.486460522 -0.293479696 -0.408130570 #> 185 -0.439414464 1.22870976 -0.496590210 -0.281179625 -0.381652656 #> 186 -0.407234716 -0.85503359 -0.661741808 -0.293479696 -0.413426153 #> 187 -0.439414464 -0.85503359 0.607861099 -0.301679743 -0.455790816 #> 188 -0.439414464 3.94774071 -0.661741808 -0.268879554 -0.445199650 #> 189 -0.423324590 -0.84232784 -0.527556135 -0.256579483 -0.333992410 #> 190 -0.321422053 -0.41033226 1.805210182 -0.285279649 -0.397539405 #> 191 0.134457715 -0.62633005 -0.661741808 0.014022081 -0.386948239 #> 192 -0.439414464 1.52094206 -0.661741808 -0.301679743 -0.450495233 #> 193 -0.412598007 -0.09268846 -0.094033191 -0.289379672 0.455049434 #> 194 -0.423324590 0.98730047 -0.527556135 -0.133578772 -0.392243822 #> 195 -0.375054967 -0.15621722 0.236270004 -0.297579720 1.090519376 #> 196 -0.144433437 -0.85503359 -0.661741808 0.104222602 -0.450495233 #> 197 -0.439414464 -0.82962208 -0.001135417 -0.293479696 -0.376357073 #> 198 0.692240021 -0.81691633 -0.661741808 -0.301679743 -0.445199650 #> 199 -0.423324590 -0.75338757 -0.290150713 -0.293479696 -0.191011673 #> 200 0.445528616 0.11060357 0.494319376 -0.301679743 -0.392243822 #> Otu00044 Otu00045 Otu00046 Otu00047 Otu00048 Otu00049 #> 1 -0.611704260 -0.23391339 0.693551357 -0.203512195 -0.253544727 0.60651290 #> 2 -0.622709104 -0.23391339 -0.569110688 -0.208661143 -0.253544727 -0.42970775 #> 3 0.026576699 -0.23391339 -0.584323484 0.342276360 0.007337307 -0.42161228 #> 4 0.092605763 -0.23391339 -0.523472301 -0.208661143 -0.253544727 -0.43780323 #> 5 -0.303568625 -0.14075174 -0.584323484 -0.208661143 -0.194846269 0.42841248 #> 6 -0.259549248 -0.23391339 0.784828131 -0.208661143 -0.253544727 -0.43780323 #> 7 0.829930318 -0.23391339 -0.584323484 -0.033596890 -0.247022676 0.01554331 #> 8 -0.204525028 -0.23391339 0.221954690 -0.208661143 -0.253544727 -0.33256207 #> 9 -0.534670351 -0.23391339 -0.584323484 -0.208661143 -0.070927303 -0.31637112 #> 10 -0.446631598 -0.23391339 -0.584323484 0.501893767 -0.207890371 -0.42970775 #> 11 0.235668737 -0.23391339 1.895362219 -0.203512195 -0.247022676 -0.43780323 #> 12 -0.622709104 -0.23391339 -0.188790795 -0.208661143 -0.116581659 -0.40542133 #> 13 -0.314573469 -0.23391339 -0.584323484 -0.208661143 -0.227456524 -0.42161228 #> 14 -0.578689727 -0.18733256 0.298018668 -0.208661143 -0.253544727 -0.43780323 #> 15 0.884954539 -0.23391339 1.180360820 -0.208661143 -0.253544727 -0.40542133 #> 16 -0.611704260 -0.10348707 -0.584323484 -0.193214297 -0.253544727 8.67770035 #> 17 0.004567010 -0.23391339 0.678338561 -0.208661143 -0.207890371 -0.41351681 #> 18 -0.215529872 -0.23391339 0.632700174 -0.203512195 -0.253544727 -0.43780323 #> 19 0.169639672 -0.23391339 -0.386557139 -0.208661143 -0.253544727 0.68746764 #> 20 -0.402612222 0.55174991 -0.584323484 -0.208661143 -0.247022676 -0.43780323 #> 21 -0.600699416 -0.23391339 -0.477833914 -0.208661143 1.142174157 -0.42161228 #> 22 0.488780151 -0.23391339 -0.234429182 -0.203512195 -0.227456524 -0.42970775 #> 23 -0.039452366 -0.23391339 -0.097514021 -0.208661143 -0.247022676 0.50936722 #> 24 6.431395968 -0.23391339 1.119509637 0.316531617 -0.253544727 -0.13017522 #> 25 -0.435626754 -0.23391339 -0.584323484 -0.208661143 -0.207890371 -0.43780323 #> 26 0.279688113 -0.23391339 -0.127939612 -0.203512195 0.626932139 -0.43780323 #> 27 -0.732757545 -0.23391339 -0.584323484 -0.208661143 5.707609757 1.02747754 #> 28 -0.380602533 -0.23391339 -0.584323484 -0.208661143 -0.253544727 -0.43780323 #> 29 -0.633713948 -0.23080800 -0.219216386 0.002445751 -0.253544727 0.03982973 #> 30 -0.545675195 -0.23080800 -0.295280365 -0.203512195 -0.253544727 -0.43780323 #> 31 -0.644718792 -0.23391339 -0.584323484 -0.208661143 -0.207890371 -0.26779828 #> 32 -0.226534716 5.84954278 -0.584323484 -0.208661143 -0.253544727 -0.43780323 #> 33 0.026576699 -0.23391339 1.073871250 -0.141724811 -0.253544727 0.52555816 #> 34 -0.655723636 -0.23391339 0.982594476 3.756029300 0.920424427 -0.02493406 #> 35 -0.347588001 -0.23080800 -0.264854773 -0.208661143 -0.240500625 0.26650300 #> 36 -0.721752701 -0.23391339 -0.584323484 -0.208661143 0.033425511 -0.28398922 #> 37 1.677303314 -0.23391339 0.510997808 -0.208661143 -0.097015507 -0.38113491 #> 38 0.829930318 -0.23391339 0.008975549 -0.208661143 -0.233978575 -0.12207975 #> 39 -0.006437834 7.04201198 0.754402540 -0.208661143 -0.253544727 0.12078447 #> 40 0.180644516 -0.23080800 1.256424799 -0.208661143 -0.253544727 -0.41351681 #> 41 -0.138495963 -0.23080800 0.008975549 -0.208661143 -0.247022676 0.48508079 #> 42 -0.292563781 -0.22459723 -0.493046709 -0.193214297 0.274741392 -0.41351681 #> 43 -0.523665507 -0.23391339 -0.584323484 -0.208661143 1.311747479 -0.34065754 #> 44 -0.094476587 -0.14385712 2.153979746 -0.208661143 -0.227456524 -0.36494396 #> 45 0.202654204 -0.23391339 -0.462621118 -0.208661143 1.279137225 0.19364374 #> 46 -0.380602533 -0.23391339 -0.569110688 -0.188065349 -0.194846269 -0.42161228 #> 47 3.206976645 -0.23391339 -0.386557139 0.661511175 0.079079867 -0.37303944 #> 48 -0.600699416 -0.23080800 -0.584323484 -0.208661143 -0.220934473 -0.43780323 #> 49 -0.380602533 -0.23391339 -0.386557139 -0.208661143 -0.207890371 -0.08969785 #> 50 -0.490650974 -0.23391339 0.100252324 -0.203512195 1.670460276 -0.31637112 #> 51 -0.215529872 -0.23391339 0.419721034 -0.208661143 -0.253544727 -0.43780323 #> 52 -0.688738168 -0.23391339 0.997807271 -0.208661143 -0.253544727 -0.43780323 #> 53 -0.721752701 -0.23391339 -0.584323484 -0.208661143 0.046469612 -0.43780323 #> 54 -0.534670351 -0.22770262 -0.188790795 -0.208661143 0.366050104 -0.42161228 #> 55 -0.248544404 -0.23391339 3.918664050 -0.208661143 -0.253544727 -0.43780323 #> 56 -0.732757545 -0.23391339 -0.584323484 -0.208661143 -0.220934473 -0.42970775 #> 57 -0.127491119 -0.02274697 -0.508259505 -0.208661143 -0.253544727 -0.17065259 #> 58 -0.721752701 -0.23391339 0.176316302 -0.198363246 -0.247022676 -0.34065754 #> 59 -0.325578313 -0.23391339 -0.371344344 -0.203512195 -0.240500625 -0.38923038 #> 60 0.323707489 0.39026971 -0.538685096 -0.208661143 -0.253544727 0.08840257 #> 61 1.226104706 -0.23391339 -0.584323484 -0.208661143 -0.253544727 -0.42161228 #> 62 -0.699743012 -0.23391339 -0.416982731 -0.203512195 0.079079867 0.25031205 #> 63 -0.501655819 2.33734833 -0.477833914 -0.203512195 -0.175280117 -0.42970775 #> 64 -0.567684883 -0.23391339 0.510997808 -0.203512195 -0.240500625 -0.41351681 #> 65 -0.468641286 -0.23080800 -0.219216386 -0.115980068 -0.253544727 2.04750725 #> 66 0.983998136 -0.23391339 -0.082301225 -0.203512195 -0.149191913 -0.07350690 #> 67 -0.446631598 -0.23391339 -0.508259505 -0.018150044 -0.253544727 0.74413596 #> 68 1.435196744 -0.23391339 3.812174480 -0.208661143 -0.253544727 -0.43780323 #> 69 0.873949695 -0.23391339 5.303028460 -0.208661143 -0.227456524 1.06795491 #> 70 -0.534670351 -0.23391339 -0.584323484 -0.208661143 -0.083971405 -0.42970775 #> 71 0.433755930 -0.18422718 -0.553897892 -0.208661143 -0.240500625 0.54174911 #> 72 1.138065953 -0.23391339 -0.584323484 -0.208661143 -0.253544727 -0.07350690 #> 73 -0.369597689 -0.23391339 2.473448456 6.943228501 -0.227456524 -0.38923038 #> 74 -0.094476587 1.04550669 -0.386557139 -0.208661143 -0.253544727 -0.20303448 #> 75 -0.347588001 -0.23391339 0.374082647 -0.208661143 -0.253544727 -0.29208470 #> 76 -0.710747857 -0.23391339 -0.158365203 -0.208661143 0.646498291 -0.43780323 #> 77 0.510789839 -0.23080800 -0.553897892 -0.208661143 -0.253544727 -0.43780323 #> 78 -0.732757545 -0.23391339 -0.584323484 -0.208661143 0.033425511 -0.43780323 #> 79 0.048586387 4.98624476 -0.204003591 -0.208661143 -0.253544727 -0.08160238 #> 80 0.323707489 -0.23391339 -0.584323484 -0.208661143 -0.136147812 -0.43780323 #> 81 0.499784995 -0.23391339 0.997807271 -0.208661143 -0.253544727 0.09649805 #> 82 -0.732757545 -0.23391339 -0.584323484 -0.203512195 -0.129625761 -0.42161228 #> 83 0.147629984 -0.23080800 -0.356131548 -0.208661143 -0.240500625 -0.42161228 #> 84 -0.523665507 -0.23391339 -0.584323484 -0.208661143 -0.227456524 -0.30018017 #> 85 5.352921246 -0.19975412 -0.569110688 -0.208661143 -0.175280117 0.06411615 #> 86 -0.457636442 -0.23391339 -0.401769935 -0.208661143 0.248653189 -0.29208470 #> 87 0.081600919 -0.23391339 -0.553897892 -0.208661143 -0.240500625 -0.42970775 #> 88 -0.116486275 -0.23391339 -0.584323484 -0.208661143 -0.253544727 -0.34875301 #> 89 0.774906098 -0.23391339 1.773659853 -0.208661143 -0.253544727 -0.43780323 #> 90 -0.534670351 -0.22149184 -0.584323484 -0.208661143 0.666064444 -0.43780323 #> 91 -0.380602533 -0.23391339 1.682383079 -0.198363246 -0.253544727 -0.32446659 #> 92 0.499784995 -0.23391339 3.583982544 -0.208661143 -0.253544727 -0.39732586 #> 93 -0.633713948 -0.23391339 -0.538685096 -0.208661143 -0.253544727 -0.35684849 #> 94 -0.457636442 -0.23391339 0.419721034 -0.208661143 -0.253544727 -0.33256207 #> 95 -0.391607378 -0.23391339 0.298018668 -0.208661143 -0.083971405 -0.39732586 #> 96 -0.732757545 -0.23391339 -0.584323484 1.160959192 0.144300375 -0.43780323 #> 97 -0.369597689 -0.23080800 -0.584323484 -0.193214297 0.242131138 0.06411615 #> 98 -0.259549248 -0.23391339 0.434933830 -0.208661143 -0.253544727 -0.38113491 #> 99 -0.677733324 -0.23391339 -0.584323484 -0.208661143 -0.038317049 -0.39732586 #> 100 3.273005710 -0.23391339 -0.477833914 -0.208661143 -0.253544727 4.16042593 #> 101 0.554809216 -0.23391339 -0.553897892 -0.167469554 -0.057883201 1.04366849 #> 102 -0.710747857 0.20084100 -0.508259505 -0.208661143 -0.207890371 -0.42161228 #> 103 -0.435626754 -0.23391339 -0.584323484 -0.208661143 -0.175280117 0.14507089 #> 104 -0.600699416 -0.23080800 -0.584323484 -0.208661143 0.633454190 -0.33256207 #> 105 -0.281558936 -0.23391339 0.008975549 -0.208661143 -0.240500625 -0.36494396 #> 106 -0.479646130 -0.22770262 -0.097514021 -0.208661143 0.509535223 1.65892451 #> 107 0.213659048 -0.23391339 -0.569110688 -0.208661143 -0.253544727 1.18129155 #> 108 0.213659048 -0.23391339 -0.584323484 -0.208661143 -0.253544727 -0.36494396 #> 109 1.699313003 -0.22459723 1.210786411 -0.208661143 -0.253544727 3.28611475 #> 110 2.260560052 -0.03206314 1.575893509 -0.208661143 -0.240500625 0.03173426 #> 111 1.908405041 -0.23391339 -0.462621118 -0.208661143 -0.253544727 -0.42161228 #> 112 0.686867345 -0.23391339 -0.584323484 0.120871569 -0.253544727 3.50469255 #> 113 0.466770463 -0.23391339 -0.584323484 -0.208661143 -0.233978575 -0.43780323 #> 114 0.653852813 6.28429718 1.560680713 -0.203512195 -0.253544727 -0.33256207 #> 115 -0.149500807 -0.23391339 1.530255122 -0.208661143 -0.247022676 2.12846199 #> 116 -0.314573469 -0.23391339 -0.493046709 -0.146873760 -0.207890371 -0.42970775 #> 117 -0.490650974 -0.23080800 -0.584323484 -0.208661143 -0.207890371 -0.42970775 #> 118 -0.710747857 -0.23080800 -0.584323484 2.715941677 -0.240500625 -0.43780323 #> 119 -0.380602533 -0.23391339 -0.584323484 3.169049157 -0.194846269 -0.41351681 #> 120 -0.600699416 -0.23080800 -0.584323484 -0.208661143 -0.253544727 -0.42970775 #> 121 -0.358592845 5.26883512 -0.584323484 -0.208661143 -0.253544727 0.08840257 #> 122 -0.501655819 -0.23080800 -0.432195526 -0.208661143 -0.253544727 -0.42970775 #> 123 -0.369597689 -0.22149184 -0.584323484 -0.038745838 -0.247022676 -0.43780323 #> 124 -0.402612222 -0.23391339 -0.569110688 -0.208661143 -0.247022676 -0.42970775 #> 125 0.664857657 -0.23391339 -0.508259505 -0.208661143 -0.227456524 -0.42161228 #> 126 -0.490650974 -0.23391339 1.438978347 -0.203512195 -0.201368320 -0.43780323 #> 127 -0.534670351 -0.23080800 -0.401769935 -0.203512195 -0.123103710 -0.34875301 #> 128 -0.644718792 -0.23391339 -0.523472301 -0.208661143 -0.253544727 0.36364869 #> 129 0.015571854 -0.23391339 -0.310493161 1.572875082 -0.253544727 0.71175406 #> 130 -0.094476587 -0.23391339 -0.584323484 -0.203512195 -0.253544727 -0.30827565 #> 131 -0.567684883 0.69770317 -0.584323484 -0.208661143 -0.025272947 -0.43780323 #> 132 -0.039452366 -0.23391339 0.860892110 -0.198363246 -0.253544727 1.01938207 #> 133 0.972993292 -0.23391339 -0.584323484 -0.208661143 -0.240500625 -0.37303944 #> 134 0.400741398 -0.23391339 1.895362219 -0.208661143 -0.253544727 -0.40542133 #> 135 -0.534670351 -0.22770262 -0.432195526 -0.208661143 -0.253544727 -0.10588880 #> 136 0.037581543 -0.23391339 -0.584323484 -0.208661143 -0.253544727 1.36748745 #> 137 -0.578689727 -0.23391339 -0.264854773 -0.208661143 -0.227456524 1.17319607 #> 138 0.928973915 -0.22770262 -0.584323484 -0.208661143 -0.201368320 -0.43780323 #> 139 -0.545675195 -0.11901402 -0.584323484 -0.208661143 -0.247022676 -0.21922543 #> 140 3.262000866 -0.23391339 -0.584323484 -0.203512195 -0.240500625 -0.43780323 #> 141 0.895959383 -0.22149184 -0.386557139 -0.208661143 -0.253544727 0.08840257 #> 142 -0.600699416 -0.23391339 -0.462621118 -0.208661143 -0.253544727 -0.42161228 #> 143 0.125620295 0.74428400 -0.584323484 -0.193214297 -0.240500625 0.82509070 #> 144 -0.468641286 -0.21217567 0.161103507 -0.136575862 -0.253544727 -0.34065754 #> 145 -0.160505651 -0.23391339 -0.584323484 -0.198363246 -0.240500625 -0.33256207 #> 146 -0.589694571 -0.22149184 4.146855986 -0.182916400 -0.253544727 -0.43780323 #> 147 -0.633713948 -0.23391339 -0.584323484 -0.208661143 0.137778324 -0.13017522 #> 148 -0.732757545 -0.23391339 -0.584323484 -0.208661143 -0.247022676 0.81699522 #> 149 -0.567684883 -0.23391339 0.298018668 -0.208661143 0.085601918 -0.42970775 #> 150 -0.732757545 -0.23391339 -0.553897892 -0.208661143 -0.162236015 -0.43780323 #> 151 -0.611704260 -0.23080800 -0.310493161 -0.208661143 -0.253544727 -0.43780323 #> 152 -0.281558936 -0.23391339 -0.584323484 0.980745990 -0.253544727 -0.43780323 #> 153 -0.424621910 -0.23391339 2.777704371 9.152127462 -0.253544727 -0.31637112 #> 154 -0.699743012 -0.23391339 1.515042326 -0.208661143 -0.233978575 0.20983468 #> 155 -0.534670351 -0.23391339 -0.584323484 -0.208661143 -0.207890371 4.74330005 #> 156 -0.490650974 -0.23391339 -0.584323484 -0.208661143 0.020381409 -0.43780323 #> 157 -0.699743012 -0.22770262 -0.584323484 -0.208661143 11.623109885 -0.29208470 #> 158 2.271564896 -0.19975412 3.188449855 -0.208661143 -0.253544727 -0.43780323 #> 159 -0.622709104 -0.23391339 -0.584323484 -0.208661143 -0.175280117 -0.31637112 #> 160 -0.556680039 -0.23080800 -0.401769935 -0.208661143 -0.247022676 -0.43780323 #> 161 -0.567684883 0.65422773 -0.584323484 -0.208661143 -0.253544727 -0.43780323 #> 162 -0.501655819 -0.23391339 0.465359421 -0.208661143 -0.201368320 0.76032691 #> 163 1.369167679 0.46169364 1.241212003 -0.208661143 -0.253544727 -0.30018017 #> 164 -0.446631598 -0.23391339 -0.493046709 -0.198363246 0.222564986 -0.42970775 #> 165 0.400741398 -0.23080800 -0.553897892 -0.208661143 -0.240500625 -0.10588880 #> 166 -0.732757545 -0.23391339 -0.584323484 -0.208661143 1.540019259 -0.26779828 #> 167 -0.545675195 -0.23080800 0.480572217 0.337127411 -0.247022676 -0.39732586 #> 168 0.191649360 -0.23080800 -0.432195526 -0.208661143 -0.253544727 -0.43780323 #> 169 -0.512660663 -0.23391339 -0.432195526 -0.208661143 -0.175280117 0.88985449 #> 170 -0.721752701 -0.23080800 -0.584323484 -0.208661143 0.653020342 -0.36494396 #> 171 0.257678425 -0.23391339 1.362914369 -0.203512195 -0.181802168 -0.40542133 #> 172 -0.501655819 -0.19043795 -0.493046709 -0.208661143 -0.247022676 2.04750725 #> 173 -0.512660663 -0.23391339 -0.553897892 -0.208661143 0.326917799 2.76800443 #> 174 -0.677733324 1.07345519 -0.584323484 -0.208661143 -0.247022676 -0.37303944 #> 175 0.015571854 -0.23391339 -0.112726816 -0.203512195 -0.253544727 -0.43780323 #> 176 -0.358592845 -0.23391339 -0.569110688 -0.208661143 0.366050104 0.11268900 #> 177 0.059591231 0.80639177 -0.280067569 -0.208661143 -0.253544727 -0.43780323 #> 178 1.006007824 -0.23080800 -0.584323484 -0.208661143 0.561711630 -0.43780323 #> 179 -0.732757545 -0.23080800 -0.584323484 -0.208661143 -0.077449354 0.23412110 #> 180 -0.402612222 0.02693925 0.632700174 -0.188065349 -0.253544727 0.32317132 #> 181 -0.270554092 -0.23391339 0.008975549 0.450404281 -0.253544727 0.39603058 #> 182 0.609833436 -0.23391339 0.465359421 -0.208661143 -0.227456524 -0.42161228 #> 183 0.631843124 0.11389013 -0.401769935 -0.208661143 -0.253544727 -0.30018017 #> 184 -0.589694571 -0.22459723 -0.371344344 -0.172618503 0.222564986 -0.35684849 #> 185 -0.457636442 0.65112234 -0.553897892 -0.208661143 -0.253544727 -0.37303944 #> 186 -0.655723636 -0.23391339 -0.477833914 -0.208661143 -0.247022676 -0.32446659 #> 187 0.895959383 -0.23391339 -0.584323484 -0.208661143 0.092123968 -0.30827565 #> 188 -0.248544404 -0.23391339 -0.493046709 -0.208661143 -0.129625761 -0.18684354 #> 189 -0.666728480 -0.23080800 -0.553897892 4.682840053 0.150822426 -0.41351681 #> 190 -0.171510495 1.64484668 1.073871250 -0.110831119 -0.247022676 -0.42970775 #> 191 -0.369597689 -0.23391339 -0.553897892 -0.208661143 2.146569989 -0.30018017 #> 192 3.735209162 -0.22459723 -0.569110688 -0.208661143 -0.240500625 -0.43780323 #> 193 -0.369597689 -0.23080800 0.328444260 -0.208661143 -0.253544727 -0.31637112 #> 194 0.224663892 -0.23391339 -0.356131548 -0.208661143 -0.253544727 -0.32446659 #> 195 -0.204525028 -0.23080800 0.313231464 -0.177767451 -0.247022676 0.43650795 #> 196 -0.490650974 -0.23391339 -0.386557139 -0.208661143 -0.188324219 -0.43780323 #> 197 -0.435626754 -0.23391339 -0.569110688 -0.208661143 -0.142669863 -0.42161228 #> 198 -0.666728480 -0.23391339 -0.553897892 -0.208661143 -0.103537557 -0.22732091 #> 199 -0.303568625 -0.23391339 -0.340918752 -0.208661143 1.983518717 -0.29208470 #> 200 2.876831322 -0.23391339 -0.584323484 -0.208661143 -0.253544727 -0.42970775 #> Otu00050 Otu00051 Otu00052 Otu00053 Otu00054 Otu00055 #> 1 -0.475385806 -0.20991733 0.19735560 -0.082761027 -0.18688626 -0.256009183 #> 2 -0.450642238 -0.20991733 -0.25745566 0.651532741 -0.45315341 -0.418554697 #> 3 0.304036595 -0.16859502 5.36271211 -0.189845534 1.12780781 -0.377918318 #> 4 1.380381816 -0.20991733 -0.25745566 -0.128654387 -0.08703608 -0.405009237 #> 5 -0.549616511 2.09035789 -0.25745566 -0.465205697 -0.53636190 -0.201827346 #> 6 -0.475385806 -0.20991733 -0.25745566 -0.342823403 -0.58628699 -0.283100102 #> 7 -0.524872942 -0.20991733 0.06740953 -0.082761027 -0.33666153 -0.432100156 #> 8 1.652561068 -0.20991733 -0.22496914 -0.388716763 -0.51972020 -0.418554697 #> 9 0.390639084 -0.20991733 -0.25745566 1.095168558 0.76169047 0.136809140 #> 10 -0.475385806 -0.20991733 -0.25745566 -0.373418976 0.26243956 0.096172762 #> 11 3.384610848 -0.20991733 -0.25745566 -0.465205697 -0.58628699 -0.296645562 #> 12 -0.549616511 -0.20991733 -0.25745566 -0.419312337 -0.38658662 -0.296645562 #> 13 -0.549616511 -0.20991733 -0.25745566 0.085514628 -0.30337814 -0.418554697 #> 14 -0.425898669 0.04490358 -0.25745566 -0.358121189 -0.50307850 -0.350827400 #> 15 0.192690538 -0.20991733 -0.25745566 0.024323481 -0.58628699 -0.296645562 #> 16 -0.203206555 0.84380156 -0.25745566 -0.465205697 -0.53636190 -0.432100156 #> 17 -0.549616511 -0.20991733 -0.25745566 -0.419312337 -0.03711098 -0.364372859 #> 18 -0.376411533 -0.20991733 -0.25745566 -0.312227829 0.16258938 0.245172816 #> 19 1.120574349 -0.20303028 -0.25745566 -0.281632255 -0.18688626 -0.405009237 #> 20 -0.524872942 0.91955912 -0.25745566 0.100812415 -0.58628699 -0.201827346 #> 21 -0.512501158 -0.20991733 -0.25745566 -0.465205697 -0.10367777 4.850629026 #> 22 -0.487757590 -0.20303028 -0.25745566 -0.449907910 2.24280151 -0.432100156 #> 23 -0.326924396 -0.20991733 -0.25745566 -0.388716763 -0.35330323 -0.432100156 #> 24 1.256663975 -0.20991733 7.27941672 -0.465205697 -0.51972020 -0.432100156 #> 25 -0.265065475 -0.20991733 -0.25745566 -0.006272093 2.12630963 -0.201827346 #> 26 -0.549616511 -0.20991733 0.58719383 -0.388716763 -0.43651171 0.475445626 #> 27 -0.512501158 -0.20991733 -0.25745566 -0.449907910 -0.58628699 2.547900921 #> 28 0.019485560 -0.20991733 -0.25745566 -0.434610124 -0.40322832 -0.405009237 #> 29 1.442240737 -0.18236913 -0.25745566 -0.449907910 -0.32001983 1.829991571 #> 30 -0.549616511 -0.20991733 -0.25745566 -0.465205697 -0.58628699 -0.405009237 #> 31 -0.549616511 -0.20303028 -0.24662682 -0.465205697 -0.07039438 2.209264435 #> 32 -0.005258008 -0.03774104 5.22193719 1.079870772 -0.10367777 -0.418554697 #> 33 -0.302180828 -0.20991733 -0.25745566 -0.327525616 -0.51972020 0.949536707 #> 34 3.533072258 -0.20991733 -0.25745566 -0.449907910 2.79197752 0.109718221 #> 35 -0.549616511 -0.20991733 -0.25745566 -0.312227829 -0.56964529 -0.323736481 #> 36 -0.537244727 -0.20991733 -0.24662682 -0.465205697 -0.41987002 -0.418554697 #> 37 -0.549616511 -0.20991733 -0.25745566 -0.419312337 -0.50307850 -0.147645508 #> 38 -0.524872942 -0.20991733 -0.25745566 -0.465205697 -0.58628699 -0.377918318 #> 39 -0.512501158 -0.08595040 -0.25745566 0.009025694 -0.58628699 -0.405009237 #> 40 1.937112103 -0.20991733 -0.25745566 -0.465205697 0.11266429 -0.418554697 #> 41 -0.116604066 -0.20991733 -0.25745566 -0.465205697 -0.00382759 3.035537461 #> 42 -0.487757590 -0.18236913 -0.09502307 -0.189845534 0.27908126 -0.120554589 #> 43 0.897882235 -0.20303028 -0.23579798 -0.465205697 2.30936830 2.507264543 #> 44 -0.401155101 -0.20991733 -0.04087887 -0.159249961 -0.12031947 -0.377918318 #> 45 -0.549616511 -0.20991733 -0.25745566 -0.449907910 0.86154066 -0.256009183 #> 46 0.056600912 -0.20991733 -0.25745566 -0.434610124 -0.33666153 -0.432100156 #> 47 -0.500129374 -0.20991733 -0.25745566 -0.388716763 -0.33666153 -0.377918318 #> 48 -0.549616511 -0.20991733 -0.25745566 -0.465205697 -0.58628699 -0.432100156 #> 49 3.124803381 -0.20991733 -0.25745566 -0.465205697 2.22615982 -0.350827400 #> 50 -0.549616511 -0.20991733 -0.25745566 -0.342823403 -0.46979511 -0.323736481 #> 51 -0.549616511 -0.20991733 -0.25745566 -0.251036682 -0.51972020 -0.432100156 #> 52 -0.549616511 -0.20991733 -0.25745566 0.085514628 -0.56964529 -0.418554697 #> 53 -0.524872942 -0.20991733 -0.25745566 -0.465205697 -0.51972020 0.163900059 #> 54 6.564159374 -0.20991733 -0.21414030 -0.465205697 -0.30337814 -0.418554697 #> 55 0.242177675 -0.20991733 -0.25745566 -0.358121189 -0.51972020 -0.337281940 #> 56 -0.537244727 -0.19614323 -0.24662682 -0.312227829 -0.51972020 -0.418554697 #> 57 -0.388783317 0.25840217 -0.25745566 -0.404014550 -0.46979511 -0.405009237 #> 58 -0.549616511 -0.20991733 -0.17082495 -0.449907910 -0.58628699 0.123263681 #> 59 0.254549459 -0.20991733 -0.25745566 -0.465205697 -0.12031947 -0.391463778 #> 60 -0.091860497 2.84104651 -0.25745566 -0.388716763 -0.56964529 0.055536384 #> 61 -0.302180828 -0.20991733 -0.25745566 -0.449907910 -0.46979511 -0.350827400 #> 62 -0.487757590 -0.20991733 -0.25745566 -0.006272093 3.92361292 4.539083459 #> 63 -0.512501158 0.39614321 4.64800869 -0.296930042 -0.58628699 -0.418554697 #> 64 0.613331199 -0.20991733 -0.11668075 0.819808396 0.12930599 -0.432100156 #> 65 0.304036595 -0.20991733 -0.25745566 -0.296930042 -0.56964529 -0.405009237 #> 66 1.454612521 -0.20991733 -0.25745566 -0.465205697 -0.20352796 -0.432100156 #> 67 4.287751091 -0.20991733 -0.25745566 -0.327525616 -0.58628699 -0.432100156 #> 68 0.786536177 -0.20991733 2.70964640 0.223194710 -0.58628699 -0.432100156 #> 69 0.118459833 -0.20991733 -0.25745566 -0.449907910 -0.58628699 -0.418554697 #> 70 -0.537244727 -0.20991733 -0.25745566 -0.465205697 3.05824467 0.367081951 #> 71 -0.549616511 -0.19614323 -0.25745566 -0.465205697 -0.53636190 -0.432100156 #> 72 0.130831617 -0.20991733 -0.25745566 0.391470365 -0.51972020 -0.377918318 #> 73 0.922625803 -0.20991733 -0.25745566 0.116110202 -0.20352796 -0.432100156 #> 74 0.192690538 -0.18925618 -0.25745566 -0.419312337 -0.30337814 -0.432100156 #> 75 -0.524872942 -0.20991733 -0.25745566 -0.052165453 -0.48643681 -0.283100102 #> 76 -0.537244727 -0.20991733 1.99494298 0.529150446 -0.33666153 -0.418554697 #> 77 -0.512501158 -0.20303028 -0.25745566 -0.174547748 -0.58628699 0.055536384 #> 78 -0.351667964 -0.20991733 -0.25745566 1.033977411 -0.56964529 -0.242463724 #> 79 -0.425898669 2.84793356 -0.13833843 -0.419312337 -0.58628699 -0.405009237 #> 80 -0.549616511 -0.20991733 -0.24662682 -0.465205697 -0.28673644 -0.432100156 #> 81 -0.326924396 -0.20991733 -0.25745566 -0.449907910 -0.48643681 -0.432100156 #> 82 -0.549616511 -0.20991733 -0.25745566 -0.465205697 -0.50307850 -0.174736427 #> 83 -0.549616511 -0.20991733 -0.25745566 -0.404014550 -0.51972020 -0.323736481 #> 84 0.551472278 -0.20991733 -0.25745566 -0.388716763 -0.40322832 -0.323736481 #> 85 1.528843226 -0.18925618 -0.25745566 -0.220441108 -0.43651171 -0.310191021 #> 86 1.256663975 -0.20991733 -0.25745566 -0.449907910 -0.45315341 -0.432100156 #> 87 -0.549616511 -0.20991733 -0.25745566 -0.251036682 0.36228975 0.908900329 #> 88 0.266921243 -0.20991733 -0.25745566 1.095168558 -0.56964529 -0.161190967 #> 89 -0.500129374 -0.20991733 -0.25745566 0.238492497 0.42885653 -0.432100156 #> 90 -0.475385806 -0.20991733 -0.25745566 4.078236988 1.95989266 -0.283100102 #> 91 -0.277437260 -0.18925618 -0.24662682 0.330279217 -0.58628699 -0.432100156 #> 92 -0.351667964 -0.20991733 -0.25745566 1.202253066 -0.56964529 -0.391463778 #> 93 -0.166091202 -0.20991733 -0.25745566 -0.465205697 -0.58628699 -0.377918318 #> 94 -0.166091202 -0.20991733 -0.25745566 -0.465205697 -0.40322832 -0.432100156 #> 95 -0.524872942 -0.20991733 0.77128410 -0.419312337 -0.03711098 -0.310191021 #> 96 -0.376411533 -0.20991733 -0.25745566 1.752973392 4.00682140 -0.350827400 #> 97 -0.537244727 -0.20991733 -0.24662682 -0.465205697 -0.46979511 1.071445842 #> 98 -0.104232281 -0.20991733 -0.25745566 -0.404014550 -0.56964529 0.082627303 #> 99 -0.401155101 -0.20991733 -0.25745566 -0.465205697 1.22765799 5.026719999 #> 100 -0.549616511 -0.20991733 -0.25745566 8.912337624 -0.58628699 -0.093463670 #> 101 -0.549616511 -0.20991733 -0.25745566 -0.128654387 -0.53636190 -0.418554697 #> 102 -0.549616511 0.14820935 -0.25745566 -0.358121189 -0.58628699 -0.418554697 #> 103 -0.227950123 -0.20991733 -0.25745566 -0.465205697 1.47728345 0.394172870 #> 104 -0.549616511 -0.20991733 -0.25745566 0.269088070 1.22765799 -0.350827400 #> 105 -0.364039749 -0.20991733 -0.25745566 -0.388716763 0.26243956 -0.174736427 #> 106 -0.524872942 -0.20991733 -0.25745566 -0.404014550 1.76019230 -0.418554697 #> 107 0.007113776 -0.20991733 -0.24662682 -0.067463240 -0.58628699 -0.418554697 #> 108 -0.190834770 -0.20991733 -0.24662682 -0.465205697 0.12930599 -0.432100156 #> 109 1.182433270 -0.20991733 -0.24662682 -0.465205697 -0.23681135 -0.405009237 #> 110 2.036086376 0.46501372 -0.24662682 -0.205143321 -0.12031947 -0.377918318 #> 111 -0.265065475 -0.20991733 -0.25745566 0.590341593 -0.55300359 -0.337281940 #> 112 -0.227950123 -0.20991733 -0.25745566 -0.465205697 -0.20352796 -0.432100156 #> 113 -0.450642238 -0.20991733 -0.24662682 -0.006272093 -0.28673644 -0.432100156 #> 114 -0.116604066 0.05179063 1.34521260 -0.082761027 -0.27009474 -0.418554697 #> 115 -0.339296180 -0.20991733 -0.25745566 0.162003562 -0.15360286 -0.052827292 #> 116 -0.537244727 0.38236910 -0.25745566 -0.174547748 -0.08703608 1.003718545 #> 117 -0.487757590 -0.20991733 -0.25745566 -0.205143321 -0.48643681 -0.174736427 #> 118 -0.549616511 -0.20991733 -0.25745566 1.951844620 -0.35330323 -0.391463778 #> 119 -0.289809044 -0.20991733 -0.25745566 1.538804376 0.06273920 -0.432100156 #> 120 -0.549616511 -0.20303028 -0.25745566 -0.465205697 -0.56964529 -0.432100156 #> 121 -0.463014022 11.54627967 -0.25745566 -0.205143321 -0.38658662 -0.432100156 #> 122 -0.326924396 -0.20991733 -0.25745566 2.915605190 -0.55300359 -0.432100156 #> 123 -0.463014022 -0.16170797 1.12863581 -0.342823403 -0.48643681 -0.432100156 #> 124 -0.549616511 -0.20991733 -0.25745566 -0.358121189 -0.43651171 0.597354761 #> 125 1.244292191 -0.20991733 -0.25745566 0.146705776 0.94474914 -0.418554697 #> 126 -0.537244727 -0.20991733 -0.25745566 4.185321496 -0.58628699 -0.432100156 #> 127 0.316408380 -0.20303028 -0.25745566 -0.281632255 0.42885653 -0.432100156 #> 128 -0.376411533 -0.20991733 -0.25745566 -0.281632255 -0.58628699 -0.418554697 #> 129 0.588587631 -0.20991733 -0.25745566 -0.388716763 -0.35330323 -0.432100156 #> 130 -0.425898669 -0.20991733 -0.25745566 0.116110202 -0.51972020 -0.432100156 #> 131 -0.463014022 0.92644617 -0.25745566 -0.449907910 -0.43651171 6.354175024 #> 132 -0.537244727 -0.19614323 -0.25745566 -0.465205697 -0.56964529 -0.432100156 #> 133 0.514356926 -0.20991733 -0.25745566 -0.404014550 -0.56964529 0.407718329 #> 134 -0.549616511 -0.20991733 -0.25745566 -0.143952174 -0.51972020 -0.201827346 #> 135 -0.425898669 -0.20991733 -0.25745566 -0.465205697 -0.45315341 -0.364372859 #> 136 0.192690538 -0.20991733 -0.24662682 3.879365760 -0.36994493 -0.432100156 #> 137 -0.388783317 -0.20991733 -0.25745566 0.100812415 1.19437460 -0.405009237 #> 138 1.145317917 -0.20991733 -0.25745566 -0.251036682 0.31236465 -0.134100048 #> 139 0.019485560 -0.08595040 -0.24662682 -0.113356600 -0.56964529 -0.432100156 #> 140 -0.401155101 -0.20991733 -0.17082495 2.686138388 -0.51972020 -0.432100156 #> 141 -0.487757590 -0.20991733 -0.25745566 -0.052165453 0.02945580 -0.405009237 #> 142 -0.500129374 -0.20991733 -0.25745566 0.452661512 0.71176538 -0.432100156 #> 143 -0.425898669 -0.18925618 -0.25745566 0.024323481 -0.08703608 -0.432100156 #> 144 0.167946970 -0.19614323 1.64842011 -0.235738895 1.92660927 -0.432100156 #> 145 -0.537244727 -0.20991733 -0.25745566 -0.220441108 0.34564805 -0.012190913 #> 146 -0.252693691 -0.19614323 -0.25745566 0.054919055 -0.27009474 -0.296645562 #> 147 4.225892170 -0.20303028 -0.25745566 -0.465205697 0.06273920 0.231627356 #> 148 -0.376411533 -0.20991733 3.34854794 0.177301349 -0.10367777 -0.432100156 #> 149 0.761792609 -0.19614323 -0.24662682 -0.327525616 4.95539814 0.488991086 #> 150 -0.549616511 -0.20991733 -0.25745566 -0.220441108 3.50757049 -0.418554697 #> 151 -0.549616511 -0.20991733 -0.25745566 0.100812415 -0.55300359 -0.432100156 #> 152 -0.549616511 -0.20991733 -0.25745566 -0.373418976 -0.22016965 2.317628111 #> 153 -0.537244727 -0.20991733 -0.25745566 0.636234954 0.02945580 0.150354600 #> 154 1.083458997 -0.20991733 -0.25745566 -0.082761027 3.90697122 -0.377918318 #> 155 1.491727874 -0.20991733 -0.25745566 -0.388716763 -0.45315341 1.355900490 #> 156 -0.153719418 -0.20991733 -0.25745566 -0.327525616 -0.03711098 -0.337281940 #> 157 -0.549616511 -0.20991733 -0.15999611 -0.434610124 -0.58628699 3.726355893 #> 158 -0.500129374 -0.20991733 -0.25745566 -0.205143321 -0.13696117 -0.405009237 #> 159 -0.413526885 -0.20991733 -0.25745566 -0.465205697 3.57413728 -0.405009237 #> 160 -0.537244727 -0.20991733 -0.25745566 -0.449907910 -0.58628699 0.651536599 #> 161 -0.549616511 1.87685929 -0.25745566 -0.327525616 -0.58628699 1.667446057 #> 162 -0.425898669 -0.20991733 -0.25745566 2.303693717 -0.20352796 -0.283100102 #> 163 0.205062322 3.83278193 -0.25745566 -0.358121189 -0.58628699 0.001354546 #> 164 -0.500129374 -0.20991733 -0.23579798 -0.266334469 -0.15360286 0.312900113 #> 165 -0.487757590 -0.18236913 -0.23579798 -0.449907910 -0.56964529 -0.391463778 #> 166 -0.537244727 -0.20991733 -0.25745566 1.018679624 -0.15360286 -0.377918318 #> 167 -0.413526885 -0.20991733 -0.25745566 0.452661512 1.84340078 -0.337281940 #> 168 2.852624130 -0.20991733 -0.25745566 -0.434610124 -0.56964529 -0.432100156 #> 169 -0.227950123 -0.20991733 -0.25745566 -0.358121189 -0.56964529 -0.432100156 #> 170 -0.549616511 -0.20991733 -0.25745566 1.079870772 0.24579787 0.326445573 #> 171 -0.524872942 -0.20991733 -0.25745566 -0.296930042 -0.46979511 -0.337281940 #> 172 -0.463014022 -0.20991733 -0.25745566 -0.358121189 -0.56964529 -0.432100156 #> 173 0.130831617 -0.20991733 -0.25745566 -0.465205697 -0.56964529 -0.188281886 #> 174 -0.524872942 -0.16859502 -0.25745566 -0.449907910 -0.50307850 -0.432100156 #> 175 -0.425898669 -0.20991733 0.19735560 0.620937167 -0.48643681 0.190990978 #> 176 -0.500129374 -0.20991733 -0.25745566 -0.434610124 -0.20352796 -0.161190967 #> 177 0.279293027 3.47465525 -0.01922119 -0.342823403 -0.56964529 -0.405009237 #> 178 -0.512501158 -0.19614323 -0.25745566 -0.342823403 0.29572296 0.231627356 #> 179 -0.401155101 -0.20991733 -0.25745566 -0.465205697 -0.30337814 -0.432100156 #> 180 -0.475385806 1.57382902 0.34895936 -0.128654387 -0.03711098 -0.405009237 #> 181 1.095830781 -0.20991733 -0.25745566 0.054919055 0.01281411 -0.256009183 #> 182 1.009228292 -0.20991733 -0.25745566 -0.404014550 -0.36994493 -0.391463778 #> 183 2.679419152 0.31349859 -0.25745566 -0.404014550 -0.55300359 -0.350827400 #> 184 -0.438270453 1.29834696 -0.24662682 1.538804376 -0.33666153 -0.215372805 #> 185 -0.549616511 0.24462807 -0.01922119 -0.358121189 -0.58628699 -0.405009237 #> 186 -0.524872942 -0.20991733 -0.24662682 -0.419312337 -0.33666153 -0.432100156 #> 187 -0.549616511 -0.20303028 3.85750340 1.095168558 -0.38658662 -0.269554643 #> 188 -0.537244727 -0.20991733 -0.23579798 -0.419312337 -0.55300359 0.475445626 #> 189 -0.425898669 -0.20991733 -0.23579798 -0.373418976 2.70876903 0.177445519 #> 190 0.254549459 -0.09972451 3.55429589 0.162003562 -0.33666153 -0.432100156 #> 191 -0.512501158 -0.20991733 -0.25745566 -0.465205697 0.21251447 -0.147645508 #> 192 -0.537244727 -0.20991733 -0.25745566 -0.465205697 0.54534841 -0.432100156 #> 193 -0.450642238 -0.20303028 -0.25745566 -0.358121189 -0.38658662 -0.310191021 #> 194 0.885510450 -0.20991733 -0.25745566 -0.388716763 -0.56964529 -0.432100156 #> 195 -0.104232281 -0.16170797 0.01326533 -0.388716763 -0.32001983 -0.269554643 #> 196 -0.549616511 -0.20991733 -0.25745566 -0.465205697 -0.22016965 -0.256009183 #> 197 -0.512501158 -0.05151515 0.31647284 1.768271179 0.91146575 -0.174736427 #> 198 0.167946970 -0.20991733 -0.25745566 -0.465205697 -0.58628699 -0.147645508 #> 199 -0.537244727 -0.20991733 -0.25745566 -0.434610124 -0.38658662 1.708082436 #> 200 -0.450642238 -0.20991733 -0.25745566 -0.342823403 -0.15360286 2.046718922 #> Otu00056 Otu00057 Otu00058 Otu00059 Otu00060 #> 1 -0.67302626 -0.063085238 0.244028438 -0.04265350 -0.41506494 #> 2 2.49956176 -0.378272648 0.956294184 -0.33573273 -0.41506494 #> 3 -0.80430576 2.658987854 -0.313396928 -0.40900254 -0.40518715 #> 4 0.18029052 -0.340068114 -0.065652321 -0.29386427 -0.41506494 #> 5 -0.80430576 -0.426028317 -0.561141535 -0.39853543 -0.40518715 #> 6 0.77104829 0.786965657 0.151124210 0.66911037 -0.41506494 #> 7 -0.82618568 -0.244556777 -0.545657497 -0.29386427 -0.41506494 #> 8 -0.62926642 -0.426028317 -0.406301156 7.84955171 -0.16812007 #> 9 0.24593027 -0.426028317 -0.483721345 -0.40900254 -0.41506494 #> 10 -0.23542791 -0.406926049 -0.576625573 -0.40900254 2.69644047 #> 11 -0.82618568 -0.406926049 2.798894699 -0.40900254 0.40479204 #> 12 -0.56362667 0.557738450 -0.205008662 0.09341901 0.04919142 #> 13 0.66164870 -0.426028317 1.730496081 -0.40900254 -0.41506494 #> 14 0.04901101 0.529085049 0.213060362 0.69004460 -0.41506494 #> 15 1.82128432 1.407789345 0.832421880 -0.05312061 -0.41506494 #> 16 1.66812490 -0.397374916 -0.158556549 -0.40900254 -0.41506494 #> 17 -0.41046725 0.519533915 -0.220492700 0.21902440 -0.41506494 #> 18 -0.30106766 1.073499667 -0.096620397 0.03061631 -0.38543156 #> 19 -0.69490618 0.147039703 0.569193235 -0.21012735 -0.41506494 #> 20 -0.78242585 -0.359170381 -0.545657497 -0.23106158 -0.41506494 #> 21 -0.82618568 -0.406926049 -0.576625573 -0.40900254 -0.41506494 #> 22 1.88692408 -0.426028317 -0.530173459 2.16590791 -0.41506494 #> 23 0.46472945 -0.426028317 -0.205008662 0.76331441 -0.41506494 #> 24 1.05548722 -0.426028317 -0.375333080 -0.40900254 -0.41506494 #> 25 0.31157002 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 26 -0.32294758 0.357164643 -0.081136359 -0.03218638 1.06660430 #> 27 -0.78242585 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 28 -0.60738651 -0.426028317 -0.607593649 -0.40900254 2.67668488 #> 29 -0.76054593 -0.426028317 -0.437269232 0.75284729 -0.41506494 #> 30 -0.69490618 -0.053534104 -0.189524624 -0.13685754 -0.39530935 #> 31 -0.82618568 -0.426028317 1.482751474 -0.39853543 1.00733753 #> 32 -0.60738651 2.085919835 -0.375333080 -0.40900254 2.59766252 #> 33 -0.62926642 0.252102173 -0.592109611 -0.40900254 -0.31628699 #> 34 -0.82618568 0.242551039 0.770485728 -0.40900254 -0.41506494 #> 35 -0.71678609 8.532935052 0.878873994 -0.19966023 -0.33604258 #> 36 -0.49798692 -0.426028317 -0.607593649 -0.40900254 0.20723614 #> 37 2.23700275 0.280755574 -0.235976738 -0.04265350 -0.41506494 #> 38 -0.76054593 -0.426028317 0.383384780 -0.40900254 0.54308117 #> 39 -0.80430576 -0.426028317 0.615645349 -0.40900254 2.37047324 #> 40 -0.43234717 0.605494118 -0.143072511 0.03061631 0.12821378 #> 41 -0.60738651 -0.292312446 -0.437269232 -0.40900254 -0.41506494 #> 42 0.61788887 -0.416477183 -0.344365004 -0.35666697 -0.40518715 #> 43 -0.80430576 -0.426028317 -0.375333080 -0.40900254 1.60000523 #> 44 1.99632366 -0.063085238 0.042735945 -0.40900254 -0.41506494 #> 45 0.31157002 -0.034431837 -0.514689421 -0.29386427 0.39491424 #> 46 0.02713110 -0.406926049 -0.468237308 -0.40900254 -0.40518715 #> 47 -0.71678609 -0.015329570 -0.313396928 -0.14732465 -0.41506494 #> 48 -0.82618568 -0.387823782 -0.545657497 -0.40900254 -0.41506494 #> 49 -0.65114634 0.137488569 -0.266944814 -0.16825888 -0.41506494 #> 50 -0.54174675 0.634147519 0.305964590 0.28182709 -0.41506494 #> 51 0.37720978 -0.426028317 -0.561141535 4.57334451 -0.40518715 #> 52 -0.47610700 -0.177698842 -0.468237308 -0.25199581 -0.41506494 #> 53 -0.80430576 -0.416477183 -0.592109611 -0.40900254 -0.41506494 #> 54 -0.80430576 -0.426028317 -0.561141535 -0.40900254 -0.40518715 #> 55 -0.38858733 0.739209989 0.058219983 0.08295189 -0.40518715 #> 56 -0.82618568 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 57 -0.76054593 -0.416477183 0.135640172 0.40743248 -0.41506494 #> 58 1.20864664 -0.416477183 -0.452753270 -0.40900254 -0.03970874 #> 59 -0.21354799 -0.426028317 0.166608248 0.83658422 -0.40518715 #> 60 -0.10414841 -0.129943173 -0.003716169 0.02014920 -0.41506494 #> 61 0.70540854 -0.426028317 1.157586677 -0.40900254 1.35306035 #> 62 -0.76054593 0.739209989 -0.514689421 -0.40900254 -0.39530935 #> 63 0.44284953 -0.235005644 -0.359849042 -0.39853543 -0.41506494 #> 64 -0.76054593 -0.426028317 -0.592109611 -0.40900254 -0.41506494 #> 65 -0.82618568 0.318960108 -0.468237308 -0.40900254 0.21711393 #> 66 0.48660936 -0.426028317 5.369244999 -0.40900254 -0.41506494 #> 67 1.29616631 -0.426028317 -0.561141535 0.54350498 0.82953722 #> 68 1.23052655 1.197664405 0.166608248 -0.19966023 2.07413939 #> 69 1.20864664 -0.426028317 1.064682449 -0.40900254 -0.41506494 #> 70 0.13653068 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 71 -0.45422709 -0.349619247 -0.530173459 -0.38806831 6.91425892 #> 72 0.13653068 2.534823116 2.195017219 -0.07405484 1.57037184 #> 73 0.50848928 0.242551039 -0.607593649 -0.40900254 -0.41506494 #> 74 4.62191375 0.013323831 0.182092286 0.63770902 3.72373115 #> 75 0.81480812 0.748761123 0.491773045 1.42274270 -0.41506494 #> 76 -0.82618568 -0.426028317 5.431181150 -0.40900254 0.02943583 #> 77 -0.69490618 -0.426028317 0.213060362 1.06686076 -0.40518715 #> 78 -0.56362667 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 79 1.58060523 -0.091738639 0.940810146 1.19246615 -0.41506494 #> 80 -0.82618568 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 81 0.59600895 -0.426028317 1.699528005 0.20855728 -0.41506494 #> 82 3.28723879 0.939783796 -0.607593649 -0.39853543 -0.41506494 #> 83 0.83668804 -0.034431837 -0.545657497 -0.25199581 -0.40518715 #> 84 -0.76054593 -0.426028317 -0.390817118 -0.40900254 -0.16812007 #> 85 -0.43234717 -0.426028317 2.427277789 -0.40900254 -0.41506494 #> 86 -0.82618568 -0.139494307 -0.251460776 -0.40900254 -0.40518715 #> 87 -0.06038857 0.051528366 -0.390817118 -0.36713408 -0.41506494 #> 88 1.01172738 -0.426028317 6.546031883 -0.40900254 -0.41506494 #> 89 1.79940441 -0.359170381 0.151124210 -0.31479850 -0.41506494 #> 90 0.13653068 6.603606053 -0.174040587 -0.28339716 -0.41506494 #> 91 -0.23542791 -0.378272648 -0.344365004 2.80440196 0.95794856 #> 92 -0.76054593 -0.426028317 2.009208764 -0.40900254 0.41466983 #> 93 -0.82618568 -0.426028317 -0.530173459 -0.40900254 -0.41506494 #> 94 -0.80430576 -0.426028317 0.228544400 2.50085561 -0.38543156 #> 95 1.03360730 1.054397400 0.274996514 0.55397210 -0.41506494 #> 96 -0.82618568 -0.426028317 -0.576625573 -0.40900254 -0.41506494 #> 97 -0.78242585 -0.426028317 -0.592109611 -0.40900254 -0.41506494 #> 98 -0.16978816 -0.426028317 -0.468237308 1.63208501 -0.41506494 #> 99 -0.78242585 -0.406926049 -0.592109611 -0.40900254 -0.41506494 #> 100 2.41204209 -0.397374916 -0.499205383 -0.39853543 -0.37555376 #> 101 1.79940441 -0.177698842 -0.576625573 -0.40900254 -0.41506494 #> 102 -0.80430576 -0.426028317 -0.607593649 -0.36713408 -0.41506494 #> 103 -0.19166808 -0.301863579 -0.421785194 -0.40900254 -0.41506494 #> 104 -0.82618568 1.025743999 0.011767869 -0.40900254 -0.39530935 #> 105 0.18029052 0.509982781 0.027251907 0.47023517 0.07882480 #> 106 0.04901101 0.309408975 -0.235976738 0.03061631 -0.39530935 #> 107 0.20217044 -0.426028317 -0.034684245 -0.40900254 0.33564747 #> 108 0.81480812 -0.426028317 1.838884347 -0.40900254 0.80978163 #> 109 -0.62926642 -0.129943173 -0.251460776 -0.38806831 -0.41506494 #> 110 2.08384333 -0.397374916 -0.205008662 -0.27293004 -0.40518715 #> 111 0.53036920 -0.426028317 -0.220492700 -0.40900254 -0.41506494 #> 112 0.50848928 -0.426028317 0.259512476 -0.40900254 0.13809157 #> 113 -0.21354799 -0.426028317 0.569193235 -0.38806831 -0.41506494 #> 114 0.35532986 -0.378272648 1.637591853 -0.15779177 1.13574887 #> 115 0.44284953 -0.426028317 1.467267436 -0.40900254 -0.06934212 #> 116 2.01820358 -0.215903376 -0.174040587 -0.40900254 -0.41506494 #> 117 -0.03850865 -0.426028317 -0.607593649 -0.40900254 2.64705149 #> 118 0.18029052 -0.426028317 -0.514689421 -0.40900254 -0.41506494 #> 119 -0.82618568 -0.426028317 -0.050168283 -0.40900254 -0.41506494 #> 120 -0.32294758 -0.387823782 -0.607593649 -0.38806831 -0.34592038 #> 121 -0.34482750 0.414471445 1.002746297 0.35509690 4.63248828 #> 122 0.24593027 -0.416477183 -0.576625573 -0.40900254 -0.41506494 #> 123 -0.82618568 -0.426028317 -0.545657497 -0.39853543 -0.41506494 #> 124 0.02713110 -0.426028317 -0.530173459 -0.40900254 -0.41506494 #> 125 -0.60738651 -0.426028317 0.089188059 3.14981678 2.73595165 #> 126 0.63976878 -0.426028317 1.064682449 -0.40900254 -0.41506494 #> 127 -0.27918775 -0.378272648 -0.545657497 -0.31479850 -0.39530935 #> 128 -0.78242585 -0.426028317 -0.576625573 -0.40900254 -0.06934212 #> 129 -0.80430576 -0.110840906 -0.483721345 0.26089286 -0.41506494 #> 130 -0.47610700 -0.426028317 -0.344365004 -0.40900254 -0.40518715 #> 131 -0.56362667 -0.426028317 -0.390817118 -0.40900254 -0.41506494 #> 132 1.47120565 -0.426028317 -0.421785194 -0.40900254 -0.20763125 #> 133 -0.67302626 -0.426028317 -0.530173459 -0.26246293 -0.41506494 #> 134 0.46472945 0.739209989 1.869852422 1.54834808 -0.40518715 #> 135 -0.82618568 -0.406926049 -0.437269232 -0.39853543 -0.41506494 #> 136 0.85856796 -0.426028317 0.011767869 -0.40900254 -0.41506494 #> 137 -0.16978816 2.085919835 -0.468237308 -0.40900254 1.15550446 #> 138 0.88044788 -0.426028317 -0.220492700 -0.40900254 -0.40518715 #> 139 -0.71678609 -0.416477183 -0.468237308 0.11435324 -0.41506494 #> 140 -0.82618568 -0.426028317 -0.220492700 -0.40900254 -0.41506494 #> 141 -0.65114634 -0.426028317 -0.174040587 1.51694674 -0.03970874 #> 142 -0.56362667 1.617914285 0.693065539 -0.40900254 -0.41506494 #> 143 -0.73866601 -0.005778436 -0.607593649 -0.06358773 -0.41506494 #> 144 -0.58550659 1.149908736 -0.468237308 0.88891980 -0.41506494 #> 145 0.61788887 -0.196801109 -0.607593649 -0.40900254 -0.41506494 #> 146 0.81480812 -0.426028317 -0.592109611 -0.06358773 -0.40518715 #> 147 -0.82618568 -0.426028317 -0.592109611 -0.39853543 -0.41506494 #> 148 -0.73866601 -0.426028317 -0.359849042 -0.40900254 -0.41506494 #> 149 -0.71678609 0.185244237 -0.452753270 -0.40900254 -0.41506494 #> 150 -0.82618568 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 151 1.66812490 0.834721326 0.878873994 -0.40900254 -0.41506494 #> 152 1.05548722 -0.168147708 -0.576625573 -0.40900254 -0.41506494 #> 153 -0.67302626 -0.426028317 0.058219983 0.45976806 -0.41506494 #> 154 -0.82618568 -0.426028317 -0.607593649 1.78909174 -0.41506494 #> 155 -0.69490618 -0.426028317 -0.545657497 5.65145742 -0.41506494 #> 156 -0.19166808 0.643698653 -0.483721345 -0.40900254 0.16772496 #> 157 -0.82618568 -0.416477183 -0.607593649 -0.40900254 -0.23726464 #> 158 1.53684540 -0.426028317 2.597602206 -0.40900254 -0.37555376 #> 159 -0.78242585 0.041977232 -0.437269232 -0.40900254 -0.41506494 #> 160 -0.80430576 -0.426028317 -0.592109611 -0.40900254 -0.41506494 #> 161 -0.65114634 -0.426028317 0.352416704 -0.40900254 -0.41506494 #> 162 -0.32294758 -0.426028317 -0.468237308 -0.40900254 0.28625850 #> 163 0.66164870 -0.378272648 0.816937842 3.22308659 -0.41506494 #> 164 -0.80430576 -0.416477183 -0.576625573 -0.40900254 2.05438380 #> 165 -0.71678609 -0.406926049 -0.576625573 -0.40900254 2.11365057 #> 166 -0.82618568 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 167 0.48660936 3.585447818 -0.328880966 -0.40900254 -0.27677581 #> 168 -0.82618568 -0.426028317 -0.406301156 -0.40900254 -0.41506494 #> 169 -0.80430576 -0.426028317 -0.530173459 -0.38806831 1.61976082 #> 170 -0.82618568 -0.426028317 -0.607593649 -0.40900254 1.05672651 #> 171 -0.47610700 0.701005455 0.646613425 0.81564999 -0.41506494 #> 172 -0.76054593 -0.426028317 -0.437269232 -0.40900254 -0.01995315 #> 173 -0.82618568 -0.426028317 -0.592109611 -0.40900254 -0.39530935 #> 174 -0.78242585 -0.416477183 -0.421785194 -0.31479850 4.01018720 #> 175 2.43392201 -0.215903376 -0.034684245 -0.40900254 -0.40518715 #> 176 1.07736713 -0.426028317 -0.127588473 -0.39853543 -0.41506494 #> 177 0.20217044 -0.034431837 0.538225159 0.05155054 -0.41506494 #> 178 -0.82618568 -0.426028317 0.182092286 -0.40900254 -0.41506494 #> 179 -0.80430576 -0.426028317 -0.607593649 -0.40900254 -0.41506494 #> 180 -0.25730783 0.844272459 -0.065652321 -0.10545619 -0.41506494 #> 181 -0.67302626 -0.416477183 -0.576625573 0.78424864 -0.41506494 #> 182 0.26781019 -0.426028317 -0.452753270 0.86798557 -0.41506494 #> 183 -0.41046725 -0.263659045 0.027251907 0.54350498 -0.41506494 #> 184 -0.36670742 -0.273210178 -0.174040587 -0.36713408 -0.30640920 #> 185 2.43392201 -0.378272648 -0.561141535 -0.40900254 -0.41506494 #> 186 -0.78242585 -0.416477183 -0.545657497 -0.37760120 -0.41506494 #> 187 0.31157002 0.548187316 -0.607593649 -0.40900254 -0.15824228 #> 188 -0.82618568 -0.426028317 -0.592109611 -0.40900254 -0.35579817 #> 189 -0.71678609 -0.340068114 -0.514689421 -0.40900254 -0.26689802 #> 190 0.81480812 0.739209989 -0.297912890 -0.25199581 -0.40518715 #> 191 0.00525118 -0.426028317 -0.499205383 -0.40900254 1.41232712 #> 192 1.12112697 -0.426028317 -0.561141535 -0.40900254 -0.41506494 #> 193 1.47120565 1.130806469 0.383384780 0.66911037 -0.05946433 #> 194 -0.56362667 -0.387823782 -0.576625573 0.02014920 0.52332558 #> 195 -0.21354799 0.901579261 0.491773045 0.50163652 -0.39530935 #> 196 -0.82618568 -0.426028317 -0.592109611 -0.40900254 -0.41506494 #> 197 -0.80430576 1.608363152 -0.514689421 -0.38806831 -0.37555376 #> 198 -0.80430576 -0.426028317 -0.530173459 -0.40900254 -0.25702023 #> 199 1.71188474 0.204346505 -0.421785194 -0.19966023 0.06894701 #> 200 3.72483714 -0.426028317 1.869852422 -0.40900254 -0.32616479 #> #> $removed #> character(0) #>"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":null,"dir":"Reference","previous_headings":"","what":"Get feature importance using the permutation method — get_feature_importance","title":"Get feature importance using the permutation method — get_feature_importance","text":"Calculates feature importance using trained model test data. Requires future.apply package.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get feature importance using the permutation method — get_feature_importance","text":"","code":"get_feature_importance( trained_model, train_data, test_data, outcome_colname, perf_metric_function, perf_metric_name, class_probs, method, seed = NA, corr_thresh = 1, groups = NULL, nperms = 100, corr_method = \"spearman\" )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get feature importance using the permutation method — get_feature_importance","text":"trained_model Trained model caret::train(). train_data Training data: dataframe outcome features. test_data Held test data: dataframe outcome features. outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). perf_metric_function Function calculate performance metric used cross-validation test performance. functions provided caret (see caret::defaultSummary()). Defaults: binary classification = twoClassSummary, multi-class classification = multiClassSummary, regression = defaultSummary. perf_metric_name column name output function provided perf_metric_function used performance metric. Defaults: binary classification = \"ROC\", multi-class classification = \"logLoss\", regression = \"RMSE\". class_probs Whether use class probabilities (TRUE categorical outcomes, FALSE numeric outcomes). method ML method. Options: c(\"glmnet\", \"rf\", \"rpart2\", \"svmRadial\", \"xgbTree\"). glmnet: linear, logistic, multiclass regression rf: random forest rpart2: decision tree svmRadial: support vector machine xgbTree: xgboost seed Random seed (default: NA). results reproducible set seed. corr_thresh feature importance, group correlations equal corr_thresh (range 0 1; default: 1). groups Vector feature names group together permutation. element string feature names separated pipe character (|). NULL (default), correlated features grouped together based corr_thresh. nperms number permutations perform (default: 100). corr_method correlation method. options supported stats::cor: spearman, pearson, kendall. (default: spearman)","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get feature importance using the permutation method — get_feature_importance","text":"Data frame performance metrics feature (group correlated features; names) permuted (perf_metric), differences actual test performance metric permuted performance metric (perf_metric_diff; test minus permuted performance), p-value (pvalue: probability obtaining actual performance value null hypothesis). Features larger perf_metric_diff important. performance metric name (perf_metric_name) seed (seed) also returned.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Get feature importance using the permutation method — get_feature_importance","text":"permutation tests, p-value number permutation statistics greater test statistic, divided number permutations. case, permutation statistic model performance (e.g. AUROC) randomizing order observations one feature, test statistic actual performance test data. default perform 100 permutations per feature; increasing increase precision estimating null distribution, also increases runtime. p-value represents probability obtaining actual performance event null hypothesis true, null hypothesis feature important model performance. strongly recommend providing multiple cores speed computation time. See vignette parallel processing details.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get feature importance using the permutation method — get_feature_importance","text":"Begüm Topçuoğlu, topcuoglu.begum@gmail.com Zena Lapp, zenalapp@umich.edu Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_feature_importance.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get feature importance using the permutation method — get_feature_importance","text":"","code":"if (FALSE) { # If you called `run_ml()` with `feature_importance = FALSE` (the default), # you can use `get_feature_importance()` later as long as you have the # trained model and test data. results <- run_ml(otu_small, \"glmnet\", kfold = 2, cv_times = 2) names(results$trained_model$trainingData)[1] <- \"dx\" feat_imp <- get_feature_importance(results$trained_model, results$trained_model$trainingData, results$test_data, \"dx\", multiClassSummary, \"AUC\", class_probs = TRUE, method = \"glmnet\" ) # We strongly recommend providing multiple cores to speed up computation time. # Do this before calling `get_feature_importance()`. doFuture::registerDoFuture() future::plan(future::multicore, workers = 2) # Optionally, you can group features together with a custom grouping feat_imp <- get_feature_importance(results$trained_model, results$trained_model$trainingData, results$test_data, \"dx\", multiClassSummary, \"AUC\", class_probs = TRUE, method = \"glmnet\", groups = c( \"Otu00007\", \"Otu00008\", \"Otu00009\", \"Otu00011\", \"Otu00012\", \"Otu00015\", \"Otu00016\", \"Otu00018\", \"Otu00019\", \"Otu00020\", \"Otu00022\", \"Otu00023\", \"Otu00025\", \"Otu00028\", \"Otu00029\", \"Otu00030\", \"Otu00035\", \"Otu00036\", \"Otu00037\", \"Otu00038\", \"Otu00039\", \"Otu00040\", \"Otu00047\", \"Otu00050\", \"Otu00052\", \"Otu00054\", \"Otu00055\", \"Otu00056\", \"Otu00060\", \"Otu00003|Otu00002|Otu00005|Otu00024|Otu00032|Otu00041|Otu00053\", \"Otu00014|Otu00021|Otu00017|Otu00031|Otu00057\", \"Otu00013|Otu00006\", \"Otu00026|Otu00001|Otu00034|Otu00048\", \"Otu00033|Otu00010\", \"Otu00042|Otu00004\", \"Otu00043|Otu00027|Otu00049\", \"Otu00051|Otu00045\", \"Otu00058|Otu00044\", \"Otu00059|Otu00046\" ) ) # the function can show a progress bar if you have the `progressr` package installed. ## optionally, specify the progress bar format: progressr::handlers(progressr::handler_progress( format = \":message :bar :percent | elapsed: :elapsed | eta: :eta\", clear = FALSE, show_after = 0 )) ## tell progressr to always report progress progressr::handlers(global = TRUE) ## run the function and watch the live progress udpates feat_imp <- get_feature_importance(results$trained_model, results$trained_model$trainingData, results$test_data, \"dx\", multiClassSummary, \"AUC\", class_probs = TRUE, method = \"glmnet\" ) # You can specify any correlation method supported by `stats::cor`: feat_imp <- get_feature_importance(results$trained_model, results$trained_model$trainingData, results$test_data, \"dx\", multiClassSummary, \"AUC\", class_probs = TRUE, method = \"glmnet\", corr_method = \"pearson\" ) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hp_performance.html","id":null,"dir":"Reference","previous_headings":"","what":"Get hyperparameter performance metrics — get_hp_performance","title":"Get hyperparameter performance metrics — get_hp_performance","text":"Get hyperparameter performance metrics","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hp_performance.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get hyperparameter performance metrics — get_hp_performance","text":"","code":"get_hp_performance(trained_model)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hp_performance.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get hyperparameter performance metrics — get_hp_performance","text":"trained_model trained model (e.g. run_ml())","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hp_performance.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get hyperparameter performance metrics — get_hp_performance","text":"Named list: dat: Dataframe performance metric group hyperparameters. params: Hyperparameters tuned. metric: Performance metric used.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hp_performance.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get hyperparameter performance metrics — get_hp_performance","text":"Zena Lapp, zenalapp@umich.edu Kelly Sovacool sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hp_performance.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get hyperparameter performance metrics — get_hp_performance","text":"","code":"get_hp_performance(otu_mini_bin_results_glmnet$trained_model) #> $dat #> alpha lambda AUC #> 1 0 1e-04 0.6082552 #> 2 0 1e-03 0.6082552 #> 3 0 1e-02 0.6086458 #> 4 0 1e-01 0.6166789 #> 5 0 1e+00 0.6221737 #> 6 0 1e+01 0.6187408 #> #> $params #> [1] \"lambda\" #> #> $metric #> [1] \"AUC\" #>"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hyperparams_list.html","id":null,"dir":"Reference","previous_headings":"","what":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","title":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","text":"details see vignette hyperparameter tuning.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hyperparams_list.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","text":"","code":"get_hyperparams_list(dataset, method)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hyperparams_list.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","text":"dataset Dataframe outcome variable columns features. method ML method. Options: c(\"glmnet\", \"rf\", \"rpart2\", \"svmRadial\", \"xgbTree\"). glmnet: linear, logistic, multiclass regression rf: random forest rpart2: decision tree svmRadial: support vector machine xgbTree: xgboost","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hyperparams_list.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","text":"Named list hyperparameters.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hyperparams_list.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","text":"Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_hyperparams_list.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Set hyperparameters based on ML method and dataset characteristics — get_hyperparams_list","text":"","code":"get_hyperparams_list(otu_mini_bin, \"rf\") #> $mtry #> [1] 2 3 6 #> get_hyperparams_list(otu_small, \"rf\") #> $mtry #> [1] 4 8 16 #> get_hyperparams_list(otu_mini_bin, \"rpart2\") #> $maxdepth #> [1] 1 2 4 8 16 30 #> get_hyperparams_list(otu_small, \"rpart2\") #> $maxdepth #> [1] 1 2 4 8 16 30 #>"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_outcome_type.html","id":null,"dir":"Reference","previous_headings":"","what":"Get outcome type. — get_outcome_type","title":"Get outcome type. — get_outcome_type","text":"outcome numeric, type continuous. Otherwise, outcome type binary two outcomes multiclass two outcomes.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_outcome_type.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get outcome type. — get_outcome_type","text":"","code":"get_outcome_type(outcomes_vec)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_outcome_type.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get outcome type. — get_outcome_type","text":"outcomes_vec Vector outcomes.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_outcome_type.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get outcome type. — get_outcome_type","text":"Outcome type (continuous, binary, multiclass).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_outcome_type.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get outcome type. — get_outcome_type","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_outcome_type.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get outcome type. — get_outcome_type","text":"","code":"get_outcome_type(c(1, 2, 1)) #> [1] \"continuous\" get_outcome_type(c(\"a\", \"b\", \"b\")) #> [1] \"binary\" get_outcome_type(c(\"a\", \"b\", \"c\")) #> [1] \"multiclass\""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":null,"dir":"Reference","previous_headings":"","what":"Select indices to partition the data into training & testing sets. — get_partition_indices","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"Use function get row indices training set.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"","code":"get_partition_indices( outcomes, training_frac = 0.8, groups = NULL, group_partitions = NULL )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"outcomes vector outcomes training_frac Fraction data training set (default: 0.8). Rows dataset randomly selected training set, remaining rows used testing set. Alternatively, provide vector integers, used row indices training set. remaining rows used testing set. groups Vector groups keep together splitting data train test sets. number groups training set larger kfold, groups also kept together cross-validation. Length matches number rows dataset (default: NULL). group_partitions Specify assign groups training testing partitions (default: NULL). groups specifies samples belong group \"\" belong group \"B\", setting group_partitions = list(train = c(\"\", \"B\"), test = c(\"B\")) result samples group \"\" placed training set, samples \"B\" also training set, remaining samples \"B\" testing set. partition sizes close training_frac possible. number groups training set larger kfold, groups also kept together cross-validation.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"Vector row indices training set.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":"details","dir":"Reference","previous_headings":"","what":"Details","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"groups NULL, uses createDataPartition. Otherwise, uses create_grouped_data_partition(). Set seed prior calling function like data partitions reproducible (recommended).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_partition_indices.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Select indices to partition the data into training & testing sets. — get_partition_indices","text":"","code":"training_inds <- get_partition_indices(otu_mini_bin$dx) train_data <- otu_mini_bin[training_inds, ] test_data <- otu_mini_bin[-training_inds, ]"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_fn.html","id":null,"dir":"Reference","previous_headings":"","what":"Get default performance metric function — get_perf_metric_fn","title":"Get default performance metric function — get_perf_metric_fn","text":"Get default performance metric function","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_fn.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get default performance metric function — get_perf_metric_fn","text":"","code":"get_perf_metric_fn(outcome_type)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_fn.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get default performance metric function — get_perf_metric_fn","text":"outcome_type Type outcome (one : \"continuous\",\"binary\",\"multiclass\").","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_fn.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get default performance metric function — get_perf_metric_fn","text":"Performance metric function.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_fn.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get default performance metric function — get_perf_metric_fn","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_fn.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get default performance metric function — get_perf_metric_fn","text":"","code":"get_perf_metric_fn(\"continuous\") #> function (data, lev = NULL, model = NULL) #> { #> if (is.character(data$obs)) #> data$obs <- factor(data$obs, levels = lev) #> postResample(data[, \"pred\"], data[, \"obs\"]) #> } #> #> get_perf_metric_fn(\"binary\") #> function (data, lev = NULL, model = NULL) #> { #> if (!all(levels(data[, \"pred\"]) == levels(data[, \"obs\"]))) #> stop(\"levels of observed and predicted data do not match\") #> has_class_probs <- all(lev %in% colnames(data)) #> if (has_class_probs) { #> lloss <- mnLogLoss(data = data, lev = lev, model = model) #> requireNamespaceQuietStop(\"pROC\") #> requireNamespaceQuietStop(\"MLmetrics\") #> prob_stats <- lapply(levels(data[, \"pred\"]), function(x) { #> obs <- ifelse(data[, \"obs\"] == x, 1, 0) #> prob <- data[, x] #> roc_auc <- try(pROC::roc(obs, data[, x], direction = \"<\", #> quiet = TRUE), silent = TRUE) #> roc_auc <- if (inherits(roc_auc, \"try-error\")) #> NA #> else roc_auc$auc #> pr_auc <- try(MLmetrics::PRAUC(y_pred = data[, x], #> y_true = obs), silent = TRUE) #> if (inherits(pr_auc, \"try-error\")) #> pr_auc <- NA #> res <- c(ROC = roc_auc, AUC = pr_auc) #> return(res) #> }) #> prob_stats <- do.call(\"rbind\", prob_stats) #> prob_stats <- colMeans(prob_stats, na.rm = TRUE) #> } #> CM <- confusionMatrix(data[, \"pred\"], data[, \"obs\"], mode = \"everything\") #> if (length(levels(data[, \"pred\"])) == 2) { #> class_stats <- CM$byClass #> } #> else { #> class_stats <- colMeans(CM$byClass) #> names(class_stats) <- paste(\"Mean\", names(class_stats)) #> } #> overall_stats <- if (has_class_probs) #> c(CM$overall, logLoss = as.numeric(lloss), AUC = unname(prob_stats[\"ROC\"]), #> prAUC = unname(prob_stats[\"AUC\"])) #> else CM$overall #> stats <- c(overall_stats, class_stats) #> stats <- stats[!names(stats) %in% c(\"AccuracyNull\", \"AccuracyLower\", #> \"AccuracyUpper\", \"AccuracyPValue\", \"McnemarPValue\", \"Mean Prevalence\", #> \"Mean Detection Prevalence\")] #> names(stats) <- gsub(\"[[:blank:]]+\", \"_\", names(stats)) #> stat_list <- c(\"Accuracy\", \"Kappa\", \"Mean_F1\", \"Mean_Sensitivity\", #> \"Mean_Specificity\", \"Mean_Pos_Pred_Value\", \"Mean_Neg_Pred_Value\", #> \"Mean_Precision\", \"Mean_Recall\", \"Mean_Detection_Rate\", #> \"Mean_Balanced_Accuracy\") #> if (has_class_probs) #> stat_list <- c(\"logLoss\", \"AUC\", \"prAUC\", stat_list) #> if (length(levels(data[, \"pred\"])) == 2) #> stat_list <- gsub(\"^Mean_\", \"\", stat_list) #> stats <- stats[c(stat_list)] #> return(stats) #> } #> #> get_perf_metric_fn(\"multiclass\") #> function (data, lev = NULL, model = NULL) #> { #> if (!all(levels(data[, \"pred\"]) == levels(data[, \"obs\"]))) #> stop(\"levels of observed and predicted data do not match\") #> has_class_probs <- all(lev %in% colnames(data)) #> if (has_class_probs) { #> lloss <- mnLogLoss(data = data, lev = lev, model = model) #> requireNamespaceQuietStop(\"pROC\") #> requireNamespaceQuietStop(\"MLmetrics\") #> prob_stats <- lapply(levels(data[, \"pred\"]), function(x) { #> obs <- ifelse(data[, \"obs\"] == x, 1, 0) #> prob <- data[, x] #> roc_auc <- try(pROC::roc(obs, data[, x], direction = \"<\", #> quiet = TRUE), silent = TRUE) #> roc_auc <- if (inherits(roc_auc, \"try-error\")) #> NA #> else roc_auc$auc #> pr_auc <- try(MLmetrics::PRAUC(y_pred = data[, x], #> y_true = obs), silent = TRUE) #> if (inherits(pr_auc, \"try-error\")) #> pr_auc <- NA #> res <- c(ROC = roc_auc, AUC = pr_auc) #> return(res) #> }) #> prob_stats <- do.call(\"rbind\", prob_stats) #> prob_stats <- colMeans(prob_stats, na.rm = TRUE) #> } #> CM <- confusionMatrix(data[, \"pred\"], data[, \"obs\"], mode = \"everything\") #> if (length(levels(data[, \"pred\"])) == 2) { #> class_stats <- CM$byClass #> } #> else { #> class_stats <- colMeans(CM$byClass) #> names(class_stats) <- paste(\"Mean\", names(class_stats)) #> } #> overall_stats <- if (has_class_probs) #> c(CM$overall, logLoss = as.numeric(lloss), AUC = unname(prob_stats[\"ROC\"]), #> prAUC = unname(prob_stats[\"AUC\"])) #> else CM$overall #> stats <- c(overall_stats, class_stats) #> stats <- stats[!names(stats) %in% c(\"AccuracyNull\", \"AccuracyLower\", #> \"AccuracyUpper\", \"AccuracyPValue\", \"McnemarPValue\", \"Mean Prevalence\", #> \"Mean Detection Prevalence\")] #> names(stats) <- gsub(\"[[:blank:]]+\", \"_\", names(stats)) #> stat_list <- c(\"Accuracy\", \"Kappa\", \"Mean_F1\", \"Mean_Sensitivity\", #> \"Mean_Specificity\", \"Mean_Pos_Pred_Value\", \"Mean_Neg_Pred_Value\", #> \"Mean_Precision\", \"Mean_Recall\", \"Mean_Detection_Rate\", #> \"Mean_Balanced_Accuracy\") #> if (has_class_probs) #> stat_list <- c(\"logLoss\", \"AUC\", \"prAUC\", stat_list) #> if (length(levels(data[, \"pred\"])) == 2) #> stat_list <- gsub(\"^Mean_\", \"\", stat_list) #> stats <- stats[c(stat_list)] #> return(stats) #> } #> #> "},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_name.html","id":null,"dir":"Reference","previous_headings":"","what":"Get default performance metric name — get_perf_metric_name","title":"Get default performance metric name — get_perf_metric_name","text":"Get default performance metric name cross-validation.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_name.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get default performance metric name — get_perf_metric_name","text":"","code":"get_perf_metric_name(outcome_type)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_name.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get default performance metric name — get_perf_metric_name","text":"outcome_type Type outcome (one : \"continuous\",\"binary\",\"multiclass\").","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_name.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get default performance metric name — get_perf_metric_name","text":"Performance metric name.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_name.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get default performance metric name — get_perf_metric_name","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_perf_metric_name.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get default performance metric name — get_perf_metric_name","text":"","code":"get_perf_metric_name(\"continuous\") #> [1] \"RMSE\" get_perf_metric_name(\"binary\") #> [1] \"AUC\" get_perf_metric_name(\"multiclass\") #> [1] \"logLoss\""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_performance_tbl.html","id":null,"dir":"Reference","previous_headings":"","what":"Get model performance metrics as a one-row tibble — get_performance_tbl","title":"Get model performance metrics as a one-row tibble — get_performance_tbl","text":"Get model performance metrics one-row tibble","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_performance_tbl.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Get model performance metrics as a one-row tibble — get_performance_tbl","text":"","code":"get_performance_tbl( trained_model, test_data, outcome_colname, perf_metric_function, perf_metric_name, class_probs, method, seed = NA )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_performance_tbl.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Get model performance metrics as a one-row tibble — get_performance_tbl","text":"trained_model Trained model caret::train(). test_data Held test data: dataframe outcome features. outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). perf_metric_function Function calculate performance metric used cross-validation test performance. functions provided caret (see caret::defaultSummary()). Defaults: binary classification = twoClassSummary, multi-class classification = multiClassSummary, regression = defaultSummary. perf_metric_name column name output function provided perf_metric_function used performance metric. Defaults: binary classification = \"ROC\", multi-class classification = \"logLoss\", regression = \"RMSE\". class_probs Whether use class probabilities (TRUE categorical outcomes, FALSE numeric outcomes). method ML method. Options: c(\"glmnet\", \"rf\", \"rpart2\", \"svmRadial\", \"xgbTree\"). glmnet: linear, logistic, multiclass regression rf: random forest rpart2: decision tree svmRadial: support vector machine xgbTree: xgboost seed Random seed (default: NA). results reproducible set seed.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_performance_tbl.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Get model performance metrics as a one-row tibble — get_performance_tbl","text":"one-row tibble columns cv_auroc, column performance metrics test data method, seed.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_performance_tbl.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Get model performance metrics as a one-row tibble — get_performance_tbl","text":"Kelly Sovacool, sovacool@umich.edu Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_performance_tbl.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Get model performance metrics as a one-row tibble — get_performance_tbl","text":"","code":"if (FALSE) { results <- run_ml(otu_small, \"glmnet\", kfold = 2, cv_times = 2) names(results$trained_model$trainingData)[1] <- \"dx\" get_performance_tbl(results$trained_model, results$test_data, \"dx\", multiClassSummary, \"AUC\", class_probs = TRUE, method = \"glmnet\" ) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_tuning_grid.html","id":null,"dir":"Reference","previous_headings":"","what":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","title":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","text":"Generate tuning grid tuning hyperparameters","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_tuning_grid.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","text":"","code":"get_tuning_grid(hyperparams_list, method)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_tuning_grid.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","text":"hyperparams_list Named list lists hyperparameters. method ML method. Options: c(\"glmnet\", \"rf\", \"rpart2\", \"svmRadial\", \"xgbTree\"). glmnet: linear, logistic, multiclass regression rf: random forest rpart2: decision tree svmRadial: support vector machine xgbTree: xgboost","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_tuning_grid.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","text":"tuning grid.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_tuning_grid.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","text":"Begüm Topçuoğlu, topcuoglu.begum@gmail.com Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/get_tuning_grid.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Generate the tuning grid for tuning hyperparameters — get_tuning_grid","text":"","code":"ml_method <- \"glmnet\" hparams_list <- get_hyperparams_list(otu_small, ml_method) get_tuning_grid(hparams_list, ml_method) #> lambda alpha #> 1 1e-04 0 #> 2 1e-03 0 #> 3 1e-02 0 #> 4 1e-01 0 #> 5 1e+00 0 #> 6 1e+01 0"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/group_correlated_features.html","id":null,"dir":"Reference","previous_headings":"","what":"Group correlated features — group_correlated_features","title":"Group correlated features — group_correlated_features","text":"Group correlated features","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/group_correlated_features.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Group correlated features — group_correlated_features","text":"","code":"group_correlated_features( features, corr_thresh = 1, group_neg_corr = TRUE, corr_method = \"spearman\" )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/group_correlated_features.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Group correlated features — group_correlated_features","text":"features dataframe column feature ML corr_thresh feature importance, group correlations equal corr_thresh (range 0 1; default: 1). group_neg_corr Whether group negatively correlated features together (e.g. c(0,1) c(1,0)). corr_method correlation method. options supported stats::cor: spearman, pearson, kendall. (default: spearman)","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/group_correlated_features.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Group correlated features — group_correlated_features","text":"vector element group correlated features separated pipes (|)","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/group_correlated_features.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Group correlated features — group_correlated_features","text":"Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/group_correlated_features.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Group correlated features — group_correlated_features","text":"","code":"features <- data.frame( a = 1:3, b = 2:4, c = c(1, 0, 1), d = (5:7), e = c(5, 1, 4), f = c(-1, 0, -1) ) group_correlated_features(features) #> [1] \"a|b|d\" \"c|f\" \"e\""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/mikropml.html","id":null,"dir":"Reference","previous_headings":"","what":"mikropml: User-Friendly R Package for Robust Machine Learning Pipelines — mikropml","title":"mikropml: User-Friendly R Package for Robust Machine Learning Pipelines — mikropml","text":"mikropml implements supervised machine learning pipelines using regression, support vector machines, decision trees, random forest, gradient-boosted trees. main functions preprocess_data() process data prior running machine learning, run_ml() run machine learning.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/mikropml.html","id":"authors","dir":"Reference","previous_headings":"","what":"Authors","title":"mikropml: User-Friendly R Package for Robust Machine Learning Pipelines — mikropml","text":"Begüm D. Topçuoğlu (ORCID) Zena Lapp (ORCID) Kelly L. Sovacool (ORCID) Evan Snitkin (ORCID) Jenna Wiens (ORCID) Patrick D. Schloss (ORCID)","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/mikropml.html","id":"see-vignettes","dir":"Reference","previous_headings":"","what":"See vignettes","title":"mikropml: User-Friendly R Package for Robust Machine Learning Pipelines — mikropml","text":"Introduction Preprocessing data Hyperparameter tuning Parallel processing mikropml paper","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin.html","id":null,"dir":"Reference","previous_headings":"","what":"Mini OTU abundance dataset — otu_mini_bin","title":"Mini OTU abundance dataset — otu_mini_bin","text":"dataset containing relatives abundances OTUs human stool samples binary outcome, dx. subset otu_small.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Mini OTU abundance dataset — otu_mini_bin","text":"","code":"otu_mini_bin"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Mini OTU abundance dataset — otu_mini_bin","text":"data frame dx column diagnosis: healthy cancerous (colorectal). columns OTU relative abundances.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_glmnet.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with L2 logistic regression on otu_mini_bin with feature importance and grouping — otu_mini_bin_results_glmnet","title":"Results from running the pipeline with L2 logistic regression on otu_mini_bin with feature importance and grouping — otu_mini_bin_results_glmnet","text":"Results running pipeline L2 logistic regression otu_mini_bin feature importance grouping","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_glmnet.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with L2 logistic regression on otu_mini_bin with feature importance and grouping — otu_mini_bin_results_glmnet","text":"","code":"otu_mini_bin_results_glmnet"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_glmnet.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with L2 logistic regression on otu_mini_bin with feature importance and grouping — otu_mini_bin_results_glmnet","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_rf.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with random forest on otu_mini_bin — otu_mini_bin_results_rf","title":"Results from running the pipeline with random forest on otu_mini_bin — otu_mini_bin_results_rf","text":"Results running pipeline random forest otu_mini_bin","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_rf.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with random forest on otu_mini_bin — otu_mini_bin_results_rf","text":"","code":"otu_mini_bin_results_rf"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_rf.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with random forest on otu_mini_bin — otu_mini_bin_results_rf","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_rpart2.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with rpart2 on otu_mini_bin — otu_mini_bin_results_rpart2","title":"Results from running the pipeline with rpart2 on otu_mini_bin — otu_mini_bin_results_rpart2","text":"Results running pipeline rpart2 otu_mini_bin","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_rpart2.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with rpart2 on otu_mini_bin — otu_mini_bin_results_rpart2","text":"","code":"otu_mini_bin_results_rpart2"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_rpart2.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with rpart2 on otu_mini_bin — otu_mini_bin_results_rpart2","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_svmRadial.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with svmRadial on otu_mini_bin — otu_mini_bin_results_svmRadial","title":"Results from running the pipeline with svmRadial on otu_mini_bin — otu_mini_bin_results_svmRadial","text":"Results running pipeline svmRadial otu_mini_bin","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_svmRadial.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with svmRadial on otu_mini_bin — otu_mini_bin_results_svmRadial","text":"","code":"otu_mini_bin_results_svmRadial"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_svmRadial.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with svmRadial on otu_mini_bin — otu_mini_bin_results_svmRadial","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_xgbTree.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with xbgTree on otu_mini_bin — otu_mini_bin_results_xgbTree","title":"Results from running the pipeline with xbgTree on otu_mini_bin — otu_mini_bin_results_xgbTree","text":"Results running pipeline xbgTree otu_mini_bin","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_xgbTree.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with xbgTree on otu_mini_bin — otu_mini_bin_results_xgbTree","text":"","code":"otu_mini_bin_results_xgbTree"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_bin_results_xgbTree.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with xbgTree on otu_mini_bin — otu_mini_bin_results_xgbTree","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cont_results_glmnet.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome — otu_mini_cont_results_glmnet","title":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome — otu_mini_cont_results_glmnet","text":"Results running pipeline glmnet otu_mini_bin Otu00001 outcome","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cont_results_glmnet.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome — otu_mini_cont_results_glmnet","text":"","code":"otu_mini_cont_results_glmnet"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cont_results_glmnet.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome — otu_mini_cont_results_glmnet","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cont_results_nocv.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome column,\nusing a custom train control scheme that does not perform cross-validation — otu_mini_cont_results_nocv","title":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome column,\nusing a custom train control scheme that does not perform cross-validation — otu_mini_cont_results_nocv","text":"Results running pipeline glmnet otu_mini_bin Otu00001 outcome column, using custom train control scheme perform cross-validation","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cont_results_nocv.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome column,\nusing a custom train control scheme that does not perform cross-validation — otu_mini_cont_results_nocv","text":"","code":"otu_mini_cont_results_nocv"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cont_results_nocv.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with glmnet on otu_mini_bin with Otu00001\nas the outcome column,\nusing a custom train control scheme that does not perform cross-validation — otu_mini_cont_results_nocv","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cv.html","id":null,"dir":"Reference","previous_headings":"","what":"Cross validation on train_data_mini with grouped features. — otu_mini_cv","title":"Cross validation on train_data_mini with grouped features. — otu_mini_cv","text":"Cross validation train_data_mini grouped features.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cv.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Cross validation on train_data_mini with grouped features. — otu_mini_cv","text":"","code":"otu_mini_cv"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_cv.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Cross validation on train_data_mini with grouped features. — otu_mini_cv","text":"object class list length 27.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi.html","id":null,"dir":"Reference","previous_headings":"","what":"Mini OTU abundance dataset with 3 categorical variables — otu_mini_multi","title":"Mini OTU abundance dataset with 3 categorical variables — otu_mini_multi","text":"dataset containing relatives abundances OTUs human stool samples","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Mini OTU abundance dataset with 3 categorical variables — otu_mini_multi","text":"","code":"otu_mini_multi"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Mini OTU abundance dataset with 3 categorical variables — otu_mini_multi","text":"data frame dx column colorectal cancer diagnosis: adenoma, carcinoma, normal. columns OTU relative abundances.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi_group.html","id":null,"dir":"Reference","previous_headings":"","what":"Groups for otu_mini_multi — otu_mini_multi_group","title":"Groups for otu_mini_multi — otu_mini_multi_group","text":"Groups otu_mini_multi","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi_group.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Groups for otu_mini_multi — otu_mini_multi_group","text":"","code":"otu_mini_multi_group"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi_group.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Groups for otu_mini_multi — otu_mini_multi_group","text":"object class character length 490.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi_results_glmnet.html","id":null,"dir":"Reference","previous_headings":"","what":"Results from running the pipeline with glmnet on otu_mini_multi for\nmulticlass outcomes — otu_mini_multi_results_glmnet","title":"Results from running the pipeline with glmnet on otu_mini_multi for\nmulticlass outcomes — otu_mini_multi_results_glmnet","text":"Results running pipeline glmnet otu_mini_multi multiclass outcomes","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi_results_glmnet.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Results from running the pipeline with glmnet on otu_mini_multi for\nmulticlass outcomes — otu_mini_multi_results_glmnet","text":"","code":"otu_mini_multi_results_glmnet"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_mini_multi_results_glmnet.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Results from running the pipeline with glmnet on otu_mini_multi for\nmulticlass outcomes — otu_mini_multi_results_glmnet","text":"object class list length 4.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_small.html","id":null,"dir":"Reference","previous_headings":"","what":"Small OTU abundance dataset — otu_small","title":"Small OTU abundance dataset — otu_small","text":"dataset containing relatives abundances 60 OTUs 60 human stool samples. subset data provided extdata/otu_large.csv, used Topçuoğlu et al. 2020.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_small.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Small OTU abundance dataset — otu_small","text":"","code":"otu_small"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/otu_small.html","id":"format","dir":"Reference","previous_headings":"","what":"Format","title":"Small OTU abundance dataset — otu_small","text":"data frame 60 rows 61 variables. dx column diagnosis: healthy cancerous (colorectal). columns OTU relative abundances.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/permute_p_value.html","id":null,"dir":"Reference","previous_headings":"","what":"Calculated a permuted p-value comparing two models — permute_p_value","title":"Calculated a permuted p-value comparing two models — permute_p_value","text":"Calculated permuted p-value comparing two models","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/permute_p_value.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Calculated a permuted p-value comparing two models — permute_p_value","text":"","code":"permute_p_value( merged_data, metric, group_name, group_1, group_2, nperm = 10000 )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/permute_p_value.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Calculated a permuted p-value comparing two models — permute_p_value","text":"merged_data concatenated performance data run_ml metric metric compare, must numeric group_name column group variables compare group_1 name one group compare group_2 name group compare nperm number permutations, default=10000","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/permute_p_value.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Calculated a permuted p-value comparing two models — permute_p_value","text":"numeric p-value comparing two models","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/permute_p_value.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Calculated a permuted p-value comparing two models — permute_p_value","text":"Begüm Topçuoğlu, topcuoglu.begum@gmail.com Courtney R Armour, armourc@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/permute_p_value.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Calculated a permuted p-value comparing two models — permute_p_value","text":"","code":"df <- dplyr::tibble( model = c(\"rf\", \"rf\", \"glmnet\", \"glmnet\", \"svmRadial\", \"svmRadial\"), AUC = c(.2, 0.3, 0.8, 0.9, 0.85, 0.95) ) set.seed(123) permute_p_value(df, \"AUC\", \"model\", \"rf\", \"glmnet\", nperm = 100) #> [1] 0.3663366"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_hp_performance.html","id":null,"dir":"Reference","previous_headings":"","what":"Plot hyperparameter performance metrics — plot_hp_performance","title":"Plot hyperparameter performance metrics — plot_hp_performance","text":"Plot hyperparameter performance metrics","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_hp_performance.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Plot hyperparameter performance metrics — plot_hp_performance","text":"","code":"plot_hp_performance(dat, param_col, metric_col)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_hp_performance.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Plot hyperparameter performance metrics — plot_hp_performance","text":"dat dataframe hyperparameters performance metric (e.g. get_hp_performance() combine_hp_performance()) param_col hyperparameter plotted. must column dat. metric_col performance metric. must column dat.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_hp_performance.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Plot hyperparameter performance metrics — plot_hp_performance","text":"ggplot hyperparameter performance.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_hp_performance.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Plot hyperparameter performance metrics — plot_hp_performance","text":"Zena Lapp, zenalapp@umich.edu Kelly Sovacool sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_hp_performance.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Plot hyperparameter performance metrics — plot_hp_performance","text":"","code":"# plot for a single `run_ml()` call hp_metrics <- get_hp_performance(otu_mini_bin_results_glmnet$trained_model) hp_metrics #> $dat #> alpha lambda AUC #> 1 0 1e-04 0.6082552 #> 2 0 1e-03 0.6082552 #> 3 0 1e-02 0.6086458 #> 4 0 1e-01 0.6166789 #> 5 0 1e+00 0.6221737 #> 6 0 1e+01 0.6187408 #> #> $params #> [1] \"lambda\" #> #> $metric #> [1] \"AUC\" #> plot_hp_performance(hp_metrics$dat, lambda, AUC) if (FALSE) { # plot for multiple `run_ml()` calls results <- lapply(seq(100, 102), function(seed) { run_ml(otu_small, \"glmnet\", seed = seed) }) models <- lapply(results, function(x) x$trained_model) hp_metrics <- combine_hp_performance(models) plot_hp_performance(hp_metrics$dat, lambda, AUC) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_model_performance.html","id":null,"dir":"Reference","previous_headings":"","what":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","title":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","text":"ggplot2 required use function.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_model_performance.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","text":"","code":"plot_model_performance(performance_df)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_model_performance.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","text":"performance_df dataframe performance results multiple calls run_ml()","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_model_performance.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","text":"ggplot2 plot performance.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_model_performance.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","text":"Begüm Topçuoglu, topcuoglu.begum@gmail.com Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/plot_model_performance.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Plot performance metrics for multiple ML runs with different parameters — plot_model_performance","text":"","code":"if (FALSE) { # call `run_ml()` multiple times with different seeds results_lst <- lapply(seq(100, 104), function(seed) { run_ml(otu_small, \"glmnet\", seed = seed) }) # extract and combine the performance results perf_df <- lapply(results_lst, function(result) { result[[\"performance\"]] }) %>% dplyr::bind_rows() # plot the performance results p <- plot_model_performance(perf_df) # call `run_ml()` with different ML methods param_grid <- expand.grid( seeds = seq(100, 104), methods = c(\"glmnet\", \"rf\") ) results_mtx <- mapply( function(seed, method) { run_ml(otu_mini_bin, method, seed = seed, kfold = 2) }, param_grid$seeds, param_grid$methods ) # extract and combine the performance results perf_df2 <- dplyr::bind_rows(results_mtx[\"performance\", ]) # plot the performance results p <- plot_model_performance(perf_df2) # you can continue adding layers to customize the plot p + theme_classic() + scale_color_brewer(palette = \"Dark2\") + coord_flip() }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":null,"dir":"Reference","previous_headings":"","what":"Preprocess data prior to running machine learning — preprocess_data","title":"Preprocess data prior to running machine learning — preprocess_data","text":"Function preprocess data input run_ml().","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Preprocess data prior to running machine learning — preprocess_data","text":"","code":"preprocess_data( dataset, outcome_colname, method = c(\"center\", \"scale\"), remove_var = \"nzv\", collapse_corr_feats = TRUE, to_numeric = TRUE, group_neg_corr = TRUE, prefilter_threshold = 1 )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Preprocess data prior to running machine learning — preprocess_data","text":"dataset Dataframe outcome variable columns features. outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). method Methods preprocess data, described caret::preProcess() (default: c(\"center\",\"scale\"), use NULL normalization). remove_var Whether remove variables near-zero variance ('nzv'; default), zero variance ('zv'), none (NULL). collapse_corr_feats Whether keep one perfectly correlated features. to_numeric Whether change features numeric possible. group_neg_corr Whether group negatively correlated features together (e.g. c(0,1) c(1,0)). prefilter_threshold Remove features non-zero & non-NA values N rows fewer (default: 1). Set -1 keep columns step. step also skipped to_numeric set FALSE.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Preprocess data prior to running machine learning — preprocess_data","text":"Named list including: dat_transformed: Preprocessed data. grp_feats: features grouped together, named list features corresponding group. removed_feats: features removed preprocessing (e.g. zero variance near-zero variance features). progressr package installed, progress bar time elapsed estimated time completion can displayed.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":"more-details","dir":"Reference","previous_headings":"","what":"More details","title":"Preprocess data prior to running machine learning — preprocess_data","text":"See preprocessing vignette details. Note values outcome_colname contain spaces, converted underscores compatibility caret.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Preprocess data prior to running machine learning — preprocess_data","text":"Zena Lapp, zenalapp@umich.edu Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/preprocess_data.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Preprocess data prior to running machine learning — preprocess_data","text":"","code":"preprocess_data(mikropml::otu_small, \"dx\") #> Using 'dx' as the outcome column. #> $dat_transformed #> # A tibble: 200 × 61 #> dx Otu00001 Otu00002 Otu00003 Otu00004 Otu00005 Otu00006 Otu00…¹ Otu00008 #> #> 1 normal -0.420 -0.219 -0.174 -0.591 -0.0488 -0.167 -0.569 -0.0624 #> 2 normal -0.105 1.75 -0.718 0.0381 1.54 -0.573 -0.643 -0.132 #> 3 normal -0.708 0.696 1.43 0.604 -0.265 -0.0364 -0.612 -0.207 #> 4 normal -0.494 -0.665 2.02 -0.593 -0.676 -0.586 -0.552 -0.470 #> 5 normal 1.11 -0.395 -0.754 -0.586 -0.754 2.73 0.191 -0.676 #> 6 normal -0.685 0.614 -0.174 -0.584 0.376 0.804 -0.337 -0.00608 #> 7 cancer -0.770 -0.496 -0.318 0.159 -0.658 2.20 -0.717 0.0636 #> 8 normal -0.424 -0.478 -0.397 -0.556 -0.391 -0.0620 0.376 -0.0222 #> 9 normal -0.556 1.14 1.62 -0.352 -0.275 -0.465 -0.804 0.294 #> 10 cancer 1.46 -0.451 -0.694 -0.0567 -0.706 0.689 -0.370 1.59 #> # … with 190 more rows, 52 more variables: Otu00009 , Otu00010 , #> # Otu00011 , Otu00012 , Otu00013 , Otu00014 , #> # Otu00015 , Otu00016 , Otu00017 , Otu00018 , #> # Otu00019 , Otu00020 , Otu00021 , Otu00022 , #> # Otu00023 , Otu00024 , Otu00025 , Otu00026 , #> # Otu00027 , Otu00028 , Otu00029 , Otu00030 , #> # Otu00031 , Otu00032 , Otu00033 , Otu00034 , … #> #> $grp_feats #> NULL #> #> $removed_feats #> character(0) #> # the function can show a progress bar if you have the progressr package installed ## optionally, specify the progress bar format progressr::handlers(progressr::handler_progress( format = \":message :bar :percent | elapsed: :elapsed | eta: :eta\", clear = FALSE, show_after = 0 )) ## tell progressor to always report progress if (FALSE) { progressr::handlers(global = TRUE) ## run the function and watch the live progress udpates dat_preproc <- preprocess_data(mikropml::otu_small, \"dx\") }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/randomize_feature_order.html","id":null,"dir":"Reference","previous_headings":"","what":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","title":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","text":"Randomize feature order eliminate position-dependent effects","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/randomize_feature_order.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","text":"","code":"randomize_feature_order(dataset, outcome_colname)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/randomize_feature_order.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","text":"dataset Dataframe outcome variable columns features. outcome_colname Column name string outcome variable (default NULL; first column chosen automatically).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/randomize_feature_order.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","text":"Dataset feature order randomized.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/randomize_feature_order.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","text":"Nick Lesniak, nlesniak@umich.edu Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/randomize_feature_order.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Randomize feature order to eliminate any position-dependent effects — randomize_feature_order","text":"","code":"dat <- data.frame( outcome = c(\"1\", \"2\", \"3\"), a = 4:6, b = 7:9, c = 10:12, d = 13:15 ) randomize_feature_order(dat, \"outcome\") #> outcome c b a d #> 1 1 10 7 4 13 #> 2 2 11 8 5 14 #> 3 3 12 9 6 15"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/reexports.html","id":null,"dir":"Reference","previous_headings":"","what":"dplyr pipe — reexports","title":"dplyr pipe — reexports","text":"objects imported packages. Follow links see documentation. caret contr.ltfr dplyr %>% rlang :=, !!, .data","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/remove_singleton_columns.html","id":null,"dir":"Reference","previous_headings":"","what":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","title":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","text":"Removes columns non-zero & non-NA values threshold row(s) fewer.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/remove_singleton_columns.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","text":"","code":"remove_singleton_columns(dat, threshold = 1)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/remove_singleton_columns.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","text":"dat dataframe threshold Number rows. column non-zero & non-NA values threshold row(s) fewer, removed.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/remove_singleton_columns.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","text":"dataframe without singleton columns","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/remove_singleton_columns.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","text":"Kelly Sovacool, sovacool@umich.edu Courtney Armour","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/remove_singleton_columns.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Remove columns appearing in only threshold row(s) or fewer. — remove_singleton_columns","text":"","code":"remove_singleton_columns(data.frame(a = 1:3, b = c(0, 1, 0), c = 4:6)) #> $dat #> a c #> 1 1 4 #> 2 2 5 #> 3 3 6 #> #> $removed_feats #> [1] \"b\" #> remove_singleton_columns(data.frame(a = 1:3, b = c(0, 1, 0), c = 4:6), threshold = 0) #> $dat #> a b c #> 1 1 0 4 #> 2 2 1 5 #> 3 3 0 6 #> #> $removed_feats #> character(0) #> remove_singleton_columns(data.frame(a = 1:3, b = c(0, 1, NA), c = 4:6)) #> $dat #> a c #> 1 1 4 #> 2 2 5 #> 3 3 6 #> #> $removed_feats #> [1] \"b\" #> remove_singleton_columns(data.frame(a = 1:3, b = c(1, 1, 1), c = 4:6)) #> $dat #> a b c #> 1 1 1 4 #> 2 2 1 5 #> 3 3 1 6 #> #> $removed_feats #> character(0) #>"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/replace_spaces.html","id":null,"dir":"Reference","previous_headings":"","what":"Replace spaces in all elements of a character vector with underscores — replace_spaces","title":"Replace spaces in all elements of a character vector with underscores — replace_spaces","text":"Replace spaces elements character vector underscores","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/replace_spaces.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Replace spaces in all elements of a character vector with underscores — replace_spaces","text":"","code":"replace_spaces(x, new_char = \"_\")"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/replace_spaces.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Replace spaces in all elements of a character vector with underscores — replace_spaces","text":"x character vector new_char character replace spaces (default: _)","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/replace_spaces.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Replace spaces in all elements of a character vector with underscores — replace_spaces","text":"character vector spaces replaced new_char","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/replace_spaces.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Replace spaces in all elements of a character vector with underscores — replace_spaces","text":"Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/replace_spaces.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Replace spaces in all elements of a character vector with underscores — replace_spaces","text":"","code":"dat <- data.frame( dx = c(\"outcome 1\", \"outcome 2\", \"outcome 1\"), a = 1:3, b = c(5, 7, 1) ) dat$dx <- replace_spaces(dat$dx) dat #> dx a b #> 1 outcome_1 1 5 #> 2 outcome_2 2 7 #> 3 outcome_1 3 1"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":null,"dir":"Reference","previous_headings":"","what":"Run the machine learning pipeline — run_ml","title":"Run the machine learning pipeline — run_ml","text":"function runs machine learning (ML), evaluates best model, optionally calculates feature importance using framework outlined Topçuoğlu et al. 2020 (doi:10.1128/mBio.00434-20 ). Required inputs dataframe outcome variable columns features, well ML method. See vignette('introduction') details.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Run the machine learning pipeline — run_ml","text":"","code":"run_ml( dataset, method, outcome_colname = NULL, hyperparameters = NULL, find_feature_importance = FALSE, calculate_performance = TRUE, kfold = 5, cv_times = 100, cross_val = NULL, training_frac = 0.8, perf_metric_function = NULL, perf_metric_name = NULL, groups = NULL, group_partitions = NULL, corr_thresh = 1, ntree = 1000, seed = NA )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Run the machine learning pipeline — run_ml","text":"dataset Dataframe outcome variable columns features. method ML method. Options: c(\"glmnet\", \"rf\", \"rpart2\", \"svmRadial\", \"xgbTree\"). glmnet: linear, logistic, multiclass regression rf: random forest rpart2: decision tree svmRadial: support vector machine xgbTree: xgboost outcome_colname Column name string outcome variable (default NULL; first column chosen automatically). hyperparameters Dataframe hyperparameters (default NULL; sensible defaults chosen automatically). find_feature_importance Run permutation importance (default: FALSE). TRUE recommended like identify features important predicting outcome, resource-intensive. calculate_performance Whether calculate performance metrics (default: TRUE). might choose skip perform cross-validation model training. kfold Fold number k-fold cross-validation (default: 5). cv_times Number cross-validation partitions create (default: 100). cross_val custom cross-validation scheme caret::trainControl() (default: NULL, uses kfold cross validation repeated cv_times). kfold cv_times ignored user provides custom cross-validation scheme. See caret::trainControl() docs information use . training_frac Fraction data training set (default: 0.8). Rows dataset randomly selected training set, remaining rows used testing set. Alternatively, provide vector integers, used row indices training set. remaining rows used testing set. perf_metric_function Function calculate performance metric used cross-validation test performance. functions provided caret (see caret::defaultSummary()). Defaults: binary classification = twoClassSummary, multi-class classification = multiClassSummary, regression = defaultSummary. perf_metric_name column name output function provided perf_metric_function used performance metric. Defaults: binary classification = \"ROC\", multi-class classification = \"logLoss\", regression = \"RMSE\". groups Vector groups keep together splitting data train test sets. number groups training set larger kfold, groups also kept together cross-validation. Length matches number rows dataset (default: NULL). group_partitions Specify assign groups training testing partitions (default: NULL). groups specifies samples belong group \"\" belong group \"B\", setting group_partitions = list(train = c(\"\", \"B\"), test = c(\"B\")) result samples group \"\" placed training set, samples \"B\" also training set, remaining samples \"B\" testing set. partition sizes close training_frac possible. number groups training set larger kfold, groups also kept together cross-validation. corr_thresh feature importance, group correlations equal corr_thresh (range 0 1; default: 1). ntree random forest, many trees use (default: 1000). Note caret allow parameter tuned. seed Random seed (default: NA). results reproducible set seed.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Run the machine learning pipeline — run_ml","text":"Named list results: trained_model: Output caret::train(), including best model. test_data: Part data used testing. performance: Dataframe performance metrics. first column cross-validation performance metric, last two columns ML method used seed (one set), respectively. columns performance metrics calculated test data. contains one row, can easily combine performance dataframes multiple calls run_ml() (see vignette(\"parallel\")). feature_importance: feature importances calculated, dataframe row feature correlated group. columns performance metric permuted data, difference true performance metric performance metric permuted data (true - permuted), feature name, ML method, performance metric name, seed (provided). AUC RMSE, higher perf_metric_diff , important feature predicting outcome. log loss, lower perf_metric_diff , important feature predicting outcome.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":"more-details","dir":"Reference","previous_headings":"","what":"More details","title":"Run the machine learning pipeline — run_ml","text":"details, please see vignettes.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Run the machine learning pipeline — run_ml","text":"Begüm Topçuoğlu, topcuoglu.begum@gmail.com Zena Lapp, zenalapp@umich.edu Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/run_ml.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Run the machine learning pipeline — run_ml","text":"","code":"if (FALSE) { # regression run_ml(otu_small, \"glmnet\", seed = 2019 ) # random forest w/ feature importance run_ml(otu_small, \"rf\", outcome_colname = \"dx\", find_feature_importance = TRUE ) # custom cross validation & hyperparameters run_ml(otu_mini_bin[, 2:11], \"glmnet\", outcome_colname = \"Otu00001\", seed = 2019, hyperparameters = list(lambda = c(1e-04), alpha = 0), cross_val = caret::trainControl(method = \"none\"), calculate_performance = FALSE ) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/tidy_perf_data.html","id":null,"dir":"Reference","previous_headings":"","what":"Tidy the performance dataframe — tidy_perf_data","title":"Tidy the performance dataframe — tidy_perf_data","text":"Used plot_model_performance().","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/tidy_perf_data.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Tidy the performance dataframe — tidy_perf_data","text":"","code":"tidy_perf_data(performance_df)"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/tidy_perf_data.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Tidy the performance dataframe — tidy_perf_data","text":"performance_df dataframe performance results multiple calls run_ml()","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/tidy_perf_data.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Tidy the performance dataframe — tidy_perf_data","text":"Tidy dataframe model performance metrics.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/tidy_perf_data.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Tidy the performance dataframe — tidy_perf_data","text":"Begüm Topçuoglu, topcuoglu.begum@gmail.com Kelly Sovacool, sovacool@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/tidy_perf_data.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Tidy the performance dataframe — tidy_perf_data","text":"","code":"if (FALSE) { # call `run_ml()` multiple times with different seeds results_lst <- lapply(seq(100, 104), function(seed) { run_ml(otu_small, \"glmnet\", seed = seed) }) # extract and combine the performance results perf_df <- lapply(results_lst, function(result) { result[[\"performance\"]] }) %>% dplyr::bind_rows() # make it pretty! tidy_perf_data(perf_df) }"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/train_model.html","id":null,"dir":"Reference","previous_headings":"","what":"Train model using caret::train(). — train_model","title":"Train model using caret::train(). — train_model","text":"Train model using caret::train().","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/train_model.html","id":"ref-usage","dir":"Reference","previous_headings":"","what":"Usage","title":"Train model using caret::train(). — train_model","text":"","code":"train_model( model_formula, train_data, method, cv, perf_metric_name, tune_grid, ntree )"},{"path":"http://www.schlosslab.org/mikropml/dev/reference/train_model.html","id":"arguments","dir":"Reference","previous_headings":"","what":"Arguments","title":"Train model using caret::train(). — train_model","text":"model_formula Model formula, typically created stats::.formula(). train_data Training data. Expected subset full dataset. method ML method. Options: c(\"glmnet\", \"rf\", \"rpart2\", \"svmRadial\", \"xgbTree\"). glmnet: linear, logistic, multiclass regression rf: random forest rpart2: decision tree svmRadial: support vector machine xgbTree: xgboost cv Cross-validation caret scheme define_cv(). perf_metric_name column name output function provided perf_metric_function used performance metric. Defaults: binary classification = \"ROC\", multi-class classification = \"logLoss\", regression = \"RMSE\". tune_grid Tuning grid get_tuning_grid(). ntree random forest, many trees use (default: 1000). Note caret allow parameter tuned.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/train_model.html","id":"value","dir":"Reference","previous_headings":"","what":"Value","title":"Train model using caret::train(). — train_model","text":"Trained model caret::train().","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/train_model.html","id":"author","dir":"Reference","previous_headings":"","what":"Author","title":"Train model using caret::train(). — train_model","text":"Zena Lapp, zenalapp@umich.edu","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/reference/train_model.html","id":"ref-examples","dir":"Reference","previous_headings":"","what":"Examples","title":"Train model using caret::train(). — train_model","text":"","code":"if (FALSE) { training_data <- otu_mini_bin_results_glmnet$trained_model$trainingData %>% dplyr::rename(dx = .outcome) method <- \"rf\" hyperparameters <- get_hyperparams_list(otu_mini_bin, method) cross_val <- define_cv(training_data, \"dx\", hyperparameters, perf_metric_function = caret::multiClassSummary, class_probs = TRUE, cv_times = 2 ) tune_grid <- get_tuning_grid(hyperparameters, method) rf_model <- train_model( stats::as.formula(paste(\"dx\", \"~ .\")), training_data, method, cross_val, \"AUC\", tune_grid, 1000 ) rf_model$results %>% dplyr::select(mtry, AUC, prAUC) }"},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-130","dir":"Changelog","previous_headings":"","what":"mikropml 1.3.0","title":"mikropml 1.3.0","text":"CRAN release: 2022-05-20 mikropml now requires R version 4.1.0 greater due update randomForest package (#292). New function compare_models() compares performance two models permutation test (#295, @courtneyarmour). Fixed bug cv_times affect reported repeats cross-validation (#291, @kelly-sovacool). Made minor documentation improvements (#293, @kelly-sovacool)","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-122","dir":"Changelog","previous_headings":"","what":"mikropml 1.2.2","title":"mikropml 1.2.2","text":"CRAN release: 2022-02-03 minor patch fixes test failure platforms long doubles. actual package code remains unchanged.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-121","dir":"Changelog","previous_headings":"","what":"mikropml 1.2.1","title":"mikropml 1.2.1","text":"CRAN release: 2022-01-30 using groups parameter, groups kept together cross-validation partitions kfold <= number groups training set. Previously, error thrown condition met. Now, enough groups training set groups kept together CV, groups allowed split across CV partitions. Report p-values permutation feature importance (#288, @kelly-sovacool).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-120","dir":"Changelog","previous_headings":"","what":"mikropml 1.2.0","title":"mikropml 1.2.0","text":"CRAN release: 2021-11-10 Also added new parameter calculate_performance, controls whether performance metrics calculated (default: TRUE). Users may wish skip performance calculations training models cross-validation. New parameter group_partitions added run_ml() allows users control groups go partition train/test split (#281, @kelly-sovacool). default, training_frac fraction 0 1 specifies much dataset used training fraction train/test split. Users can instead give training_frac vector indices correspond rows dataset go training fraction train/test split. gives users direct control exactly observations training fraction desired.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-111","dir":"Changelog","previous_headings":"","what":"mikropml 1.1.1","title":"mikropml 1.1.1","text":"CRAN release: 2021-09-14 Also, group_correlated_features() now user-facing function.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-110","dir":"Changelog","previous_headings":"","what":"mikropml 1.1.0","title":"mikropml 1.1.0","text":"CRAN release: 2021-08-10 default still “spearman”, now can use methods supported stats::cor corr_method parameter: get_feature_importance(corr_method = \"pearson\") now video tutorials covering mikropml skills related machine learning, created @pschloss (#270). Fixed bug preprocess_data() converted outcome column character vector (#273, @kelly-sovacool, @ecmaggioncalda).","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-100","dir":"Changelog","previous_headings":"","what":"mikropml 1.0.0","title":"mikropml 1.0.0","text":"CRAN release: 2021-05-13 mikropml now logo created @NLesniak! Made documentation improvements (#238, #231 @kelly-sovacool; #256 @BTopcuoglu). Remove features appear N=prefilter_threshold fewer rows data. Created function remove_singleton_columns() called preprocess_data() carry . Provide custom groups features permute together permutation importance. groups NULL default; case, correlated features corr_thresh grouped together. preprocess_data() now replaces spaces outcome column underscores (#247, @kelly-sovacool, @JonnyTran). Clarify intro vignette support multi-label outcomes. (#254, @zenalapp) Optional progress bar preprocess_data() get_feature_importance() using progressr package (#257, @kelly-sovacool, @JonnyTran, @FedericoComoglio). mikropml paper soon published JOSS!","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-002","dir":"Changelog","previous_headings":"","what":"mikropml 0.0.2","title":"mikropml 0.0.2","text":"CRAN release: 2020-12-03 Fixed test failure Solaris. Fixed multiple test failures R 3.6.2 due stringsAsFactors behavior. Made minor documentation improvements. Moved rpart Suggests Imports consistency packages used model training.","code":""},{"path":"http://www.schlosslab.org/mikropml/dev/news/index.html","id":"mikropml-001","dir":"Changelog","previous_headings":"","what":"mikropml 0.0.1","title":"mikropml 0.0.1","text":"CRAN release: 2020-11-23 first release version mikropml! 🎉 Added NEWS.md file track changes package. run_ml() preprocess_data() plot_model_performance() plot_hp_performance() glmnet: logistic linear regression rf: random forest rpart2: decision trees svmRadial: support vector machines xgbTree: gradient-boosted trees Introduction Preprocess data Hyperparameter tuning Parallel processing mikropml paper","code":""}] diff --git a/tests/testthat/test-plot.R b/tests/testthat/test-plot.R index ee021607..090c6dc5 100644 --- a/tests/testthat/test-plot.R +++ b/tests/testthat/test-plot.R @@ -69,6 +69,15 @@ test_that("tidy_perf_data works", { expect_equal(tidy_perf_data(perf_df_untidy), perf_df_tidy) }) +test_that("plot_model_performance creates a boxplot from tidied data", { + p <- perf_df_untidy %>% plot_model_performance() + expect_equal(p$data, perf_df_untidy %>% tidy_perf_data()) + expect_equal( + p$layers[[1]]$geom %>% class() %>% as.vector(), + c("GeomBoxplot", "Geom", "ggproto", "gg") + ) +}) + test_that("get_hp_performance works", { expect_equal( get_hp_performance(otu_mini_bin_results_glmnet$trained_model),