From 595b64867da2f87c81fdd810fd15d96a2bb8adcc Mon Sep 17 00:00:00 2001 From: Hannah Frick Date: Fri, 25 Jun 2021 17:14:27 +0100 Subject: [PATCH 01/19] add article with minimal examples for fitting and predicting --- _pkgdown.yml | 2 + vignettes/articles/Examples.Rmd | 519 ++++++++++++++++++ vignettes/articles/template-cls-multi-class.R | 12 + vignettes/articles/template-cls-two-class.R | 11 + vignettes/articles/template-reg-chicago.R | 18 + 5 files changed, 562 insertions(+) create mode 100644 vignettes/articles/Examples.Rmd create mode 100644 vignettes/articles/template-cls-multi-class.R create mode 100644 vignettes/articles/template-cls-two-class.R create mode 100644 vignettes/articles/template-reg-chicago.R diff --git a/_pkgdown.yml b/_pkgdown.yml index c11cdc617..17f82cfd4 100644 --- a/_pkgdown.yml +++ b/_pkgdown.yml @@ -90,6 +90,8 @@ navbar: href: https://www.tidymodels.org/learn/develop/models/ - text: Evaluating submodels with the same model object href: articles/articles/Submodels.html + - text: Minimal Examples for Fitting and Predicting + href: articles/articles/Examples.html - text: News href: news/index.html - text: Reference diff --git a/vignettes/articles/Examples.Rmd b/vignettes/articles/Examples.Rmd new file mode 100644 index 000000000..ab40da7eb --- /dev/null +++ b/vignettes/articles/Examples.Rmd @@ -0,0 +1,519 @@ +--- +title: "Minimal Examples for Fitting and Predicting" +vignette: > + %\VignetteEngine{knitr::rmarkdown} + %\VignetteIndexEntry{Minimal Examples for Fitting and Predicting} +output: + knitr:::html_vignette +--- + +```{r startup, include = FALSE} +library(utils) +library(ggplot2) +theme_set(theme_bw()) +``` + +This is a collection of minimal examples for fitting and predicting with various models and engines. + +
+ +`nearest_neighbor()` with the `"kknn"` engine + +```{r echo=FALSE} +knitr::spin_child("template-reg-chicago.R") +``` + +We can define the model with specific parameters: + + ```{r} +knn_reg_spec <- + nearest_neighbor(neighbors = 5, weight_func = "triangular") %>% + # This model can be used for classification or regression, so set mode + set_mode("regression") %>% + set_engine("kknn") +knn_reg_spec +``` + +Now we create the model fit object: + + ```{r} +knn_reg_fit <- knn_reg_spec %>% fit(ridership ~ ., data = Chicago_train) +knn_reg_fit +``` + +The holdout data can be predicted: + + ```{r} +predict(knn_reg_fit, Chicago_test) +``` + + +```{r echo=FALSE} +knitr::spin_child("template-cls-two-class.R") +``` + +Since there are two classes, we'll use an odd number of neighbors to avoid ties: + +```{r} +knn_cls_spec <- + nearest_neighbor(neighbors = 11, weight_func = "triangular") %>% + # This model can be used for classification or regression, so set mode + set_mode("classification") %>% + set_engine("kknn") +knn_cls_spec +``` + +Now we create the model fit object: + +```{r} +knn_cls_fit <- knn_cls_spec %>% fit(Class ~ ., data = data_train) +knn_cls_fit +``` + +The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + + ```{r} +bind_cols( + predict(knn_cls_fit, data_test), + predict(knn_cls_fit, data_test, type = "prob") +) +``` + +
+ + +
+ +`multinom_reg()` with the `"glmnet"` engine + +```{r echo=FALSE} +knitr::spin_child("template-cls-multi-class.R") +``` + +We can define the model with specific parameters: + +```{r} +mr_cls_spec <- + multinom_reg(penalty = 0.1) %>% + set_engine("glmnet") +mr_cls_spec +``` + +Now we create the model fit object: + +```{r} +set.seed(1) +mr_cls_fit <- mr_cls_spec %>% fit(island ~ ., data = penguins_train) +mr_cls_fit +``` + +The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + +```{r} +bind_cols( + predict(mr_cls_fit, penguins_test), + predict(mr_cls_fit, penguins_test, type = "prob") +) +``` + +
+ + +
`rand_forest()` with the `"ranger"` engine + +```{r echo=FALSE} +knitr::spin_child("template-reg-chicago.R") +``` + +We can define the model with specific parameters: + +```{r} +rf_reg_spec <- + rand_forest(trees = 200, min_n = 5) %>% + # This model can be used for classification or regression, so set mode + set_mode("regression") %>% + set_engine("ranger") +rf_reg_spec +``` + +Now we create the model fit object: + +```{r} +set.seed(1) +rf_reg_fit <- rf_reg_spec %>% fit(ridership ~ ., data = Chicago_train) +rf_reg_fit +``` + +The holdout data can be predicted: + +```{r} +predict(rf_reg_fit, Chicago_test) +``` + + +```{r echo=FALSE} +knitr::spin_child("template-cls-two-class.R") +``` + +We can define the model with specific parameters: + +```{r} +rf_cls_spec <- + rand_forest(trees = 200, min_n = 5) %>% + # This model can be used for classification or regression, so set mode + set_mode("classification") %>% + set_engine("ranger") +rf_cls_spec +``` + +Now we create the model fit object: + +```{r} +set.seed(1) +rf_cls_fit <- rf_cls_spec %>% fit(Class ~ ., data = data_train) +rf_cls_fit +``` + +The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + +```{r} +bind_cols( + predict(rf_cls_fit, data_test), + predict(rf_cls_fit, data_test, type = "prob") +) +``` + +
+ + +
`rand_forest()` with the `"randomForest"` engine + +```{r echo=FALSE} +knitr::spin_child("template-reg-chicago.R") +``` + +We can define the model with specific parameters: + +```{r} +rf_reg_spec <- + rand_forest(trees = 200, min_n = 5) %>% + # This model can be used for classification or regression, so set mode + set_mode("regression") %>% + set_engine("randomForest") +rf_reg_spec +``` + +Now we create the model fit object: + +```{r} +set.seed(1) +rf_reg_fit <- rf_reg_spec %>% fit(ridership ~ ., data = Chicago_train) +rf_reg_fit +``` + +The holdout data can be predicted: + +```{r} +predict(rf_reg_fit, Chicago_test) +``` + + +```{r echo=FALSE} +knitr::spin_child("template-cls-two-class.R") +``` + +We can define the model with specific parameters: + +```{r} +rf_cls_spec <- + rand_forest(trees = 200, min_n = 5) %>% + # This model can be used for classification or regression, so set mode + set_mode("classification") %>% + set_engine("randomForest") +rf_cls_spec +``` + +Now we create the model fit object: + +```{r} +set.seed(1) +rf_cls_fit <- rf_cls_spec %>% fit(Class ~ ., data = data_train) +rf_cls_fit +``` + +The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + +```{r} +bind_cols( + predict(rf_cls_fit, data_test), + predict(rf_cls_fit, data_test, type = "prob") +) +``` + +
+ + +
`svm_linear()` with the `"LiblineaR"` engine + +```{r echo=FALSE} +knitr::spin_child("template-reg-chicago.R") +``` + +We can define the model with specific parameters: + +```{r} +svm_reg_spec <- + svm_linear(cost = 1, margin = 0.1) %>% + # This model can be used for classification or regression, so set mode + set_mode("regression") %>% + set_engine("LiblineaR") +svm_reg_spec +``` + +Now we create the model fit object: + +```{r} +set.seed(1) +svm_reg_fit <- svm_reg_spec %>% fit(ridership ~ ., data = Chicago_train) +svm_reg_fit +``` + +The holdout data can be predicted: + +```{r} +predict(svm_reg_fit, Chicago_test) +``` + + +```{r echo=FALSE} +knitr::spin_child("template-cls-two-class.R") +``` + +We can define the model with specific parameters: + +```{r} +svm_cls_spec <- + svm_linear(cost = 1) %>% + # This model can be used for classification or regression, so set mode + set_mode("classification") %>% + set_engine("LiblineaR") +svm_cls_spec +``` + +Now we create the model fit object: + +```{r} +set.seed(1) +svm_cls_fit <- svm_cls_spec %>% fit(Class ~ ., data = data_train) +svm_cls_fit +``` + +The holdout data can be predicted for hard class predictions. + +```{r} +predict(svm_cls_fit, data_test) +``` + +
+ + +
`svm_linear()` with the `"kernlab"` engine + +```{r echo=FALSE} +knitr::spin_child("template-reg-chicago.R") +``` + +We can define the model with specific parameters: + +```{r} +svm_reg_spec <- + svm_linear(cost = 1, margin = 0.1) %>% + # This model can be used for classification or regression, so set mode + set_mode("regression") %>% + set_engine("kernlab") +svm_reg_spec +``` + +Now we create the model fit object: + +```{r} +set.seed(1) +svm_reg_fit <- svm_reg_spec %>% fit(ridership ~ ., data = Chicago_train) +svm_reg_fit +``` + +The holdout data can be predicted: + +```{r} +predict(svm_reg_fit, Chicago_test) +``` + + +```{r echo=FALSE} +knitr::spin_child("template-cls-two-class.R") +``` + +We can define the model with specific parameters: + +```{r} +svm_cls_spec <- + svm_linear(cost = 1) %>% + # This model can be used for classification or regression, so set mode + set_mode("classification") %>% + set_engine("kernlab") +svm_cls_spec +``` + +Now we create the model fit object: + +```{r} +set.seed(1) +svm_cls_fit <- svm_cls_spec %>% fit(Class ~ ., data = data_train) +svm_cls_fit +``` + +The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + +```{r} +bind_cols( + predict(svm_cls_fit, data_test), + predict(svm_cls_fit, data_test, type = "prob") +) +``` + +
+ + +
`svm_poly()` with the `"kernlab"` engine + +```{r echo=FALSE} +knitr::spin_child("template-reg-chicago.R") +``` + +We can define the model with specific parameters: + +```{r} +svm_reg_spec <- + svm_poly(cost = 1, margin = 0.1) %>% + # This model can be used for classification or regression, so set mode + set_mode("regression") %>% + set_engine("kernlab") +svm_reg_spec +``` + +Now we create the model fit object: + +```{r} +set.seed(1) +svm_reg_fit <- svm_reg_spec %>% fit(ridership ~ ., data = Chicago_train) +svm_reg_fit +``` + +The holdout data can be predicted: + +```{r} +predict(svm_reg_fit, Chicago_test) +``` + + +```{r echo=FALSE} +knitr::spin_child("template-cls-two-class.R") +``` + +We can define the model with specific parameters: + +```{r} +svm_cls_spec <- + svm_poly(cost = 1) %>% + # This model can be used for classification or regression, so set mode + set_mode("classification") %>% + set_engine("kernlab") +svm_cls_spec +``` + +Now we create the model fit object: + +```{r} +set.seed(1) +svm_cls_fit <- svm_cls_spec %>% fit(Class ~ ., data = data_train) +svm_cls_fit +``` + +The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + +```{r} +bind_cols( + predict(svm_cls_fit, data_test), + predict(svm_cls_fit, data_test, type = "prob") +) +``` + +
+ + +
`svm_rbf()` with the `"kernlab"` engine + +```{r echo=FALSE} +knitr::spin_child("template-reg-chicago.R") +``` + +We can define the model with specific parameters: + +```{r} +svm_reg_spec <- + svm_rbf(cost = 1, margin = 0.1) %>% + # This model can be used for classification or regression, so set mode + set_mode("regression") %>% + set_engine("kernlab") +svm_reg_spec +``` + +Now we create the model fit object: + +```{r} +set.seed(1) +svm_reg_fit <- svm_reg_spec %>% fit(ridership ~ ., data = Chicago_train) +svm_reg_fit +``` + +The holdout data can be predicted: + +```{r} +predict(svm_reg_fit, Chicago_test) +``` + + +```{r echo=FALSE} +knitr::spin_child("template-cls-two-class.R") +``` + +We can define the model with specific parameters: + +```{r} +svm_cls_spec <- + svm_rbf(cost = 1) %>% + # This model can be used for classification or regression, so set mode + set_mode("classification") %>% + set_engine("kernlab") +svm_cls_spec +``` + +Now we create the model fit object: + +```{r} +set.seed(1) +svm_cls_fit <- svm_cls_spec %>% fit(Class ~ ., data = data_train) +svm_cls_fit +``` + +The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + +```{r} +bind_cols( + predict(svm_cls_fit, data_test), + predict(svm_cls_fit, data_test, type = "prob") +) +``` + +
+ diff --git a/vignettes/articles/template-cls-multi-class.R b/vignettes/articles/template-cls-multi-class.R new file mode 100644 index 000000000..bc3ba138c --- /dev/null +++ b/vignettes/articles/template-cls-multi-class.R @@ -0,0 +1,12 @@ +#' ### Classification Example + +#' We'll model the island of the penguins with two predictors in the same unit (mm): bill length and bill depth. + +#+ results = "hide", messages = FALSE +library(tidymodels) +tidymodels_prefer() +data(penguins) + +penguins <- penguins %>% select(island, starts_with("bill_")) +penguins_train <- penguins[-c(21, 153, 31, 277, 1), ] +penguins_test <- penguins[ c(21, 153, 31, 277, 1), ] diff --git a/vignettes/articles/template-cls-two-class.R b/vignettes/articles/template-cls-two-class.R new file mode 100644 index 000000000..01cc35d20 --- /dev/null +++ b/vignettes/articles/template-cls-two-class.R @@ -0,0 +1,11 @@ +#' ### Classification Example + +#' The example data has two predictors and an outcome with two classes. Both predictors are in the same units + +#+ results = "hide", messages = FALSE +library(tidymodels) +tidymodels_prefer() +data(two_class_dat) + +data_train <- two_class_dat[-(1:10), ] +data_test <- two_class_dat[ 1:10 , ] diff --git a/vignettes/articles/template-reg-chicago.R b/vignettes/articles/template-reg-chicago.R new file mode 100644 index 000000000..228deda3c --- /dev/null +++ b/vignettes/articles/template-reg-chicago.R @@ -0,0 +1,18 @@ +#' ### Regression Example + +#' We'll model the ridership on the Chicago elevated trains as a function of the 14 day lagged ridership at two stations. The two predictors are in the same units (rides per day/1000) and do not need to be normalized. + +#' All but the last week of data are used for training. The last week will be predicted after the model is fit. + +#+ results = "hide", messages = FALSE +library(tidymodels) +tidymodels_prefer() +data(Chicago) + +n <- nrow(Chicago) +Chicago <- Chicago %>% select(ridership, Clark_Lake, Quincy_Wells) + +Chicago_train <- Chicago[1:(n - 7), ] +Chicago_test <- Chicago[(n - 6):n, ] + + From bf9bfb3ebc2ee17b890833618d34daa5ca9ea5de Mon Sep 17 00:00:00 2001 From: Hannah Frick Date: Mon, 28 Jun 2021 11:39:05 +0100 Subject: [PATCH 02/19] fix indentation --- vignettes/articles/Examples.Rmd | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/vignettes/articles/Examples.Rmd b/vignettes/articles/Examples.Rmd index ab40da7eb..477bb285d 100644 --- a/vignettes/articles/Examples.Rmd +++ b/vignettes/articles/Examples.Rmd @@ -25,7 +25,7 @@ knitr::spin_child("template-reg-chicago.R") We can define the model with specific parameters: - ```{r} +```{r} knn_reg_spec <- nearest_neighbor(neighbors = 5, weight_func = "triangular") %>% # This model can be used for classification or regression, so set mode @@ -36,14 +36,14 @@ knn_reg_spec Now we create the model fit object: - ```{r} +```{r} knn_reg_fit <- knn_reg_spec %>% fit(ridership ~ ., data = Chicago_train) knn_reg_fit ``` The holdout data can be predicted: - ```{r} +```{r} predict(knn_reg_fit, Chicago_test) ``` @@ -72,7 +72,7 @@ knn_cls_fit The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: - ```{r} +```{r} bind_cols( predict(knn_cls_fit, data_test), predict(knn_cls_fit, data_test, type = "prob") From e1952ea6422f81b11bca59154cd764abaa48563e Mon Sep 17 00:00:00 2001 From: Hannah Frick Date: Mon, 28 Jun 2021 12:07:12 +0100 Subject: [PATCH 03/19] all examples indented to get collapsing right --- vignettes/articles/Examples.Rmd | 750 ++++++++++++++++---------------- 1 file changed, 376 insertions(+), 374 deletions(-) diff --git a/vignettes/articles/Examples.Rmd b/vignettes/articles/Examples.Rmd index 477bb285d..6d8465bd2 100644 --- a/vignettes/articles/Examples.Rmd +++ b/vignettes/articles/Examples.Rmd @@ -15,505 +15,507 @@ theme_set(theme_bw()) This is a collection of minimal examples for fitting and predicting with various models and engines. -
+
-`nearest_neighbor()` with the `"kknn"` engine + `nearest_neighbor()` with the `"kknn"` engine -```{r echo=FALSE} -knitr::spin_child("template-reg-chicago.R") -``` + ```{r echo=FALSE} + knitr::spin_child("template-reg-chicago.R") + ``` -We can define the model with specific parameters: + We can define the model with specific parameters: -```{r} -knn_reg_spec <- - nearest_neighbor(neighbors = 5, weight_func = "triangular") %>% - # This model can be used for classification or regression, so set mode - set_mode("regression") %>% - set_engine("kknn") -knn_reg_spec -``` + ```{r} + knn_reg_spec <- + nearest_neighbor(neighbors = 5, weight_func = "triangular") %>% + # This model can be used for classification or regression, so set mode + set_mode("regression") %>% + set_engine("kknn") + knn_reg_spec + ``` -Now we create the model fit object: + Now we create the model fit object: -```{r} -knn_reg_fit <- knn_reg_spec %>% fit(ridership ~ ., data = Chicago_train) -knn_reg_fit -``` + ```{r} + knn_reg_fit <- knn_reg_spec %>% fit(ridership ~ ., data = Chicago_train) + knn_reg_fit + ``` -The holdout data can be predicted: + The holdout data can be predicted: -```{r} -predict(knn_reg_fit, Chicago_test) -``` + ```{r} + predict(knn_reg_fit, Chicago_test) + ``` -```{r echo=FALSE} -knitr::spin_child("template-cls-two-class.R") -``` + ```{r echo=FALSE} + knitr::spin_child("template-cls-two-class.R") + ``` -Since there are two classes, we'll use an odd number of neighbors to avoid ties: + Since there are two classes, we'll use an odd number of neighbors to avoid ties: -```{r} -knn_cls_spec <- - nearest_neighbor(neighbors = 11, weight_func = "triangular") %>% - # This model can be used for classification or regression, so set mode - set_mode("classification") %>% - set_engine("kknn") -knn_cls_spec -``` + ```{r} + knn_cls_spec <- + nearest_neighbor(neighbors = 11, weight_func = "triangular") %>% + # This model can be used for classification or regression, so set mode + set_mode("classification") %>% + set_engine("kknn") + knn_cls_spec + ``` -Now we create the model fit object: + Now we create the model fit object: -```{r} -knn_cls_fit <- knn_cls_spec %>% fit(Class ~ ., data = data_train) -knn_cls_fit -``` + ```{r} + knn_cls_fit <- knn_cls_spec %>% fit(Class ~ ., data = data_train) + knn_cls_fit + ``` -The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: -```{r} -bind_cols( - predict(knn_cls_fit, data_test), - predict(knn_cls_fit, data_test, type = "prob") -) -``` + ```{r} + bind_cols( + predict(knn_cls_fit, data_test), + predict(knn_cls_fit, data_test, type = "prob") + ) + ``` -
+
-
+
-`multinom_reg()` with the `"glmnet"` engine + `multinom_reg()` with the `"glmnet"` engine -```{r echo=FALSE} -knitr::spin_child("template-cls-multi-class.R") -``` + ```{r echo=FALSE} + knitr::spin_child("template-cls-multi-class.R") + ``` -We can define the model with specific parameters: + We can define the model with specific parameters: -```{r} -mr_cls_spec <- - multinom_reg(penalty = 0.1) %>% - set_engine("glmnet") -mr_cls_spec -``` + ```{r} + mr_cls_spec <- + multinom_reg(penalty = 0.1) %>% + set_engine("glmnet") + mr_cls_spec + ``` -Now we create the model fit object: + Now we create the model fit object: -```{r} -set.seed(1) -mr_cls_fit <- mr_cls_spec %>% fit(island ~ ., data = penguins_train) -mr_cls_fit -``` + ```{r} + set.seed(1) + mr_cls_fit <- mr_cls_spec %>% fit(island ~ ., data = penguins_train) + mr_cls_fit + ``` -The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: -```{r} -bind_cols( - predict(mr_cls_fit, penguins_test), - predict(mr_cls_fit, penguins_test, type = "prob") -) -``` + ```{r} + bind_cols( + predict(mr_cls_fit, penguins_test), + predict(mr_cls_fit, penguins_test, type = "prob") + ) + ``` -
+
-
`rand_forest()` with the `"ranger"` engine +
`rand_forest()` with the `"ranger"` engine -```{r echo=FALSE} -knitr::spin_child("template-reg-chicago.R") -``` + ```{r echo=FALSE} + knitr::spin_child("template-reg-chicago.R") + ``` -We can define the model with specific parameters: + We can define the model with specific parameters: -```{r} -rf_reg_spec <- - rand_forest(trees = 200, min_n = 5) %>% - # This model can be used for classification or regression, so set mode - set_mode("regression") %>% - set_engine("ranger") -rf_reg_spec -``` + ```{r} + rf_reg_spec <- + rand_forest(trees = 200, min_n = 5) %>% + # This model can be used for classification or regression, so set mode + set_mode("regression") %>% + set_engine("ranger") + rf_reg_spec + ``` -Now we create the model fit object: + Now we create the model fit object: -```{r} -set.seed(1) -rf_reg_fit <- rf_reg_spec %>% fit(ridership ~ ., data = Chicago_train) -rf_reg_fit -``` + ```{r} + set.seed(1) + rf_reg_fit <- rf_reg_spec %>% fit(ridership ~ ., data = Chicago_train) + rf_reg_fit + ``` -The holdout data can be predicted: + The holdout data can be predicted: -```{r} -predict(rf_reg_fit, Chicago_test) -``` + ```{r} + predict(rf_reg_fit, Chicago_test) + ``` -```{r echo=FALSE} -knitr::spin_child("template-cls-two-class.R") -``` + ```{r echo=FALSE} + knitr::spin_child("template-cls-two-class.R") + ``` -We can define the model with specific parameters: + We can define the model with specific parameters: -```{r} -rf_cls_spec <- - rand_forest(trees = 200, min_n = 5) %>% - # This model can be used for classification or regression, so set mode - set_mode("classification") %>% - set_engine("ranger") -rf_cls_spec -``` + ```{r} + rf_cls_spec <- + rand_forest(trees = 200, min_n = 5) %>% + # This model can be used for classification or regression, so set mode + set_mode("classification") %>% + set_engine("ranger") + rf_cls_spec + ``` -Now we create the model fit object: + Now we create the model fit object: -```{r} -set.seed(1) -rf_cls_fit <- rf_cls_spec %>% fit(Class ~ ., data = data_train) -rf_cls_fit -``` + ```{r} + set.seed(1) + rf_cls_fit <- rf_cls_spec %>% fit(Class ~ ., data = data_train) + rf_cls_fit + ``` -The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: -```{r} -bind_cols( - predict(rf_cls_fit, data_test), - predict(rf_cls_fit, data_test, type = "prob") -) -``` - -
+ ```{r} + bind_cols( + predict(rf_cls_fit, data_test), + predict(rf_cls_fit, data_test, type = "prob") + ) + ``` + +
-
`rand_forest()` with the `"randomForest"` engine +
`rand_forest()` with the `"randomForest"` engine -```{r echo=FALSE} -knitr::spin_child("template-reg-chicago.R") -``` + ```{r echo=FALSE} + knitr::spin_child("template-reg-chicago.R") + ``` -We can define the model with specific parameters: + We can define the model with specific parameters: -```{r} -rf_reg_spec <- - rand_forest(trees = 200, min_n = 5) %>% - # This model can be used for classification or regression, so set mode - set_mode("regression") %>% - set_engine("randomForest") -rf_reg_spec -``` + ```{r} + rf_reg_spec <- + rand_forest(trees = 200, min_n = 5) %>% + # This model can be used for classification or regression, so set mode + set_mode("regression") %>% + set_engine("randomForest") + rf_reg_spec + ``` -Now we create the model fit object: + Now we create the model fit object: -```{r} -set.seed(1) -rf_reg_fit <- rf_reg_spec %>% fit(ridership ~ ., data = Chicago_train) -rf_reg_fit -``` + ```{r} + set.seed(1) + rf_reg_fit <- rf_reg_spec %>% fit(ridership ~ ., data = Chicago_train) + rf_reg_fit + ``` -The holdout data can be predicted: + The holdout data can be predicted: -```{r} -predict(rf_reg_fit, Chicago_test) -``` + ```{r} + predict(rf_reg_fit, Chicago_test) + ``` -```{r echo=FALSE} -knitr::spin_child("template-cls-two-class.R") -``` + ```{r echo=FALSE} + knitr::spin_child("template-cls-two-class.R") + ``` -We can define the model with specific parameters: + We can define the model with specific parameters: -```{r} -rf_cls_spec <- - rand_forest(trees = 200, min_n = 5) %>% - # This model can be used for classification or regression, so set mode - set_mode("classification") %>% - set_engine("randomForest") -rf_cls_spec -``` + ```{r} + rf_cls_spec <- + rand_forest(trees = 200, min_n = 5) %>% + # This model can be used for classification or regression, so set mode + set_mode("classification") %>% + set_engine("randomForest") + rf_cls_spec + ``` -Now we create the model fit object: + Now we create the model fit object: -```{r} -set.seed(1) -rf_cls_fit <- rf_cls_spec %>% fit(Class ~ ., data = data_train) -rf_cls_fit -``` + ```{r} + set.seed(1) + rf_cls_fit <- rf_cls_spec %>% fit(Class ~ ., data = data_train) + rf_cls_fit + ``` -The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: -```{r} -bind_cols( - predict(rf_cls_fit, data_test), - predict(rf_cls_fit, data_test, type = "prob") -) -``` - -
+ ```{r} + bind_cols( + predict(rf_cls_fit, data_test), + predict(rf_cls_fit, data_test, type = "prob") + ) + ``` + +
-
`svm_linear()` with the `"LiblineaR"` engine +
`svm_linear()` with the `"LiblineaR"` engine -```{r echo=FALSE} -knitr::spin_child("template-reg-chicago.R") -``` + ```{r echo=FALSE} + knitr::spin_child("template-reg-chicago.R") + ``` -We can define the model with specific parameters: + We can define the model with specific parameters: -```{r} -svm_reg_spec <- - svm_linear(cost = 1, margin = 0.1) %>% - # This model can be used for classification or regression, so set mode - set_mode("regression") %>% - set_engine("LiblineaR") -svm_reg_spec -``` + ```{r} + svm_reg_spec <- + svm_linear(cost = 1, margin = 0.1) %>% + # This model can be used for classification or regression, so set mode + set_mode("regression") %>% + set_engine("LiblineaR") + svm_reg_spec + ``` -Now we create the model fit object: + Now we create the model fit object: -```{r} -set.seed(1) -svm_reg_fit <- svm_reg_spec %>% fit(ridership ~ ., data = Chicago_train) -svm_reg_fit -``` + ```{r} + set.seed(1) + svm_reg_fit <- svm_reg_spec %>% fit(ridership ~ ., data = Chicago_train) + svm_reg_fit + ``` -The holdout data can be predicted: + The holdout data can be predicted: -```{r} -predict(svm_reg_fit, Chicago_test) -``` + ```{r} + predict(svm_reg_fit, Chicago_test) + ``` -```{r echo=FALSE} -knitr::spin_child("template-cls-two-class.R") -``` + ```{r echo=FALSE} + knitr::spin_child("template-cls-two-class.R") + ``` -We can define the model with specific parameters: + We can define the model with specific parameters: -```{r} -svm_cls_spec <- - svm_linear(cost = 1) %>% - # This model can be used for classification or regression, so set mode - set_mode("classification") %>% - set_engine("LiblineaR") -svm_cls_spec -``` + ```{r} + svm_cls_spec <- + svm_linear(cost = 1) %>% + # This model can be used for classification or regression, so set mode + set_mode("classification") %>% + set_engine("LiblineaR") + svm_cls_spec + ``` -Now we create the model fit object: + Now we create the model fit object: -```{r} -set.seed(1) -svm_cls_fit <- svm_cls_spec %>% fit(Class ~ ., data = data_train) -svm_cls_fit -``` + ```{r} + set.seed(1) + svm_cls_fit <- svm_cls_spec %>% fit(Class ~ ., data = data_train) + svm_cls_fit + ``` -The holdout data can be predicted for hard class predictions. + The holdout data can be predicted for hard class predictions. -```{r} -predict(svm_cls_fit, data_test) -``` + ```{r} + predict(svm_cls_fit, data_test) + ``` -
+
-
`svm_linear()` with the `"kernlab"` engine +
`svm_linear()` with the `"kernlab"` engine -```{r echo=FALSE} -knitr::spin_child("template-reg-chicago.R") -``` + ```{r echo=FALSE} + knitr::spin_child("template-reg-chicago.R") + ``` -We can define the model with specific parameters: + We can define the model with specific parameters: -```{r} -svm_reg_spec <- - svm_linear(cost = 1, margin = 0.1) %>% - # This model can be used for classification or regression, so set mode - set_mode("regression") %>% - set_engine("kernlab") -svm_reg_spec -``` + ```{r} + svm_reg_spec <- + svm_linear(cost = 1, margin = 0.1) %>% + # This model can be used for classification or regression, so set mode + set_mode("regression") %>% + set_engine("kernlab") + svm_reg_spec + ``` -Now we create the model fit object: + Now we create the model fit object: -```{r} -set.seed(1) -svm_reg_fit <- svm_reg_spec %>% fit(ridership ~ ., data = Chicago_train) -svm_reg_fit -``` + ```{r} + set.seed(1) + svm_reg_fit <- svm_reg_spec %>% fit(ridership ~ ., data = Chicago_train) + svm_reg_fit + ``` -The holdout data can be predicted: + The holdout data can be predicted: -```{r} -predict(svm_reg_fit, Chicago_test) -``` + ```{r} + predict(svm_reg_fit, Chicago_test) + ``` -```{r echo=FALSE} -knitr::spin_child("template-cls-two-class.R") -``` + ```{r echo=FALSE} + knitr::spin_child("template-cls-two-class.R") + ``` -We can define the model with specific parameters: + We can define the model with specific parameters: -```{r} -svm_cls_spec <- - svm_linear(cost = 1) %>% - # This model can be used for classification or regression, so set mode - set_mode("classification") %>% - set_engine("kernlab") -svm_cls_spec -``` + ```{r} + svm_cls_spec <- + svm_linear(cost = 1) %>% + # This model can be used for classification or regression, so set mode + set_mode("classification") %>% + set_engine("kernlab") + svm_cls_spec + ``` -Now we create the model fit object: - -```{r} -set.seed(1) -svm_cls_fit <- svm_cls_spec %>% fit(Class ~ ., data = data_train) -svm_cls_fit -``` - -The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: - -```{r} -bind_cols( - predict(svm_cls_fit, data_test), - predict(svm_cls_fit, data_test, type = "prob") -) -``` - -
+ Now we create the model fit object: + ```{r} + set.seed(1) + svm_cls_fit <- svm_cls_spec %>% fit(Class ~ ., data = data_train) + svm_cls_fit + ``` -
`svm_poly()` with the `"kernlab"` engine - -```{r echo=FALSE} -knitr::spin_child("template-reg-chicago.R") -``` + The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: -We can define the model with specific parameters: + ```{r} + bind_cols( + predict(svm_cls_fit, data_test), + predict(svm_cls_fit, data_test, type = "prob") + ) + ``` + +
-```{r} -svm_reg_spec <- - svm_poly(cost = 1, margin = 0.1) %>% - # This model can be used for classification or regression, so set mode - set_mode("regression") %>% - set_engine("kernlab") -svm_reg_spec -``` -Now we create the model fit object: +
`svm_poly()` with the `"kernlab"` engine -```{r} -set.seed(1) -svm_reg_fit <- svm_reg_spec %>% fit(ridership ~ ., data = Chicago_train) -svm_reg_fit -``` + ```{r echo=FALSE} + knitr::spin_child("template-reg-chicago.R") + ``` -The holdout data can be predicted: + We can define the model with specific parameters: -```{r} -predict(svm_reg_fit, Chicago_test) -``` + ```{r} + svm_reg_spec <- + svm_poly(cost = 1, margin = 0.1) %>% + # This model can be used for classification or regression, so set mode + set_mode("regression") %>% + set_engine("kernlab") + svm_reg_spec + ``` + Now we create the model fit object: -```{r echo=FALSE} -knitr::spin_child("template-cls-two-class.R") -``` + ```{r} + set.seed(1) + svm_reg_fit <- svm_reg_spec %>% fit(ridership ~ ., data = Chicago_train) + svm_reg_fit + ``` -We can define the model with specific parameters: + The holdout data can be predicted: -```{r} -svm_cls_spec <- - svm_poly(cost = 1) %>% - # This model can be used for classification or regression, so set mode - set_mode("classification") %>% - set_engine("kernlab") -svm_cls_spec -``` + ```{r} + predict(svm_reg_fit, Chicago_test) + ``` -Now we create the model fit object: -```{r} -set.seed(1) -svm_cls_fit <- svm_cls_spec %>% fit(Class ~ ., data = data_train) -svm_cls_fit -``` + ```{r echo=FALSE} + knitr::spin_child("template-cls-two-class.R") + ``` -The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + We can define the model with specific parameters: -```{r} -bind_cols( - predict(svm_cls_fit, data_test), - predict(svm_cls_fit, data_test, type = "prob") -) -``` - -
+ ```{r} + svm_cls_spec <- + svm_poly(cost = 1) %>% + # This model can be used for classification or regression, so set mode + set_mode("classification") %>% + set_engine("kernlab") + svm_cls_spec + ``` + Now we create the model fit object: -
`svm_rbf()` with the `"kernlab"` engine + ```{r} + set.seed(1) + svm_cls_fit <- svm_cls_spec %>% fit(Class ~ ., data = data_train) + svm_cls_fit + ``` -```{r echo=FALSE} -knitr::spin_child("template-reg-chicago.R") -``` + The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: -We can define the model with specific parameters: + ```{r} + bind_cols( + predict(svm_cls_fit, data_test), + predict(svm_cls_fit, data_test, type = "prob") + ) + ``` + +
+ + +
-```{r} -svm_reg_spec <- - svm_rbf(cost = 1, margin = 0.1) %>% - # This model can be used for classification or regression, so set mode - set_mode("regression") %>% - set_engine("kernlab") -svm_reg_spec -``` + `svm_rbf()` with the `"kernlab"` engine + + ```{r echo=FALSE} + knitr::spin_child("template-reg-chicago.R") + ``` + + We can define the model with specific parameters: -Now we create the model fit object: + ```{r} + svm_reg_spec <- + svm_rbf(cost = 1, margin = 0.1) %>% + # This model can be used for classification or regression, so set mode + set_mode("regression") %>% + set_engine("kernlab") + svm_reg_spec + ``` -```{r} -set.seed(1) -svm_reg_fit <- svm_reg_spec %>% fit(ridership ~ ., data = Chicago_train) -svm_reg_fit -``` + Now we create the model fit object: -The holdout data can be predicted: + ```{r} + set.seed(1) + svm_reg_fit <- svm_reg_spec %>% fit(ridership ~ ., data = Chicago_train) + svm_reg_fit + ``` -```{r} -predict(svm_reg_fit, Chicago_test) -``` + The holdout data can be predicted: + ```{r} + predict(svm_reg_fit, Chicago_test) + ``` -```{r echo=FALSE} -knitr::spin_child("template-cls-two-class.R") -``` -We can define the model with specific parameters: + ```{r echo=FALSE} + knitr::spin_child("template-cls-two-class.R") + ``` -```{r} -svm_cls_spec <- - svm_rbf(cost = 1) %>% - # This model can be used for classification or regression, so set mode - set_mode("classification") %>% - set_engine("kernlab") -svm_cls_spec -``` + We can define the model with specific parameters: + + ```{r} + svm_cls_spec <- + svm_rbf(cost = 1) %>% + # This model can be used for classification or regression, so set mode + set_mode("classification") %>% + set_engine("kernlab") + svm_cls_spec + ``` -Now we create the model fit object: + Now we create the model fit object: -```{r} -set.seed(1) -svm_cls_fit <- svm_cls_spec %>% fit(Class ~ ., data = data_train) -svm_cls_fit -``` + ```{r} + set.seed(1) + svm_cls_fit <- svm_cls_spec %>% fit(Class ~ ., data = data_train) + svm_cls_fit + ``` -The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: -```{r} -bind_cols( - predict(svm_cls_fit, data_test), - predict(svm_cls_fit, data_test, type = "prob") -) -``` - -
+ ```{r} + bind_cols( + predict(svm_cls_fit, data_test), + predict(svm_cls_fit, data_test, type = "prob") + ) + ``` + +
From e80be20d85c6ed5c83b52c60ee5861aaf2737f59 Mon Sep 17 00:00:00 2001 From: Hannah Frick Date: Mon, 28 Jun 2021 12:45:32 +0100 Subject: [PATCH 04/19] make headers work within the collapsable section --- vignettes/articles/template-cls-multi-class.R | 2 +- vignettes/articles/template-cls-two-class.R | 2 +- vignettes/articles/template-reg-chicago.R | 2 +- 3 files changed, 3 insertions(+), 3 deletions(-) diff --git a/vignettes/articles/template-cls-multi-class.R b/vignettes/articles/template-cls-multi-class.R index bc3ba138c..fe4aa865b 100644 --- a/vignettes/articles/template-cls-multi-class.R +++ b/vignettes/articles/template-cls-multi-class.R @@ -1,4 +1,4 @@ -#' ### Classification Example +#'

Classification Example

#' We'll model the island of the penguins with two predictors in the same unit (mm): bill length and bill depth. diff --git a/vignettes/articles/template-cls-two-class.R b/vignettes/articles/template-cls-two-class.R index 01cc35d20..4c0938426 100644 --- a/vignettes/articles/template-cls-two-class.R +++ b/vignettes/articles/template-cls-two-class.R @@ -1,4 +1,4 @@ -#' ### Classification Example +#'

Classification Example

#' The example data has two predictors and an outcome with two classes. Both predictors are in the same units diff --git a/vignettes/articles/template-reg-chicago.R b/vignettes/articles/template-reg-chicago.R index 228deda3c..321b11aed 100644 --- a/vignettes/articles/template-reg-chicago.R +++ b/vignettes/articles/template-reg-chicago.R @@ -1,4 +1,4 @@ -#' ### Regression Example +#'

Regression Example

#' We'll model the ridership on the Chicago elevated trains as a function of the 14 day lagged ridership at two stations. The two predictors are in the same units (rides per day/1000) and do not need to be normalized. From bc4405b636d3ececa83219401ad758afdfe8614e Mon Sep 17 00:00:00 2001 From: Hannah Frick Date: Wed, 30 Jun 2021 18:39:28 +0100 Subject: [PATCH 05/19] add id tags so that we can link to specific sections --- vignettes/articles/Examples.Rmd | 26 ++++++++++++++++++-------- 1 file changed, 18 insertions(+), 8 deletions(-) diff --git a/vignettes/articles/Examples.Rmd b/vignettes/articles/Examples.Rmd index 6d8465bd2..7aa32805d 100644 --- a/vignettes/articles/Examples.Rmd +++ b/vignettes/articles/Examples.Rmd @@ -15,7 +15,7 @@ theme_set(theme_bw()) This is a collection of minimal examples for fitting and predicting with various models and engines. -
+
`nearest_neighbor()` with the `"kknn"` engine @@ -82,7 +82,7 @@ This is a collection of minimal examples for fitting and predicting with various
-
+
`multinom_reg()` with the `"glmnet"` engine @@ -119,7 +119,9 @@ This is a collection of minimal examples for fitting and predicting with various
-
`rand_forest()` with the `"ranger"` engine +
+ + `rand_forest()` with the `"ranger"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -186,7 +188,9 @@ This is a collection of minimal examples for fitting and predicting with various
-
`rand_forest()` with the `"randomForest"` engine +
+ + `rand_forest()` with the `"randomForest"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -253,7 +257,9 @@ This is a collection of minimal examples for fitting and predicting with various
-
`svm_linear()` with the `"LiblineaR"` engine +
+ + `svm_linear()` with the `"LiblineaR"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -317,7 +323,9 @@ This is a collection of minimal examples for fitting and predicting with various
-
`svm_linear()` with the `"kernlab"` engine +
+ + `svm_linear()` with the `"kernlab"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -384,7 +392,9 @@ This is a collection of minimal examples for fitting and predicting with various
-
`svm_poly()` with the `"kernlab"` engine +
+ + `svm_poly()` with the `"kernlab"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -451,7 +461,7 @@ This is a collection of minimal examples for fitting and predicting with various
-
+
`svm_rbf()` with the `"kernlab"` engine From c30965584b28d9010b48ab9a49c60713ad1d8a4f Mon Sep 17 00:00:00 2001 From: Hannah Frick Date: Wed, 30 Jun 2021 18:40:42 +0100 Subject: [PATCH 06/19] (failed) attempt to use the tag to link from help page to the article --- man/rmd/nearest_neighbor_kknn.Rmd | 4 ++++ 1 file changed, 4 insertions(+) diff --git a/man/rmd/nearest_neighbor_kknn.Rmd b/man/rmd/nearest_neighbor_kknn.Rmd index 4c9927966..f96082dd4 100644 --- a/man/rmd/nearest_neighbor_kknn.Rmd +++ b/man/rmd/nearest_neighbor_kknn.Rmd @@ -69,6 +69,10 @@ nearest_neighbor( ```{r child = "template-same-scale.Rmd"} ``` +## Minimal examples + +The Minimal Examples for Fitting and Predicting article contains examples for [nearest_neighbor() with the "kknn" engine](#nearest-neighbor-kknn). + ## References - Hechenbichler K. and Schliep K.P. (2004) [Weighted k-Nearest-Neighbor Techniques and Ordinal Classification](https://epub.ub.uni-muenchen.de/1769/), Discussion Paper 399, SFB 386, Ludwig-Maximilians University Munich From 2ced7911e4140268c5ebbd415460a5d2cdcf8c16 Mon Sep 17 00:00:00 2001 From: Hannah Frick Date: Thu, 1 Jul 2021 11:29:11 +0100 Subject: [PATCH 07/19] change order --- vignettes/articles/Examples.Rmd | 75 +++++++++++++++++---------------- 1 file changed, 38 insertions(+), 37 deletions(-) diff --git a/vignettes/articles/Examples.Rmd b/vignettes/articles/Examples.Rmd index 7aa32805d..4e45e1abf 100644 --- a/vignettes/articles/Examples.Rmd +++ b/vignettes/articles/Examples.Rmd @@ -15,6 +15,44 @@ theme_set(theme_bw()) This is a collection of minimal examples for fitting and predicting with various models and engines. + +
+ + `multinom_reg()` with the `"glmnet"` engine + + ```{r echo=FALSE} + knitr::spin_child("template-cls-multi-class.R") + ``` + + We can define the model with specific parameters: + + ```{r} + mr_cls_spec <- + multinom_reg(penalty = 0.1) %>% + set_engine("glmnet") + mr_cls_spec + ``` + + Now we create the model fit object: + + ```{r} + set.seed(1) + mr_cls_fit <- mr_cls_spec %>% fit(island ~ ., data = penguins_train) + mr_cls_fit + ``` + + The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + + ```{r} + bind_cols( + predict(mr_cls_fit, penguins_test), + predict(mr_cls_fit, penguins_test, type = "prob") + ) + ``` + +
+ +
`nearest_neighbor()` with the `"kknn"` engine @@ -82,43 +120,6 @@ This is a collection of minimal examples for fitting and predicting with various
-
- - `multinom_reg()` with the `"glmnet"` engine - - ```{r echo=FALSE} - knitr::spin_child("template-cls-multi-class.R") - ``` - - We can define the model with specific parameters: - - ```{r} - mr_cls_spec <- - multinom_reg(penalty = 0.1) %>% - set_engine("glmnet") - mr_cls_spec - ``` - - Now we create the model fit object: - - ```{r} - set.seed(1) - mr_cls_fit <- mr_cls_spec %>% fit(island ~ ., data = penguins_train) - mr_cls_fit - ``` - - The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: - - ```{r} - bind_cols( - predict(mr_cls_fit, penguins_test), - predict(mr_cls_fit, penguins_test, type = "prob") - ) - ``` - -
- -
`rand_forest()` with the `"ranger"` engine From 6b50ff8ee85d9db4216ed636e8f38a9cce44bdee Mon Sep 17 00:00:00 2001 From: Hannah Frick Date: Thu, 1 Jul 2021 11:54:12 +0100 Subject: [PATCH 08/19] added links to pkgdown site in the help pages --- man/details_multinom_reg_glmnet.Rd | 7 +++++++ man/details_nearest_neighbor_kknn.Rd | 7 +++++++ man/details_rand_forest_randomForest.Rd | 7 +++++++ man/details_rand_forest_ranger.Rd | 7 +++++++ man/details_svm_linear_LiblineaR.Rd | 7 +++++++ man/details_svm_linear_kernlab.Rd | 7 +++++++ man/details_svm_poly_kernlab.Rd | 7 +++++++ man/details_svm_rbf_kernlab.Rd | 7 +++++++ man/rmd/multinom_reg_glmnet.Rmd | 4 ++++ man/rmd/nearest_neighbor_kknn.Rmd | 2 +- man/rmd/rand_forest_randomForest.Rmd | 4 ++++ man/rmd/rand_forest_ranger.Rmd | 4 ++++ man/rmd/svm_linear_LiblineaR.Rmd | 4 ++++ man/rmd/svm_linear_kernlab.Rmd | 4 ++++ man/rmd/svm_poly_kernlab.Rmd | 4 ++++ man/rmd/svm_rbf_kernlab.Rmd | 4 ++++ 16 files changed, 85 insertions(+), 1 deletion(-) diff --git a/man/details_multinom_reg_glmnet.Rd b/man/details_multinom_reg_glmnet.Rd index b64927e87..2cbda21ba 100644 --- a/man/details_multinom_reg_glmnet.Rd +++ b/man/details_multinom_reg_glmnet.Rd @@ -57,6 +57,13 @@ variance of one. By default, \code{\link[glmnet:glmnet]{glmnet::glmnet()}} uses the argument \code{standardize = TRUE} to center and scale the data. } +\subsection{Minimal examples}{ + +The Minimal Examples for Fitting and Predicting article contains +\href{https://parsnip.tidymodels.org/articles/articles/Examples.html#multinom-reg-glmnet}{examples} +for \code{multinom_reg()} with the \code{"glmnet"} engine. +} + \subsection{References}{ \itemize{ \item Hastie, T, R Tibshirani, and M Wainwright. 2015. \emph{Statistical diff --git a/man/details_nearest_neighbor_kknn.Rd b/man/details_nearest_neighbor_kknn.Rd index fa34905e1..3dee20760 100644 --- a/man/details_nearest_neighbor_kknn.Rd +++ b/man/details_nearest_neighbor_kknn.Rd @@ -82,6 +82,13 @@ center and scale each so that each predictor has mean zero and a variance of one. } +\subsection{Minimal examples}{ + +The Minimal Examples for Fitting and Predicting article contains +\href{https://parsnip.tidymodels.org/articles/articles/Examples.html#nearest-neighbor-kknn}{examples} +for \code{nearest_neighbor()} with the \code{"kknn"} engine. +} + \subsection{References}{ \itemize{ \item Hechenbichler K. and Schliep K.P. (2004) \href{https://epub.ub.uni-muenchen.de/1769/}{Weighted k-Nearest-Neighbor Techniques and Ordinal Classification}, Discussion diff --git a/man/details_rand_forest_randomForest.Rd b/man/details_rand_forest_randomForest.Rd index 09dea5f2f..2c3f01031 100644 --- a/man/details_rand_forest_randomForest.Rd +++ b/man/details_rand_forest_randomForest.Rd @@ -88,6 +88,13 @@ Categorical predictors can be partitioned into groups of factor levels are not required for this model. } +\subsection{Minimal examples}{ + +The Minimal Examples for Fitting and Predicting article contains +\href{https://parsnip.tidymodels.org/articles/articles/Examples.html#rand-forest-randomForest}{examples} +for \code{rand_forest()} with the \code{"randomForest"} engine. +} + \subsection{References}{ \itemize{ \item Kuhn, M, and K Johnson. 2013. \emph{Applied Predictive Modeling}. diff --git a/man/details_rand_forest_ranger.Rd b/man/details_rand_forest_ranger.Rd index 4910b9965..75c574c9e 100644 --- a/man/details_rand_forest_ranger.Rd +++ b/man/details_rand_forest_ranger.Rd @@ -105,6 +105,13 @@ these values can fall outside of \verb{[0, 1]} and will be coerced to be in this range. } +\subsection{Minimal examples}{ + +The Minimal Examples for Fitting and Predicting article contains +\href{https://parsnip.tidymodels.org/articles/articles/Examples.html#rand-forest-ranger}{examples} +for \code{rand_forest()} with the \code{"ranger"} engine. +} + \subsection{References}{ \itemize{ \item Kuhn, M, and K Johnson. 2013. \emph{Applied Predictive Modeling}. diff --git a/man/details_svm_linear_LiblineaR.Rd b/man/details_svm_linear_LiblineaR.Rd index 6491729f8..97bad9ffb 100644 --- a/man/details_svm_linear_LiblineaR.Rd +++ b/man/details_svm_linear_LiblineaR.Rd @@ -85,6 +85,13 @@ center and scale each so that each predictor has mean zero and a variance of one. } +\subsection{Minimal examples}{ + +The Minimal Examples for Fitting and Predicting article contains +\href{https://parsnip.tidymodels.org/articles/articles/Examples.html#svm-linear-LiblineaR}{examples} +for \code{svm_linear()} with the \code{"LiblineaR"} engine. +} + \subsection{References}{ \itemize{ \item Kuhn, M, and K Johnson. 2013. \emph{Applied Predictive Modeling}. diff --git a/man/details_svm_linear_kernlab.Rd b/man/details_svm_linear_kernlab.Rd index fede1ccff..495548014 100644 --- a/man/details_svm_linear_kernlab.Rd +++ b/man/details_svm_linear_kernlab.Rd @@ -82,6 +82,13 @@ center and scale each so that each predictor has mean zero and a variance of one. } +\subsection{Minimal examples}{ + +The Minimal Examples for Fitting and Predicting article contains +\href{https://parsnip.tidymodels.org/articles/articles/Examples.html#svm-linear-kernlab}{examples} +for \code{svm_linear()} with the \code{"kernlab"} engine. +} + \subsection{References}{ \itemize{ \item Lin, HT, and R Weng. \href{https://www.csie.ntu.edu.tw/~cjlin/papers/plattprob.pdf}{“A Note on Platt’s Probabilistic Outputs for Support Vector Machines”} diff --git a/man/details_svm_poly_kernlab.Rd b/man/details_svm_poly_kernlab.Rd index 996e74a08..f424aa9f9 100644 --- a/man/details_svm_poly_kernlab.Rd +++ b/man/details_svm_poly_kernlab.Rd @@ -94,6 +94,13 @@ center and scale each so that each predictor has mean zero and a variance of one. } +\subsection{Minimal examples}{ + +The Minimal Examples for Fitting and Predicting article contains +\href{https://parsnip.tidymodels.org/articles/articles/Examples.html#svm-poly-kernlab}{examples} +for \code{svm_poly()} with the \code{"kernlab"} engine. +} + \subsection{References}{ \itemize{ \item Lin, HT, and R Weng. \href{https://www.csie.ntu.edu.tw/~cjlin/papers/plattprob.pdf}{“A Note on Platt’s Probabilistic Outputs for Support Vector Machines”} diff --git a/man/details_svm_rbf_kernlab.Rd b/man/details_svm_rbf_kernlab.Rd index 8417a7d9b..6dc072841 100644 --- a/man/details_svm_rbf_kernlab.Rd +++ b/man/details_svm_rbf_kernlab.Rd @@ -94,6 +94,13 @@ center and scale each so that each predictor has mean zero and a variance of one. } +\subsection{Minimal examples}{ + +The Minimal Examples for Fitting and Predicting article contains +\href{https://parsnip.tidymodels.org/articles/articles/Examples.html#svm-rbf-kernlab}{examples} +for \code{svm_rbf()} with the \code{"kernlab"} engine. +} + \subsection{References}{ \itemize{ \item Lin, HT, and R Weng. \href{https://www.csie.ntu.edu.tw/~cjlin/papers/plattprob.pdf}{“A Note on Platt’s Probabilistic Outputs for Support Vector Machines”} diff --git a/man/rmd/multinom_reg_glmnet.Rmd b/man/rmd/multinom_reg_glmnet.Rmd index 339983ad9..39b68a4de 100644 --- a/man/rmd/multinom_reg_glmnet.Rmd +++ b/man/rmd/multinom_reg_glmnet.Rmd @@ -54,6 +54,10 @@ multinom_reg(penalty = double(1), mixture = double(1)) %>% ``` By default, [glmnet::glmnet()] uses the argument `standardize = TRUE` to center and scale the data. +## Minimal examples + +The Minimal Examples for Fitting and Predicting article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#multinom-reg-glmnet) for `multinom_reg()` with the `"glmnet"` engine. + ## References - Hastie, T, R Tibshirani, and M Wainwright. 2015. _Statistical Learning with Sparsity_. CRC Press. diff --git a/man/rmd/nearest_neighbor_kknn.Rmd b/man/rmd/nearest_neighbor_kknn.Rmd index f96082dd4..9c6708de8 100644 --- a/man/rmd/nearest_neighbor_kknn.Rmd +++ b/man/rmd/nearest_neighbor_kknn.Rmd @@ -71,7 +71,7 @@ nearest_neighbor( ## Minimal examples -The Minimal Examples for Fitting and Predicting article contains examples for [nearest_neighbor() with the "kknn" engine](#nearest-neighbor-kknn). +The Minimal Examples for Fitting and Predicting article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#nearest-neighbor-kknn) for `nearest_neighbor()` with the `"kknn"` engine. ## References diff --git a/man/rmd/rand_forest_randomForest.Rmd b/man/rmd/rand_forest_randomForest.Rmd index fe0f9ef8c..754df4653 100644 --- a/man/rmd/rand_forest_randomForest.Rmd +++ b/man/rmd/rand_forest_randomForest.Rmd @@ -70,6 +70,10 @@ rand_forest( ```{r child = "template-tree-split-factors.Rmd"} ``` +## Minimal examples + +The Minimal Examples for Fitting and Predicting article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#rand-forest-randomForest) for `rand_forest()` with the `"randomForest"` engine. + ## References - Kuhn, M, and K Johnson. 2013. _Applied Predictive Modeling_. Springer. diff --git a/man/rmd/rand_forest_ranger.Rmd b/man/rmd/rand_forest_ranger.Rmd index f8ae3f03c..c698ac8e9 100644 --- a/man/rmd/rand_forest_ranger.Rmd +++ b/man/rmd/rand_forest_ranger.Rmd @@ -78,6 +78,10 @@ By default, parallel processing is turned off. When tuning, it is more efficient For `ranger` confidence intervals, the intervals are constructed using the form `estimate +/- z * std_error`. For classification probabilities, these values can fall outside of `[0, 1]` and will be coerced to be in this range. +## Minimal examples + +The Minimal Examples for Fitting and Predicting article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#rand-forest-ranger) for `rand_forest()` with the `"ranger"` engine. + ## References - Kuhn, M, and K Johnson. 2013. _Applied Predictive Modeling_. Springer. diff --git a/man/rmd/svm_linear_LiblineaR.Rmd b/man/rmd/svm_linear_LiblineaR.Rmd index f60c4b159..babf8096c 100644 --- a/man/rmd/svm_linear_LiblineaR.Rmd +++ b/man/rmd/svm_linear_LiblineaR.Rmd @@ -70,6 +70,10 @@ Note that the `LiblineaR` engine does not produce class probabilities. When opti ```{r child = "template-same-scale.Rmd"} ``` +## Minimal examples + +The Minimal Examples for Fitting and Predicting article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#svm-linear-LiblineaR) for `svm_linear()` with the `"LiblineaR"` engine. + ## References - Kuhn, M, and K Johnson. 2013. _Applied Predictive Modeling_. Springer. diff --git a/man/rmd/svm_linear_kernlab.Rmd b/man/rmd/svm_linear_kernlab.Rmd index 80fd9a8fb..faffd12df 100644 --- a/man/rmd/svm_linear_kernlab.Rmd +++ b/man/rmd/svm_linear_kernlab.Rmd @@ -68,6 +68,10 @@ Note that the `"kernlab"` engine does not naturally estimate class probabilities ```{r child = "template-same-scale.Rmd"} ``` +## Minimal examples + +The Minimal Examples for Fitting and Predicting article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#svm-linear-kernlab) for `svm_linear()` with the `"kernlab"` engine. + ## References - Lin, HT, and R Weng. ["A Note on Platt’s Probabilistic Outputs for Support Vector Machines"](https://www.csie.ntu.edu.tw/~cjlin/papers/plattprob.pdf) diff --git a/man/rmd/svm_poly_kernlab.Rmd b/man/rmd/svm_poly_kernlab.Rmd index b543c43f5..52ade1196 100644 --- a/man/rmd/svm_poly_kernlab.Rmd +++ b/man/rmd/svm_poly_kernlab.Rmd @@ -72,6 +72,10 @@ Note that the `"kernlab"` engine does not naturally estimate class probabilities ```{r child = "template-same-scale.Rmd"} ``` +## Minimal examples + +The Minimal Examples for Fitting and Predicting article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#svm-poly-kernlab) for `svm_poly()` with the `"kernlab"` engine. + ## References - Lin, HT, and R Weng. ["A Note on Platt’s Probabilistic Outputs for Support Vector Machines"](https://www.csie.ntu.edu.tw/~cjlin/papers/plattprob.pdf) diff --git a/man/rmd/svm_rbf_kernlab.Rmd b/man/rmd/svm_rbf_kernlab.Rmd index a01a4f710..d53d7a9d3 100644 --- a/man/rmd/svm_rbf_kernlab.Rmd +++ b/man/rmd/svm_rbf_kernlab.Rmd @@ -72,6 +72,10 @@ Note that the `"kernlab"` engine does not naturally estimate class probabilities ```{r child = "template-same-scale.Rmd"} ``` +## Minimal examples + +The Minimal Examples for Fitting and Predicting article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#svm-rbf-kernlab) for `svm_rbf()` with the `"kernlab"` engine. + ## References - Lin, HT, and R Weng. ["A Note on Platt’s Probabilistic Outputs for Support Vector Machines"](https://www.csie.ntu.edu.tw/~cjlin/papers/plattprob.pdf) From 937908f0077de64b64087c78c4028c9ce2a1c48d Mon Sep 17 00:00:00 2001 From: Hannah Frick Date: Thu, 8 Jul 2021 14:18:53 +0100 Subject: [PATCH 09/19] let GA build the site and deploy --- _pkgdown.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/_pkgdown.yml b/_pkgdown.yml index 17f82cfd4..1c200b0d8 100644 --- a/_pkgdown.yml +++ b/_pkgdown.yml @@ -10,7 +10,7 @@ template: # https://github.com/tidyverse/tidytemplate for css development: - mode: auto + mode: release figures: From 1d3bf190550bd58da24b2b42203f8365902f6048 Mon Sep 17 00:00:00 2001 From: Hannah Frick Date: Thu, 8 Jul 2021 17:54:48 +0100 Subject: [PATCH 10/19] don't call it minimal because it's still quite a bit to take in --- _pkgdown.yml | 2 +- man/details_multinom_reg_glmnet.Rd | 4 ++-- man/details_nearest_neighbor_kknn.Rd | 4 ++-- man/details_rand_forest_randomForest.Rd | 4 ++-- man/details_rand_forest_ranger.Rd | 4 ++-- man/details_svm_linear_LiblineaR.Rd | 4 ++-- man/details_svm_linear_kernlab.Rd | 4 ++-- man/details_svm_poly_kernlab.Rd | 4 ++-- man/details_svm_rbf_kernlab.Rd | 4 ++-- man/rmd/multinom_reg_glmnet.Rmd | 4 ++-- man/rmd/nearest_neighbor_kknn.Rmd | 4 ++-- man/rmd/rand_forest_randomForest.Rmd | 4 ++-- man/rmd/rand_forest_ranger.Rmd | 4 ++-- man/rmd/svm_linear_LiblineaR.Rmd | 4 ++-- man/rmd/svm_linear_kernlab.Rmd | 4 ++-- man/rmd/svm_poly_kernlab.Rmd | 4 ++-- man/rmd/svm_rbf_kernlab.Rmd | 4 ++-- vignettes/articles/Examples.Rmd | 6 +++--- 18 files changed, 36 insertions(+), 36 deletions(-) diff --git a/_pkgdown.yml b/_pkgdown.yml index 1c200b0d8..389881c49 100644 --- a/_pkgdown.yml +++ b/_pkgdown.yml @@ -90,7 +90,7 @@ navbar: href: https://www.tidymodels.org/learn/develop/models/ - text: Evaluating submodels with the same model object href: articles/articles/Submodels.html - - text: Minimal Examples for Fitting and Predicting + - text: Fitting and Predicting with parsnip href: articles/articles/Examples.html - text: News href: news/index.html diff --git a/man/details_multinom_reg_glmnet.Rd b/man/details_multinom_reg_glmnet.Rd index 2cbda21ba..31aec956c 100644 --- a/man/details_multinom_reg_glmnet.Rd +++ b/man/details_multinom_reg_glmnet.Rd @@ -57,9 +57,9 @@ variance of one. By default, \code{\link[glmnet:glmnet]{glmnet::glmnet()}} uses the argument \code{standardize = TRUE} to center and scale the data. } -\subsection{Minimal examples}{ +\subsection{Examples}{ -The Minimal Examples for Fitting and Predicting article contains +The “Fitting and Predicting with parsnip” article contains \href{https://parsnip.tidymodels.org/articles/articles/Examples.html#multinom-reg-glmnet}{examples} for \code{multinom_reg()} with the \code{"glmnet"} engine. } diff --git a/man/details_nearest_neighbor_kknn.Rd b/man/details_nearest_neighbor_kknn.Rd index 3dee20760..83190ac1d 100644 --- a/man/details_nearest_neighbor_kknn.Rd +++ b/man/details_nearest_neighbor_kknn.Rd @@ -82,9 +82,9 @@ center and scale each so that each predictor has mean zero and a variance of one. } -\subsection{Minimal examples}{ +\subsection{Examples}{ -The Minimal Examples for Fitting and Predicting article contains +The “Fitting and Predicting with parsnip” article contains \href{https://parsnip.tidymodels.org/articles/articles/Examples.html#nearest-neighbor-kknn}{examples} for \code{nearest_neighbor()} with the \code{"kknn"} engine. } diff --git a/man/details_rand_forest_randomForest.Rd b/man/details_rand_forest_randomForest.Rd index 2c3f01031..5a6fd3592 100644 --- a/man/details_rand_forest_randomForest.Rd +++ b/man/details_rand_forest_randomForest.Rd @@ -88,9 +88,9 @@ Categorical predictors can be partitioned into groups of factor levels are not required for this model. } -\subsection{Minimal examples}{ +\subsection{Examples}{ -The Minimal Examples for Fitting and Predicting article contains +The “Fitting and Predicting with parsnip” article contains \href{https://parsnip.tidymodels.org/articles/articles/Examples.html#rand-forest-randomForest}{examples} for \code{rand_forest()} with the \code{"randomForest"} engine. } diff --git a/man/details_rand_forest_ranger.Rd b/man/details_rand_forest_ranger.Rd index 75c574c9e..942892007 100644 --- a/man/details_rand_forest_ranger.Rd +++ b/man/details_rand_forest_ranger.Rd @@ -105,9 +105,9 @@ these values can fall outside of \verb{[0, 1]} and will be coerced to be in this range. } -\subsection{Minimal examples}{ +\subsection{Examples}{ -The Minimal Examples for Fitting and Predicting article contains +The “Fitting and Predicting with parsnip” article contains \href{https://parsnip.tidymodels.org/articles/articles/Examples.html#rand-forest-ranger}{examples} for \code{rand_forest()} with the \code{"ranger"} engine. } diff --git a/man/details_svm_linear_LiblineaR.Rd b/man/details_svm_linear_LiblineaR.Rd index 97bad9ffb..bbc377764 100644 --- a/man/details_svm_linear_LiblineaR.Rd +++ b/man/details_svm_linear_LiblineaR.Rd @@ -85,9 +85,9 @@ center and scale each so that each predictor has mean zero and a variance of one. } -\subsection{Minimal examples}{ +\subsection{Examples}{ -The Minimal Examples for Fitting and Predicting article contains +The “Fitting and Predicting with parsnip” article contains \href{https://parsnip.tidymodels.org/articles/articles/Examples.html#svm-linear-LiblineaR}{examples} for \code{svm_linear()} with the \code{"LiblineaR"} engine. } diff --git a/man/details_svm_linear_kernlab.Rd b/man/details_svm_linear_kernlab.Rd index 495548014..94c6e198b 100644 --- a/man/details_svm_linear_kernlab.Rd +++ b/man/details_svm_linear_kernlab.Rd @@ -82,9 +82,9 @@ center and scale each so that each predictor has mean zero and a variance of one. } -\subsection{Minimal examples}{ +\subsection{Examples}{ -The Minimal Examples for Fitting and Predicting article contains +The “Fitting and Predicting with parsnip” article contains \href{https://parsnip.tidymodels.org/articles/articles/Examples.html#svm-linear-kernlab}{examples} for \code{svm_linear()} with the \code{"kernlab"} engine. } diff --git a/man/details_svm_poly_kernlab.Rd b/man/details_svm_poly_kernlab.Rd index f424aa9f9..0108f8ff4 100644 --- a/man/details_svm_poly_kernlab.Rd +++ b/man/details_svm_poly_kernlab.Rd @@ -94,9 +94,9 @@ center and scale each so that each predictor has mean zero and a variance of one. } -\subsection{Minimal examples}{ +\subsection{Examples}{ -The Minimal Examples for Fitting and Predicting article contains +The “Fitting and Predicting with parsnip” article contains \href{https://parsnip.tidymodels.org/articles/articles/Examples.html#svm-poly-kernlab}{examples} for \code{svm_poly()} with the \code{"kernlab"} engine. } diff --git a/man/details_svm_rbf_kernlab.Rd b/man/details_svm_rbf_kernlab.Rd index 6dc072841..87aa5e5e4 100644 --- a/man/details_svm_rbf_kernlab.Rd +++ b/man/details_svm_rbf_kernlab.Rd @@ -94,9 +94,9 @@ center and scale each so that each predictor has mean zero and a variance of one. } -\subsection{Minimal examples}{ +\subsection{Examples}{ -The Minimal Examples for Fitting and Predicting article contains +The “Fitting and Predicting with parsnip” article contains \href{https://parsnip.tidymodels.org/articles/articles/Examples.html#svm-rbf-kernlab}{examples} for \code{svm_rbf()} with the \code{"kernlab"} engine. } diff --git a/man/rmd/multinom_reg_glmnet.Rmd b/man/rmd/multinom_reg_glmnet.Rmd index 39b68a4de..998121d07 100644 --- a/man/rmd/multinom_reg_glmnet.Rmd +++ b/man/rmd/multinom_reg_glmnet.Rmd @@ -54,9 +54,9 @@ multinom_reg(penalty = double(1), mixture = double(1)) %>% ``` By default, [glmnet::glmnet()] uses the argument `standardize = TRUE` to center and scale the data. -## Minimal examples +## Examples -The Minimal Examples for Fitting and Predicting article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#multinom-reg-glmnet) for `multinom_reg()` with the `"glmnet"` engine. +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#multinom-reg-glmnet) for `multinom_reg()` with the `"glmnet"` engine. ## References diff --git a/man/rmd/nearest_neighbor_kknn.Rmd b/man/rmd/nearest_neighbor_kknn.Rmd index 9c6708de8..838c92474 100644 --- a/man/rmd/nearest_neighbor_kknn.Rmd +++ b/man/rmd/nearest_neighbor_kknn.Rmd @@ -69,9 +69,9 @@ nearest_neighbor( ```{r child = "template-same-scale.Rmd"} ``` -## Minimal examples +## Examples -The Minimal Examples for Fitting and Predicting article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#nearest-neighbor-kknn) for `nearest_neighbor()` with the `"kknn"` engine. +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#nearest-neighbor-kknn) for `nearest_neighbor()` with the `"kknn"` engine. ## References diff --git a/man/rmd/rand_forest_randomForest.Rmd b/man/rmd/rand_forest_randomForest.Rmd index 754df4653..676395004 100644 --- a/man/rmd/rand_forest_randomForest.Rmd +++ b/man/rmd/rand_forest_randomForest.Rmd @@ -70,9 +70,9 @@ rand_forest( ```{r child = "template-tree-split-factors.Rmd"} ``` -## Minimal examples +## Examples -The Minimal Examples for Fitting and Predicting article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#rand-forest-randomForest) for `rand_forest()` with the `"randomForest"` engine. +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#rand-forest-randomForest) for `rand_forest()` with the `"randomForest"` engine. ## References diff --git a/man/rmd/rand_forest_ranger.Rmd b/man/rmd/rand_forest_ranger.Rmd index c698ac8e9..bea7a4835 100644 --- a/man/rmd/rand_forest_ranger.Rmd +++ b/man/rmd/rand_forest_ranger.Rmd @@ -78,9 +78,9 @@ By default, parallel processing is turned off. When tuning, it is more efficient For `ranger` confidence intervals, the intervals are constructed using the form `estimate +/- z * std_error`. For classification probabilities, these values can fall outside of `[0, 1]` and will be coerced to be in this range. -## Minimal examples +## Examples -The Minimal Examples for Fitting and Predicting article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#rand-forest-ranger) for `rand_forest()` with the `"ranger"` engine. +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#rand-forest-ranger) for `rand_forest()` with the `"ranger"` engine. ## References diff --git a/man/rmd/svm_linear_LiblineaR.Rmd b/man/rmd/svm_linear_LiblineaR.Rmd index babf8096c..7d68e91df 100644 --- a/man/rmd/svm_linear_LiblineaR.Rmd +++ b/man/rmd/svm_linear_LiblineaR.Rmd @@ -70,9 +70,9 @@ Note that the `LiblineaR` engine does not produce class probabilities. When opti ```{r child = "template-same-scale.Rmd"} ``` -## Minimal examples +## Examples -The Minimal Examples for Fitting and Predicting article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#svm-linear-LiblineaR) for `svm_linear()` with the `"LiblineaR"` engine. +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#svm-linear-LiblineaR) for `svm_linear()` with the `"LiblineaR"` engine. ## References diff --git a/man/rmd/svm_linear_kernlab.Rmd b/man/rmd/svm_linear_kernlab.Rmd index faffd12df..9c7cec545 100644 --- a/man/rmd/svm_linear_kernlab.Rmd +++ b/man/rmd/svm_linear_kernlab.Rmd @@ -68,9 +68,9 @@ Note that the `"kernlab"` engine does not naturally estimate class probabilities ```{r child = "template-same-scale.Rmd"} ``` -## Minimal examples +## Examples -The Minimal Examples for Fitting and Predicting article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#svm-linear-kernlab) for `svm_linear()` with the `"kernlab"` engine. +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#svm-linear-kernlab) for `svm_linear()` with the `"kernlab"` engine. ## References diff --git a/man/rmd/svm_poly_kernlab.Rmd b/man/rmd/svm_poly_kernlab.Rmd index 52ade1196..179d3f157 100644 --- a/man/rmd/svm_poly_kernlab.Rmd +++ b/man/rmd/svm_poly_kernlab.Rmd @@ -72,9 +72,9 @@ Note that the `"kernlab"` engine does not naturally estimate class probabilities ```{r child = "template-same-scale.Rmd"} ``` -## Minimal examples +## Examples -The Minimal Examples for Fitting and Predicting article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#svm-poly-kernlab) for `svm_poly()` with the `"kernlab"` engine. +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#svm-poly-kernlab) for `svm_poly()` with the `"kernlab"` engine. ## References diff --git a/man/rmd/svm_rbf_kernlab.Rmd b/man/rmd/svm_rbf_kernlab.Rmd index d53d7a9d3..b62abaf5f 100644 --- a/man/rmd/svm_rbf_kernlab.Rmd +++ b/man/rmd/svm_rbf_kernlab.Rmd @@ -72,9 +72,9 @@ Note that the `"kernlab"` engine does not naturally estimate class probabilities ```{r child = "template-same-scale.Rmd"} ``` -## Minimal examples +## Examples -The Minimal Examples for Fitting and Predicting article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#svm-rbf-kernlab) for `svm_rbf()` with the `"kernlab"` engine. +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#svm-rbf-kernlab) for `svm_rbf()` with the `"kernlab"` engine. ## References diff --git a/vignettes/articles/Examples.Rmd b/vignettes/articles/Examples.Rmd index 4e45e1abf..9d35270c5 100644 --- a/vignettes/articles/Examples.Rmd +++ b/vignettes/articles/Examples.Rmd @@ -1,8 +1,8 @@ --- -title: "Minimal Examples for Fitting and Predicting" +title: "Fitting and Predicting with parsnip" vignette: > %\VignetteEngine{knitr::rmarkdown} - %\VignetteIndexEntry{Minimal Examples for Fitting and Predicting} + %\VignetteIndexEntry{Fitting and Predicting with parsnip} output: knitr:::html_vignette --- @@ -13,7 +13,7 @@ library(ggplot2) theme_set(theme_bw()) ``` -This is a collection of minimal examples for fitting and predicting with various models and engines. +This is a collection of examples for fitting and predicting with various models and engines.
From 81d2ef4abceff20203d7251611cac84bb50a02a6 Mon Sep 17 00:00:00 2001 From: Hannah Frick Date: Thu, 8 Jul 2021 18:11:10 +0100 Subject: [PATCH 11/19] add intro sentence --- vignettes/articles/Examples.Rmd | 1 + 1 file changed, 1 insertion(+) diff --git a/vignettes/articles/Examples.Rmd b/vignettes/articles/Examples.Rmd index 9d35270c5..cbe7aaae2 100644 --- a/vignettes/articles/Examples.Rmd +++ b/vignettes/articles/Examples.Rmd @@ -15,6 +15,7 @@ theme_set(theme_bw()) This is a collection of examples for fitting and predicting with various models and engines. +Model/engine combinations which can be used for different modes have an example each. For regression, we use the Chicago ridership data. For classification, we use an artificial dataset for a binary example and the Palmer penguins data for a multi-class example.
From 1ad128f574a0e2a2b45b44beba4d037303630e76 Mon Sep 17 00:00:00 2001 From: Hannah Frick Date: Thu, 8 Jul 2021 18:12:16 +0100 Subject: [PATCH 12/19] add tensorflow for future examples --- .github/workflows/pkgdown.yaml | 15 +++++++++++++++ 1 file changed, 15 insertions(+) diff --git a/.github/workflows/pkgdown.yaml b/.github/workflows/pkgdown.yaml index 8de8eeb9d..5f574be11 100644 --- a/.github/workflows/pkgdown.yaml +++ b/.github/workflows/pkgdown.yaml @@ -53,6 +53,21 @@ jobs: pak::pkg_install("C50") shell: Rscript {0} + - name: Install Miniconda + run: | + Rscript -e "pak::pkg_install('rstudio/reticulate')" + Rscript -e "reticulate::install_miniconda()" + + - name: Find Miniconda on macOS + if: runner.os == 'macOS' + run: echo "options(reticulate.conda_binary = reticulate:::miniconda_conda())" >> .Rprofile + + - name: Install TensorFlow + run: | + reticulate::conda_create('r-reticulate', packages = c('python==3.6.9')) + tensorflow::install_tensorflow(version='1.14.0') + shell: Rscript {0} + - name: Install package run: R CMD INSTALL . From 22f257e7000d86f3c92c9af8f9cb0fa3f02fa1ef Mon Sep 17 00:00:00 2001 From: Hannah Frick Date: Tue, 13 Jul 2021 18:53:56 +0100 Subject: [PATCH 13/19] more examples `multinom_reg()`: nnet, keras `mlp()`: nnet, keras `mars()`: earth --- man/rmd/mars_earth.Rmd | 4 + man/rmd/mlp_keras.Rmd | 4 + man/rmd/mlp_nnet.Rmd | 4 + man/rmd/multinom_reg_keras.Rmd | 3 + man/rmd/multinom_reg_nnet.Rmd | 4 + vignettes/articles/Examples.Rmd | 280 ++++++++++++++++++++++++++++++++ 6 files changed, 299 insertions(+) diff --git a/man/rmd/mars_earth.Rmd b/man/rmd/mars_earth.Rmd index 9e730d326..39dc42f5f 100644 --- a/man/rmd/mars_earth.Rmd +++ b/man/rmd/mars_earth.Rmd @@ -61,6 +61,10 @@ An alternate method for using MARs for categorical outcomes can be found in [dis ```{r child = "template-makes-dummies.Rmd"} ``` +## Examples + +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#mars-earth) for `mars()` with the `"earth"` engine. + ## References - Friedman, J. 1991. "Multivariate Adaptive Regression Splines." _The Annals of Statistics_, vol. 19, no. 1, pp. 1-67. diff --git a/man/rmd/mlp_keras.Rmd b/man/rmd/mlp_keras.Rmd index bc980e7b9..e237af08f 100644 --- a/man/rmd/mlp_keras.Rmd +++ b/man/rmd/mlp_keras.Rmd @@ -72,6 +72,10 @@ mlp( ```{r child = "template-same-scale.Rmd"} ``` +## Examples + +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#mlp-keras) for `mlp()` with the `"keras"` engine. + ## References - Kuhn, M, and K Johnson. 2013. _Applied Predictive Modeling_. Springer. diff --git a/man/rmd/mlp_nnet.Rmd b/man/rmd/mlp_nnet.Rmd index b4568dfed..3486f4921 100644 --- a/man/rmd/mlp_nnet.Rmd +++ b/man/rmd/mlp_nnet.Rmd @@ -73,6 +73,10 @@ mlp( ```{r child = "template-same-scale.Rmd"} ``` +## Examples + +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#mlp-nnet) for `mlp()` with the `"nnet"` engine. + ## References - Kuhn, M, and K Johnson. 2013. _Applied Predictive Modeling_. Springer. diff --git a/man/rmd/multinom_reg_keras.Rmd b/man/rmd/multinom_reg_keras.Rmd index 354b9fc36..3475f4409 100644 --- a/man/rmd/multinom_reg_keras.Rmd +++ b/man/rmd/multinom_reg_keras.Rmd @@ -53,6 +53,9 @@ multinom_reg(penalty = double(1)) %>% ```{r child = "template-same-scale.Rmd"} ``` +## Examples + +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#multinom-reg-keras) for `multinom_reg()` with the `"keras"` engine. ## References diff --git a/man/rmd/multinom_reg_nnet.Rmd b/man/rmd/multinom_reg_nnet.Rmd index 107262d88..9cd571b76 100644 --- a/man/rmd/multinom_reg_nnet.Rmd +++ b/man/rmd/multinom_reg_nnet.Rmd @@ -51,6 +51,10 @@ multinom_reg(penalty = double(1)) %>% ```{r child = "template-same-scale.Rmd"} ``` +## Examples + +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#multinom-reg-nnet) for `multinom_reg()` with the `"nnet"` engine. + ## References - Luraschi, J, K Kuo, and E Ruiz. 2019. _Mastering nnet with R_. O'Reilly Media diff --git a/vignettes/articles/Examples.Rmd b/vignettes/articles/Examples.Rmd index cbe7aaae2..0b5b19f66 100644 --- a/vignettes/articles/Examples.Rmd +++ b/vignettes/articles/Examples.Rmd @@ -17,6 +17,212 @@ This is a collection of examples for fitting and predicting with various models Model/engine combinations which can be used for different modes have an example each. For regression, we use the Chicago ridership data. For classification, we use an artificial dataset for a binary example and the Palmer penguins data for a multi-class example. +
+ + `mars()` with the `"earth"` engine + + ```{r echo=FALSE} + knitr::spin_child("template-reg-chicago.R") + ``` + + We can define the model with specific parameters: + + ```{r} + mars_reg_spec <- + mlp() %>% + # This model can be used for classification or regression, so set mode + set_mode("regression") %>% + set_engine("nnet") + mars_reg_spec + ``` + + Now we create the model fit object: + + ```{r} + set.seed(1) + mars_reg_fit <- mars_reg_spec %>% fit(ridership ~ ., data = Chicago_train) + mars_reg_fit + ``` + + The holdout data can be predicted: + + ```{r} + predict(mars_reg_fit, Chicago_test) + ``` + + + ```{r echo=FALSE} + knitr::spin_child("template-cls-two-class.R") + ``` + + We can define the model with specific parameters: + + ```{r} + mars_cls_spec <- + mlp() %>% + # This model can be used for classification or regression, so set mode + set_mode("classification") %>% + set_engine("nnet") + mars_cls_spec + ``` + + Now we create the model fit object: + + ```{r} + set.seed(1) + mars_cls_fit <- mars_cls_spec %>% fit(Class ~ ., data = data_train) + mars_cls_fit + ``` + + The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + + ```{r} + bind_cols( + predict(mars_cls_fit, data_test), + predict(mars_cls_fit, data_test, type = "prob") + ) + ``` + +
+ + +
+ + `mlp()` with the `"nnet"` engine + + ```{r echo=FALSE} + knitr::spin_child("template-reg-chicago.R") + ``` + + We can define the model with specific parameters: + + ```{r} + mlp_reg_spec <- + mlp() %>% + # This model can be used for classification or regression, so set mode + set_mode("regression") %>% + set_engine("nnet") + mlp_reg_spec + ``` + + Now we create the model fit object: + + ```{r} + set.seed(1) + mlp_reg_fit <- mlp_reg_spec %>% fit(ridership ~ ., data = Chicago_train) + mlp_reg_fit + ``` + + The holdout data can be predicted: + + ```{r} + predict(mlp_reg_fit, Chicago_test) + ``` + + + ```{r echo=FALSE} + knitr::spin_child("template-cls-two-class.R") + ``` + + We can define the model with specific parameters: + + ```{r} + mlp_cls_spec <- + mlp() %>% + # This model can be used for classification or regression, so set mode + set_mode("classification") %>% + set_engine("nnet") + mlp_cls_spec + ``` + + Now we create the model fit object: + + ```{r} + set.seed(1) + mlp_cls_fit <- mlp_cls_spec %>% fit(Class ~ ., data = data_train) + mlp_cls_fit + ``` + + The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + + ```{r} + bind_cols( + predict(mlp_cls_fit, data_test), + predict(mlp_cls_fit, data_test, type = "prob") + ) + ``` + +
+ +
+ + `mlp()` with the `"keras"` engine + + ```{r echo=FALSE} + knitr::spin_child("template-reg-chicago.R") + ``` + + We can define the model with specific parameters: + + ```{r} + mlp_reg_spec <- + mlp() %>% + # This model can be used for classification or regression, so set mode + set_mode("regression") %>% + set_engine("keras") + mlp_reg_spec + ``` + + Now we create the model fit object: + + ```{r} + set.seed(1) + mlp_reg_fit <- mlp_reg_spec %>% fit(ridership ~ ., data = Chicago_train) + mlp_reg_fit + ``` + + The holdout data can be predicted: + + ```{r} + predict(mlp_reg_fit, Chicago_test) + ``` + + + ```{r echo=FALSE} + knitr::spin_child("template-cls-two-class.R") + ``` + + We can define the model with specific parameters: + + ```{r} + mlp_cls_spec <- + mlp() %>% + # This model can be used for classification or regression, so set mode + set_mode("classification") %>% + set_engine("keras") + mlp_cls_spec + ``` + + Now we create the model fit object: + + ```{r} + set.seed(1) + mlp_cls_fit <- mlp_cls_spec %>% fit(Class ~ ., data = data_train) + mlp_cls_fit + ``` + + The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + + ```{r} + bind_cols( + predict(mlp_cls_fit, data_test), + predict(mlp_cls_fit, data_test, type = "prob") + ) + ``` + +
+ +
`multinom_reg()` with the `"glmnet"` engine @@ -54,6 +260,80 @@ Model/engine combinations which can be used for different modes have an example
+
+ + `multinom_reg()` with the `"keras"` engine + + ```{r echo=FALSE} + knitr::spin_child("template-cls-multi-class.R") + ``` + + We can define the model with specific parameters: + + ```{r} + mr_cls_spec <- + multinom_reg(penalty = 0.1) %>% + set_engine("keras") + mr_cls_spec + ``` + + Now we create the model fit object: + + ```{r} + set.seed(1) + mr_cls_fit <- mr_cls_spec %>% fit(island ~ ., data = penguins_train) + mr_cls_fit + ``` + + The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + + ```{r} + bind_cols( + predict(mr_cls_fit, penguins_test), + predict(mr_cls_fit, penguins_test, type = "prob") + ) + ``` + +
+ + +
+ + `multinom_reg()` with the `"nnet"` engine + + ```{r echo=FALSE} + knitr::spin_child("template-cls-multi-class.R") + ``` + + We can define the model with specific parameters: + + ```{r} + mr_cls_spec <- + multinom_reg(penalty = 0.1) %>% + set_engine("nnet") + mr_cls_spec + ``` + + Now we create the model fit object: + + ```{r} + set.seed(1) + mr_cls_fit <- mr_cls_spec %>% fit(island ~ ., data = penguins_train) + mr_cls_fit + ``` + + The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + + ```{r} + bind_cols( + predict(mr_cls_fit, penguins_test), + predict(mr_cls_fit, penguins_test, type = "prob") + ) + ``` + +
+ +
`nearest_neighbor()` with the `"kknn"` engine From 972145101284427b05feeba3f25c57592e75908b Mon Sep 17 00:00:00 2001 From: Hannah Frick Date: Wed, 14 Jul 2021 09:39:33 +0100 Subject: [PATCH 14/19] suppress auto-linking in section headings --- vignettes/articles/Examples.Rmd | 28 +++++++++++++++------------- 1 file changed, 15 insertions(+), 13 deletions(-) diff --git a/vignettes/articles/Examples.Rmd b/vignettes/articles/Examples.Rmd index 0b5b19f66..25df96387 100644 --- a/vignettes/articles/Examples.Rmd +++ b/vignettes/articles/Examples.Rmd @@ -17,9 +17,11 @@ This is a collection of examples for fitting and predicting with various models Model/engine combinations which can be used for different modes have an example each. For regression, we use the Chicago ridership data. For classification, we use an artificial dataset for a binary example and the Palmer penguins data for a multi-class example. + +
- `mars()` with the `"earth"` engine + mars() with the `"earth"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -88,7 +90,7 @@ Model/engine combinations which can be used for different modes have an example
- `mlp()` with the `"nnet"` engine + mlp() with the `"nnet"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -156,7 +158,7 @@ Model/engine combinations which can be used for different modes have an example
- `mlp()` with the `"keras"` engine + mlp() with the `"keras"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -225,7 +227,7 @@ Model/engine combinations which can be used for different modes have an example
- `multinom_reg()` with the `"glmnet"` engine + multinom_reg() with the `"glmnet"` engine ```{r echo=FALSE} knitr::spin_child("template-cls-multi-class.R") @@ -262,7 +264,7 @@ Model/engine combinations which can be used for different modes have an example
- `multinom_reg()` with the `"keras"` engine + multinom_reg() with the `"keras"` engine ```{r echo=FALSE} knitr::spin_child("template-cls-multi-class.R") @@ -299,7 +301,7 @@ Model/engine combinations which can be used for different modes have an example
- `multinom_reg()` with the `"nnet"` engine + multinom_reg() with the `"nnet"` engine ```{r echo=FALSE} knitr::spin_child("template-cls-multi-class.R") @@ -336,7 +338,7 @@ Model/engine combinations which can be used for different modes have an example
- `nearest_neighbor()` with the `"kknn"` engine + nearest_neighbor() with the `"kknn"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -403,7 +405,7 @@ Model/engine combinations which can be used for different modes have an example
- `rand_forest()` with the `"ranger"` engine + rand_forest() with the `"ranger"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -472,7 +474,7 @@ Model/engine combinations which can be used for different modes have an example
- `rand_forest()` with the `"randomForest"` engine + rand_forest() with the `"randomForest"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -541,7 +543,7 @@ Model/engine combinations which can be used for different modes have an example
- `svm_linear()` with the `"LiblineaR"` engine + svm_linear() with the `"LiblineaR"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -607,7 +609,7 @@ Model/engine combinations which can be used for different modes have an example
- `svm_linear()` with the `"kernlab"` engine + svm_linear() with the `"kernlab"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -676,7 +678,7 @@ Model/engine combinations which can be used for different modes have an example
- `svm_poly()` with the `"kernlab"` engine + svm_poly() with the `"kernlab"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -745,7 +747,7 @@ Model/engine combinations which can be used for different modes have an example
- `svm_rbf()` with the `"kernlab"` engine + svm_rbf() with the `"kernlab"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") From 4ff9fb031806c3bf5d2ea10fd3557348c68f3c80 Mon Sep 17 00:00:00 2001 From: Hannah Frick Date: Wed, 14 Jul 2021 10:23:24 +0100 Subject: [PATCH 15/19] more examples `logistic_reg()`: glm, glmnet, keras, LiblineaR, stan `linear_reg()`: lm, glmnet, keras, stan `decision_tree()`: rpart, C5.0 `boost_tree()`: xgboost, C5.0 --- man/rmd/boost_tree_C5.0.Rmd | 4 + man/rmd/boost_tree_xgboost.Rmd | 4 + man/rmd/decision_tree_C5.0.Rmd | 4 + man/rmd/decision_tree_rpart.Rmd | 4 + man/rmd/linear_reg_glmnet.Rmd | 4 + man/rmd/linear_reg_keras.Rmd | 3 + man/rmd/linear_reg_lm.Rmd | 4 + man/rmd/linear_reg_stan.Rmd | 4 + man/rmd/logistic_reg_LiblineaR.Rmd | 4 + man/rmd/logistic_reg_glm.Rmd | 4 + man/rmd/logistic_reg_glmnet.Rmd | 4 + man/rmd/logistic_reg_keras.Rmd | 3 + man/rmd/logistic_reg_stan.Rmd | 4 + vignettes/articles/Examples.Rmd | 558 ++++++++++++++++++++++++++++- 14 files changed, 600 insertions(+), 8 deletions(-) diff --git a/man/rmd/boost_tree_C5.0.Rmd b/man/rmd/boost_tree_C5.0.Rmd index cf43aa369..b7cbfa25c 100644 --- a/man/rmd/boost_tree_C5.0.Rmd +++ b/man/rmd/boost_tree_C5.0.Rmd @@ -58,6 +58,10 @@ boost_tree(trees = integer(), min_n = integer(), sample_size = numeric()) %>% By default, early stopping is used. To use the complete set of boosting iterations, pass `earlyStopping = FALSE` to [set_engine()]. Also, it is unlikely that early stopping will occur if `sample_size = 1`. +## Examples + +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#boost-tree-C5.0) for `boost_tree()` with the `"C5.0"` engine. + ## References - Kuhn, M, and K Johnson. 2013. *Applied Predictive Modeling*. Springer. diff --git a/man/rmd/boost_tree_xgboost.Rmd b/man/rmd/boost_tree_xgboost.Rmd index 1f56cfcc3..178c7c1ad 100644 --- a/man/rmd/boost_tree_xgboost.Rmd +++ b/man/rmd/boost_tree_xgboost.Rmd @@ -93,6 +93,10 @@ If the model specification has `early_stop >= trees`, `early_stop` is converted parsnip chooses the objective function based on the characteristics of the outcome. To use a different loss, pass the `objective` argument to [set_engine()]. +## Examples + +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#boost-tree-xgboost) for `boost_tree()` with the `"xgboost"` engine. + ## References - [XGBoost: A Scalable Tree Boosting System](https://arxiv.org/abs/1603.02754) diff --git a/man/rmd/decision_tree_C5.0.Rmd b/man/rmd/decision_tree_C5.0.Rmd index 7147fde64..8ca1b6edb 100644 --- a/man/rmd/decision_tree_C5.0.Rmd +++ b/man/rmd/decision_tree_C5.0.Rmd @@ -50,6 +50,10 @@ decision_tree(min_n = integer()) %>% ```{r child = "template-tree-split-factors.Rmd"} ``` +## Examples + +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#decision-tree-C5.0) for `decision_tree()` with the `"C5.0"` engine. + ## References - Kuhn, M, and K Johnson. 2013. *Applied Predictive Modeling*. Springer. diff --git a/man/rmd/decision_tree_rpart.Rmd b/man/rmd/decision_tree_rpart.Rmd index 94f302a3a..3de385dad 100644 --- a/man/rmd/decision_tree_rpart.Rmd +++ b/man/rmd/decision_tree_rpart.Rmd @@ -58,6 +58,10 @@ decision_tree(tree_depth = integer(1), min_n = integer(1), cost_complexity = dou ```{r child = "template-tree-split-factors.Rmd"} ``` +## Examples + +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#decision-tree-rpart) for `decision_tree()` with the `"rpart"` engine. + ## References - Kuhn, M, and K Johnson. 2013. *Applied Predictive Modeling*. Springer. diff --git a/man/rmd/linear_reg_glmnet.Rmd b/man/rmd/linear_reg_glmnet.Rmd index 4949319c3..f15d1eab5 100644 --- a/man/rmd/linear_reg_glmnet.Rmd +++ b/man/rmd/linear_reg_glmnet.Rmd @@ -54,6 +54,10 @@ linear_reg(penalty = double(1), mixture = double(1)) %>% ``` By default, [glmnet::glmnet()] uses the argument `standardize = TRUE` to center and scale the data. +## Examples + +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#linear-reg-glmnet) for `linear_reg()` with the `"glmnet"` engine. + ## References - Hastie, T, R Tibshirani, and M Wainwright. 2015. _Statistical Learning with Sparsity_. CRC Press. diff --git a/man/rmd/linear_reg_keras.Rmd b/man/rmd/linear_reg_keras.Rmd index b2ac2d3a6..4d7ab6742 100644 --- a/man/rmd/linear_reg_keras.Rmd +++ b/man/rmd/linear_reg_keras.Rmd @@ -53,6 +53,9 @@ linear_reg(penalty = double(1)) %>% ```{r child = "template-same-scale.Rmd"} ``` +## Examples + +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#linear-reg-keras) for `linear_reg()` with the `"keras"` engine. ## References diff --git a/man/rmd/linear_reg_lm.Rmd b/man/rmd/linear_reg_lm.Rmd index dd61b16d2..9d90608c7 100644 --- a/man/rmd/linear_reg_lm.Rmd +++ b/man/rmd/linear_reg_lm.Rmd @@ -20,6 +20,10 @@ linear_reg() %>% ```{r child = "template-makes-dummies.Rmd"} ``` +## Examples + +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#linear-reg-lm) for `linear_reg()` with the `"lm"` engine. + ## References - Kuhn, M, and K Johnson. 2013. _Applied Predictive Modeling_. Springer. diff --git a/man/rmd/linear_reg_stan.Rmd b/man/rmd/linear_reg_stan.Rmd index 3b064e303..2ad553e91 100644 --- a/man/rmd/linear_reg_stan.Rmd +++ b/man/rmd/linear_reg_stan.Rmd @@ -39,6 +39,10 @@ Note that the `refresh` default prevents logging of the estimation process. Chan For prediction, the `"stan"` engine can compute posterior intervals analogous to confidence and prediction intervals. In these instances, the units are the original outcome and when `std_error = TRUE`, the standard deviation of the posterior distribution (or posterior predictive distribution as appropriate) is returned. +## Examples + +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#linear-reg-stan) for `linear_reg()` with the `"stan"` engine. + ## References - McElreath, R. 2020 _Statistical Rethinking_. CRC Press. diff --git a/man/rmd/logistic_reg_LiblineaR.Rmd b/man/rmd/logistic_reg_LiblineaR.Rmd index fd888d517..5332ee1d4 100644 --- a/man/rmd/logistic_reg_LiblineaR.Rmd +++ b/man/rmd/logistic_reg_LiblineaR.Rmd @@ -53,6 +53,10 @@ logistic_reg(penalty = double(1), mixture = double(1)) %>% ```{r child = "template-same-scale.Rmd"} ``` +## Examples + +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#logistic-reg-LiblineaR) for `logistic_reg()` with the `"LiblineaR"` engine. + ## References - Hastie, T, R Tibshirani, and M Wainwright. 2015. _Statistical Learning with Sparsity_. CRC Press. diff --git a/man/rmd/logistic_reg_glm.Rmd b/man/rmd/logistic_reg_glm.Rmd index 7cd4ccf5f..cafeaba6f 100644 --- a/man/rmd/logistic_reg_glm.Rmd +++ b/man/rmd/logistic_reg_glm.Rmd @@ -20,6 +20,10 @@ logistic_reg() %>% ```{r child = "template-makes-dummies.Rmd"} ``` +## Examples + +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#logistic-reg-glm) for `logistic_reg()` with the `"glm"` engine. + ## References - Kuhn, M, and K Johnson. 2013. _Applied Predictive Modeling_. Springer. diff --git a/man/rmd/logistic_reg_glmnet.Rmd b/man/rmd/logistic_reg_glmnet.Rmd index 0835bed75..2f27623ee 100644 --- a/man/rmd/logistic_reg_glmnet.Rmd +++ b/man/rmd/logistic_reg_glmnet.Rmd @@ -54,6 +54,10 @@ logistic_reg(penalty = double(1), mixture = double(1)) %>% ``` By default, [glmnet::glmnet()] uses the argument `standardize = TRUE` to center and scale the data. +## Examples + +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#logistic-reg-glmnet) for `logistic_reg()` with the `"glmnet"` engine. + ## References - Hastie, T, R Tibshirani, and M Wainwright. 2015. _Statistical Learning with Sparsity_. CRC Press. diff --git a/man/rmd/logistic_reg_keras.Rmd b/man/rmd/logistic_reg_keras.Rmd index 4818f800e..69172d1dd 100644 --- a/man/rmd/logistic_reg_keras.Rmd +++ b/man/rmd/logistic_reg_keras.Rmd @@ -53,6 +53,9 @@ logistic_reg(penalty = double(1)) %>% ```{r child = "template-same-scale.Rmd"} ``` +## Examples + +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#logistic-reg-keras) for `logistic_reg()` with the `"keras"` engine. ## References diff --git a/man/rmd/logistic_reg_stan.Rmd b/man/rmd/logistic_reg_stan.Rmd index cab623bfd..7160fb896 100644 --- a/man/rmd/logistic_reg_stan.Rmd +++ b/man/rmd/logistic_reg_stan.Rmd @@ -39,6 +39,10 @@ Note that the `refresh` default prevents logging of the estimation process. Chan For prediction, the `"stan"` engine can compute posterior intervals analogous to confidence and prediction intervals. In these instances, the units are the original outcome and when `std_error = TRUE`, the standard deviation of the posterior distribution (or posterior predictive distribution as appropriate) is returned. +## Examples + +The "Fitting and Predicting with parsnip" article contains [examples](https://parsnip.tidymodels.org/articles/articles/Examples.html#logistic-reg-stan) for `logistic_reg()` with the `"stan"` engine. + ## References - McElreath, R. 2020 _Statistical Rethinking_. CRC Press. diff --git a/vignettes/articles/Examples.Rmd b/vignettes/articles/Examples.Rmd index 25df96387..5310530e8 100644 --- a/vignettes/articles/Examples.Rmd +++ b/vignettes/articles/Examples.Rmd @@ -19,6 +19,548 @@ Model/engine combinations which can be used for different modes have an example +
+ + boost_tree() with the `"xgboost"` engine + + ```{r echo=FALSE} + knitr::spin_child("template-reg-chicago.R") + ``` + + We can define the model with specific parameters: + + ```{r} + bt_reg_spec <- + boost_tree(trees = 15) %>% + # This model can be used for classification or regression, so set mode + set_mode("regression") %>% + set_engine("xgboost") + bt_reg_spec + ``` + + Now we create the model fit object: + + ```{r} + set.seed(1) + bt_reg_fit <- bt_reg_spec %>% fit(ridership ~ ., data = Chicago_train) + bt_reg_fit + ``` + + The holdout data can be predicted: + + ```{r} + predict(bt_reg_fit, Chicago_test) + ``` + + + ```{r echo=FALSE} + knitr::spin_child("template-cls-two-class.R") + ``` + + We can define the model with specific parameters: + + ```{r} + bt_cls_spec <- + boost_tree(trees = 15) %>% + # This model can be used for classification or regression, so set mode + set_mode("classification") %>% + set_engine("xgboost") + bt_cls_spec + ``` + + Now we create the model fit object: + + ```{r} + set.seed(1) + bt_cls_fit <- bt_cls_spec %>% fit(Class ~ ., data = data_train) + bt_cls_fit + ``` + + The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + + ```{r} + bind_cols( + predict(bt_cls_fit, data_test), + predict(bt_cls_fit, data_test, type = "prob") + ) + ``` + +
+ + +
+ + boost_tree() with the `"C5.0"` engine + + ```{r echo=FALSE} + knitr::spin_child("template-cls-two-class.R") + ``` + + We can define the model with specific parameters: + + ```{r} + bt_cls_spec <- + boost_tree(trees = 15) %>% + set_mode("classification") %>% + set_engine("C5.0") + bt_cls_spec + ``` + + Now we create the model fit object: + + ```{r} + set.seed(1) + bt_cls_fit <- bt_cls_spec %>% fit(Class ~ ., data = data_train) + bt_cls_fit + ``` + + The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + + ```{r} + bind_cols( + predict(bt_cls_fit, data_test), + predict(bt_cls_fit, data_test, type = "prob") + ) + ``` + +
+ + +
+ + decision_tree() with the `"rpart"` engine + + ```{r echo=FALSE} + knitr::spin_child("template-reg-chicago.R") + ``` + + We can define the model with specific parameters: + + ```{r} + dt_reg_spec <- + decision_tree(tree_depth = 30) %>% + # This model can be used for classification or regression, so set mode + set_mode("regression") %>% + set_engine("rpart") + dt_reg_spec + ``` + + Now we create the model fit object: + + ```{r} + set.seed(1) + dt_reg_fit <- dt_reg_spec %>% fit(ridership ~ ., data = Chicago_train) + dt_reg_fit + ``` + + The holdout data can be predicted: + + ```{r} + predict(dt_reg_fit, Chicago_test) + ``` + + + ```{r echo=FALSE} + knitr::spin_child("template-cls-two-class.R") + ``` + + We can define the model with specific parameters: + + ```{r} + dt_cls_spec <- + decision_tree(tree_depth = 30) %>% + # This model can be used for classification or regression, so set mode + set_mode("classification") %>% + set_engine("rpart") + dt_cls_spec + ``` + + Now we create the model fit object: + + ```{r} + set.seed(1) + dt_cls_fit <- dt_cls_spec %>% fit(Class ~ ., data = data_train) + dt_cls_fit + ``` + + The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + + ```{r} + bind_cols( + predict(dt_cls_fit, data_test), + predict(dt_cls_fit, data_test, type = "prob") + ) + ``` + +
+ + +
+ + decision_tree() with the `"C5.0"` engine + + ```{r echo=FALSE} + knitr::spin_child("template-cls-two-class.R") + ``` + + We can define the model with specific parameters: + + ```{r} + dt_cls_spec <- + decision_tree(min_n = 2) %>% + set_mode("classification") %>% + set_engine("C5.0") + dt_cls_spec + ``` + + Now we create the model fit object: + + ```{r} + set.seed(1) + dt_cls_fit <- dt_cls_spec %>% fit(Class ~ ., data = data_train) + dt_cls_fit + ``` + + The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + + ```{r} + bind_cols( + predict(dt_cls_fit, data_test), + predict(dt_cls_fit, data_test, type = "prob") + ) + ``` + +
+ +
+ + linear_reg() with the `"lm"` engine + + ```{r echo=FALSE} + knitr::spin_child("template-reg-chicago.R") + ``` + + We can define the model with specific parameters: + + ```{r} + linreg_reg_spec <- + linear_reg() %>% + set_mode("regression") %>% + set_engine("lm") + linreg_reg_spec + ``` + + Now we create the model fit object: + + ```{r} + set.seed(1) + linreg_reg_fit <- linreg_reg_spec %>% fit(ridership ~ ., data = Chicago_train) + linreg_reg_fit + ``` + + The holdout data can be predicted: + + ```{r} + predict(linreg_reg_fit, Chicago_test) + ``` + +
+ + +
+ + linear_reg() with the `"glmnet"` engine + + ```{r echo=FALSE} + knitr::spin_child("template-reg-chicago.R") + ``` + + We can define the model with specific parameters: + + ```{r} + linreg_reg_spec <- + linear_reg(penalty = 0.1) %>% + set_mode("regression") %>% + set_engine("glmnet") + linreg_reg_spec + ``` + + Now we create the model fit object: + + ```{r} + set.seed(1) + linreg_reg_fit <- linreg_reg_spec %>% fit(ridership ~ ., data = Chicago_train) + linreg_reg_fit + ``` + + The holdout data can be predicted: + + ```{r} + predict(linreg_reg_fit, Chicago_test) + ``` + +
+ + +
+ + linear_reg() with the `"keras"` engine + + ```{r echo=FALSE} + knitr::spin_child("template-reg-chicago.R") + ``` + + We can define the model with specific parameters: + + ```{r} + linreg_reg_spec <- + linear_reg(penalty = 0.1) %>% + set_mode("regression") %>% + set_engine("keras") + linreg_reg_spec + ``` + + Now we create the model fit object: + + ```{r} + set.seed(1) + linreg_reg_fit <- linreg_reg_spec %>% fit(ridership ~ ., data = Chicago_train) + linreg_reg_fit + ``` + + The holdout data can be predicted: + + ```{r} + predict(linreg_reg_fit, Chicago_test) + ``` + +
+ + +
+ + linear_reg() with the `"stan"` engine + + ```{r echo=FALSE} + knitr::spin_child("template-reg-chicago.R") + ``` + + We can define the model with specific parameters: + + ```{r} + linreg_reg_spec <- + linear_reg() %>% + set_mode("regression") %>% + set_engine("stan") + linreg_reg_spec + ``` + + Now we create the model fit object: + + ```{r} + set.seed(1) + linreg_reg_fit <- linreg_reg_spec %>% fit(ridership ~ ., data = Chicago_train) + linreg_reg_fit + ``` + + The holdout data can be predicted: + + ```{r} + predict(linreg_reg_fit, Chicago_test) + ``` + +
+ + +
+ + logistic_reg() with the `"glm"` engine + + ```{r echo=FALSE} + knitr::spin_child("template-cls-two-class.R") + ``` + + We can define the model with specific parameters: + + ```{r} + logreg_cls_spec <- + logistic_reg() %>% + set_mode("classification") %>% + set_engine("glm") + logreg_cls_spec + ``` + + Now we create the model fit object: + + ```{r} + set.seed(1) + logreg_cls_fit <- logreg_cls_spec %>% fit(Class ~ ., data = data_train) + logreg_cls_fit + ``` + + The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + + ```{r} + bind_cols( + predict(logreg_cls_fit, data_test), + predict(logreg_cls_fit, data_test, type = "prob") + ) + ``` + +
+ + +
+ + logistic_reg() with the `"glmnet"` engine + + ```{r echo=FALSE} + knitr::spin_child("template-cls-two-class.R") + ``` + + We can define the model with specific parameters: + + ```{r} + logreg_cls_spec <- + logistic_reg(penalty = 0.1) %>% + set_mode("classification") %>% + set_engine("glmnet") + logreg_cls_spec + ``` + + Now we create the model fit object: + + ```{r} + set.seed(1) + logreg_cls_fit <- logreg_cls_spec %>% fit(Class ~ ., data = data_train) + logreg_cls_fit + ``` + + The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + + ```{r} + bind_cols( + predict(logreg_cls_fit, data_test), + predict(logreg_cls_fit, data_test, type = "prob") + ) + ``` + +
+ + +
+ + logistic_reg() with the `"keras"` engine + + ```{r echo=FALSE} + knitr::spin_child("template-cls-two-class.R") + ``` + + We can define the model with specific parameters: + + ```{r} + logreg_cls_spec <- + logistic_reg(penalty = 0.1) %>% + set_mode("classification") %>% + set_engine("keras") + logreg_cls_spec + ``` + + Now we create the model fit object: + + ```{r} + set.seed(1) + logreg_cls_fit <- logreg_cls_spec %>% fit(Class ~ ., data = data_train) + logreg_cls_fit + ``` + + The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + + ```{r} + bind_cols( + predict(logreg_cls_fit, data_test), + predict(logreg_cls_fit, data_test, type = "prob") + ) + ``` + +
+ + +
+ + logistic_reg() with the `"LiblineaR"` engine + + ```{r echo=FALSE} + knitr::spin_child("template-cls-two-class.R") + ``` + + We can define the model with specific parameters: + + ```{r} + logreg_cls_spec <- + logistic_reg(penalty = 0.1) %>% + set_mode("classification") %>% + set_engine("LiblineaR") + logreg_cls_spec + ``` + + Now we create the model fit object: + + ```{r} + set.seed(1) + logreg_cls_fit <- logreg_cls_spec %>% fit(Class ~ ., data = data_train) + logreg_cls_fit + ``` + + The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + + ```{r} + bind_cols( + predict(logreg_cls_fit, data_test), + predict(logreg_cls_fit, data_test, type = "prob") + ) + ``` + +
+ + +
+ + logistic_reg() with the `"stan"` engine + + ```{r echo=FALSE} + knitr::spin_child("template-cls-two-class.R") + ``` + + We can define the model with specific parameters: + + ```{r} + logreg_cls_spec <- + logistic_reg() %>% + set_mode("classification") %>% + set_engine("stan") + logreg_cls_spec + ``` + + Now we create the model fit object: + + ```{r} + set.seed(1) + logreg_cls_fit <- logreg_cls_spec %>% fit(Class ~ ., data = data_train) + logreg_cls_fit + ``` + + The holdout data can be predicted for both hard class predictions and probabilities. We'll bind these together into one tibble: + + ```{r} + bind_cols( + predict(logreg_cls_fit, data_test), + predict(logreg_cls_fit, data_test, type = "prob") + ) + ``` + +
+
mars() with the `"earth"` engine @@ -31,10 +573,10 @@ Model/engine combinations which can be used for different modes have an example ```{r} mars_reg_spec <- - mlp() %>% + mars(prod_degree = 1, prune_method = "backward") %>% # This model can be used for classification or regression, so set mode set_mode("regression") %>% - set_engine("nnet") + set_engine("earth") mars_reg_spec ``` @@ -61,10 +603,10 @@ Model/engine combinations which can be used for different modes have an example ```{r} mars_cls_spec <- - mlp() %>% + mars(prod_degree = 1, prune_method = "backward") %>% # This model can be used for classification or regression, so set mode set_mode("classification") %>% - set_engine("nnet") + set_engine("earth") mars_cls_spec ``` @@ -100,7 +642,7 @@ Model/engine combinations which can be used for different modes have an example ```{r} mlp_reg_spec <- - mlp() %>% + mlp(penalty = 0, epochs = 100) %>% # This model can be used for classification or regression, so set mode set_mode("regression") %>% set_engine("nnet") @@ -130,7 +672,7 @@ Model/engine combinations which can be used for different modes have an example ```{r} mlp_cls_spec <- - mlp() %>% + mlp(penalty = 0, epochs = 100) %>% # This model can be used for classification or regression, so set mode set_mode("classification") %>% set_engine("nnet") @@ -168,7 +710,7 @@ Model/engine combinations which can be used for different modes have an example ```{r} mlp_reg_spec <- - mlp() %>% + mlp(penalty = 0, epochs = 20) %>% # This model can be used for classification or regression, so set mode set_mode("regression") %>% set_engine("keras") @@ -198,7 +740,7 @@ Model/engine combinations which can be used for different modes have an example ```{r} mlp_cls_spec <- - mlp() %>% + mlp(penalty = 0, epochs = 20) %>% # This model can be used for classification or regression, so set mode set_mode("classification") %>% set_engine("keras") From 5b44653c5760a1e01cf4fe412722ecfebd9059da Mon Sep 17 00:00:00 2001 From: Hannah Frick Date: Wed, 14 Jul 2021 12:04:10 +0100 Subject: [PATCH 16/19] remove `set_mode()` when the model only has one mode --- vignettes/articles/Examples.Rmd | 9 --------- 1 file changed, 9 deletions(-) diff --git a/vignettes/articles/Examples.Rmd b/vignettes/articles/Examples.Rmd index 5310530e8..fc600eace 100644 --- a/vignettes/articles/Examples.Rmd +++ b/vignettes/articles/Examples.Rmd @@ -245,7 +245,6 @@ Model/engine combinations which can be used for different modes have an example ```{r} linreg_reg_spec <- linear_reg() %>% - set_mode("regression") %>% set_engine("lm") linreg_reg_spec ``` @@ -280,7 +279,6 @@ Model/engine combinations which can be used for different modes have an example ```{r} linreg_reg_spec <- linear_reg(penalty = 0.1) %>% - set_mode("regression") %>% set_engine("glmnet") linreg_reg_spec ``` @@ -315,7 +313,6 @@ Model/engine combinations which can be used for different modes have an example ```{r} linreg_reg_spec <- linear_reg(penalty = 0.1) %>% - set_mode("regression") %>% set_engine("keras") linreg_reg_spec ``` @@ -350,7 +347,6 @@ Model/engine combinations which can be used for different modes have an example ```{r} linreg_reg_spec <- linear_reg() %>% - set_mode("regression") %>% set_engine("stan") linreg_reg_spec ``` @@ -385,7 +381,6 @@ Model/engine combinations which can be used for different modes have an example ```{r} logreg_cls_spec <- logistic_reg() %>% - set_mode("classification") %>% set_engine("glm") logreg_cls_spec ``` @@ -423,7 +418,6 @@ Model/engine combinations which can be used for different modes have an example ```{r} logreg_cls_spec <- logistic_reg(penalty = 0.1) %>% - set_mode("classification") %>% set_engine("glmnet") logreg_cls_spec ``` @@ -461,7 +455,6 @@ Model/engine combinations which can be used for different modes have an example ```{r} logreg_cls_spec <- logistic_reg(penalty = 0.1) %>% - set_mode("classification") %>% set_engine("keras") logreg_cls_spec ``` @@ -499,7 +492,6 @@ Model/engine combinations which can be used for different modes have an example ```{r} logreg_cls_spec <- logistic_reg(penalty = 0.1) %>% - set_mode("classification") %>% set_engine("LiblineaR") logreg_cls_spec ``` @@ -537,7 +529,6 @@ Model/engine combinations which can be used for different modes have an example ```{r} logreg_cls_spec <- logistic_reg() %>% - set_mode("classification") %>% set_engine("stan") logreg_cls_spec ``` From 2623f20a63a90ecff92e91bafbce5aec62ed9330 Mon Sep 17 00:00:00 2001 From: Hannah Frick Date: Thu, 15 Jul 2021 18:10:29 +0100 Subject: [PATCH 17/19] added headings --- vignettes/articles/Examples.Rmd | 77 +++++++++++++++++++++------------ 1 file changed, 50 insertions(+), 27 deletions(-) diff --git a/vignettes/articles/Examples.Rmd b/vignettes/articles/Examples.Rmd index fc600eace..a46e32bb7 100644 --- a/vignettes/articles/Examples.Rmd +++ b/vignettes/articles/Examples.Rmd @@ -17,11 +17,12 @@ This is a collection of examples for fitting and predicting with various models Model/engine combinations which can be used for different modes have an example each. For regression, we use the Chicago ridership data. For classification, we use an artificial dataset for a binary example and the Palmer penguins data for a multi-class example. - + +## `boost_tree()` models
- boost_tree() with the `"xgboost"` engine + With the `"xgboost"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -90,7 +91,7 @@ Model/engine combinations which can be used for different modes have an example
- boost_tree() with the `"C5.0"` engine + With the `"C5.0"` engine ```{r echo=FALSE} knitr::spin_child("template-cls-two-class.R") @@ -126,9 +127,11 @@ Model/engine combinations which can be used for different modes have an example
+## `decision_tree()` models +
- decision_tree() with the `"rpart"` engine + With the `"rpart"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -197,7 +200,7 @@ Model/engine combinations which can be used for different modes have an example
- decision_tree() with the `"C5.0"` engine + With the `"C5.0"` engine ```{r echo=FALSE} knitr::spin_child("template-cls-two-class.R") @@ -232,9 +235,11 @@ Model/engine combinations which can be used for different modes have an example
+## `linear_reg()` models +
- linear_reg() with the `"lm"` engine + With the `"lm"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -268,7 +273,7 @@ Model/engine combinations which can be used for different modes have an example
- linear_reg() with the `"glmnet"` engine + With the `"glmnet"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -302,7 +307,7 @@ Model/engine combinations which can be used for different modes have an example
- linear_reg() with the `"keras"` engine + With the `"keras"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -336,7 +341,7 @@ Model/engine combinations which can be used for different modes have an example
- linear_reg() with the `"stan"` engine + With the `"stan"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -368,9 +373,11 @@ Model/engine combinations which can be used for different modes have an example
+## `logistic_reg()` models +
- logistic_reg() with the `"glm"` engine + With the `"glm"` engine ```{r echo=FALSE} knitr::spin_child("template-cls-two-class.R") @@ -407,7 +414,7 @@ Model/engine combinations which can be used for different modes have an example
- logistic_reg() with the `"glmnet"` engine + With the `"glmnet"` engine ```{r echo=FALSE} knitr::spin_child("template-cls-two-class.R") @@ -444,7 +451,7 @@ Model/engine combinations which can be used for different modes have an example
- logistic_reg() with the `"keras"` engine + With the `"keras"` engine ```{r echo=FALSE} knitr::spin_child("template-cls-two-class.R") @@ -481,7 +488,7 @@ Model/engine combinations which can be used for different modes have an example
- logistic_reg() with the `"LiblineaR"` engine + With the `"LiblineaR"` engine ```{r echo=FALSE} knitr::spin_child("template-cls-two-class.R") @@ -518,7 +525,7 @@ Model/engine combinations which can be used for different modes have an example
- logistic_reg() with the `"stan"` engine + With the `"stan"` engine ```{r echo=FALSE} knitr::spin_child("template-cls-two-class.R") @@ -552,9 +559,11 @@ Model/engine combinations which can be used for different modes have an example
+## `mars()` models +
- mars() with the `"earth"` engine + With the `"earth"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -621,9 +630,11 @@ Model/engine combinations which can be used for different modes have an example
+## `mlp()` models +
- mlp() with the `"nnet"` engine + With the `"nnet"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -691,7 +702,7 @@ Model/engine combinations which can be used for different modes have an example
- mlp() with the `"keras"` engine + With the `"keras"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -758,9 +769,11 @@ Model/engine combinations which can be used for different modes have an example
+## `multinom_reg()` models +
- multinom_reg() with the `"glmnet"` engine + With the `"glmnet"` engine ```{r echo=FALSE} knitr::spin_child("template-cls-multi-class.R") @@ -797,7 +810,7 @@ Model/engine combinations which can be used for different modes have an example
- multinom_reg() with the `"keras"` engine + With the `"keras"` engine ```{r echo=FALSE} knitr::spin_child("template-cls-multi-class.R") @@ -834,7 +847,7 @@ Model/engine combinations which can be used for different modes have an example
- multinom_reg() with the `"nnet"` engine + With the `"nnet"` engine ```{r echo=FALSE} knitr::spin_child("template-cls-multi-class.R") @@ -869,9 +882,11 @@ Model/engine combinations which can be used for different modes have an example
+## `nearest_neighbor()` models +
- nearest_neighbor() with the `"kknn"` engine + With the `"kknn"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -936,9 +951,11 @@ Model/engine combinations which can be used for different modes have an example
+## `rand_forest()` models +
- rand_forest() with the `"ranger"` engine + With the `"ranger"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -1007,7 +1024,7 @@ Model/engine combinations which can be used for different modes have an example
- rand_forest() with the `"randomForest"` engine + With the `"randomForest"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -1074,9 +1091,11 @@ Model/engine combinations which can be used for different modes have an example
+## `svm_linear()` models +
- svm_linear() with the `"LiblineaR"` engine + With the `"LiblineaR"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -1142,7 +1161,7 @@ Model/engine combinations which can be used for different modes have an example
- svm_linear() with the `"kernlab"` engine + With the `"kernlab"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -1209,9 +1228,11 @@ Model/engine combinations which can be used for different modes have an example
+## `svm_poly()` models +
- svm_poly() with the `"kernlab"` engine + With the `"kernlab"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") @@ -1278,9 +1299,11 @@ Model/engine combinations which can be used for different modes have an example
+## `svm_rbf()` models +
- svm_rbf() with the `"kernlab"` engine + With the `"kernlab"` engine ```{r echo=FALSE} knitr::spin_child("template-reg-chicago.R") From 32f4c9ce58f4f9b88775b709f28e5ae617d90632 Mon Sep 17 00:00:00 2001 From: Hannah Frick Date: Fri, 16 Jul 2021 09:36:59 +0100 Subject: [PATCH 18/19] include info on model/mode/engine Co-authored-by: Julia Silge --- vignettes/articles/Examples.Rmd | 11 ++++++++--- 1 file changed, 8 insertions(+), 3 deletions(-) diff --git a/vignettes/articles/Examples.Rmd b/vignettes/articles/Examples.Rmd index a46e32bb7..b0dd7ca0b 100644 --- a/vignettes/articles/Examples.Rmd +++ b/vignettes/articles/Examples.Rmd @@ -13,9 +13,15 @@ library(ggplot2) theme_set(theme_bw()) ``` -This is a collection of examples for fitting and predicting with various models and engines. +These examples show how to *fit* and *predict* with different combinations of model, mode, and engine. As a reminder, in parsnip, -Model/engine combinations which can be used for different modes have an example each. For regression, we use the Chicago ridership data. For classification, we use an artificial dataset for a binary example and the Palmer penguins data for a multi-class example. +- the **model type** differentiates basic modeling approaches, such as random forests, logistic regression, linear support vector machines, etc., + +- the **mode** denotes in what kind of modeling context it will be used (most commonly, classification or regression), and + +- the computational **engine** indicates how the model is fit, such as with a specific R package implementation or even methods outside of R like Keras or Stan. + +The following examples use consistent data sets throughout. For regression, we use the Chicago ridership data. For classification, we use an artificial data set for a binary example and the Palmer penguins data for a multiclass example. ## `boost_tree()` models @@ -1368,4 +1374,3 @@ Model/engine combinations which can be used for different modes have an example ```
- From 601675a6f1f52f652b0ff902957268a9d9d5aac2 Mon Sep 17 00:00:00 2001 From: Hannah Frick Date: Fri, 16 Jul 2021 09:37:22 +0100 Subject: [PATCH 19/19] Update vignettes/articles/template-cls-multi-class.R Co-authored-by: Julia Silge --- vignettes/articles/template-cls-multi-class.R | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/vignettes/articles/template-cls-multi-class.R b/vignettes/articles/template-cls-multi-class.R index fe4aa865b..63d1a6ed3 100644 --- a/vignettes/articles/template-cls-multi-class.R +++ b/vignettes/articles/template-cls-multi-class.R @@ -1,6 +1,6 @@ #'

Classification Example

-#' We'll model the island of the penguins with two predictors in the same unit (mm): bill length and bill depth. +#' We'll predict the island where the penguins were observed with two variables in the same unit (mm): bill length and bill depth. #+ results = "hide", messages = FALSE library(tidymodels)