diff --git a/.travis.yml b/.travis.yml index 0349bff0f1..2c04ac090d 100644 --- a/.travis.yml +++ b/.travis.yml @@ -1,7 +1,5 @@ language: R cache: packages -sudo: true -dist: trusty # build matrix; turn on vdiffr only on r release matrix: @@ -35,8 +33,13 @@ env: after_success: - Rscript -e 'covr::codecov()' -before_install: - - sudo add-apt-repository ppa:ubuntugis/ubuntugis-unstable --yes - - sudo apt-get --yes --force-yes update -qq - - sudo apt-get install --yes libudunits2-dev libproj-dev libgeos-dev libgdal-dev - - Rscript -e 'update.packages(ask = FALSE)' +addons: + apt: + sources: + - sourceline: 'ppa:ubuntugis/ppa' + packages: + - libudunits2-dev + - libproj-dev + - libgeos-dev + - libgdal-dev + diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index 8796efb5d8..4301fa0fac 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -6,7 +6,7 @@ quickly as possible. The guide is divided into two main pieces: 1. Filing a bug report or feature request in an issue. 1. Suggesting a change via a pull request. -Please note that ggplot2 is released with a [Contributor Code of Conduct](.github/CODE_OF_CONDUCT.md). By contributing to this project, +Please note that ggplot2 is released with a [Contributor Code of Conduct](CODE_OF_CONDUCT.md). By contributing to this project, you agree to abide by its terms. ## Issues diff --git a/DESCRIPTION b/DESCRIPTION index 22432db827..61a8ffa925 100644 --- a/DESCRIPTION +++ b/DESCRIPTION @@ -39,6 +39,7 @@ Suggests: ggplot2movies, hexbin, Hmisc, + isoband, knitr, lattice, mapproj, @@ -54,7 +55,7 @@ Suggests: rpart, sf (>= 0.7-3), svglite (>= 1.2.0.9001), - testthat (>= 0.11.0), + testthat (>= 2.1.0), vdiffr (>= 0.3.0) Enhances: sp License: GPL-2 | file LICENSE @@ -190,6 +191,7 @@ Collate: 'scale-continuous.r' 'scale-date.r' 'scale-discrete-.r' + 'scale-expansion.r' 'scale-gradient.r' 'scale-grey.r' 'scale-hue.r' @@ -199,6 +201,7 @@ Collate: 'scale-shape.r' 'scale-size.r' 'scale-type.R' + 'scale-view.r' 'scale-viridis.r' 'scales-.r' 'stat-bin.r' diff --git a/NAMESPACE b/NAMESPACE index 41405a260e..0612aa408f 100644 --- a/NAMESPACE +++ b/NAMESPACE @@ -49,6 +49,7 @@ S3method(ggplot_add,Coord) S3method(ggplot_add,Facet) S3method(ggplot_add,Layer) S3method(ggplot_add,Scale) +S3method(ggplot_add,by) S3method(ggplot_add,data.frame) S3method(ggplot_add,default) S3method(ggplot_add,guides) @@ -204,6 +205,7 @@ export(StatBindot) export(StatBinhex) export(StatBoxplot) export(StatContour) +export(StatContourFilled) export(StatCount) export(StatDensity) export(StatDensity2d) @@ -290,6 +292,7 @@ export(ensym) export(ensyms) export(expand_limits) export(expand_scale) +export(expansion) export(expr) export(facet_grid) export(facet_null) @@ -304,6 +307,7 @@ export(geom_blank) export(geom_boxplot) export(geom_col) export(geom_contour) +export(geom_contour_filled) export(geom_count) export(geom_crossbar) export(geom_curve) @@ -518,6 +522,7 @@ export(stat_bin_hex) export(stat_binhex) export(stat_boxplot) export(stat_contour) +export(stat_contour_filled) export(stat_count) export(stat_density) export(stat_density2d) diff --git a/NEWS.md b/NEWS.md index dd342f49d4..384a50b645 100644 --- a/NEWS.md +++ b/NEWS.md @@ -1,5 +1,60 @@ # ggplot2 (development version) +* stacking text when calculating the labels and the y axis with + `stat_summary()` now works (@ikosmidis, #2709) + +* Allowed reversing of discrete scales by re-writing `get_limits()` (@AnneLyng, #3115) + +* Added `stat_contour_filled()` and `geom_contour_filled()`, which compute + and draw filled contours of gridded data (@paleolimbot, #3044). + +* `geom_contour()` and `stat_contour()` now use the isoband package + to compute contour lines. The `complete` parameter (which was undocumented + and has been unused for at least four years) was removed (@paleolimbot, #3044). + +* `stat_smooth()` user `REML` by default, if `method = "gam"` and + `gam`'s method is not specified (@ikosmidis, #2630). + +* Changed `theme_grey()` setting for legend key so that it creates no + border (`NA`) rather than drawing a white one. (@annennenne, #3180) + +* Added function `ggplot_add.by()` for lists created with `by()` (#2734, @Maschette) + +* `ggdep()` was deprecated (@perezp44, #3382). + +* Added weight aesthetic option to `stat_density()` and made scaling of + weights the default (@annennenne, #2902) + +* `expand_scale()` was deprecated in favour of `expansion()` for setting + the `expand` argument of `x` and `y` scales (@paleolimbot). + +* `coord_trans()` now draws second axes and accepts `xlim`, `ylim`, + and `expand` arguments to bring it up to feature parity with + `coord_cartesian()`. The `xtrans` and `ytrans` arguments that were + deprecated in version 1.0.1 in favour of `x` and `y` + were removed (@paleolimbot, #2990). + +* `coord_trans()` now calculates breaks using the expanded range + (previously these were calculated using the unexpanded range, + which resulted in differences between plots made with `coord_trans()` + and those made with `coord_cartesian()`). The expansion for discrete axes + in `coord_trans()` was also updated such that it behaves identically + to that in `coord_cartesian()` (@paleolimbot, #3338). + +* All `coord_*()` functions with `xlim` and `ylim` arguments now accept + vectors with `NA` as a placeholder for the minimum or maximum value + (e.g., `ylim = c(0, NA)` would zoom the y-axis from 0 to the + maximum value observed in the data). This mimics the behaviour + of the `limits` argument in continuous scale functions + (@paleolimbot, #2907). + +* `geom_abline()`, `geom_hline()`, and `geom_vline()` now issue + more informative warnings when supplied with set aesthetics + (i.e., `slope`, `intercept`, `yintercept`, and/or `xintercept`) + and mapped aesthetics (i.e., `data` and/or `mapping`). + +* `stat_density2d()` can now take an `adjust` parameter to scale the default bandwidth. (#2860, @haleyjeppson) + # ggplot2 3.2.1 This is a patch release fixing a few regressions introduced in 3.2.0 as well as diff --git a/R/aes.r b/R/aes.r index d5a3996049..ef6f9dd602 100644 --- a/R/aes.r +++ b/R/aes.r @@ -8,8 +8,8 @@ NULL #' [ggplot2()] and in individual layers. #' #' This function also standardises aesthetic names by converting `color` to `colour` -#' (also in substrings, e.g. `point_color` to `point_colour`) and translating old style -#' R names to ggplot names (eg. `pch` to `shape`, `cex` to `size`). +#' (also in substrings, e.g., `point_color` to `point_colour`) and translating old style +#' R names to ggplot names (e.g., `pch` to `shape` and `cex` to `size`). #' #' @section Quasiquotation: #' @@ -22,9 +22,13 @@ NULL #' programming vignette](http://dplyr.tidyverse.org/articles/programming.html) #' to learn more about these techniques. #' -#' @param x,y,... List of name value pairs giving aesthetics to map to -#' variables. The names for x and y aesthetics are typically omitted because -#' they are so common; all other aesthetics must be named. +#' @param x,y,... List of name-value pairs in the form `aesthetic = variable` +#' describing which variables in the layer data should be mapped to which +#' aesthetics used by the paired geom/stat. The expression `variable` is +#' evaluated within the layer data, so there is no need to refer to +#' the original dataset (i.e., use `ggplot(df, aes(variable))` +#' instead of `ggplot(df, aes(df$variable))`). The names for x and y aesthetics +#' are typically omitted because they are so common; all other aesthetics must be named. #' @seealso [vars()] for another quoting function designed for #' faceting specifications. #' @return A list with class `uneval`. Components of the list are either @@ -334,3 +338,55 @@ mapped_aesthetics <- function(x) { is_null <- vapply(x, is.null, logical(1)) names(x)[!is_null] } + + +#' Check a mapping for discouraged usage +#' +#' Checks that `$` and `[[` are not used when the target *is* the data +#' +#' @param mapping A mapping created with [aes()] +#' @param data The data to be mapped from +#' +#' @noRd +warn_for_aes_extract_usage <- function(mapping, data) { + lapply(mapping, function(quosure) { + warn_for_aes_extract_usage_expr(get_expr(quosure), data, get_env(quosure)) + }) +} + +warn_for_aes_extract_usage_expr <- function(x, data, env = emptyenv()) { + if (is_call(x, "[[") || is_call(x, "$")) { + if (extract_target_is_likely_data(x, data, env)) { + good_usage <- alternative_aes_extract_usage(x) + warning( + "Use of `", format(x), "` is discouraged. ", + "Use `", good_usage, "` instead.", + call. = FALSE + ) + } + } else if (is.call(x)) { + lapply(x, warn_for_aes_extract_usage_expr, data, env) + } +} + +alternative_aes_extract_usage <- function(x) { + if (is_call(x, "[[")) { + good_call <- call2("[[", quote(.data), x[[3]]) + format(good_call) + } else if (is_call(x, "$")) { + as.character(x[[3]]) + } else { + stop("Don't know how to get alternative usage for `", format(x), "`", call. = FALSE) + } +} + +extract_target_is_likely_data <- function(x, data, env) { + if (!is.name(x[[2]])) { + return(FALSE) + } + + tryCatch({ + data_eval <- eval_tidy(x[[2]], data, env) + identical(data_eval, data) + }, error = function(err) FALSE) +} diff --git a/R/axis-secondary.R b/R/axis-secondary.R index 87a6893dba..e7de5f44a6 100644 --- a/R/axis-secondary.R +++ b/R/axis-secondary.R @@ -187,21 +187,49 @@ AxisSecondary <- ggproto("AxisSecondary", NULL, # patch for date and datetime scales just to maintain functionality # works only for linear secondary transforms that respect the time or date transform - if (scale$trans$name %in% c("date", "time")){ + if (scale$trans$name %in% c("date", "time")) { temp_scale <- self$create_scale(new_range, trans = scale$trans) range_info <- temp_scale$break_info() - names(range_info) <- paste0("sec.", names(range_info)) - return(range_info) - } + old_val_trans <- rescale(range_info$major, from = c(0, 1), to = range) + old_val_minor_trans <- rescale(range_info$minor, from = c(0, 1), to = range) + } else { + temp_scale <- self$create_scale(new_range) + range_info <- temp_scale$break_info() - temp_scale <- self$create_scale(new_range) - range_info <- temp_scale$break_info() + # Map the break values back to their correct position on the primary scale + old_val <- lapply(range_info$major_source, function(x) which.min(abs(full_range - x))) + old_val <- old_range[unlist(old_val)] + old_val_trans <- scale$trans$transform(old_val) + + old_val_minor <- lapply(range_info$minor_source, function(x) which.min(abs(full_range - x))) + old_val_minor <- old_range[unlist(old_val_minor)] + old_val_minor_trans <- scale$trans$transform(old_val_minor) + + # rescale values from 0 to 1 + range_info$major[] <- round( + rescale( + scale$map(old_val_trans, range(old_val_trans)), + from = range + ), + digits = 3 + ) + + range_info$minor[] <- round( + rescale( + scale$map(old_val_minor_trans, range(old_val_minor_trans)), + from = range + ), + digits = 3 + ) + } - # Map the break values back to their correct position on the primary scale - old_val <- lapply(range_info$major_source, function(x) which.min(abs(full_range - x))) - old_val <- old_range[unlist(old_val)] - old_val_trans <- scale$trans$transform(old_val) - range_info$major[] <- round(rescale(scale$map(old_val_trans, range(old_val_trans)), from = range), digits = 3) + # The _source values should be in (primary) scale_transformed space, + # so that the coord doesn't have to know about the secondary scale transformation + # when drawing the axis. The values in user space are useful for testing. + range_info$major_source_user <- range_info$major_source + range_info$minor_source_user <- range_info$minor_source + range_info$major_source[] <- old_val_trans + range_info$minor_source[] <- old_val_minor_trans names(range_info) <- paste0("sec.", names(range_info)) range_info diff --git a/R/bin.R b/R/bin.R index a5e7ae85bc..55d898c846 100644 --- a/R/bin.R +++ b/R/bin.R @@ -102,7 +102,7 @@ bin_breaks_bins <- function(x_range, bins = 30, center = NULL, if (bins < 1) { stop("Need at least one bin.", call. = FALSE) } else if (zero_range(x_range)) { - # 0.1 is the same width as the expansion `expand_default()` gives for 0-width data + # 0.1 is the same width as the expansion `default_expansion()` gives for 0-width data width <- 0.1 } else if (bins == 1) { width <- diff(x_range) diff --git a/R/coord-.r b/R/coord-.r index 47e0f0175c..e2ea1a025d 100644 --- a/R/coord-.r +++ b/R/coord-.r @@ -64,51 +64,27 @@ Coord <- ggproto("Coord", render_fg = function(panel_params, theme) element_render(theme, "panel.border"), render_bg = function(panel_params, theme) { - guide_grid(theme, - panel_params$x.minor, - panel_params$x.major, - panel_params$y.minor, - panel_params$y.major) + stop("Not implemented", call. = FALSE) }, render_axis_h = function(panel_params, theme) { - arrange <- panel_params$x.arrange %||% c("secondary", "primary") - - list( - top = render_axis(panel_params, arrange[1], "x", "top", theme), - bottom = render_axis(panel_params, arrange[2], "x", "bottom", theme) - ) + stop("Not implemented", call. = FALSE) }, render_axis_v = function(panel_params, theme) { - arrange <- panel_params$y.arrange %||% c("primary", "secondary") - - list( - left = render_axis(panel_params, arrange[1], "y", "left", theme), - right = render_axis(panel_params, arrange[2], "y", "right", theme) - ) + stop("Not implemented", call. = FALSE) }, # transform range given in transformed coordinates # back into range in given in (possibly scale-transformed) # data coordinates backtransform_range = function(self, panel_params) { - warning( - "range backtransformation not implemented in this coord; results may be wrong.", - call. = FALSE - ) - # return result from range function for backwards compatibility - # before ggplot2 3.0.1 - self$range(panel_params) + stop("Not implemented", call. = FALSE) }, # return range stored in panel_params range = function(panel_params) { - warning( - "range calculation not implemented in this coord; results may be wrong.", - call. = FALSE - ) - list(x = panel_params$x.range, y = panel_params$y.range) + stop("Not implemented", call. = FALSE) }, setup_panel_params = function(scale_x, scale_y, params = list()) { @@ -150,17 +126,13 @@ Coord <- ggproto("Coord", #' @keywords internal is.Coord <- function(x) inherits(x, "Coord") -expand_default <- function(scale, discrete = c(0, 0.6, 0, 0.6), continuous = c(0.05, 0, 0.05, 0)) { - scale$expand %|W|% if (scale$is_discrete()) discrete else continuous -} - # Renders an axis with the correct orientation or zeroGrob if no axis should be # generated render_axis <- function(panel_params, axis, scale, position, theme) { if (axis == "primary") { - guide_axis(panel_params[[paste0(scale, ".major")]], panel_params[[paste0(scale, ".labels")]], position, theme) + draw_axis(panel_params[[paste0(scale, ".major")]], panel_params[[paste0(scale, ".labels")]], position, theme) } else if (axis == "secondary" && !is.null(panel_params[[paste0(scale, ".sec.major")]])) { - guide_axis(panel_params[[paste0(scale, ".sec.major")]], panel_params[[paste0(scale, ".sec.labels")]], position, theme) + draw_axis(panel_params[[paste0(scale, ".sec.major")]], panel_params[[paste0(scale, ".sec.labels")]], position, theme) } else { zeroGrob() } diff --git a/R/coord-cartesian-.r b/R/coord-cartesian-.r index d350801902..8222604039 100644 --- a/R/coord-cartesian-.r +++ b/R/coord-cartesian-.r @@ -79,12 +79,12 @@ CoordCartesian <- ggproto("CoordCartesian", Coord, is_free = function() TRUE, distance = function(x, y, panel_params) { - max_dist <- dist_euclidean(panel_params$x.range, panel_params$y.range) + max_dist <- dist_euclidean(panel_params$x$dimension(), panel_params$y$dimension()) dist_euclidean(x, y) / max_dist }, range = function(panel_params) { - list(x = panel_params$x.range, y = panel_params$y.range) + list(x = panel_params$x$dimension(), y = panel_params$y$dimension()) }, backtransform_range = function(self, panel_params) { @@ -92,37 +92,71 @@ CoordCartesian <- ggproto("CoordCartesian", Coord, }, transform = function(data, panel_params) { - rescale_x <- function(data) rescale(data, from = panel_params$x.range) - rescale_y <- function(data) rescale(data, from = panel_params$y.range) - - data <- transform_position(data, rescale_x, rescale_y) + data <- transform_position(data, panel_params$x$rescale, panel_params$y$rescale) transform_position(data, squish_infinite, squish_infinite) }, setup_panel_params = function(self, scale_x, scale_y, params = list()) { - train_cartesian <- function(scale, limits, name) { - range <- scale_range(scale, limits, self$expand) + c( + view_scales_from_scale(scale_x, self$limits$x, self$expand), + view_scales_from_scale(scale_y, self$limits$y, self$expand) + ) + }, - out <- scale$break_info(range) - out$arrange <- scale$axis_order() - names(out) <- paste(name, names(out), sep = ".") - out - } + render_bg = function(panel_params, theme) { + guide_grid( + theme, + panel_params$x$break_positions_minor(), + panel_params$x$break_positions(), + panel_params$y$break_positions_minor(), + panel_params$y$break_positions() + ) + }, - c( - train_cartesian(scale_x, self$limits$x, "x"), - train_cartesian(scale_y, self$limits$y, "y") + render_axis_h = function(panel_params, theme) { + arrange <- panel_params$x.arrange %||% c("secondary", "primary") + arrange_scale_keys <- c("primary" = "x", "secondary" = "x.sec")[arrange] + arrange_scales <- panel_params[arrange_scale_keys] + + list( + top = draw_view_scale_axis(arrange_scales[[1]], "top", theme), + bottom = draw_view_scale_axis(arrange_scales[[2]], "bottom", theme) + ) + }, + + render_axis_v = function(panel_params, theme) { + arrange <- panel_params$y.arrange %||% c("primary", "secondary") + arrange_scale_keys <- c("primary" = "y", "secondary" = "y.sec")[arrange] + arrange_scales <- panel_params[arrange_scale_keys] + + list( + left = draw_view_scale_axis(arrange_scales[[1]], "left", theme), + right = draw_view_scale_axis(arrange_scales[[2]], "right", theme) ) } ) -scale_range <- function(scale, limits = NULL, expand = TRUE) { - expansion <- if (expand) expand_default(scale) else c(0, 0) +view_scales_from_scale <- function(scale, coord_limits = NULL, expand = TRUE) { + expansion <- default_expansion(scale, expand = expand) + limits <- scale$get_limits() + continuous_range <- expand_limits_scale(scale, expansion, limits, coord_limits = coord_limits) + aesthetic <- scale$aesthetics[1] + + view_scales <- list( + view_scale_primary(scale, limits, continuous_range), + sec = view_scale_secondary(scale, limits, continuous_range), + arrange = scale$axis_order(), + range = continuous_range + ) + names(view_scales) <- c(aesthetic, paste0(aesthetic, ".", names(view_scales)[-1])) + + view_scales +} - if (is.null(limits)) { - scale$dimension(expansion) - } else { - range <- range(scale$transform(limits)) - expand_range(range, expansion[1], expansion[2]) +draw_view_scale_axis <- function(view_scale, axis_position, theme) { + if(is.null(view_scale) || view_scale$is_empty()) { + return(zeroGrob()) } + + draw_axis(view_scale$break_positions(), view_scale$get_labels(), axis_position, theme) } diff --git a/R/coord-flip.r b/R/coord-flip.r index 1d12a3b42c..71d11f26ec 100644 --- a/R/coord-flip.r +++ b/R/coord-flip.r @@ -48,10 +48,11 @@ CoordFlip <- ggproto("CoordFlip", CoordCartesian, self$range(panel_params) }, - range = function(panel_params) { + range = function(self, panel_params) { # summarise_layout() expects the original x and y ranges here, # not the ones we would get after flipping the axes - list(x = panel_params$y.range, y = panel_params$x.range) + un_flipped_range <- ggproto_parent(CoordCartesian, self)$range(panel_params) + list(x = un_flipped_range$y, y = un_flipped_range$x) }, setup_panel_params = function(self, scale_x, scale_y, params = list()) { diff --git a/R/coord-map.r b/R/coord-map.r index 008a919350..f6a56ac89f 100644 --- a/R/coord-map.r +++ b/R/coord-map.r @@ -181,12 +181,7 @@ CoordMap <- ggproto("CoordMap", Coord, for (n in c("x", "y")) { scale <- get(paste0("scale_", n)) limits <- self$limits[[n]] - - if (is.null(limits)) { - range <- scale$dimension(expand_default(scale)) - } else { - range <- range(scale$transform(limits)) - } + range <- expand_limits_scale(scale, default_expansion(scale), coord_limits = limits) ranges[[n]] <- range } @@ -289,8 +284,8 @@ CoordMap <- ggproto("CoordMap", Coord, pos <- self$transform(x_intercept, panel_params) axes <- list( - top = guide_axis(pos$x, panel_params$x.labels, "top", theme), - bottom = guide_axis(pos$x, panel_params$x.labels, "bottom", theme) + top = draw_axis(pos$x, panel_params$x.labels, "top", theme), + bottom = draw_axis(pos$x, panel_params$x.labels, "bottom", theme) ) axes[[which(arrange == "secondary")]] <- zeroGrob() axes @@ -313,8 +308,8 @@ CoordMap <- ggproto("CoordMap", Coord, pos <- self$transform(x_intercept, panel_params) axes <- list( - left = guide_axis(pos$y, panel_params$y.labels, "left", theme), - right = guide_axis(pos$y, panel_params$y.labels, "right", theme) + left = draw_axis(pos$y, panel_params$y.labels, "left", theme), + right = draw_axis(pos$y, panel_params$y.labels, "right", theme) ) axes[[which(arrange == "secondary")]] <- zeroGrob() axes diff --git a/R/coord-polar.r b/R/coord-polar.r index d200759886..fd5d44fa3f 100644 --- a/R/coord-polar.r +++ b/R/coord-polar.r @@ -111,16 +111,12 @@ CoordPolar <- ggproto("CoordPolar", Coord, scale <- get(paste0("scale_", n)) limits <- self$limits[[n]] - if (is.null(limits)) { - if (self$theta == n) { - expand <- expand_default(scale, c(0, 0.5), c(0, 0)) - } else { - expand <- expand_default(scale, c(0, 0), c(0, 0)) - } - range <- scale$dimension(expand) + if (self$theta == n) { + expansion <- default_expansion(scale, c(0, 0.5), c(0, 0)) } else { - range <- range(scale_transform(scale, limits)) + expansion <- default_expansion(scale, c(0, 0), c(0, 0)) } + range <- expand_limits_scale(scale, expansion, coord_limits = limits) out <- scale$break_info(range) ret[[n]]$range <- out$range @@ -128,8 +124,8 @@ CoordPolar <- ggproto("CoordPolar", Coord, ret[[n]]$minor <- out$minor_source ret[[n]]$labels <- out$labels ret[[n]]$sec.range <- out$sec.range - ret[[n]]$sec.major <- out$sec.major_source - ret[[n]]$sec.minor <- out$sec.minor_source + ret[[n]]$sec.major <- out$sec.major_source_user + ret[[n]]$sec.minor <- out$sec.minor_source_user ret[[n]]$sec.labels <- out$sec.labels } @@ -190,7 +186,7 @@ CoordPolar <- ggproto("CoordPolar", Coord, render_axis_h = function(panel_params, theme) { list( top = zeroGrob(), - bottom = guide_axis(NA, "", "bottom", theme) + bottom = draw_axis(NA, "", "bottom", theme) ) }, diff --git a/R/coord-sf.R b/R/coord-sf.R index 056510733a..b73227c7e7 100644 --- a/R/coord-sf.R +++ b/R/coord-sf.R @@ -127,8 +127,10 @@ CoordSf <- ggproto("CoordSf", CoordCartesian, setup_panel_params = function(self, scale_x, scale_y, params = list()) { # Bounding box of the data - x_range <- scale_range(scale_x, self$limits$x, self$expand) - y_range <- scale_range(scale_y, self$limits$y, self$expand) + expansion_x <- default_expansion(scale_x, expand = self$expand) + x_range <- expand_limits_scale(scale_x, expansion_x, coord_limits = self$limits$x) + expansion_y <- default_expansion(scale_y, expand = self$expand) + y_range <- expand_limits_scale(scale_y, expansion_y, coord_limits = self$limits$y) bbox <- c( x_range[1], y_range[1], x_range[2], y_range[2] @@ -243,10 +245,10 @@ CoordSf <- ggproto("CoordSf", CoordCartesian, tick_labels <- c(ticks1$degree_label, ticks2$degree_label) if (length(tick_positions) > 0) { - top <- guide_axis( + top <- draw_axis( tick_positions, tick_labels, - position = "top", + axis_position = "top", theme = theme ) } else { @@ -279,10 +281,10 @@ CoordSf <- ggproto("CoordSf", CoordCartesian, tick_labels <- c(ticks1$degree_label, ticks2$degree_label) if (length(tick_positions) > 0) { - bottom <- guide_axis( + bottom <- draw_axis( tick_positions, tick_labels, - position = "bottom", + axis_position = "bottom", theme = theme ) } else { @@ -321,10 +323,10 @@ CoordSf <- ggproto("CoordSf", CoordCartesian, tick_labels <- c(ticks1$degree_label, ticks2$degree_label) if (length(tick_positions) > 0) { - right <- guide_axis( + right <- draw_axis( tick_positions, tick_labels, - position = "right", + axis_position = "right", theme = theme ) } else { @@ -357,10 +359,10 @@ CoordSf <- ggproto("CoordSf", CoordCartesian, tick_labels <- c(ticks1$degree_label, ticks2$degree_label) if (length(tick_positions) > 0) { - left <- guide_axis( + left <- draw_axis( tick_positions, tick_labels, - position = "left", + axis_position = "left", theme = theme ) } else { diff --git a/R/coord-transform.r b/R/coord-transform.r index 237c9dbe1f..7b61c82094 100644 --- a/R/coord-transform.r +++ b/R/coord-transform.r @@ -8,13 +8,9 @@ #' [scales::trans_new()] for list of transformations, and instructions #' on how to create your own. #' -#' @param x,y transformers for x and y axes -#' @param xtrans,ytrans Deprecated; use `x` and `y` instead. -#' @param limx,limy limits for x and y axes. (Named so for backward -#' compatibility) -#' @param clip Should drawing be clipped to the extent of the plot panel? A -#' setting of `"on"` (the default) means yes, and a setting of `"off"` -#' means no. For details, please see [`coord_cartesian()`]. +#' @inheritParams coord_cartesian +#' @param x,y Transformers for x and y axes or their names. +#' @param limx,limy **Deprecated**: use `xlim` and `ylim` instead. #' @export #' @examples #' \donttest{ @@ -78,31 +74,25 @@ #' plot + coord_trans(x = "log10") #' plot + coord_trans(x = "sqrt") #' } -coord_trans <- function(x = "identity", y = "identity", limx = NULL, limy = NULL, clip = "on", - xtrans, ytrans) -{ - if (!missing(xtrans)) { - gg_dep("1.0.1", "`xtrans` arguments is deprecated; please use `x` instead.") - x <- xtrans +coord_trans <- function(x = "identity", y = "identity", xlim = NULL, ylim = NULL, + limx = "DEPRECATED", limy = "DEPRECATED", clip = "on", expand = TRUE) { + if (!missing(limx)) { + warning("`limx` argument is deprecated; please use `xlim` instead.", call. = FALSE) + xlim <- limx } - if (!missing(ytrans)) { - gg_dep("1.0.1", "`ytrans` arguments is deprecated; please use `y` instead.") - y <- ytrans + if (!missing(limy)) { + warning("`limy` argument is deprecated; please use `ylim` instead.", call. = FALSE) + ylim <- limy } - # @kohske - # Now limits are implemented. - # But for backward compatibility, xlim -> limx, ylim -> ylim - # Because there are many examples such as - # > coord_trans(x = "log10", y = "log10") - # Maybe this is changed. + # resolve transformers if (is.character(x)) x <- as.trans(x) if (is.character(y)) y <- as.trans(y) - ggproto(NULL, CoordTrans, trans = list(x = x, y = y), - limits = list(x = limx, y = limy), + limits = list(x = xlim, y = ylim), + expand = expand, clip = clip ) } @@ -147,8 +137,36 @@ CoordTrans <- ggproto("CoordTrans", Coord, setup_panel_params = function(self, scale_x, scale_y, params = list()) { c( - train_trans(scale_x, self$limits$x, self$trans$x, "x"), - train_trans(scale_y, self$limits$y, self$trans$y, "y") + train_trans(scale_x, self$limits$x, self$trans$x, "x", self$expand), + train_trans(scale_y, self$limits$y, self$trans$y, "y", self$expand) + ) + }, + + render_bg = function(panel_params, theme) { + guide_grid( + theme, + panel_params$x.minor, + panel_params$x.major, + panel_params$y.minor, + panel_params$y.major + ) + }, + + render_axis_h = function(panel_params, theme) { + arrange <- panel_params$x.arrange %||% c("secondary", "primary") + + list( + top = render_axis(panel_params, arrange[1], "x", "top", theme), + bottom = render_axis(panel_params, arrange[2], "x", "bottom", theme) + ) + }, + + render_axis_v = function(panel_params, theme) { + arrange <- panel_params$y.arrange %||% c("primary", "secondary") + + list( + left = render_axis(panel_params, arrange[1], "y", "left", theme), + right = render_axis(panel_params, arrange[2], "y", "right", theme) ) } ) @@ -159,39 +177,51 @@ transform_value <- function(trans, value, range) { rescale(trans$transform(value), 0:1, range) } - -train_trans <- function(scale, limits, trans, name) { - # first, calculate the range that is the numerical limits in data space - - # expand defined by scale OR coord - # @kohske - # Expansion of data range sometimes go beyond domain, - # so in trans, expansion takes place at the final stage. - if (is.null(limits)) { - range <- scale$dimension() +train_trans <- function(scale, coord_limits, trans, name, expand = TRUE) { + expansion <- default_expansion(scale, expand = expand) + scale_trans <- scale$trans %||% identity_trans() + coord_limits <- coord_limits %||% scale_trans$inverse(c(NA, NA)) + + if (scale$is_discrete()) { + continuous_ranges <- expand_limits_discrete_trans( + scale$get_limits(), + expansion, + coord_limits, + trans, + range_continuous = scale$range_c$range + ) } else { - range <- range(scale$transform(limits)) + # transform user-specified limits to scale transformed space + coord_limits <- scale$trans$transform(coord_limits) + continuous_ranges <- expand_limits_continuous_trans( + scale$get_limits(), + expansion, + coord_limits, + trans + ) } - # breaks on data space - out <- scale$break_info(range) + # calculate break information + out <- scale$break_info(continuous_ranges$continuous_range) - # trans'd range - out$range <- trans$transform(out$range) - - # expansion if limits are not specified - if (is.null(limits)) { - expand <- expand_default(scale) - out$range <- expand_range(out$range, expand[1], expand[2]) - } + # range in coord space has already been calculated + # needs to be in increasing order for transform_value() to work + out$range <- range(continuous_ranges$continuous_range_coord) - # major and minor values in plot space + # major and minor values in coordinate data out$major_source <- transform_value(trans, out$major_source, out$range) out$minor_source <- transform_value(trans, out$minor_source, out$range) + out$sec.major_source <- transform_value(trans, out$sec.major_source, out$range) + out$sec.minor_source <- transform_value(trans, out$sec.minor_source, out$range) out <- list( - range = out$range, labels = out$labels, - major = out$major_source, minor = out$minor_source + range = out$range, + labels = out$labels, + major = out$major_source, + minor = out$minor_source, + sec.labels = out$sec.labels, + sec.major = out$sec.major_source, + sec.minor = out$sec.minor_source ) names(out) <- paste(name, names(out), sep = ".") out diff --git a/R/facet-.r b/R/facet-.r index 08ad219651..c8bfd4eb21 100644 --- a/R/facet-.r +++ b/R/facet-.r @@ -563,7 +563,7 @@ combine_vars <- function(data, env = emptyenv(), vars = NULL, drop = TRUE) { if (drop) { new <- unique_combs(new) } - base <- rbind(base, df.grid(old, new)) + base <- unique(rbind(base, df.grid(old, new))) } if (empty(base)) { diff --git a/R/facet-grid-.r b/R/facet-grid-.r index e7bf47beba..6c0d0b2224 100644 --- a/R/facet-grid-.r +++ b/R/facet-grid-.r @@ -6,6 +6,7 @@ NULL #' `facet_grid()` forms a matrix of panels defined by row and column #' faceting variables. It is most useful when you have two discrete #' variables, and all combinations of the variables exist in the data. +#' If you have only one variable with many levels, try [facet_wrap()]. #' #' @param rows,cols A set of variables or expressions quoted by #' [vars()] and defining faceting groups on the rows or columns @@ -28,12 +29,13 @@ NULL #' @param labeller A function that takes one data frame of labels and #' returns a list or data frame of character vectors. Each input #' column corresponds to one factor. Thus there will be more than -#' one with formulae of the type `~cyl + am`. Each output +#' one with `vars(cyl, am)`. Each output #' column gets displayed as one separate line in the strip #' label. This function should inherit from the "labeller" S3 class -#' for compatibility with [labeller()]. See -#' [label_value()] for more details and pointers to other -#' options. +#' for compatibility with [labeller()]. You can use different labeling +#' functions for different kind of labels, for example use [label_parsed()] for +#' formatting facet labels. [label_value()] is used by default, +#' check it for more details and pointers to other options. #' @param as.table If `TRUE`, the default, the facets are laid out like #' a table with highest values at the bottom-right. If `FALSE`, the #' facets are laid out like a plot with the highest value at the top-right. @@ -66,13 +68,6 @@ NULL #' p + facet_grid(cols = vars(cyl)) #' p + facet_grid(vars(drv), vars(cyl)) #' -#' # The historical formula interface is also available: -#' \donttest{ -#' p + facet_grid(. ~ cyl) -#' p + facet_grid(drv ~ .) -#' p + facet_grid(drv ~ cyl) -#' } -#' #' # To change plot order of facet grid, #' # change the order of variable levels with factor() #' @@ -91,7 +86,7 @@ NULL #' mt <- ggplot(mtcars, aes(mpg, wt, colour = factor(cyl))) + #' geom_point() #' -#' mt + facet_grid(. ~ cyl, scales = "free") +#' mt + facet_grid(vars(cyl), scales = "free") #' #' # If scales and space are free, then the mapping between position #' # and values in the data will be the same across all panels. This diff --git a/R/facet-wrap.r b/R/facet-wrap.r index 8c0b112c26..6ab5904de5 100644 --- a/R/facet-wrap.r +++ b/R/facet-wrap.r @@ -32,9 +32,6 @@ NULL #' # Use vars() to supply faceting variables: #' p + facet_wrap(vars(class)) #' -#' # The historical interface with formulas is also available: -#' p + facet_wrap(~class) -#' #' # Control the number of rows and columns with nrow and ncol #' p + facet_wrap(vars(class), nrow = 4) #' @@ -47,14 +44,14 @@ NULL #' # Use the `labeller` option to control how labels are printed: #' ggplot(mpg, aes(displ, hwy)) + #' geom_point() + -#' facet_wrap(c("cyl", "drv"), labeller = "label_both") +#' facet_wrap(vars(cyl, drv), labeller = "label_both") #' #' # To change the order in which the panels appear, change the levels #' # of the underlying factor. #' mpg$class2 <- reorder(mpg$class, mpg$displ) #' ggplot(mpg, aes(displ, hwy)) + #' geom_point() + -#' facet_wrap(~class2) +#' facet_wrap(vars(class2)) #' #' # By default, the same scales are used for all panels. You can allow #' # scales to vary across the panels with the `scales` argument. @@ -62,14 +59,14 @@ NULL #' # harder to compare across panels. #' ggplot(mpg, aes(displ, hwy)) + #' geom_point() + -#' facet_wrap(~class, scales = "free") +#' facet_wrap(vars(class), scales = "free") #' #' # To repeat the same data in every panel, simply construct a data frame #' # that does not contain the faceting variable. #' ggplot(mpg, aes(displ, hwy)) + #' geom_point(data = transform(mpg, class = NULL), colour = "grey85") + #' geom_point() + -#' facet_wrap(~class) +#' facet_wrap(vars(class)) #' #' # Use `strip.position` to display the facet labels at the side of your #' # choice. Setting it to `bottom` makes it act as a subtitle for the axis. @@ -77,7 +74,7 @@ NULL #' # strip labels. #' ggplot(economics_long, aes(date, value)) + #' geom_line() + -#' facet_wrap(~variable, scales = "free_y", nrow = 2, strip.position = "bottom") + +#' facet_wrap(vars(variable), scales = "free_y", nrow = 2, strip.position = "top") + #' theme(strip.background = element_blank(), strip.placement = "outside") #' } facet_wrap <- function(facets, nrow = NULL, ncol = NULL, scales = "fixed", diff --git a/R/geom-abline.r b/R/geom-abline.r index d47dfbbf50..a0ef5058b5 100644 --- a/R/geom-abline.r +++ b/R/geom-abline.r @@ -76,7 +76,7 @@ geom_abline <- function(mapping = NULL, data = NULL, show.legend = NA) { # If nothing set, default to y = x - if (missing(mapping) && missing(slope) && missing(intercept)) { + if (is.null(mapping) && missing(slope) && missing(intercept)) { slope <- 1 intercept <- 0 } @@ -84,13 +84,12 @@ geom_abline <- function(mapping = NULL, data = NULL, # Act like an annotation if (!missing(slope) || !missing(intercept)) { - # Warn if supplied mapping is going to be overwritten - if (!missing(mapping)) { - warning(paste0("Using `intercept` and/or `slope` with `mapping` may", - " not have the desired result as mapping is overwritten", - " if either of these is specified\n" - ) - ) + # Warn if supplied mapping and/or data is going to be overwritten + if (!is.null(mapping)) { + warn_overwritten_args("geom_abline()", "mapping", c("slope", "intercept")) + } + if (!is.null(data)) { + warn_overwritten_args("geom_abline()", "data", c("slope", "intercept")) } if (missing(slope)) slope <- 1 @@ -141,3 +140,34 @@ GeomAbline <- ggproto("GeomAbline", Geom, draw_key = draw_key_abline ) + +warn_overwritten_args <- function(fun_name, overwritten_arg, provided_args, plural_join = " and/or ") { + overwritten_arg_text <- paste0("`", overwritten_arg, "`") + + n_provided_args <- length(provided_args) + if (n_provided_args == 1) { + provided_arg_text <- paste0("`", provided_args, "`") + verb <- "was" + } else if (n_provided_args == 2) { + provided_arg_text <- paste0("`", provided_args, "`", collapse = plural_join) + verb <- "were" + } else { + provided_arg_text <- paste0( + paste0("`", provided_args[-n_provided_args], "`", collapse = ", "), + ",", plural_join, + "`", provided_args[n_provided_args], "`" + ) + verb <- "were" + } + + warning( + sprintf( + "%s: Ignoring %s because %s %s provided.", + fun_name, + overwritten_arg_text, + provided_arg_text, + verb + ), + call. = FALSE + ) +} diff --git a/R/geom-contour.r b/R/geom-contour.r index 51236b901f..eb9e477c44 100644 --- a/R/geom-contour.r +++ b/R/geom-contour.r @@ -12,9 +12,13 @@ #' @inheritParams layer #' @inheritParams geom_point #' @inheritParams geom_path +#' @param bins Number of contour bins. Overridden by `binwidth`. +#' @param binwidth The width of the contour bins. Overridden by `breaks`. +#' @param breaks Numeric vector to set the contour breaks. +#' Overrides `binwidth` and `bins`. By default, this is a vector of +#' length ten with [pretty()] breaks. #' @seealso [geom_density_2d()]: 2d density contours #' @export -#' @export #' @examples #' #' # Basic plot #' v <- ggplot(faithfuld, aes(waiting, eruptions, z = density)) @@ -25,6 +29,9 @@ #' geom_density_2d() #' #' \donttest{ +#' # use geom_contour_filled() for filled contours +#' v + geom_contour_filled() +#' #' # Setting bins creates evenly spaced contours in the range of the data #' v + geom_contour(bins = 2) #' v + geom_contour(bins = 10) @@ -43,6 +50,9 @@ geom_contour <- function(mapping = NULL, data = NULL, stat = "contour", position = "identity", ..., + bins = NULL, + binwidth = NULL, + breaks = NULL, lineend = "butt", linejoin = "round", linemitre = 10, @@ -58,6 +68,9 @@ geom_contour <- function(mapping = NULL, data = NULL, show.legend = show.legend, inherit.aes = inherit.aes, params = list( + bins = bins, + binwidth = binwidth, + breaks = breaks, lineend = lineend, linejoin = linejoin, linemitre = linemitre, @@ -67,12 +80,46 @@ geom_contour <- function(mapping = NULL, data = NULL, ) } +#' @rdname geom_contour +#' @export +geom_contour_filled <- function(mapping = NULL, data = NULL, + stat = "contour_filled", position = "identity", + ..., + bins = NULL, + binwidth = NULL, + breaks = NULL, + na.rm = FALSE, + show.legend = NA, + inherit.aes = TRUE) { + layer( + data = data, + mapping = mapping, + stat = stat, + geom = GeomPolygon, + position = position, + show.legend = show.legend, + inherit.aes = inherit.aes, + params = list( + bins = bins, + binwidth = binwidth, + breaks = breaks, + na.rm = na.rm, + ... + ) + ) +} + #' @rdname ggplot2-ggproto #' @format NULL #' @usage NULL #' @export #' @include geom-path.r GeomContour <- ggproto("GeomContour", GeomPath, - default_aes = aes(weight = 1, colour = "#3366FF", size = 0.5, linetype = 1, - alpha = NA) + default_aes = aes( + weight = 1, + colour = "#3366FF", + size = 0.5, + linetype = 1, + alpha = NA + ) ) diff --git a/R/geom-histogram.r b/R/geom-histogram.r index 13689b26fb..2bdbe74315 100644 --- a/R/geom-histogram.r +++ b/R/geom-histogram.r @@ -12,8 +12,10 @@ #' #' By default, the underlying computation (`stat_bin()`) uses 30 bins; #' this is not a good default, but the idea is to get you experimenting with -#' different bin widths. You may need to look at a few to uncover the full -#' story behind your data. +#' different number of bins. You can also experiment modifying the `binwidth` with +#' `center` or `boundary` arguments. `binwidth` overrides `bins` so you should do +#' one change at a time. You may need to look at a few options to uncover +#' the full story behind your data. #' #' @section Aesthetics: #' `geom_histogram()` uses the same aesthetics as [geom_bar()]; diff --git a/R/geom-hline.r b/R/geom-hline.r index 0a689ff7f3..d242f74b08 100644 --- a/R/geom-hline.r +++ b/R/geom-hline.r @@ -11,14 +11,14 @@ geom_hline <- function(mapping = NULL, data = NULL, # Act like an annotation if (!missing(yintercept)) { - # Warn if supplied mapping is going to be overwritten - if (!missing(mapping)) { - warning(paste0("Using both `yintercept` and `mapping` may not have the", - " desired result as mapping is overwritten if", - " `yintercept` is specified\n" - ) - ) + # Warn if supplied mapping and/or data is going to be overwritten + if (!is.null(mapping)) { + warn_overwritten_args("geom_hline()", "mapping", "yintercept") } + if (!is.null(data)) { + warn_overwritten_args("geom_hline()", "data", "yintercept") + } + data <- new_data_frame(list(yintercept = yintercept)) mapping <- aes(yintercept = yintercept) show.legend <- FALSE diff --git a/R/geom-path.r b/R/geom-path.r index 379355c563..f703809708 100644 --- a/R/geom-path.r +++ b/R/geom-path.r @@ -264,8 +264,9 @@ GeomLine <- ggproto("GeomLine", GeomPath, } ) -#' @param direction direction of stairs: 'vh' for vertical then horizontal, or -#' 'hv' for horizontal then vertical. +#' @param direction direction of stairs: 'vh' for vertical then horizontal, +#' 'hv' for horizontal then vertical, or 'mid' for step half-way between +#' adjacent x-values. #' @export #' @rdname geom_path geom_step <- function(mapping = NULL, data = NULL, stat = "identity", @@ -299,12 +300,12 @@ GeomStep <- ggproto("GeomStep", GeomPath, } ) -# Calculate stairsteps -# Used by [geom_step()] -# -# @keyword internal -stairstep <- function(data, direction="hv") { - direction <- match.arg(direction, c("hv", "vh")) +#' Calculate stairsteps for `geom_step()` +#' Used by `GeomStep()` +#' +#' @noRd +stairstep <- function(data, direction = "hv") { + direction <- match.arg(direction, c("hv", "vh", "mid")) data <- as.data.frame(data)[order(data$x), ] n <- nrow(data) @@ -316,16 +317,27 @@ stairstep <- function(data, direction="hv") { if (direction == "vh") { xs <- rep(1:n, each = 2)[-2*n] ys <- c(1, rep(2:n, each = 2)) - } else { + } else if (direction == "hv") { ys <- rep(1:n, each = 2)[-2*n] xs <- c(1, rep(2:n, each = 2)) + } else if (direction == "mid") { + xs <- rep(1:(n-1), each = 2) + ys <- rep(1:n, each = 2) + } else { + stop("Parameter `direction` is invalid.") + } + + if (direction == "mid") { + gaps <- data$x[-1] - data$x[-n] + mid_x <- data$x[-n] + gaps/2 # map the mid-point between adjacent x-values + x <- c(data$x[1], mid_x[xs], data$x[n]) + y <- c(data$y[ys]) + data_attr <- data[c(1,xs,n), setdiff(names(data), c("x", "y"))] + } else { + x <- data$x[xs] + y <- data$y[ys] + data_attr <- data[xs, setdiff(names(data), c("x", "y"))] } - new_data_frame(c( - list( - x = data$x[xs], - y = data$y[ys] - ), - data[xs, setdiff(names(data), c("x", "y"))] - )) + new_data_frame(c(list(x = x, y = y), data_attr)) } diff --git a/R/geom-point.r b/R/geom-point.r index 7627c6f940..922b19170a 100644 --- a/R/geom-point.r +++ b/R/geom-point.r @@ -35,7 +35,6 @@ #' often aesthetics, used to set an aesthetic to a fixed value, like #' `colour = "red"` or `size = 3`. They may also be parameters #' to the paired geom/stat. -#' @inheritParams layer #' @export #' @examples #' p <- ggplot(mtcars, aes(wt, mpg)) diff --git a/R/geom-ribbon.r b/R/geom-ribbon.r index fb292cb0fa..17df0ed118 100644 --- a/R/geom-ribbon.r +++ b/R/geom-ribbon.r @@ -1,8 +1,9 @@ #' Ribbons and area plots #' -#' For each x value, `geom_ribbon` displays a y interval defined -#' by `ymin` and `ymax`. `geom_area` is a special case of -#' `geom_ribbon`, where the `ymin` is fixed to 0. +#' For each x value, `geom_ribbon()` displays a y interval defined +#' by `ymin` and `ymax`. `geom_area()` is a special case of +#' `geom_ribbon`, where the `ymin` is fixed to 0 and `y` is used instead +#' of `ymax`. #' #' An area plot is the continuous analogue of a stacked bar chart (see #' [geom_bar()]), and can be used to show how composition of the diff --git a/R/geom-vline.r b/R/geom-vline.r index 8c68a79761..3c22e0b1c9 100644 --- a/R/geom-vline.r +++ b/R/geom-vline.r @@ -11,14 +11,14 @@ geom_vline <- function(mapping = NULL, data = NULL, # Act like an annotation if (!missing(xintercept)) { - # Warn if supplied mapping is going to be overwritten - if (!missing(mapping)) { - warning(paste0("Using both `xintercept` and `mapping` may not have the", - " desired result as mapping is overwritten if", - " `xintercept` is specified\n" - ) - ) + # Warn if supplied mapping and/or data is going to be overwritten + if (!is.null(mapping)) { + warn_overwritten_args("geom_vline()", "mapping", "xintercept") } + if (!is.null(data)) { + warn_overwritten_args("geom_vline()", "data", "xintercept") + } + data <- new_data_frame(list(xintercept = xintercept)) mapping <- aes(xintercept = xintercept) show.legend <- FALSE diff --git a/R/guides-axis.r b/R/guides-axis.r index 764be5a4db..d7bc5449ed 100644 --- a/R/guides-axis.r +++ b/R/guides-axis.r @@ -1,136 +1,258 @@ -# Grob for axes -# -# @param position of ticks -# @param labels at ticks -# @param position of axis (top, bottom, left or right) -# @param range of data values -guide_axis <- function(at, labels, position = "right", theme) { - line <- switch(position, - top = element_render(theme, "axis.line.x.top", c(0, 1), c(0, 0), id.lengths = 2), - bottom = element_render(theme, "axis.line.x.bottom", c(0, 1), c(1, 1), id.lengths = 2), - right = element_render(theme, "axis.line.y.right", c(0, 0), c(0, 1), id.lengths = 2), - left = element_render(theme, "axis.line.y.left", c(1, 1), c(0, 1), id.lengths = 2) - ) - position <- match.arg(position, c("top", "bottom", "right", "left")) - - zero <- unit(0, "npc") - one <- unit(1, "npc") - - if (length(at) == 0) { - vertical <- position %in% c("left", "right") - return(absoluteGrob( - gList(line), - width = if (vertical) zero else one, - height = if (vertical) one else zero - )) + +#' Grob for axes +#' +#' @param break_position position of ticks +#' @param break_labels labels at ticks +#' @param axis_position position of axis (top, bottom, left or right) +#' @param theme A complete [theme()] object +#' @param check.overlap silently remove overlapping labels, +#' (recursively) prioritizing the first, last, and middle labels. +#' @param angle Compared to setting the angle in [theme()] / [element_text()], +#' this also uses some heuristics to automatically pick the `hjust` and `vjust` that +#' you probably want. +#' @param n_dodge The number of rows (for vertical axes) or columns (for +#' horizontal axes) that should be used to render the labels. This is +#' useful for displaying labels that would otherwise overlap. +#' +#' @noRd +#' +draw_axis <- function(break_positions, break_labels, axis_position, theme, + check.overlap = FALSE, angle = NULL, n_dodge = 1) { + + axis_position <- match.arg(axis_position, c("top", "bottom", "right", "left")) + aesthetic <- if (axis_position %in% c("top", "bottom")) "x" else "y" + + # resolve elements + line_element_name <- paste0("axis.line.", aesthetic, ".", axis_position) + tick_element_name <- paste0("axis.ticks.", aesthetic, ".", axis_position) + tick_length_element_name <- paste0("axis.ticks.length.", aesthetic, ".", axis_position) + label_element_name <- paste0("axis.text.", aesthetic, ".", axis_position) + + line_element <- calc_element(line_element_name, theme) + tick_element <- calc_element(tick_element_name, theme) + tick_length <- calc_element(tick_length_element_name, theme) + label_element <- calc_element(label_element_name, theme) + + # override label element parameters for rotation + if (inherits(label_element, "element_text")) { + label_overrides <- axis_label_element_overrides(axis_position, angle) + # label_overrides is always an element_text(), but in order for the merge to + # keep the new class, the override must also have the new class + class(label_overrides) <- class(label_element) + label_element <- merge_element(label_overrides, label_element) } - at <- unit(at, "native") - - theme$axis.ticks.length.x.bottom <- theme$axis.ticks.length.x.bottom %||% - theme$axis.ticks.length.x %||% - theme$axis.ticks.length - theme$axis.ticks.length.x.top <- theme$axis.ticks.length.x.top %||% - theme$axis.ticks.length.x %||% - theme$axis.ticks.length - theme$axis.ticks.length.y.left <- theme$axis.ticks.length.y.left %||% - theme$axis.ticks.length.y %||% - theme$axis.ticks.length - theme$axis.ticks.length.y.right <- theme$axis.ticks.length.y.right %||% - theme$axis.ticks.length.y %||% - theme$axis.ticks.length - - label_render <- switch(position, - top = "axis.text.x.top", bottom = "axis.text.x.bottom", - left = "axis.text.y.left", right = "axis.text.y.right" - ) + # conditionally set parameters that depend on axis orientation + is_vertical <- axis_position %in% c("left", "right") - label_x <- switch(position, - top = , - bottom = at, - right = theme$axis.ticks.length.y.right, - left = one - theme$axis.ticks.length.y.left - ) - label_y <- switch(position, - top = theme$axis.ticks.length.x.top, - bottom = one - theme$axis.ticks.length.x.bottom, - right = , - left = at + position_dim <- if (is_vertical) "y" else "x" + non_position_dim <- if (is_vertical) "x" else "y" + position_size <- if (is_vertical) "height" else "width" + non_position_size <- if (is_vertical) "width" else "height" + gtable_element <- if (is_vertical) gtable_row else gtable_col + measure_gtable <- if (is_vertical) gtable_width else gtable_height + measure_labels_non_pos <- if (is_vertical) grobWidth else grobHeight + + # conditionally set parameters that depend on which side of the panel + # the axis is on + is_second <- axis_position %in% c("right", "top") + + tick_direction <- if (is_second) 1 else -1 + non_position_panel <- if (is_second) unit(0, "npc") else unit(1, "npc") + tick_coordinate_order <- if (is_second) c(2, 1) else c(1, 2) + + # conditionally set the gtable ordering + labels_first_gtable <- axis_position %in% c("left", "top") # refers to position in gtable + + # set common parameters + n_breaks <- length(break_positions) + opposite_positions <- c("top" = "bottom", "bottom" = "top", "right" = "left", "left" = "right") + axis_position_opposite <- unname(opposite_positions[axis_position]) + + # draw elements + line_grob <- exec( + element_grob, line_element, + !!position_dim := unit(c(0, 1), "npc"), + !!non_position_dim := unit.c(non_position_panel, non_position_panel) ) - if (is.list(labels)) { - if (any(sapply(labels, is.language))) { - labels <- do.call(expression, labels) + if (n_breaks == 0) { + return( + absoluteGrob( + gList(line_grob), + width = grobWidth(line_grob), + height = grobHeight(line_grob) + ) + ) + } + + # break_labels can be a list() of language objects + if (is.list(break_labels)) { + if (any(vapply(break_labels, is.language, logical(1)))) { + break_labels <- do.call(expression, break_labels) } else { - labels <- unlist(labels) + break_labels <- unlist(break_labels) } } - labels <- switch(position, - top = , - bottom = element_render(theme, label_render, labels, x = label_x, margin_y = TRUE), - right = , - left = element_render(theme, label_render, labels, y = label_y, margin_x = TRUE)) - - - - nticks <- length(at) - - ticks <- switch(position, - top = element_render(theme, "axis.ticks.x.top", - x = rep(at, each = 2), - y = rep(unit.c(zero, theme$axis.ticks.length.x.top), nticks), - id.lengths = rep(2, nticks)), - bottom = element_render(theme, "axis.ticks.x.bottom", - x = rep(at, each = 2), - y = rep(unit.c(one - theme$axis.ticks.length.x.bottom, one), nticks), - id.lengths = rep(2, nticks)), - right = element_render(theme, "axis.ticks.y.right", - x = rep(unit.c(zero, theme$axis.ticks.length.y.right), nticks), - y = rep(at, each = 2), - id.lengths = rep(2, nticks)), - left = element_render(theme, "axis.ticks.y.left", - x = rep(unit.c(one - theme$axis.ticks.length.y.left, one), nticks), - y = rep(at, each = 2), - id.lengths = rep(2, nticks)) - ) + # calculate multiple rows/columns of labels (which is usually 1) + dodge_pos <- rep(seq_len(n_dodge), length.out = n_breaks) + dodge_indices <- split(seq_len(n_breaks), dodge_pos) - # Create the gtable for the ticks + labels - gt <- switch(position, - top = gtable_col("axis", - grobs = list(labels, ticks), - width = one, - heights = unit.c(grobHeight(labels), theme$axis.ticks.length.x.top) - ), - bottom = gtable_col("axis", - grobs = list(ticks, labels), - width = one, - heights = unit.c(theme$axis.ticks.length.x.bottom, grobHeight(labels)) - ), - right = gtable_row("axis", - grobs = list(ticks, labels), - widths = unit.c(theme$axis.ticks.length.y.right, grobWidth(labels)), - height = one - ), - left = gtable_row("axis", - grobs = list(labels, ticks), - widths = unit.c(grobWidth(labels), theme$axis.ticks.length.y.left), - height = one + label_grobs <- lapply(dodge_indices, function(indices) { + draw_axis_labels( + break_positions = break_positions[indices], + break_labels = break_labels[indices], + label_element = label_element, + is_vertical = is_vertical, + check.overlap = check.overlap ) + }) + + ticks_grob <- exec( + element_grob, tick_element, + !!position_dim := rep(unit(break_positions, "native"), each = 2), + !!non_position_dim := rep( + unit.c(non_position_panel + (tick_direction * tick_length), non_position_panel)[tick_coordinate_order], + times = n_breaks + ), + id.lengths = rep(2, times = n_breaks) ) - # Viewport for justifying the axis grob - justvp <- switch(position, - top = viewport(y = 0, just = "bottom", height = gtable_height(gt)), - bottom = viewport(y = 1, just = "top", height = gtable_height(gt)), - right = viewport(x = 0, just = "left", width = gtable_width(gt)), - left = viewport(x = 1, just = "right", width = gtable_width(gt)) + # create gtable + non_position_sizes <- paste0(non_position_size, "s") + label_dims <- do.call(unit.c, lapply(label_grobs, measure_labels_non_pos)) + grobs <- c(list(ticks_grob), label_grobs) + grob_dims <- unit.c(tick_length, label_dims) + + if (labels_first_gtable) { + grobs <- rev(grobs) + grob_dims <- rev(grob_dims) + } + + gt <- exec( + gtable_element, + name = "axis", + grobs = grobs, + !!non_position_sizes := grob_dims, + !!position_size := unit(1, "npc") + ) + + # create viewport + justvp <- exec( + viewport, + !!non_position_dim := non_position_panel, + !!non_position_size := measure_gtable(gt), + just = axis_position_opposite ) absoluteGrob( - gList(line, gt), + gList(line_grob, gt), width = gtable_width(gt), height = gtable_height(gt), vp = justvp ) } + +draw_axis_labels <- function(break_positions, break_labels, label_element, is_vertical, + check.overlap = FALSE) { + + position_dim <- if (is_vertical) "y" else "x" + label_margin_name <- if (is_vertical) "margin_x" else "margin_y" + + n_breaks <- length(break_positions) + break_positions <- unit(break_positions, "native") + + if (check.overlap) { + priority <- axis_label_priority(n_breaks) + break_labels <- break_labels[priority] + break_positions <- break_positions[priority] + } + + labels_grob <- exec( + element_grob, label_element, + !!position_dim := break_positions, + !!label_margin_name := TRUE, + label = break_labels, + check.overlap = check.overlap + ) +} + +#' Determine the label priority for a given number of labels +#' +#' @param n The number of labels +#' +#' @return The vector `seq_len(n)` arranged such that the +#' first, last, and middle elements are recursively +#' placed at the beginning of the vector. +#' @noRd +#' +axis_label_priority <- function(n) { + if (n <= 0) { + return(numeric(0)) + } + + c(1, n, axis_label_priority_between(1, n)) +} + +axis_label_priority_between <- function(x, y) { + n <- y - x + 1 + if (n <= 2) { + return(numeric(0)) + } + + mid <- x - 1 + (n + 1) %/% 2 + c( + mid, + axis_label_priority_between(x, mid), + axis_label_priority_between(mid, y) + ) +} + +#' Override axis text angle and alignment +#' +#' @param axis_position One of bottom, left, top, or right +#' @param angle The text angle, or NULL to override nothing +#' +#' @return An [element_text()] that contains parameters that should be +#' overridden from the user- or theme-supplied element. +#' @noRd +#' +axis_label_element_overrides <- function(axis_position, angle = NULL) { + if (is.null(angle)) { + return(element_text(angle = NULL, hjust = NULL, vjust = NULL)) + } + + # it is not worth the effort to align upside-down labels properly + if (angle > 90 || angle < -90) { + stop("`angle` must be between 90 and -90", call. = FALSE) + } + + if (axis_position == "bottom") { + element_text( + angle = angle, + hjust = if (angle > 0) 1 else if (angle < 0) 0 else 0.5, + vjust = if (abs(angle) == 90) 0.5 else 1 + ) + } else if (axis_position == "left") { + element_text( + angle = angle, + hjust = if (abs(angle) == 90) 0.5 else 1, + vjust = if (angle > 0) 0 else if (angle < 0) 1 else 0.5, + ) + } else if (axis_position == "top") { + element_text( + angle = angle, + hjust = if (angle > 0) 0 else if (angle < 0) 1 else 0.5, + vjust = if (abs(angle) == 90) 0.5 else 0 + ) + } else if (axis_position == "right") { + element_text( + angle = angle, + hjust = if (abs(angle) == 90) 0.5 else 0, + vjust = if (angle > 0) 1 else if (angle < 0) 0 else 0.5, + ) + } else { + stop("Unrecognized position: '", axis_position, "'", call. = FALSE) + } +} diff --git a/R/labeller.r b/R/labeller.r index d5c86cbd73..ef5565437e 100644 --- a/R/labeller.r +++ b/R/labeller.r @@ -353,7 +353,7 @@ as_labeller <- function(x, default = label_value, multi_line = TRUE) { #' used with lookup tables or non-labeller functions. #' @family facet labeller #' @seealso [as_labeller()], \link{labellers} -#' @return A labeller function to supply to [facet_grid()] +#' @return A labeller function to supply to [facet_grid()] or [facet_wrap()] #' for the argument `labeller`. #' @export #' @examples diff --git a/R/layer.r b/R/layer.r index f0427c16eb..cf0cfd15fd 100644 --- a/R/layer.r +++ b/R/layer.r @@ -238,10 +238,14 @@ Layer <- ggproto("Layer", NULL, scales_add_defaults(plot$scales, data, aesthetics, plot$plot_env) - # Evaluate and check aesthetics + # Evaluate aesthetics evaled <- lapply(aesthetics, eval_tidy, data = data) evaled <- compact(evaled) + # Check for discouraged usage in mapping + warn_for_aes_extract_usage(aesthetics, data[setdiff(names(data), "PANEL")]) + + # Check aesthetic values nondata_cols <- check_nondata_cols(evaled) if (length(nondata_cols) > 0) { msg <- paste0( diff --git a/R/limits.r b/R/limits.r index 6fa2dae289..7d989b4e42 100644 --- a/R/limits.r +++ b/R/limits.r @@ -16,7 +16,9 @@ #' A date-time value will create a continuous date/time scale. #' @seealso For changing x or y axis limits \strong{without} dropping data #' observations, see [coord_cartesian()]. To expand the range of -#' a plot to always include certain values, see [expand_limits()]. +#' a plot to always include certain values, see [expand_limits()]. For other +#' types of data, see [scale_x_discrete()], [scale_x_continuous()], [scale_x_date()]. +#' #' @export #' @examples #' # Zoom into a specified area @@ -46,6 +48,30 @@ #' ggplot(big, aes(mpg, wt, colour = factor(cyl))) + #' geom_point() + #' lims(colour = c("4", "6", "8")) +#' +#' # There are two ways of setting the axis limits: with limits or +#' # with coordinate systems. They work in two rather different ways. +#' +#' last_month <- Sys.Date() - 0:59 +#' df <- data.frame( +#' date = last_month, +#' price = c(rnorm(30, mean = 15), runif(30) + 0.2 * (1:30)) +#' ) +#' +#' p <- ggplot(df, aes(date, price)) + +#' geom_line() + +#' stat_smooth() +#' +#' p +#' +#' # Setting the limits with the scale discards all data outside the range. +#' p + lims(x= c(Sys.Date() - 30, NA), y = c(10, 20)) +#' +#' # For changing x or y axis limits **without** dropping data +#' # observations use [coord_cartesian()]. Setting the limits on the +#' # coordinate system performs a visual zoom. +#' p + coord_cartesian(xlim =c(Sys.Date() - 30, NA), ylim = c(10, 20)) +#' lims <- function(...) { args <- list(...) diff --git a/R/margins.R b/R/margins.R index 6314981f6a..b85f37a5fe 100644 --- a/R/margins.R +++ b/R/margins.R @@ -37,7 +37,7 @@ margin_width <- function(grob, margins) { #' #' @noRd title_spec <- function(label, x, y, hjust, vjust, angle, gp = gpar(), - debug = FALSE) { + debug = FALSE, check.overlap = FALSE) { if (is.null(label)) return(zeroGrob()) @@ -56,7 +56,8 @@ title_spec <- function(label, x, y, hjust, vjust, angle, gp = gpar(), hjust = hjust, vjust = vjust, rot = angle, - gp = gp + gp = gp, + check.overlap = check.overlap ) # The grob dimensions don't include the text descenders, so these need to be added @@ -175,7 +176,7 @@ add_margins <- function(grob, height, width, margin = NULL, #' @noRd titleGrob <- function(label, x, y, hjust, vjust, angle = 0, gp = gpar(), margin = NULL, margin_x = FALSE, margin_y = FALSE, - debug = FALSE) { + debug = FALSE, check.overlap = FALSE) { if (is.null(label)) return(zeroGrob()) @@ -189,7 +190,8 @@ titleGrob <- function(label, x, y, hjust, vjust, angle = 0, gp = gpar(), vjust = vjust, angle = angle, gp = gp, - debug = debug + debug = debug, + check.overlap = check.overlap ) add_margins( diff --git a/R/plot-construction.r b/R/plot-construction.r index c8ebca0378..4f49d9d16b 100644 --- a/R/plot-construction.r +++ b/R/plot-construction.r @@ -151,6 +151,11 @@ ggplot_add.list <- function(object, plot, object_name) { } plot } +#' @export +ggplot_add.by <- function(object, plot, object_name) { + ggplot_add.list(object, plot, object_name) +} + #' @export ggplot_add.Layer <- function(object, plot, object_name) { plot$layers <- append(plot$layers, object) diff --git a/R/position-dodge.r b/R/position-dodge.r index 6ad71f8edf..dd9f67fe52 100644 --- a/R/position-dodge.r +++ b/R/position-dodge.r @@ -3,7 +3,8 @@ #' Dodging preserves the vertical position of an geom while adjusting the #' horizontal position. `position_dodge2` is a special case of `position_dodge` #' for arranging box plots, which can have variable widths. `position_dodge2` -#' also works with bars and rectangles. +#' also works with bars and rectangles. But unlike `position_dodge`, +#' `position_dodge2` works without a grouping variable in a layer. #' #' @inheritParams position_identity #' @param width Dodging width, when different to the width of the individual diff --git a/R/position-stack.r b/R/position-stack.r index a34162d160..7e42a8aef3 100644 --- a/R/position-stack.r +++ b/R/position-stack.r @@ -177,6 +177,8 @@ PositionStack <- ggproto("PositionStack", Position, } negative <- data$ymax < 0 + negative[is.na(negative)] <- FALSE + neg <- data[negative, , drop = FALSE] pos <- data[!negative, , drop = FALSE] diff --git a/R/save.r b/R/save.r index 41ad6dcce8..965d9c115e 100644 --- a/R/save.r +++ b/R/save.r @@ -18,7 +18,9 @@ #' @param device Device to use. Can either be a device function #' (e.g. [png()]), or one of "eps", "ps", "tex" (pictex), #' "pdf", "jpeg", "tiff", "png", "bmp", "svg" or "wmf" (windows only). -#' @param path Path to save plot to (combined with filename). +#' @param path Path of the directory to save plot to: `path` and `filename` +#' are combined to create the fully qualified file name. Defaults to the +#' working directory. #' @param scale Multiplicative scaling factor. #' @param width,height,units Plot size in `units` ("in", "cm", or "mm"). #' If not supplied, uses the size of current graphics device. diff --git a/R/scale-.r b/R/scale-.r index 40c1ef33a5..9ccd77b42d 100644 --- a/R/scale-.r +++ b/R/scale-.r @@ -1,11 +1,282 @@ + +#' Continuous scale constructor +#' +#' @export +#' @param aesthetics The names of the aesthetics that this scale works with. +#' @param scale_name The name of the scale that should be used for error messages +#' associated with this scale. +#' @param palette A palette function that when called with a numeric vector with +#' values between 0 and 1 returns the corresponding output values +#' (e.g., [scales::area_pal()]). +#' @param name The name of the scale. Used as the axis or legend title. If +#' `waiver()`, the default, the name of the scale is taken from the first +#' mapping used for that aesthetic. If `NULL`, the legend title will be +#' omitted. +#' @param breaks One of: +#' - `NULL` for no breaks +#' - `waiver()` for the default breaks computed by the +#' [transformation object][scales::trans_new()] +#' - A numeric vector of positions +#' - A function that takes the limits as input and returns breaks +#' as output (e.g., a function returned by [scales::extended_breaks()]) +#' @param minor_breaks One of: +#' - `NULL` for no minor breaks +#' - `waiver()` for the default breaks (one minor break between +#' each major break) +#' - A numeric vector of positions +#' - A function that given the limits returns a vector of minor breaks. +#' @param labels One of: +#' - `NULL` for no labels +#' - `waiver()` for the default labels computed by the +#' transformation object +#' - A character vector giving labels (must be same length as `breaks`) +#' - A function that takes the breaks as input and returns labels +#' as output +#' @param limits One of: +#' - `NULL` to use the default scale range +#' - A numeric vector of length two providing limits of the scale. +#' Use `NA` to refer to the existing minimum or maximum +#' - A function that accepts the existing (automatic) limits and returns +#' new limits +#' Note that setting limits on positional scales will **remove** data outside of the limits. +#' If the purpose is to zoom, use the limit argument in the coordinate system +#' (see [coord_cartesian()]). +#' @param rescaler A function used to scale the input values to the +#' range \[0, 1]. This is always [scales::rescale()], except for +#' diverging and n colour gradients (i.e., [scale_colour_gradient2()], +#' [scale_colour_gradientn()]). The `rescaler` is ignored by position +#' scales, which ways use [scales::rescale()]. +#' @param oob One of: +#' - Function that handles limits outside of the scale limits +#' (out of bounds). +#' - The default ([scales::censor()]) replaces out of +#' bounds values with `NA`. +#' - [scales::squish()] for squishing out of bounds values into range. +#' - [scales::squish_infinite()] for squishing infitite values into range. +#' @param na.value Missing values will be replaced with this value. +#' @param trans For continuous scales, the name of a transformation object +#' or the object itself. Built-in transformations include "asn", "atanh", +#' "boxcox", "date", "exp", "hms", "identity", "log", "log10", "log1p", "log2", +#' "logit", "modulus", "probability", "probit", "pseudo_log", "reciprocal", +#' "reverse", "sqrt" and "time". +#' +#' A transformation object bundles together a transform, its inverse, +#' and methods for generating breaks and labels. Transformation objects +#' are defined in the scales package, and are called `_trans` (e.g., +#' [scales::boxcox_trans()]). You can create your own +#' transformation with [scales::trans_new()]. +#' @param guide A function used to create a guide or its name. See +#' [guides()] for more information. +#' @param expand For position scales, a vector of range expansion constants used to add some +#' padding around the data to ensure that they are placed some distance +#' away from the axes. Use the convenience function [expansion()] +#' to generate the values for the `expand` argument. The defaults are to +#' expand the scale by 5\% on each side for continuous variables, and by +#' 0.6 units on each side for discrete variables. +#' @param position For position scales, The position of the axis. +#' `left` or `right` for y axes, `top` or `bottom` for x axes. +#' @param super The super class to use for the constructed scale +#' @keywords internal +continuous_scale <- function(aesthetics, scale_name, palette, name = waiver(), + breaks = waiver(), minor_breaks = waiver(), labels = waiver(), limits = NULL, + rescaler = rescale, oob = censor, expand = waiver(), na.value = NA_real_, + trans = "identity", guide = "legend", position = "left", super = ScaleContinuous) { + + aesthetics <- standardise_aes_names(aesthetics) + + check_breaks_labels(breaks, labels) + + position <- match.arg(position, c("left", "right", "top", "bottom")) + + # If the scale is non-positional, break = NULL means removing the guide + if (is.null(breaks) && all(!is_position_aes(aesthetics))) { + guide <- "none" + } + + trans <- as.trans(trans) + if (!is.null(limits) && !is.function(limits)) { + limits <- trans$transform(limits) + } + + ggproto(NULL, super, + call = match.call(), + + aesthetics = aesthetics, + scale_name = scale_name, + palette = palette, + + range = continuous_range(), + limits = limits, + trans = trans, + na.value = na.value, + expand = expand, + rescaler = rescaler, + oob = oob, + + name = name, + breaks = breaks, + minor_breaks = minor_breaks, + + labels = labels, + guide = guide, + position = position + ) +} + +#' Discrete scale constructor +#' +#' @export +#' @inheritParams continuous_scale +#' @param palette A palette function that when called with a single integer +#' argument (the number of levels in the scale) returns the values that +#' they should take (e.g., [scales::hue_pal()]). +#' @param breaks One of: +#' - `NULL` for no breaks +#' - `waiver()` for the default breaks (the scale limits) +#' - A character vector of breaks +#' - A function that takes the limits as input and returns breaks +#' as output +#' @param limits A character vector that defines possible values of the scale +#' and their order. +#' @param drop Should unused factor levels be omitted from the scale? +#' The default, `TRUE`, uses the levels that appear in the data; +#' `FALSE` uses all the levels in the factor. +#' @param na.translate Unlike continuous scales, discrete scales can easily show +#' missing values, and do so by default. If you want to remove missing values +#' from a discrete scale, specify `na.translate = FALSE`. +#' @param na.value If `na.translate = TRUE`, what value aesthetic +#' value should missing be displayed as? Does not apply to position scales +#' where `NA` is always placed at the far right. +#' @keywords internal +discrete_scale <- function(aesthetics, scale_name, palette, name = waiver(), + breaks = waiver(), labels = waiver(), limits = NULL, expand = waiver(), + na.translate = TRUE, na.value = NA, drop = TRUE, + guide = "legend", position = "left", super = ScaleDiscrete) { + + aesthetics <- standardise_aes_names(aesthetics) + + check_breaks_labels(breaks, labels) + + position <- match.arg(position, c("left", "right", "top", "bottom")) + + # If the scale is non-positional, break = NULL means removing the guide + if (is.null(breaks) && all(!is_position_aes(aesthetics))) { + guide <- "none" + } + + ggproto(NULL, super, + call = match.call(), + + aesthetics = aesthetics, + scale_name = scale_name, + palette = palette, + + range = discrete_range(), + limits = limits, + na.value = na.value, + na.translate = na.translate, + expand = expand, + + name = name, + breaks = breaks, + labels = labels, + drop = drop, + guide = guide, + position = position + ) +} + #' @section Scales: #' -#' All `scale_*` functions (like `scale_x_continuous`) return a -#' `Scale*` object (like `ScaleContinuous`). The `Scale*` -#' object represents a single scale. +#' All `scale_*` functions like [scale_x_continuous()] return a `Scale*` +#' object like `ScaleContinuous`. Each of the `Scale*` objects is a [ggproto()] +#' object, descended from the top-level `Scale`. +#' +#' Properties not documented in [continuous_scale()] or [discrete_scale()]: +#' +#' - `call` The call to [continuous_scale()] or [discrete_scale()] that constructed +#' the scale. +#' +#' - `range` One of `continuous_range()` or `discrete_range()`. +#' +#' +#' Methods: +#' +#' - `is_discrete()` Returns `TRUE` if the scale is a discrete scale +#' +#' - `is_empty()` Returns `TRUE` if the scale contains no information (i.e., +#' it has no information with which to calculate its `limits`). +#' +#' - `clone()` Returns a copy of the scale that can be trained +#' independently without affecting the original scale. +#' +#' - `transform()` Transforms a vector of values using `self$trans`. +#' This occurs before the `Stat` is calculated. +#' +#' - `train()` Update the `self$range` of observed (transformed) data values with +#' a vector of (possibly) new values. +#' +#' - `reset()` Reset the `self$range` of observed data values. For discrete +#' position scales, only the continuous range is reset. +#' +#' - `map()` Map transformed data values to some output value as +#' determined by `self$rescale()` and `self$pallete` (except for position scales, +#' which do not use the default implementation of this method). The output corresponds +#' to the transformed data value in aesthetic space (e.g., a color, line width, or size). +#' +#' - `rescale()` Rescale transformed data to the the range 0, 1. This is most useful for +#' position scales. For continuous scales, `rescale()` uses the `rescaler` that +#' was provided to the constructor. `rescale()` does not apply `self$oob()` to +#' its input, which means that discrete values outside `limits` will be `NA`, and +#' values that are outside `range` will have values less than 0 or greater than 1. +#' This allows guides more control over how out-of-bounds values are displayed. +#' +#' - `transform_df()`, `train_df()`, `map_df()` These `_df` variants +#' accept a data frame, and apply the `transform`, `train`, and `map` methods +#' (respectively) to the columns whose names are in `self$aesthetics`. +#' +#' - `get_limits()` Calculates the final scale limits in transformed data space +#' based on the combination of `self$limits` and/or the range of observed values +#' (`self$range`). +#' +#' - `get_breaks()` Calculates the final scale breaks in transformed data space +#' based on on the combination of `self$breaks`, `self$trans$breaks()` (for +#' continuous scales), and `limits`. Breaks outside of `limits` are assigned +#' a value of `NA` (continuous scales) or dropped (discrete scales). +#' +#' - `get_labels()` Calculates labels for a given set of (transformed) `breaks` +#' based on the combination of `self$labels` and `breaks`. +#' +#' - `get_breaks_minor()` For continuous scales, calculates the final scale minor breaks +#' in transformed data space based on the rescaled `breaks`, the value of `self$minor_breaks`, +#' and the value of `self$trans$minor_breaks()`. Discrete scales always return `NULL`. +#' +#' - `make_title()` Hook to modify the title that is calculated during guide construction +#' (for non-position scales) or when the `Layout` calculates the x and y labels +#' (position scales). +#' +#' These methods are only valid for position (x and y) scales: #' -#' Each of the `Scale*` objects is a [ggproto()] object, -#' descended from the top-level `Scale`. +#' - `dimension()` For continuous scales, the dimension is the same concept as the limits. +#' For discrete scales, `dimension()` returns a continuous range, where the limits +#' would be placed at integer positions. `dimension()` optionally expands +#' this range given an expantion of length 4 (see [expansion()]). +#' +#' - `break_info()` Returns a `list()` with calculated values needed for the `Coord` +#' to transform values in transformed data space. Axis and grid guides also use +#' these values to draw guides. This is called with +#' a (usually expanded) continuous range, such as that returned by `self$dimension()` +#' (even for discrete scales). The list has components `major_source` +#' (`self$get_breaks()` for continuous scales, or `seq_along(self$get_breaks())` +#' for discrete scales), `major` (the rescaled value of `major_source`, ignoring +#' `self$rescaler`), `minor` (the rescaled value of `minor_source`, ignoring +#' `self$rescaler`), `range` (the range that was passed in to `break_info()`), +#' `labels` (the label values, one for each element in `breaks`). +#' +#' - `axis_order()` One of `c("primary", "secondary")` or `c("secondary", "primary")` +#' +#' - `make_sec_title()` Hook to modify the title for the second axis that is calculated +#' when the `Layout` calculates the x and y labels. #' #' @rdname ggplot2-ggproto #' @format NULL @@ -36,10 +307,6 @@ Scale <- ggproto("Scale", NULL, stop("Not implemented", call. = FALSE) }, - # Train scale from a data frame. - # - # @return updated range (invisibly) - # @seealso [scale_train()] for scale specific generic method train_df = function(self, df) { if (empty(df)) return() @@ -50,12 +317,10 @@ Scale <- ggproto("Scale", NULL, invisible() }, - # Train an individual scale from a vector of data. train = function(self, x) { stop("Not implemented", call. = FALSE) }, - # Reset scale, untraining ranges reset = function(self) { self$range$reset() }, @@ -64,12 +329,15 @@ Scale <- ggproto("Scale", NULL, is.null(self$range$range) && is.null(self$limits) }, - # @return list of transformed variables transform_df = function(self, df) { - if (empty(df)) return() + if (empty(df)) { + return() + } aesthetics <- intersect(self$aesthetics, names(df)) - if (length(aesthetics) == 0) return() + if (length(aesthetics) == 0) { + return() + } lapply(df[aesthetics], self$transform) }, @@ -78,13 +346,16 @@ Scale <- ggproto("Scale", NULL, stop("Not implemented", call. = FALSE) }, - # @return list of mapped variables map_df = function(self, df, i = NULL) { - if (empty(df)) return() + if (empty(df)) { + return() + } aesthetics <- intersect(self$aesthetics, names(df)) names(aesthetics) <- aesthetics - if (length(aesthetics) == 0) return() + if (length(aesthetics) == 0) { + return() + } if (is.null(i)) { lapply(aesthetics, function(j) self$map(df[[j]])) @@ -93,25 +364,18 @@ Scale <- ggproto("Scale", NULL, } }, - # @kohske - # map tentatively accept limits argument. - # map replaces oob (i.e., outside limits) values with NA. - # - # Previously limits are always scale_limits(scale). - # But if this function is called to get breaks, - # and breaks spans oob, the oob breaks is replaces by NA. - # This makes impossible to display oob breaks. - # Now coord_train calls this function with limits determined by coord (with expansion). map = function(self, x, limits = self$get_limits()) { stop("Not implemented", call. = FALSE) }, - # if scale is a function, apply it to the default (inverted) scale range - # if scale is NULL, use the default scale range - # if scale contains a NA, use the default range for that axis, otherwise - # use the user defined limit for that axis + rescale = function(self, x, limits = self$get_limits(), range = self$dimension()) { + stop("Not implemented", call. = FALSE) + }, + get_limits = function(self) { - if (self$is_empty()) return(c(0, 1)) + if (self$is_empty()) { + return(c(0, 1)) + } if (is.null(self$limits)) { self$range$range @@ -123,10 +387,7 @@ Scale <- ggproto("Scale", NULL, } }, - # The physical size of the scale. - # This always returns a numeric vector of length 4, giving the physical - # dimensions of a scale. - dimension = function(self, expand = c(0, 0, 0, 0)) { + dimension = function(self, expand = expansion(0, 0), limits = self$get_limits()) { stop("Not implemented", call. = FALSE) }, @@ -134,7 +395,6 @@ Scale <- ggproto("Scale", NULL, stop("Not implemented", call. = FALSE) }, - # The numeric position of scale breaks, used by coord/guide break_positions = function(self, range = self$get_limits()) { self$map(self$get_breaks(range)) }, @@ -147,8 +407,6 @@ Scale <- ggproto("Scale", NULL, stop("Not implemented", call. = FALSE) }, - # Each implementation of a Scale must implement a clone method that makes - # copies of reference objecsts. clone = function(self) { stop("Not implemented", call. = FALSE) }, @@ -157,7 +415,6 @@ Scale <- ggproto("Scale", NULL, stop("Not implemented", call. = FALSE) }, - # Only relevant for positional scales axis_order = function(self) { ord <- c("primary", "secondary") if (self$position %in% c("right", "bottom")) { @@ -166,18 +423,22 @@ Scale <- ggproto("Scale", NULL, ord }, - # Here to make it possible for scales to modify the default titles make_title = function(title) { title }, + make_sec_title = function(title) { title } ) check_breaks_labels <- function(breaks, labels) { - if (is.null(breaks)) return(TRUE) - if (is.null(labels)) return(TRUE) + if (is.null(breaks)) { + return(TRUE) + } + if (is.null(labels)) { + return(TRUE) + } bad_labels <- is.atomic(breaks) && is.atomic(labels) && length(breaks) != length(labels) @@ -196,14 +457,16 @@ check_breaks_labels <- function(breaks, labels) { ScaleContinuous <- ggproto("ScaleContinuous", Scale, range = continuous_range(), na.value = NA_real_, - rescaler = rescale, # Used by diverging and n colour gradients x + rescaler = rescale, oob = censor, minor_breaks = waiver(), is_discrete = function() FALSE, train = function(self, x) { - if (length(x) == 0) return() + if (length(x) == 0) { + return() + } self$range$train(x) }, @@ -218,7 +481,7 @@ ScaleContinuous <- ggproto("ScaleContinuous", Scale, }, map = function(self, x, limits = self$get_limits()) { - x <- self$rescaler(self$oob(x, range = limits), from = limits) + x <- self$rescale(self$oob(x, range = limits), limits) uniq <- unique(x) pal <- self$palette(uniq) @@ -227,21 +490,31 @@ ScaleContinuous <- ggproto("ScaleContinuous", Scale, ifelse(!is.na(scaled), scaled, self$na.value) }, - dimension = function(self, expand = c(0, 0, 0, 0)) { - expand_range4(self$get_limits(), expand) + rescale = function(self, x, limits = self$get_limits(), range = limits) { + self$rescaler(x, from = range) + }, + + dimension = function(self, expand = expansion(0, 0), limits = self$get_limits()) { + expand_limits_scale(self, expand, limits) }, get_breaks = function(self, limits = self$get_limits()) { - if (self$is_empty()) return(numeric()) + if (self$is_empty()) { + return(numeric()) + } # Limits in transformed space need to be converted back to data space limits <- self$trans$inverse(limits) if (is.null(self$breaks)) { return(NULL) - } else if (identical(self$breaks, NA)) { - stop("Invalid breaks specification. Use NULL, not NA") - } else if (zero_range(as.numeric(limits))) { + } + + if (identical(self$breaks, NA)) { + stop("Invalid breaks specification. Use NULL, not NA", call. = FALSE) + } + + if (zero_range(as.numeric(limits))) { breaks <- limits[1] } else if (is.waive(self$breaks)) { breaks <- self$trans$breaks(limits) @@ -252,13 +525,10 @@ ScaleContinuous <- ggproto("ScaleContinuous", Scale, } # Breaks in data space need to be converted back to transformed space - # And any breaks outside the dimensions need to be flagged as missing - # - # @kohske - # TODO: replace NA with something else for flag. - # guides cannot discriminate oob from missing value. - breaks <- censor(self$trans$transform(breaks), self$trans$transform(limits), - only.finite = FALSE) + breaks <- self$trans$transform(breaks) + # Any breaks outside the dimensions are flagged as missing + breaks <- censor(breaks, self$trans$transform(limits), only.finite = FALSE) + breaks }, @@ -269,9 +539,13 @@ ScaleContinuous <- ggproto("ScaleContinuous", Scale, if (is.null(self$minor_breaks)) { return(NULL) - } else if (identical(self$minor_breaks, NA)) { + } + + if (identical(self$minor_breaks, NA)) { stop("Invalid minor_breaks specification. Use NULL, not NA", call. = FALSE) - } else if (is.waive(self$minor_breaks)) { + } + + if (is.waive(self$minor_breaks)) { if (is.null(b)) { breaks <- NULL } else { @@ -290,24 +564,32 @@ ScaleContinuous <- ggproto("ScaleContinuous", Scale, }, get_labels = function(self, breaks = self$get_breaks()) { - if (is.null(breaks)) return(NULL) + if (is.null(breaks)) { + return(NULL) + } breaks <- self$trans$inverse(breaks) if (is.null(self$labels)) { return(NULL) - } else if (identical(self$labels, NA)) { + } + + if (identical(self$labels, NA)) { stop("Invalid labels specification. Use NULL, not NA", call. = FALSE) - } else if (is.waive(self$labels)) { + } + + if (is.waive(self$labels)) { labels <- self$trans$format(breaks) } else if (is.function(self$labels)) { labels <- self$labels(breaks) } else { labels <- self$labels } + if (length(labels) != length(breaks)) { - stop("Breaks and labels are different lengths") + stop("Breaks and labels are different lengths", call. = FALSE) } + labels }, @@ -339,9 +621,14 @@ ScaleContinuous <- ggproto("ScaleContinuous", Scale, major_n <- rescale(major, from = range) minor_n <- rescale(minor, from = range) - list(range = range, labels = labels, - major = major_n, minor = minor_n, - major_source = major, minor_source = minor) + list( + range = range, + labels = labels, + major = major_n, + minor = minor_n, + major_source = major, + minor_source = minor + ) }, print = function(self, ...) { @@ -367,7 +654,9 @@ ScaleDiscrete <- ggproto("ScaleDiscrete", Scale, is_discrete = function() TRUE, train = function(self, x) { - if (length(x) == 0) return() + if (length(x) == 0) { + return() + } self$range$train(x, drop = self$drop, na.rm = !self$na.translate) }, @@ -380,7 +669,9 @@ ScaleDiscrete <- ggproto("ScaleDiscrete", Scale, if (!is.null(self$n.breaks.cache) && self$n.breaks.cache == n) { pal <- self$palette.cache } else { - if (!is.null(self$n.breaks.cache)) warning("Cached palette does not match requested", call. = FALSE) + if (!is.null(self$n.breaks.cache)) { + warning("Cached palette does not match requested", call. = FALSE) + } pal <- self$palette(n) self$palette.cache <- pal self$n.breaks.cache <- n @@ -400,18 +691,28 @@ ScaleDiscrete <- ggproto("ScaleDiscrete", Scale, } }, - dimension = function(self, expand = c(0, 0, 0, 0)) { - expand_range4(length(self$get_limits()), expand) + rescale = function(self, x, limits = self$get_limits(), range = c(1, length(limits))) { + rescale(x, match(as.character(x), limits), from = range) + }, + + dimension = function(self, expand = expansion(0, 0), limits = self$get_limits()) { + expand_limits_discrete(limits, expand = expand) }, get_breaks = function(self, limits = self$get_limits()) { - if (self$is_empty()) return(numeric()) + if (self$is_empty()) { + return(numeric()) + } if (is.null(self$breaks)) { return(NULL) - } else if (identical(self$breaks, NA)) { + } + + if (identical(self$breaks, NA)) { stop("Invalid breaks specification. Use NULL, not NA", call. = FALSE) - } else if (is.waive(self$breaks)) { + } + + if (is.waive(self$breaks)) { breaks <- limits } else if (is.function(self$breaks)) { breaks <- self$breaks(limits) @@ -419,23 +720,31 @@ ScaleDiscrete <- ggproto("ScaleDiscrete", Scale, breaks <- self$breaks } - # Breaks can only occur only on values in domain - in_domain <- intersect(breaks, self$get_limits()) + # Breaks only occur only on values in domain + in_domain <- intersect(breaks, limits) structure(in_domain, pos = match(in_domain, breaks)) }, get_breaks_minor = function(...) NULL, get_labels = function(self, breaks = self$get_breaks()) { - if (self$is_empty()) return(character()) + if (self$is_empty()) { + return(character()) + } - if (is.null(breaks)) return(NULL) + if (is.null(breaks)) { + return(NULL) + } if (is.null(self$labels)) { return(NULL) - } else if (identical(self$labels, NA)) { + } + + if (identical(self$labels, NA)) { stop("Invalid labels specification. Use NULL, not NA", call. = FALSE) - } else if (is.waive(self$labels)) { + } + + if (is.waive(self$labels)) { breaks <- self$get_breaks() if (is.numeric(breaks)) { # Only format numbers, because on Windows, format messes up encoding @@ -490,185 +799,17 @@ ScaleDiscrete <- ggproto("ScaleDiscrete", Scale, major_n <- rescale(major, from = range) } - list(range = range, labels = labels, - major = major_n, minor = NULL, - major_source = major, minor_source = NULL) + list( + range = range, + labels = labels, + major = major_n, + minor = NULL, + major_source = major, + minor_source = NULL + ) } ) - -#' Continuous scale constructor. -#' -#' @export -#' @param aesthetics The names of the aesthetics that this scale works with -#' @param scale_name The name of the scale -#' @param palette A palette function that when called with a numeric vector with -#' values between 0 and 1 returns the corresponding values in the range the -#' scale maps to. -#' @param name The name of the scale. Used as the axis or legend title. If -#' `waiver()`, the default, the name of the scale is taken from the first -#' mapping used for that aesthetic. If `NULL`, the legend title will be -#' omitted. -#' @param breaks One of: -#' - `NULL` for no breaks -#' - `waiver()` for the default breaks computed by the -#' transformation object -#' - A numeric vector of positions -#' - A function that takes the limits as input and returns breaks -#' as output -#' @param minor_breaks One of: -#' - `NULL` for no minor breaks -#' - `waiver()` for the default breaks (one minor break between -#' each major break) -#' - A numeric vector of positions -#' - A function that given the limits returns a vector of minor breaks. -#' @param labels One of: -#' - `NULL` for no labels -#' - `waiver()` for the default labels computed by the -#' transformation object -#' - A character vector giving labels (must be same length as `breaks`) -#' - A function that takes the breaks as input and returns labels -#' as output -#' @param limits One of: -#' - `NULL` to use the default scale range -#' - A numeric vector of length two providing limits of the scale. -#' Use `NA` to refer to the existing minimum or maximum -#' - A function that accepts the existing (automatic) limits and returns -#' new limits -#' @param rescaler Used by diverging and n colour gradients -#' (i.e. [scale_colour_gradient2()], [scale_colour_gradientn()]). -#' A function used to scale the input values to the range \[0, 1]. -#' @param oob Function that handles limits outside of the scale limits -#' (out of bounds). The default replaces out of bounds values with `NA`. -#' @inheritParams scale_x_discrete -#' @param na.value Missing values will be replaced with this value. -#' @param trans Either the name of a transformation object, or the -#' object itself. Built-in transformations include "asn", "atanh", -#' "boxcox", "date", "exp", "hms", "identity", "log", "log10", "log1p", "log2", -#' "logit", "modulus", "probability", "probit", "pseudo_log", "reciprocal", -#' "reverse", "sqrt" and "time". -#' -#' A transformation object bundles together a transform, its inverse, -#' and methods for generating breaks and labels. Transformation objects -#' are defined in the scales package, and are called `name_trans`, e.g. -#' [scales::boxcox_trans()]. You can create your own -#' transformation with [scales::trans_new()]. -#' @param guide A function used to create a guide or its name. See -#' [guides()] for more info. -#' @param position The position of the axis. "left" or "right" for vertical -#' scales, "top" or "bottom" for horizontal scales -#' @param super The super class to use for the constructed scale -#' @keywords internal -continuous_scale <- function(aesthetics, scale_name, palette, name = waiver(), - breaks = waiver(), minor_breaks = waiver(), labels = waiver(), limits = NULL, - rescaler = rescale, oob = censor, expand = waiver(), na.value = NA_real_, - trans = "identity", guide = "legend", position = "left", super = ScaleContinuous) { - - aesthetics <- standardise_aes_names(aesthetics) - - check_breaks_labels(breaks, labels) - - position <- match.arg(position, c("left", "right", "top", "bottom")) - - # If the scale is non-positional, break = NULL means removing the guide - if (is.null(breaks) && all(!is_position_aes(aesthetics))) { - guide <- "none" - } - - trans <- as.trans(trans) - if (!is.null(limits) && !is.function(limits)) { - limits <- trans$transform(limits) - } - - ggproto(NULL, super, - call = match.call(), - - aesthetics = aesthetics, - scale_name = scale_name, - palette = palette, - - range = continuous_range(), - limits = limits, - trans = trans, - na.value = na.value, - expand = expand, - rescaler = rescaler, # Used by diverging and n colour gradients - oob = oob, - - name = name, - breaks = breaks, - minor_breaks = minor_breaks, - - labels = labels, - guide = guide, - position = position - ) -} - -#' Discrete scale constructor. -#' -#' @export -#' @inheritParams continuous_scale -#' @param palette A palette function that when called with a single integer -#' argument (the number of levels in the scale) returns the values that -#' they should take. -#' @param breaks One of: -#' - `NULL` for no breaks -#' - `waiver()` for the default breaks computed by the -#' transformation object -#' - A character vector of breaks -#' - A function that takes the limits as input and returns breaks -#' as output -#' @param limits A character vector that defines possible values of the scale -#' and their order. -#' @param drop Should unused factor levels be omitted from the scale? -#' The default, `TRUE`, uses the levels that appear in the data; -#' `FALSE` uses all the levels in the factor. -#' @param na.translate Unlike continuous scales, discrete scales can easily show -#' missing values, and do so by default. If you want to remove missing values -#' from a discrete scale, specify `na.translate = FALSE`. -#' @param na.value If `na.translate = TRUE`, what value aesthetic -#' value should missing be displayed as? Does not apply to position scales -#' where `NA` is always placed at the far right. -#' @keywords internal -discrete_scale <- function(aesthetics, scale_name, palette, name = waiver(), - breaks = waiver(), labels = waiver(), limits = NULL, expand = waiver(), - na.translate = TRUE, na.value = NA, drop = TRUE, - guide = "legend", position = "left", super = ScaleDiscrete) { - - aesthetics <- standardise_aes_names(aesthetics) - - check_breaks_labels(breaks, labels) - - position <- match.arg(position, c("left", "right", "top", "bottom")) - - # If the scale is non-positional, break = NULL means removing the guide - if (is.null(breaks) && all(!is_position_aes(aesthetics))) { - guide <- "none" - } - - ggproto(NULL, super, - call = match.call(), - - aesthetics = aesthetics, - scale_name = scale_name, - palette = palette, - - range = discrete_range(), - limits = limits, - na.value = na.value, - na.translate = na.translate, - expand = expand, - - name = name, - breaks = breaks, - labels = labels, - drop = drop, - guide = guide, - position = position - ) -} - # In place modification of a scale to change the primary axis scale_flip_position <- function(scale) { scale$position <- switch(scale$position, diff --git a/R/scale-brewer.r b/R/scale-brewer.r index b9c5a36eb6..ccbfc869ac 100644 --- a/R/scale-brewer.r +++ b/R/scale-brewer.r @@ -23,11 +23,15 @@ #' \item{Sequential}{Blues, BuGn, BuPu, GnBu, Greens, Greys, Oranges, #' OrRd, PuBu, PuBuGn, PuRd, Purples, RdPu, Reds, YlGn, YlGnBu, YlOrBr, YlOrRd} #' } +#' Modify the palette through the `palette` arguement. #' #' @inheritParams scales::brewer_pal #' @inheritParams scale_colour_hue #' @inheritParams scale_colour_gradient #' @inheritParams scales::gradient_n_pal +#' @param palette If a string, will use that named palette. If a number, will index into +#' the list of palettes of appropriate `type`. The list of available palettes can found +#' in the Palettes section. #' @param ... Other arguments passed on to [discrete_scale()] or, for #' `distiller` scales, [continuous_scale()] to control name, #' limits, breaks, labels and so forth. diff --git a/R/scale-colour.r b/R/scale-colour.r index d3fd460336..d50fb9e22c 100644 --- a/R/scale-colour.r +++ b/R/scale-colour.r @@ -4,7 +4,7 @@ #' `ggplot2.continuous.colour` and `ggplot2.continuous.fill` options. If these #' options are not present, `"gradient"` will be used. See [options()] for more #' information. -#' +#' #' @param ... Additional parameters passed on to the scale type #' @param type One of "gradient" (the default) or "viridis" indicating the #' colour scale to use @@ -12,14 +12,28 @@ #' [scale_fill_gradient()], and [scale_fill_viridis_c()] #' @export #' @rdname scale_colour_continuous +#' @section Color Blindness: +#' Many color palettes derived from RGB combinations (like the "rainbow" color +#' palette) are not suitable to support all viewers, especially those with +#' color vision deficiencies. Using `viridis` type, which is perceptually +#' uniform in both colour and black-and-white display is an easy option to +#' ensure good perceptive properties of your visulizations. +#' The colorspace package offers functionalities +#' - to generate color palettes with good perceptive properties, +#' - to analyse a given color palette, like emulating color blindness, +#' - and to modify a given color palette for better perceptivity. +#' +#' For more information on color vision deficiencies and suitable color choices +#' see the [paper on the colorspace package](https://arxiv.org/abs/1903.06490) +#' and references therein. #' @examples #' v <- ggplot(faithfuld, aes(waiting, eruptions, fill = density)) + #' geom_tile() #' v -#' +#' #' v + scale_fill_continuous(type = "gradient") #' v + scale_fill_continuous(type = "viridis") -#' +#' #' # The above are equivalent to #' v + scale_fill_gradient() #' v + scale_fill_viridis_c() diff --git a/R/scale-continuous.r b/R/scale-continuous.r index f7285d4acd..662f39b913 100644 --- a/R/scale-continuous.r +++ b/R/scale-continuous.r @@ -70,9 +70,8 @@ NULL #' @rdname scale_continuous #' -#' @param sec.axis specify a secondary axis +#' @param sec.axis [sec_axis()] is used to specify a secondary axis. #' -#' @seealso [sec_axis()] for how to specify secondary axes #' @export scale_x_continuous <- function(name = waiver(), breaks = waiver(), minor_breaks = waiver(), labels = waiver(), diff --git a/R/scale-discrete-.r b/R/scale-discrete-.r index 7d1d5999b3..afcc4a0794 100644 --- a/R/scale-discrete-.r +++ b/R/scale-discrete-.r @@ -1,20 +1,17 @@ #' Position scales for discrete data #' +#' `scale_x_discrete` and `scale_y_discrete` are used to set the values for +#' discrete x and y scale aesthetics. For simple manipulation of scale labels +#' and limits, you may wish to use [labs()] and [lims()] instead. +#' #' You can use continuous positions even with a discrete position scale - #' this allows you (e.g.) to place labels between bars in a bar chart. #' Continuous positions are numeric values starting at one for the first #' level, and increasing by one for each level (i.e. the labels are placed #' at integer positions). This is what allows jittering to work. #' -#' @inheritDotParams discrete_scale -expand -position -#' @param expand Vector of range expansion constants used to add some -#' padding around the data, to ensure that they are placed some distance -#' away from the axes. Use the convenience function [expand_scale()] -#' to generate the values for the `expand` argument. The defaults are to -#' expand the scale by 5\% on each side for continuous variables, and by -#' 0.6 units on each side for discrete variables. -#' @param position The position of the axis. `left` or `right` for y -#' axes, `top` or `bottom` for x axes +#' @inheritDotParams discrete_scale +#' @inheritParams discrete_scale #' @rdname scale_discrete #' @family position scales #' @export @@ -86,9 +83,17 @@ ScaleDiscretePosition <- ggproto("ScaleDiscretePosition", ScaleDiscrete, }, get_limits = function(self) { - if (self$is_empty()) return(c(0, 1)) - - self$limits %||% self$range$range %||% integer() + if (self$is_empty()) { + c(0, 1) + } else if (!is.null(self$limits) & !is.function(self$limits)){ + self$limits + } else if (is.null(self$limits)) { + self$range$range + } else if (is.function(self$limits)) { + self$limits(self$range$range) + } else { + integer(0) + } }, is_empty = function(self) { @@ -96,7 +101,7 @@ ScaleDiscretePosition <- ggproto("ScaleDiscretePosition", ScaleDiscrete, }, reset = function(self) { - # Can't reset discrete scale because no way to recover values + # Can't reset discrete position scale because no way to recover values self$range_c$reset() }, @@ -108,26 +113,12 @@ ScaleDiscretePosition <- ggproto("ScaleDiscretePosition", ScaleDiscrete, } }, - dimension = function(self, expand = c(0, 0, 0, 0)) { - c_range <- self$range_c$range - d_range <- self$get_limits() - - if (self$is_empty()) { - c(0, 1) - } else if (is.null(self$range$range)) { # only continuous - expand_range4(c_range, expand) - } else if (is.null(c_range)) { # only discrete - expand_range4(c(1, length(d_range)), expand) - } else { # both - range( - c_range, - expand_range4(c(1, length(d_range)), expand) - ) - } + rescale = function(self, x, limits = self$get_limits(), range = self$dimension(limits = limits)) { + rescale(self$map(x, limits = limits), from = range) }, - get_breaks = function(self, limits = self$get_limits()) { - ggproto_parent(ScaleDiscrete, self)$get_breaks(limits) + dimension = function(self, expand = expansion(0, 0), limits = self$get_limits()) { + expand_limits_scale(self, expand, limits) }, clone = function(self) { diff --git a/R/scale-expansion.r b/R/scale-expansion.r new file mode 100644 index 0000000000..ee0e6952b4 --- /dev/null +++ b/R/scale-expansion.r @@ -0,0 +1,236 @@ + +#' Generate expansion vector for scales +#' +#' This is a convenience function for generating scale expansion vectors +#' for the `expand` argument of [scale_(x|y)_continuous][scale_x_continuous()] +#' and [scale_(x|y)_discrete][scale_x_discrete()]. The expansion vectors are used to +#' add some space between the data and the axes. +#' +#' @param mult vector of multiplicative range expansion factors. +#' If length 1, both the lower and upper limits of the scale +#' are expanded outwards by `mult`. If length 2, the lower limit +#' is expanded by `mult[1]` and the upper limit by `mult[2]`. +#' @param add vector of additive range expansion constants. +#' If length 1, both the lower and upper limits of the scale +#' are expanded outwards by `add` units. If length 2, the +#' lower limit is expanded by `add[1]` and the upper +#' limit by `add[2]`. +#' +#' @export +#' @examples +#' # No space below the bars but 10% above them +#' ggplot(mtcars) + +#' geom_bar(aes(x = factor(cyl))) + +#' scale_y_continuous(expand = expansion(mult = c(0, .1))) +#' +#' # Add 2 units of space on the left and right of the data +#' ggplot(subset(diamonds, carat > 2), aes(cut, clarity)) + +#' geom_jitter() + +#' scale_x_discrete(expand = expansion(add = 2)) +#' +#' # Reproduce the default range expansion used +#' # when the 'expand' argument is not specified +#' ggplot(subset(diamonds, carat > 2), aes(cut, price)) + +#' geom_jitter() + +#' scale_x_discrete(expand = expansion(add = .6)) + +#' scale_y_continuous(expand = expansion(mult = .05)) +#' +expansion <- function(mult = 0, add = 0) { + stopifnot( + is.numeric(mult), (length(mult) %in% 1:2), + is.numeric(add), (length(add) %in% 1:2) + ) + + mult <- rep(mult, length.out = 2) + add <- rep(add, length.out = 2) + c(mult[1], add[1], mult[2], add[2]) +} + +#' @rdname expansion +#' @export +expand_scale <- function(mult = 0, add = 0) { + .Deprecated(msg = "`expand_scale()` is deprecated; use `expansion()` instead.") + expansion(mult, add) +} + +#' Expand a numeric range +#' +#' @param limits A numeric vector of length 2 giving the +#' range to expand. +#' @param expand A numeric vector of length 2 (`c(add, mult)`) +#' or length 4 (`c(mult_left, add_left, mult_right, add_right)`), +#' as generated by [expansion()]. +#' +#' @return The expanded `limits` +#' +#' @noRd +#' +expand_range4 <- function(limits, expand) { + stopifnot( + is.numeric(expand), + length(expand) %in% c(2,4) + ) + + if (all(!is.finite(limits))) { + return(c(-Inf, Inf)) + } + + # If only two expansion constants are given (i.e. the old syntax), + # reuse them to generate a four-element expansion vector + if (length(expand) == 2) { + expand <- c(expand, expand) + } + + # Calculate separate range expansion for the lower and + # upper range limits, and then combine them into one vector + lower <- expand_range(limits, expand[1], expand[2])[1] + upper <- expand_range(limits, expand[3], expand[4])[2] + c(lower, upper) +} + +#' Calculate the default expansion for a scale +#' +#' @param scale A position scale (e.g., [scale_x_continuous()] or [scale_x_discrete()]) +#' @param discrete,continuous Default scale expansion factors for +#' discrete and continuous scales, respectively. +#' @param expand Should any expansion be applied? +#' +#' @return One of `discrete`, `continuous`, or `scale$expand` +#' @noRd +#' +default_expansion <- function(scale, discrete = expansion(add = 0.6), + continuous = expansion(mult = 0.05), expand = TRUE) { + if (!expand) { + return(expansion(0, 0)) + } + + scale$expand %|W|% if (scale$is_discrete()) discrete else continuous +} + +#' Expand limits in (possibly) transformed space +#' +#' These functions calculate the continuous range in coordinate space +#' and in scale space. Usually these can be calculated from +#' each other using the coordinate system transformation, except +#' when transforming and expanding the scale limits results in values outside +#' the domain of the transformation (e.g., a lower limit of 0 with a square root +#' transformation). +#' +#' @param scale A position scale (see [scale_x_continuous()] and [scale_x_discrete()]) +#' @param limits The initial scale limits, in scale-transformed space. +#' @param coord_limits The user-provided limits in scale-transformed space, +#' which may include one more more NA values, in which case those limits +#' will fall back to the `limits`. In `expand_limits_scale()`, `coord_limits` +#' are in user data space and can be `NULL` (unspecified), since the transformation +#' from user to mapped space is different for each scale. +#' @param expand An expansion generated by [expansion()] or [default_expansion()]. +#' @param trans The coordinate system transformation. +#' +#' @return A list with components `continuous_range`, which is the +#' expanded range in scale-transformed space, and `continuous_range_coord`, +#' which is the expanded range in coordinate-transformed space. +#' +#' @noRd +#' +expand_limits_scale <- function(scale, expand = expansion(0, 0), limits = waiver(), + coord_limits = NULL) { + limits <- limits %|W|% scale$get_limits() + + if (scale$is_discrete()) { + coord_limits <- coord_limits %||% c(NA_real_, NA_real_) + expand_limits_discrete( + limits, + expand, + coord_limits, + range_continuous = scale$range_c$range + ) + } else { + # using the inverse transform to resolve the NA value is needed for date/datetime/time + # scales, which refuse to transform objects of the incorrect type + coord_limits <- coord_limits %||% scale$trans$inverse(c(NA_real_, NA_real_)) + coord_limits_scale <- scale$trans$transform(coord_limits) + expand_limits_continuous(limits, expand, coord_limits_scale) + } +} + +expand_limits_continuous <- function(limits, expand = expansion(0, 0), coord_limits = c(NA, NA)) { + expand_limits_continuous_trans(limits, expand, coord_limits)$continuous_range +} + +expand_limits_discrete <- function(limits, expand = expansion(0, 0), coord_limits = c(NA, NA), + range_continuous = NULL) { + limit_info <- expand_limits_discrete_trans( + limits, + expand, + coord_limits, + range_continuous = range_continuous + ) + + limit_info$continuous_range +} + +expand_limits_continuous_trans <- function(limits, expand = expansion(0, 0), + coord_limits = c(NA, NA), trans = identity_trans()) { + + # let non-NA coord_limits override the scale limits + limits <- ifelse(is.na(coord_limits), limits, coord_limits) + + # expand limits in coordinate space + continuous_range_coord <- trans$transform(limits) + + # range expansion expects values in increasing order, which may not be true + # for reciprocal/reverse transformations + if (all(is.finite(continuous_range_coord)) && diff(continuous_range_coord) < 0) { + continuous_range_coord <- rev(expand_range4(rev(continuous_range_coord), expand)) + } else { + continuous_range_coord <- expand_range4(continuous_range_coord, expand) + } + + final_scale_limits <- trans$inverse(continuous_range_coord) + + # if any non-finite values were introduced in the transformations, + # replace them with the original scale limits for the purposes of + # calculating breaks and minor breaks from the scale + continuous_range <- ifelse(is.finite(final_scale_limits), final_scale_limits, limits) + + list( + continuous_range_coord = continuous_range_coord, + continuous_range = continuous_range + ) +} + +expand_limits_discrete_trans <- function(limits, expand = expansion(0, 0), + coord_limits = c(NA, NA), trans = identity_trans(), + range_continuous = NULL) { + + n_limits <- length(limits) + is_empty <- is.null(limits) && is.null(range_continuous) + is_only_continuous <- n_limits == 0 + is_only_discrete <- is.null(range_continuous) + + if (is_empty) { + expand_limits_continuous_trans(c(0, 1), expand, coord_limits, trans) + } else if (is_only_continuous) { + expand_limits_continuous_trans(range_continuous, expand, coord_limits, trans) + } else if (is_only_discrete) { + expand_limits_continuous_trans(c(1, n_limits), expand, coord_limits, trans) + } else { + # continuous and discrete + limit_info_discrete <- expand_limits_continuous_trans(c(1, n_limits), expand, coord_limits, trans) + + # don't expand continuous range if there is also a discrete range + limit_info_continuous <- expand_limits_continuous_trans( + range_continuous, expansion(0, 0), coord_limits, trans + ) + + # prefer expanded discrete range, but allow continuous range to further expand the range + list( + continuous_range_coord = range( + c(limit_info_discrete$continuous_range_coord, limit_info_continuous$continuous_range_coord) + ), + continuous_range = range( + c(limit_info_discrete$continuous_range, limit_info_continuous$continuous_range) + ) + ) + } +} diff --git a/R/scale-gradient.r b/R/scale-gradient.r index 2877e39417..9e733712e9 100644 --- a/R/scale-gradient.r +++ b/R/scale-gradient.r @@ -29,6 +29,13 @@ #' z2 = abs(rnorm(100)) #' ) #' +#' df_na <- data.frame( +#' value = seq(1, 20), +#' x = runif(20), +#' y = runif(20), +#' z1 = c(rep(NA, 10), rnorm(10)) +#' ) +#' #' # Default colour scale colours from light blue to dark blue #' ggplot(df, aes(x, y)) + #' geom_point(aes(colour = z2)) @@ -54,6 +61,16 @@ #' scale_colour_gradient(low = "white", high = "black") #' # Avoid red-green colour contrasts because ~10% of men have difficulty #' # seeing them +#' +#'# Use `na.value = NA` to hide missing values but keep the original axis range +#' ggplot(df_na, aes(x = value, y)) + +#' geom_bar(aes(fill = z1), stat = "identity") + +#' scale_fill_gradient(low = "yellow", high = "red", na.value = NA) +#' +#' ggplot(df_na, aes(x, y)) + +#' geom_point(aes(colour = z1)) + +#' scale_colour_gradient(low = "yellow", high = "red", na.value = NA) +#' scale_colour_gradient <- function(..., low = "#132B43", high = "#56B1F7", space = "Lab", na.value = "grey50", guide = "colourbar", aesthetics = "colour") { continuous_scale(aesthetics, "gradient", seq_gradient_pal(low, high, space), diff --git a/R/scale-manual.r b/R/scale-manual.r index 890ed96f1c..fc8a28f001 100644 --- a/R/scale-manual.r +++ b/R/scale-manual.r @@ -22,6 +22,20 @@ #' If unnamed, values will be matched in order (usually alphabetical) with #' the limits of the scale. Any data values that don't match will be #' given `na.value`. +#' @section Color Blindness: +#' Many color palettes derived from RGB combinations (like the "rainbow" color +#' palette) are not suitable to support all viewers, especially those with +#' color vision deficiencies. Using `viridis` type, which is perceptually +#' uniform in both colour and black-and-white display is an easy option to +#' ensure good perceptive properties of your visulizations. +#' The colorspace package offers functionalities +#' - to generate color palettes with good perceptive properties, +#' - to analyse a given color palette, like emulating color blindness, +#' - and to modify a given color palette for better perceptivity. +#' +#' For more information on color vision deficiencies and suitable color choices +#' see the [paper on the colorspace package](https://arxiv.org/abs/1903.06490) +#' and references therein. #' @examples #' p <- ggplot(mtcars, aes(mpg, wt)) + #' geom_point(aes(colour = factor(cyl))) diff --git a/R/scale-view.r b/R/scale-view.r new file mode 100644 index 0000000000..13afdba516 --- /dev/null +++ b/R/scale-view.r @@ -0,0 +1,128 @@ + +#' View scale constructor +#' +#' View scales are an implementation of `Scale` objects that have fixed +#' limits, dimension, breaks, labels, and minor breaks. They are used as +#' the immutable result of the trained scales that have been assigned +#' `limits` and a `continuous_range` from the coordinate system's +#' implementation of scale expantion. +#' +#' @param scale The scale from which to construct a view scale. +#' @param limits The final scale limits +#' @param continuous_range The final dimensions of the scale +#' +#' @noRd +view_scale_primary <- function(scale, limits = scale$get_limits(), + continuous_range = scale$dimension(limits = limits)) { + + if(!scale$is_discrete()) { + breaks <- scale$get_breaks(continuous_range) + breaks <- breaks[is.finite(breaks)] + minor_breaks <- scale$get_breaks_minor(b = breaks, limits = continuous_range) + } else { + breaks <- scale$get_breaks(limits) + minor_breaks <- scale$get_breaks_minor(b = breaks, limits = limits) + } + + ggproto(NULL, ViewScale, + scale = scale, + aesthetics = scale$aesthetics, + name = scale$name, + scale_is_discrete = scale$is_discrete(), + limits = limits, + continuous_range = continuous_range, + breaks = breaks, + labels = scale$get_labels(breaks), + minor_breaks = minor_breaks + ) +} + +# this function is a hack that is difficult to avoid given the complex implementation of second axes +view_scale_secondary <- function(scale, limits = scale$get_limits(), + continuous_range = scale$dimension(limits = limits)) { + if (is.null(scale$secondary.axis) || is.waive(scale$secondary.axis) || scale$secondary.axis$empty()) { + view_scale_empty() + } else { + scale$secondary.axis$init(scale) + break_info <- scale$secondary.axis$break_info(continuous_range, scale) + names(break_info) <- gsub("sec\\.", "", names(break_info)) + + ggproto(NULL, ViewScale, + scale = scale, + break_info = break_info, + aesthetics = paste0(scale$aesthetics, ".sec"), + name = scale$sec_name(), + make_title = function(self, title) self$scale$make_sec_title(title), + + dimension = function(self) self$break_info$range, + get_limits = function(self) self$break_info$range, + get_breaks = function(self) self$break_info$major_source, + get_breaks_minor = function(self) self$break_info$minor_source, + break_positions = function(self) self$break_info$major, + break_positions_minor = function(self) self$break_info$minor, + get_labels = function(self) self$break_info$labels, + rescale = function(x) rescale(x, from = break_info$range, to = c(0, 1)) + ) + } +} + +view_scale_empty <- function() { + ggproto(NULL, ViewScale, + is_empty = function() TRUE, + is_discrete = function() NA, + dimension = function() c(0, 1), + get_limits = function() c(0, 1), + get_breaks = function() NULL, + get_breaks_minor = function() NULL, + get_labels = function() NULL, + rescale = function(x) stop("Not implemented", call. = FALSE), + map = function(x) stop("Not implemented", call. = FALSE), + make_title = function(title) title, + break_positions = function() NULL, + break_positions_minor = function() NULL + ) +} + +ViewScale <- ggproto("ViewScale", NULL, + # map, rescale, and make_title need a reference + # to the original scale + scale = ggproto(NULL, Scale), + aesthetics = NULL, + name = waiver(), + scale_is_discrete = FALSE, + limits = NULL, + continuous_range = NULL, + breaks = NULL, + labels = NULL, + minor_breaks = NULL, + + is_empty = function(self) { + is.null(self$get_breaks()) && is.null(self$get_breaks_minor()) + }, + is_discrete = function(self) self$scale_is_discrete, + dimension = function(self) self$continuous_range, + get_limits = function(self) self$limits, + get_breaks = function(self) self$breaks, + get_breaks_minor = function(self) self$minor_breaks, + get_labels = function(self) self$labels, + rescale = function(self, x) { + self$scale$rescale(x, self$limits, self$continuous_range) + }, + map = function(self, x) { + self$scale$map(x, self$limits) + }, + make_title = function(self, title) { + self$scale$make_title(title) + }, + break_positions = function(self) { + self$rescale(self$get_breaks()) + }, + break_positions_minor = function(self) { + b <- self$get_breaks_minor() + if (is.null(b)) { + return(NULL) + } + + self$rescale(b) + } +) diff --git a/R/stat-bin.r b/R/stat-bin.r index b621c4426f..591034bbfb 100644 --- a/R/stat-bin.r +++ b/R/stat-bin.r @@ -1,10 +1,10 @@ #' @param binwidth The width of the bins. Can be specified as a numeric value -#' or as a function that calculates width from unscaled x. Here, "unscaled x" -#' refers to the original x values in the data, before application of any +#' or as a function that calculates width from unscaled x. Here, "unscaled x" +#' refers to the original x values in the data, before application of any #' scale transformation. When specifying a function along with a grouping -#' structure, the function will be called once per group. -#' The default is to use `bins` -#' bins that cover the range of the data. You should always override +#' structure, the function will be called once per group. +#' The default is to use the number of bins in `bins`, +#' covering the range of the data. You should always override #' this value, exploring multiple widths to find the best to illustrate the #' stories in your data. #' diff --git a/R/stat-contour.r b/R/stat-contour.r index 6b9850d8ab..ed287a5a27 100644 --- a/R/stat-contour.r +++ b/R/stat-contour.r @@ -1,4 +1,5 @@ #' @inheritParams stat_identity +#' @inheritParams geom_contour #' @export #' @eval rd_aesthetics("stat", "contour") #' @section Computed variables: @@ -11,6 +12,9 @@ stat_contour <- function(mapping = NULL, data = NULL, geom = "contour", position = "identity", ..., + bins = NULL, + binwidth = NULL, + breaks = NULL, na.rm = FALSE, show.legend = NA, inherit.aes = TRUE) { @@ -23,6 +27,38 @@ stat_contour <- function(mapping = NULL, data = NULL, show.legend = show.legend, inherit.aes = inherit.aes, params = list( + bins = bins, + binwidth = binwidth, + breaks = breaks, + na.rm = na.rm, + ... + ) + ) +} + +#' @rdname geom_contour +#' @export +stat_contour_filled <- function(mapping = NULL, data = NULL, + geom = "polygon", position = "identity", + ..., + bins = NULL, + binwidth = NULL, + breaks = NULL, + na.rm = FALSE, + show.legend = NA, + inherit.aes = TRUE) { + layer( + data = data, + mapping = mapping, + stat = StatContourFilled, + geom = geom, + position = position, + show.legend = show.legend, + inherit.aes = inherit.aes, + params = list( + bins = bins, + binwidth = binwidth, + breaks = breaks, na.rm = na.rm, ... ) @@ -34,89 +70,188 @@ stat_contour <- function(mapping = NULL, data = NULL, #' @usage NULL #' @export StatContour <- ggproto("StatContour", Stat, + required_aes = c("x", "y", "z"), default_aes = aes(order = stat(level)), compute_group = function(data, scales, bins = NULL, binwidth = NULL, - breaks = NULL, complete = FALSE, na.rm = FALSE) { - # If no parameters set, use pretty bins - if (is.null(bins) && is.null(binwidth) && is.null(breaks)) { - breaks <- pretty(range(data$z), 10) - } - # If provided, use bins to calculate binwidth - if (!is.null(bins)) { - binwidth <- diff(range(data$z)) / bins - } - # If necessary, compute breaks from binwidth - if (is.null(breaks)) { - breaks <- fullseq(range(data$z), binwidth) - } - - contour_lines(data, breaks, complete = complete) + breaks = NULL, na.rm = FALSE) { + + z_range <- range(data$z, na.rm = TRUE, finite = TRUE) + breaks <- contour_breaks(z_range, bins, binwidth, breaks) + + isolines <- xyz_to_isolines(data, breaks) + path_df <- iso_to_path(isolines, data$group[1]) + + path_df$level <- as.numeric(path_df$level) + path_df$nlevel <- rescale_max(path_df$level) + + path_df } +) + +#' @rdname ggplot2-ggproto +#' @format NULL +#' @usage NULL +#' @export +StatContourFilled <- ggproto("StatContourFilled", Stat, + + required_aes = c("x", "y", "z"), + default_aes = aes(order = stat(level), fill = stat(level)), + compute_group = function(data, scales, bins = NULL, binwidth = NULL, breaks = NULL, na.rm = FALSE) { + + z_range <- range(data$z, na.rm = TRUE, finite = TRUE) + breaks <- contour_breaks(z_range, bins, binwidth) + + isobands <- xyz_to_isobands(data, breaks) + names(isobands) <- pretty_isoband_levels(names(isobands)) + path_df <- iso_to_path(isobands, data$group[1]) + + path_df$level <- factor(path_df$level, levels = names(isobands)) + + path_df + } ) +#' Calculate the breaks used for contouring +#' +#' @inheritParams geom_contour +#' @param z_range Range of values within which breaks should be calculated +#' +#' @return A vector of breaks +#' @noRd +#' +contour_breaks <- function(z_range, bins = NULL, binwidth = NULL, breaks = NULL) { + if (!is.null(breaks)) { + return(breaks) + } -# v3d <- reshape2::melt(volcano) -# names(v3d) <- c("x", "y", "z") -# -# breaks <- seq(95, 195, length.out = 10) -# contours <- contourLines(v3d, breaks) -# ggplot(contours, aes(x, y)) + -# geom_path() + -# facet_wrap(~piece) -contour_lines <- function(data, breaks, complete = FALSE) { - z <- tapply(data$z, data[c("x", "y")], identity) - - if (is.list(z)) { - stop("Contour requires single `z` at each combination of `x` and `y`.", - call. = FALSE) + # If no parameters set, use pretty bins + if (is.null(bins) && is.null(binwidth)) { + breaks <- pretty(z_range, 10) } - cl <- grDevices::contourLines( - x = sort(unique(data$x)), y = sort(unique(data$y)), z = z, - levels = breaks) + # If provided, use bins to calculate binwidth + if (!is.null(bins)) { + binwidth <- diff(z_range) / bins + } - if (length(cl) == 0) { - warning("Not possible to generate contour data", call. = FALSE) - return(new_data_frame()) + # If necessary, compute breaks from binwidth + if (is.null(breaks)) { + breaks <- fullseq(z_range, binwidth) } - # Convert list of lists into single data frame - lengths <- vapply(cl, function(x) length(x$x), integer(1)) - levels <- vapply(cl, "[[", "level", FUN.VALUE = double(1)) - xs <- unlist(lapply(cl, "[[", "x"), use.names = FALSE) - ys <- unlist(lapply(cl, "[[", "y"), use.names = FALSE) - pieces <- rep(seq_along(cl), lengths) - # Add leading zeros so that groups can be properly sorted later - groups <- paste(data$group[1], sprintf("%03d", pieces), sep = "-") - - new_data_frame(list( - level = rep(levels, lengths), - nlevel = rep(levels, lengths) / max(rep(levels, lengths), na.rm = TRUE), - x = xs, - y = ys, - piece = pieces, - group = factor(groups) - ), n = length(xs)) + breaks +} + +#' Compute isoband objects +#' +#' @param data A data frame with columns `x`, `y`, and `z`. +#' @param breaks A vector of breaks. These are the values for +#' which contour lines will be computed. +#' +#' @return An S3 "iso" object, which is a `list()` of `list(x, y, id)`s. +#' @noRd +#' +xyz_to_isolines <- function(data, breaks) { + isoband::isolines( + x = sort(unique(data$x)), + y = sort(unique(data$y)), + z = isoband_z_matrix(data), + levels = breaks + ) +} + +xyz_to_isobands <- function(data, breaks) { + isoband::isobands( + x = sort(unique(data$x)), + y = sort(unique(data$y)), + z = isoband_z_matrix(data), + levels_low = breaks[-length(breaks)], + levels_high = breaks[-1] + ) +} + +#' Compute input matrix for isoband functions +#' +#' Note that [grDevices::contourLines()] needs transposed +#' output to the matrix returned by this function. +#' +#' @param data A data frame with columns `x`, `y`, and `z`. +#' +#' @return A [matrix()] +#' @noRd +#' +isoband_z_matrix <- function(data) { + # Convert vector of data to raster + x_pos <- as.integer((data$x - min(data$x)) / resolution(data$x, FALSE)) + y_pos <- as.integer((max(data$y) - data$y) / resolution(data$y, FALSE)) + + nrow <- max(y_pos) + 1 + ncol <- max(x_pos) + 1 + + raster <- matrix(NA_real_, nrow = nrow, ncol = ncol) + raster[cbind(nrow - y_pos, x_pos + 1)] <- data$z + + raster } -# 1 = clockwise, -1 = counterclockwise, 0 = 0 area -# From http://stackoverflow.com/questions/1165647 -# x <- c(5, 6, 4, 1, 1) -# y <- c(0, 4, 5, 5, 0) -# poly_dir(x, y) -poly_dir <- function(x, y) { - xdiff <- c(x[-1], x[1]) - x - ysum <- c(y[-1], y[1]) + y - sign(sum(xdiff * ysum)) +#' Convert the output of isoband functions +#' +#' @param iso the output of [isoband::isolines()] or [isoband::isobands()] +#' @param group the name of the group +#' +#' @return A data frame that can be passed to [geom_path()] or [geom_polygon()]. +#' @noRd +#' +iso_to_path <- function(iso, group = 1) { + lengths <- vapply(iso, function(x) length(x$x), integer(1)) + + if (all(lengths == 0)) { + warning("stat_contour(): Zero contours were generated", call. = FALSE) + return(new_data_frame()) + } + + levels <- names(iso) + xs <- unlist(lapply(iso, "[[", "x"), use.names = FALSE) + ys <- unlist(lapply(iso, "[[", "y"), use.names = FALSE) + ids <- unlist(lapply(iso, "[[", "id"), use.names = FALSE) + item_id <- rep(seq_along(iso), lengths) + + # Add leading zeros so that groups can be properly sorted + groups <- paste(group, sprintf("%03d", item_id), sprintf("%03d", ids), sep = "-") + groups <- factor(groups) + + new_data_frame( + list( + level = rep(levels, lengths), + x = xs, + y = ys, + piece = as.integer(groups), + group = groups + ), + n = length(xs) + ) } -# To fix breaks and complete the polygons, we need to add 0-4 corner points. -# -# contours <- ddply(contours, "piece", mutate, dir = ggplot2:::poly_dir(x, y)) -# ggplot(contours, aes(x, y)) + -# geom_path(aes(group = piece, colour = factor(dir))) -# last_plot() + facet_wrap(~ level) +#' Pretty isoband level names +#' +#' @param isoband_levels `names()` of an [isoband::isobands()] object. +#' +#' @return A vector of labels like those used in +#' [cut()] and [cut_inverval()]. +#' @noRd +#' +pretty_isoband_levels <- function(isoband_levels, dig.lab = 3) { + interval_low <- gsub(":.*$", "", isoband_levels) + interval_high <- gsub("^[^:]*:", "", isoband_levels) + + label_low <- format(as.numeric(interval_low), digits = dig.lab, trim = TRUE) + label_high <- format(as.numeric(interval_high), digits = dig.lab, trim = TRUE) + # from the isoband::isobands() docs: + # the intervals specifying isobands are closed at their lower boundary + # and open at their upper boundary + sprintf("(%s, %s]", label_low, label_high) +} diff --git a/R/stat-density-2d.r b/R/stat-density-2d.r index 2bd736833a..1a5fcf0e7a 100644 --- a/R/stat-density-2d.r +++ b/R/stat-density-2d.r @@ -5,6 +5,10 @@ #' @param n number of grid points in each direction #' @param h Bandwidth (vector of length two). If `NULL`, estimated #' using [MASS::bandwidth.nrd()]. +#' @param adjust A multiplicative bandwidth adjustment to be used if 'h' is +#' 'NULL'. This makes it possible to adjust the bandwidth while still +#' using the a bandwidth estimator. For example, `adjust = 1/2` means +#' use half of the default bandwidth. #' @section Computed variables: #' Same as [stat_contour()] #' @@ -19,6 +23,7 @@ stat_density_2d <- function(mapping = NULL, data = NULL, contour = TRUE, n = 100, h = NULL, + adjust = c(1, 1), na.rm = FALSE, show.legend = NA, inherit.aes = TRUE) { @@ -35,6 +40,7 @@ stat_density_2d <- function(mapping = NULL, data = NULL, contour = contour, n = n, h = h, + adjust = adjust, ... ) ) @@ -54,11 +60,12 @@ StatDensity2d <- ggproto("StatDensity2d", Stat, required_aes = c("x", "y"), - compute_group = function(data, scales, na.rm = FALSE, h = NULL, + compute_group = function(data, scales, na.rm = FALSE, h = NULL, adjust = c(1, 1), contour = TRUE, n = 100, bins = NULL, binwidth = NULL) { if (is.null(h)) { h <- c(MASS::bandwidth.nrd(data$x), MASS::bandwidth.nrd(data$y)) + h <- h * adjust } dens <- MASS::kde2d( diff --git a/R/stat-density.r b/R/stat-density.r index 0fbd706c17..6d18b8bb12 100644 --- a/R/stat-density.r +++ b/R/stat-density.r @@ -64,7 +64,7 @@ stat_density <- function(mapping = NULL, data = NULL, #' @export StatDensity <- ggproto("StatDensity", Stat, required_aes = "x", - default_aes = aes(y = stat(density), fill = NA), + default_aes = aes(y = stat(density), fill = NA, weight = NULL), compute_group = function(data, scales, bw = "nrd0", adjust = 1, kernel = "gaussian", n = 512, trim = FALSE, na.rm = FALSE) { @@ -85,6 +85,8 @@ compute_density <- function(x, w, from, to, bw = "nrd0", adjust = 1, nx <- length(x) if (is.null(w)) { w <- rep(1 / nx, nx) + } else { + w <- w / sum(w) } # if less than 2 points return data frame of NAs and a warning diff --git a/R/stat-ellipse.R b/R/stat-ellipse.R index fb94679849..9ad716ac4a 100644 --- a/R/stat-ellipse.R +++ b/R/stat-ellipse.R @@ -1,12 +1,12 @@ -#' Compute normal confidence ellipses +#' Compute normal data ellipses #' #' The method for calculating the ellipses has been modified from -#' `car::ellipse` (Fox and Weisberg, 2011) +#' `car::dataEllipse` (Fox and Weisberg, 2011) #' #' @references John Fox and Sanford Weisberg (2011). An \R Companion to #' Applied Regression, Second Edition. Thousand Oaks CA: Sage. URL: #' \url{http://socserv.socsci.mcmaster.ca/jfox/Books/Companion} -#' @param level The confidence level at which to draw an ellipse (default is 0.95), +#' @param level The level at which to draw an ellipse, #' or, if `type="euclid"`, the radius of the circle to be drawn. #' @param type The type of ellipse. #' The default `"t"` assumes a multivariate t-distribution, and diff --git a/R/stat-function.r b/R/stat-function.r index 89e98d97d2..8db4c2b64a 100644 --- a/R/stat-function.r +++ b/R/stat-function.r @@ -4,13 +4,13 @@ #' The function is called with a grid of evenly spaced values along the x axis, #' and the results are drawn (by default) with a line. #' -#' @eval rd_aesthetics("stat", "function") +#' #' @param fun Function to use. Either 1) an anonymous function in the base or #' rlang formula syntax (see [rlang::as_function()]) #' or 2) a quoted or character name referencing a function; see examples. Must #' be vectorised. #' @param n Number of points to interpolate along -#' @param args List of additional arguments to pass to `fun` +#' @param args List of additional arguments passed on to the function defined by `fun`. #' @param xlim Optionally, restrict the range of the function to this range. #' @inheritParams layer #' @inheritParams geom_point @@ -64,6 +64,15 @@ stat_function <- function(mapping = NULL, data = NULL, na.rm = FALSE, show.legend = NA, inherit.aes = TRUE) { + + # Warn if supplied mapping and/or data is going to be overwritten + if (!is.null(mapping)) { + warning("`mapping` is not used by stat_function()", call. = FALSE) + } + if (!is.null(data)) { + warning("`data` is not used by stat_function()", call. = FALSE) + } + layer( data = data, mapping = mapping, diff --git a/R/stat-smooth.r b/R/stat-smooth.r index 6e97dbb331..86e2e9dcab 100644 --- a/R/stat-smooth.r +++ b/R/stat-smooth.r @@ -5,7 +5,7 @@ #' For `method = "auto"` the smoothing method is chosen based on the #' size of the largest group (across all panels). [stats::loess()] is #' used for less than 1,000 observations; otherwise [mgcv::gam()] is -#' used with `formula = y ~ s(x, bs = "cs")`. Somewhat anecdotally, +#' used with `formula = y ~ s(x, bs = "cs")` with `method = "REML"`. Somewhat anecdotally, #' `loess` gives a better appearance, but is \eqn{O(N^{2})}{O(N^2)} in memory, #' so does not work for larger datasets. #' @@ -76,7 +76,6 @@ stat_smooth <- function(mapping = NULL, data = NULL, #' @usage NULL #' @export StatSmooth <- ggproto("StatSmooth", Stat, - setup_params = function(data, params) { if (identical(params$method, "auto")) { # Use loess for small datasets, gam with a cubic regression basis for @@ -90,17 +89,16 @@ StatSmooth <- ggproto("StatSmooth", Stat, params$method <- "gam" params$formula <- y ~ s(x, bs = "cs") } - message("`geom_smooth()` using method = '", params$method, - "' and formula '", deparse(params$formula), "'") - } - if (identical(params$method, "gam")) { - params$method <- mgcv::gam + message( + "`geom_smooth()` using method = '", params$method, + "' and formula '", deparse(params$formula), "'" + ) } params }, - compute_group = function(data, scales, method = "auto", formula = y~x, + compute_group = function(data, scales, method = "auto", formula = y ~ x, se = TRUE, n = 80, span = 0.75, fullrange = FALSE, xseq = NULL, level = 0.95, method.args = list(), na.rm = FALSE) { @@ -127,12 +125,23 @@ StatSmooth <- ggproto("StatSmooth", Stat, xseq <- seq(range[1], range[2], length.out = n) } } + # Special case span because it's the most commonly used model argument if (identical(method, "loess")) { method.args$span <- span } - if (is.character(method)) method <- match.fun(method) + if (is.character(method)) { + if (identical(method, "gam")) { + method <- mgcv::gam + } else { + method <- match.fun(method) + } + } + # If gam and gam's method is not specified by the user then use REML + if (identical(method, mgcv::gam) && is.null(method.args$method)) { + method.args$method <- "REML" + } base.args <- list(quote(formula), data = quote(data), weights = quote(weight)) model <- do.call(method, c(base.args, method.args)) diff --git a/R/theme-defaults.r b/R/theme-defaults.r index 876b5787e0..bed61b6f7a 100644 --- a/R/theme-defaults.r +++ b/R/theme-defaults.r @@ -169,7 +169,7 @@ theme_grey <- function(base_size = 11, base_family = "", legend.spacing.x = NULL, legend.spacing.y = NULL, legend.margin = margin(half_line, half_line, half_line, half_line), - legend.key = element_rect(fill = "grey95", colour = "white"), + legend.key = element_rect(fill = "grey95", colour = NA), legend.key.size = unit(1.2, "lines"), legend.key.height = NULL, legend.key.width = NULL, diff --git a/R/theme-elements.r b/R/theme-elements.r index 247c4c868c..f2cdd5c0fa 100644 --- a/R/theme-elements.r +++ b/R/theme-elements.r @@ -215,7 +215,7 @@ element_grob.element_text <- function(element, label = "", x = NULL, y = NULL, titleGrob(label, x, y, hjust = hj, vjust = vj, angle = angle, gp = modify_list(element_gp, gp), margin = margin, - margin_x = margin_x, margin_y = margin_y, debug = element$debug) + margin_x = margin_x, margin_y = margin_y, debug = element$debug, ...) } diff --git a/R/theme.r b/R/theme.r index f9d461f1da..2fa96b4c8a 100644 --- a/R/theme.r +++ b/R/theme.r @@ -612,21 +612,38 @@ merge_element.element <- function(new, old) { new } -# Combine the properties of two elements -# -# @param e1 An element object -# @param e2 An element object which e1 inherits from +#' Combine the properties of two elements +#' +#' @param e1 An element object +#' @param e2 An element object from which e1 inherits +#' +#' @noRd +#' combine_elements <- function(e1, e2) { # If e2 is NULL, nothing to inherit - if (is.null(e2) || inherits(e1, "element_blank")) return(e1) + if (is.null(e2) || inherits(e1, "element_blank")) { + return(e1) + } + # If e1 is NULL inherit everything from e2 - if (is.null(e1)) return(e2) + if (is.null(e1)) { + return(e2) + } + + # If neither of e1 or e2 are element_* objects, return e1 + if (!inherits(e1, "element") && !inherits(e2, "element")) { + return(e1) + } + # If e2 is element_blank, and e1 inherits blank inherit everything from e2, # otherwise ignore e2 if (inherits(e2, "element_blank")) { - if (e1$inherit.blank) return(e2) - else return(e1) + if (e1$inherit.blank) { + return(e2) + } else { + return(e1) + } } # If e1 has any NULL properties, inherit them from e2 diff --git a/R/utilities.r b/R/utilities.r index db8c16f932..6336ace4b8 100644 --- a/R/utilities.r +++ b/R/utilities.r @@ -176,97 +176,9 @@ rescale01 <- function(x) { (x - rng[1]) / (rng[2] - rng[1]) } -#' Similar to expand_range(), but taking a vector ‘expand’ -#' of *four* expansion values, where the 1st and 2nd -#' elements are used for the lower limit, and the 3rd and -#' 4th elements are used for the upper limit). -#' -#' The ‘expand’ argument can also be of length 2, -#' and the expansion values for the lower limit -#' are then reused for the upper limit. -# -#' @noRd -#' @keywords internal -expand_range4 <- function(limits, expand) { - stopifnot(is.numeric(expand) && (length(expand) %in% c(2,4))) - # If only two expansion constants are given (i.e. the old syntax), - # reuse them to generate a four-element expansion vector - if (length(expand) == 2) { expand <- c(expand, expand) } - - # Calculate separate range expansion for the lower and - # upper range limits, and then combine them into one vector - lower <- expand_range(limits, expand[1], expand[2])[1] - upper <- expand_range(limits, expand[3], expand[4])[2] - c(lower, upper) -} - -#' Generate expansion vector for scales. -#' -#' This is a convenience function for generating scale expansion vectors -#' for the \code{expand} argument of -#' \code{\link[=scale_x_continuous]{scale_*_continuous}} and -#' \code{\link[=scale_x_discrete]{scale_*_discrete}}. -#' The expansions vectors are used to add some space between -#' the data and the axes. -#' -#' @export -#' @param mult vector of multiplicative range expansion factors. -#' If length 1, both the lower and upper limits of the scale -#' are expanded outwards by \code{mult}. If length 2, the lower limit -#' is expanded by \code{mult[1]} and the upper limit by \code{mult[2]}. -#' @param add vector of additive range expansion constants. -#' If length 1, both the lower and upper limits of the scale -#' are expanded outwards by \code{add} units. If length 2, the -#' lower limit is expanded by \code{add[1]} and the upper -#' limit by \code{add[2]}. -#' @examples -#' # No space below the bars but 10% above them -#' ggplot(mtcars) + -#' geom_bar(aes(x = factor(cyl))) + -#' scale_y_continuous(expand = expand_scale(mult = c(0, .1))) -#' -#' # Add 2 units of space on the left and right of the data -#' ggplot(subset(diamonds, carat > 2), aes(cut, clarity)) + -#' geom_jitter() + -#' scale_x_discrete(expand = expand_scale(add = 2)) -#' -#' # Reproduce the default range expansion used -#' # when the 'expand' argument is not specified -#' ggplot(subset(diamonds, carat > 2), aes(cut, price)) + -#' geom_jitter() + -#' scale_x_discrete(expand = expand_scale(add = .6)) + -#' scale_y_continuous(expand = expand_scale(mult = .05)) -expand_scale = function(mult = 0, add = 0) { - stopifnot(is.numeric(mult) && is.numeric(add)) - stopifnot((length(mult) %in% 1:2) && (length(add) %in% 1:2)) - - mult <- rep(mult, length.out = 2) - add <- rep(add, length.out = 2) - c(mult[1], add[1], mult[2], add[2]) -} - - - #' Give a deprecation error, warning, or message, depending on version number. #' -#' Version numbers have the format .., like 0.9.2. -#' This function compares the current version number of ggplot2 against the -#' specified `version`, which is the most recent version before the -#' function (or other object) was deprecated. -#' -#' `gg_dep` will give an error, warning, or message, depending on the -#' difference between the current ggplot2 version and the specified -#' `version`. -#' -#' If the current major number is greater than `version`'s major number, -#' or if the current minor number is more than 1 greater than `version`'s -#' minor number, give an error. -#' -#' If the current minor number differs from `version`'s minor number by -#' one, give a warning. -#' -#' If the current subminor number differs from `version`'s subminor -#' number, print a message. +#' This function is deprecated. #' #' @param version The last version of ggplot2 where this function was good #' (in other words, the last version where it was not deprecated). @@ -274,6 +186,7 @@ expand_scale = function(mult = 0, add = 0) { #' @keywords internal #' @export gg_dep <- function(version, msg) { + .Deprecated() v <- as.package_version(version) cv <- utils::packageVersion("ggplot2") diff --git a/R/zxx.r b/R/zxx.r index 32ab91f1bc..e13cbe7c1d 100644 --- a/R/zxx.r +++ b/R/zxx.r @@ -110,7 +110,7 @@ scale_color_brewer <- scale_colour_brewer scale_color_distiller <- scale_colour_distiller #' @export -#' @rdname scale_gradient +#' @rdname scale_colour_continuous #' @usage NULL scale_color_continuous <- scale_colour_continuous diff --git a/_pkgdown.yml b/_pkgdown.yml index f837806f04..235a1d6d44 100644 --- a/_pkgdown.yml +++ b/_pkgdown.yml @@ -94,7 +94,7 @@ reference: - labs - lims - expand_limits - - expand_scale + - expansion - starts_with("scale_") - title: "Guides: axes and legends" @@ -217,37 +217,31 @@ reference: - map_data navbar: - title: ~ - type: default - left: - - text: Reference - href: reference/index.html - - text: Articles - menu: - - text: Aesthetic specifications - href: articles/ggplot2-specs.html - - text: Extending ggplot2 - href: articles/extending-ggplot2.html - - text: News - menu: - - text: "Release notes" - - text: "Version 3.1.0" - href: https://www.tidyverse.org/articles/2018/10/ggplot2-3-1-0/ - - text: "Version 3.0.0" - href: https://www.tidyverse.org/articles/2018/07/ggplot2-3-0-0/ - - text: "Version 2.2.0" - href: https://blog.rstudio.com/2016/11/14/ggplot2-2-2-0/ - - text: "Version 2.1.0" - href: https://blog.rstudio.com/2016/03/03/ggplot2-2-1-0/ - - text: "Version 2.0.0" - href: https://blog.rstudio.com/2015/12/21/ggplot2-2-0-0/ - - text: "Version 1.0.0" - href: https://blog.rstudio.com/2015/01/09/ggplot2-updates/ - - text: "------------------" - - text: "Change log" - href: news/index.html - - text: Extensions - href: http://www.ggplot2-exts.org/gallery/ - right: - - icon: fa-github fa-lg - href: https://github.com/tidyverse/ggplot2 + structure: + right: [extensions, github] + components: + home: ~ + news: + text: News + menu: + - text: "Release notes" + - text: "Version 3.2.0" + href: https://www.tidyverse.org/articles/2019/06/ggplot2-3-2-0/ + - text: "Version 3.1.0" + href: https://www.tidyverse.org/articles/2018/10/ggplot2-3-1-0/ + - text: "Version 3.0.0" + href: https://www.tidyverse.org/articles/2018/07/ggplot2-3-0-0/ + - text: "Version 2.2.0" + href: https://blog.rstudio.com/2016/11/14/ggplot2-2-2-0/ + - text: "Version 2.1.0" + href: https://blog.rstudio.com/2016/03/03/ggplot2-2-1-0/ + - text: "Version 2.0.0" + href: https://blog.rstudio.com/2015/12/21/ggplot2-2-0-0/ + - text: "Version 1.0.0" + href: https://blog.rstudio.com/2015/01/09/ggplot2-updates/ + - text: "------------------" + - text: "Change log" + href: news/index.html + extensions: + text: Extensions + href: http://www.ggplot2-exts.org/gallery/ diff --git a/man/aes.Rd b/man/aes.Rd index 0f207c4fef..5b1c80daab 100644 --- a/man/aes.Rd +++ b/man/aes.Rd @@ -7,9 +7,13 @@ aes(x, y, ...) } \arguments{ -\item{x, y, ...}{List of name value pairs giving aesthetics to map to -variables. The names for x and y aesthetics are typically omitted because -they are so common; all other aesthetics must be named.} +\item{x, y, ...}{List of name-value pairs in the form \code{aesthetic = variable} +describing which variables in the layer data should be mapped to which +aesthetics used by the paired geom/stat. The expression \code{variable} is +evaluated within the layer data, so there is no need to refer to +the original dataset (i.e., use \code{ggplot(df, aes(variable))} +instead of \code{ggplot(df, aes(df$variable))}). The names for x and y aesthetics +are typically omitted because they are so common; all other aesthetics must be named.} } \value{ A list with class \code{uneval}. Components of the list are either @@ -22,8 +26,8 @@ properties (aesthetics) of geoms. Aesthetic mappings can be set in } \details{ This function also standardises aesthetic names by converting \code{color} to \code{colour} -(also in substrings, e.g. \code{point_color} to \code{point_colour}) and translating old style -R names to ggplot names (eg. \code{pch} to \code{shape}, \code{cex} to \code{size}). +(also in substrings, e.g., \code{point_color} to \code{point_colour}) and translating old style +R names to ggplot names (e.g., \code{pch} to \code{shape} and \code{cex} to \code{size}). } \section{Quasiquotation}{ diff --git a/man/continuous_scale.Rd b/man/continuous_scale.Rd index 36d46fb5af..6c5ee2a3fb 100644 --- a/man/continuous_scale.Rd +++ b/man/continuous_scale.Rd @@ -2,7 +2,7 @@ % Please edit documentation in R/scale-.r \name{continuous_scale} \alias{continuous_scale} -\title{Continuous scale constructor.} +\title{Continuous scale constructor} \usage{ continuous_scale(aesthetics, scale_name, palette, name = waiver(), breaks = waiver(), minor_breaks = waiver(), labels = waiver(), @@ -11,13 +11,14 @@ continuous_scale(aesthetics, scale_name, palette, name = waiver(), guide = "legend", position = "left", super = ScaleContinuous) } \arguments{ -\item{aesthetics}{The names of the aesthetics that this scale works with} +\item{aesthetics}{The names of the aesthetics that this scale works with.} -\item{scale_name}{The name of the scale} +\item{scale_name}{The name of the scale that should be used for error messages +associated with this scale.} \item{palette}{A palette function that when called with a numeric vector with -values between 0 and 1 returns the corresponding values in the range the -scale maps to.} +values between 0 and 1 returns the corresponding output values +(e.g., \code{\link[scales:area_pal]{scales::area_pal()}}).} \item{name}{The name of the scale. Used as the axis or legend title. If \code{waiver()}, the default, the name of the scale is taken from the first @@ -28,10 +29,10 @@ omitted.} \itemize{ \item \code{NULL} for no breaks \item \code{waiver()} for the default breaks computed by the -transformation object +\link[scales:trans_new]{transformation object} \item A numeric vector of positions \item A function that takes the limits as input and returns breaks -as output +as output (e.g., a function returned by \code{\link[scales:extended_breaks]{scales::extended_breaks()}}) }} \item{minor_breaks}{One of: @@ -60,45 +61,57 @@ as output Use \code{NA} to refer to the existing minimum or maximum \item A function that accepts the existing (automatic) limits and returns new limits +Note that setting limits on positional scales will \strong{remove} data outside of the limits. +If the purpose is to zoom, use the limit argument in the coordinate system +(see \code{\link[=coord_cartesian]{coord_cartesian()}}). }} -\item{rescaler}{Used by diverging and n colour gradients -(i.e. \code{\link[=scale_colour_gradient2]{scale_colour_gradient2()}}, \code{\link[=scale_colour_gradientn]{scale_colour_gradientn()}}). -A function used to scale the input values to the range [0, 1].} +\item{rescaler}{A function used to scale the input values to the +range [0, 1]. This is always \code{\link[scales:rescale]{scales::rescale()}}, except for +diverging and n colour gradients (i.e., \code{\link[=scale_colour_gradient2]{scale_colour_gradient2()}}, +\code{\link[=scale_colour_gradientn]{scale_colour_gradientn()}}). The \code{rescaler} is ignored by position +scales, which ways use \code{\link[scales:rescale]{scales::rescale()}}.} -\item{oob}{Function that handles limits outside of the scale limits -(out of bounds). The default replaces out of bounds values with \code{NA}.} +\item{oob}{One of: +\itemize{ +\item Function that handles limits outside of the scale limits +(out of bounds). +\item The default (\code{\link[scales:censor]{scales::censor()}}) replaces out of +bounds values with \code{NA}. +\item \code{\link[scales:squish]{scales::squish()}} for squishing out of bounds values into range. +\item \code{\link[scales:squish_infinite]{scales::squish_infinite()}} for squishing infitite values into range. +}} -\item{expand}{Vector of range expansion constants used to add some -padding around the data, to ensure that they are placed some distance -away from the axes. Use the convenience function \code{\link[=expand_scale]{expand_scale()}} +\item{expand}{For position scales, a vector of range expansion constants used to add some +padding around the data to ensure that they are placed some distance +away from the axes. Use the convenience function \code{\link[=expansion]{expansion()}} to generate the values for the \code{expand} argument. The defaults are to expand the scale by 5\% on each side for continuous variables, and by 0.6 units on each side for discrete variables.} \item{na.value}{Missing values will be replaced with this value.} -\item{trans}{Either the name of a transformation object, or the -object itself. Built-in transformations include "asn", "atanh", +\item{trans}{For continuous scales, the name of a transformation object +or the object itself. Built-in transformations include "asn", "atanh", "boxcox", "date", "exp", "hms", "identity", "log", "log10", "log1p", "log2", "logit", "modulus", "probability", "probit", "pseudo_log", "reciprocal", "reverse", "sqrt" and "time". A transformation object bundles together a transform, its inverse, and methods for generating breaks and labels. Transformation objects -are defined in the scales package, and are called \code{name_trans}, e.g. -\code{\link[scales:boxcox_trans]{scales::boxcox_trans()}}. You can create your own +are defined in the scales package, and are called \code{_trans} (e.g., +\code{\link[scales:boxcox_trans]{scales::boxcox_trans()}}). You can create your own transformation with \code{\link[scales:trans_new]{scales::trans_new()}}.} \item{guide}{A function used to create a guide or its name. See -\code{\link[=guides]{guides()}} for more info.} +\code{\link[=guides]{guides()}} for more information.} -\item{position}{The position of the axis. "left" or "right" for vertical -scales, "top" or "bottom" for horizontal scales} +\item{position}{For position scales, The position of the axis. +\code{left} or \code{right} for y axes, \code{top} or \code{bottom} for x axes.} \item{super}{The super class to use for the constructed scale} } \description{ -Continuous scale constructor. +Continuous scale constructor } \keyword{internal} diff --git a/man/coord_trans.Rd b/man/coord_trans.Rd index 56b7282b34..d109704ae5 100644 --- a/man/coord_trans.Rd +++ b/man/coord_trans.Rd @@ -4,20 +4,31 @@ \alias{coord_trans} \title{Transformed Cartesian coordinate system} \usage{ -coord_trans(x = "identity", y = "identity", limx = NULL, - limy = NULL, clip = "on", xtrans, ytrans) +coord_trans(x = "identity", y = "identity", xlim = NULL, + ylim = NULL, limx = "DEPRECATED", limy = "DEPRECATED", + clip = "on", expand = TRUE) } \arguments{ -\item{x, y}{transformers for x and y axes} +\item{x, y}{Transformers for x and y axes or their names.} -\item{limx, limy}{limits for x and y axes. (Named so for backward -compatibility)} +\item{xlim}{Limits for the x and y axes.} + +\item{ylim}{Limits for the x and y axes.} + +\item{limx, limy}{\strong{Deprecated}: use \code{xlim} and \code{ylim} instead.} \item{clip}{Should drawing be clipped to the extent of the plot panel? A setting of \code{"on"} (the default) means yes, and a setting of \code{"off"} -means no. For details, please see \code{\link[=coord_cartesian]{coord_cartesian()}}.} +means no. In most cases, the default of \code{"on"} should not be changed, +as setting \code{clip = "off"} can cause unexpected results. It allows +drawing of data points anywhere on the plot, including in the plot margins. If +limits are set via \code{xlim} and \code{ylim} and some data points fall outside those +limits, then those data points may show up in places such as the axes, the +legend, the plot title, or the plot margins.} -\item{xtrans, ytrans}{Deprecated; use \code{x} and \code{y} instead.} +\item{expand}{If \code{TRUE}, the default, adds a small expansion factor to +the limits to ensure that data and axes don't overlap. If \code{FALSE}, +limits are taken exactly from the data or \code{xlim}/\code{ylim}.} } \description{ \code{coord_trans} is different to scale transformations in that it occurs after diff --git a/man/discrete_scale.Rd b/man/discrete_scale.Rd index 87f0fb8100..0aabfe02b1 100644 --- a/man/discrete_scale.Rd +++ b/man/discrete_scale.Rd @@ -2,7 +2,7 @@ % Please edit documentation in R/scale-.r \name{discrete_scale} \alias{discrete_scale} -\title{Discrete scale constructor.} +\title{Discrete scale constructor} \usage{ discrete_scale(aesthetics, scale_name, palette, name = waiver(), breaks = waiver(), labels = waiver(), limits = NULL, @@ -11,13 +11,14 @@ discrete_scale(aesthetics, scale_name, palette, name = waiver(), super = ScaleDiscrete) } \arguments{ -\item{aesthetics}{The names of the aesthetics that this scale works with} +\item{aesthetics}{The names of the aesthetics that this scale works with.} -\item{scale_name}{The name of the scale} +\item{scale_name}{The name of the scale that should be used for error messages +associated with this scale.} \item{palette}{A palette function that when called with a single integer argument (the number of levels in the scale) returns the values that -they should take.} +they should take (e.g., \code{\link[scales:hue_pal]{scales::hue_pal()}}).} \item{name}{The name of the scale. Used as the axis or legend title. If \code{waiver()}, the default, the name of the scale is taken from the first @@ -27,8 +28,7 @@ omitted.} \item{breaks}{One of: \itemize{ \item \code{NULL} for no breaks -\item \code{waiver()} for the default breaks computed by the -transformation object +\item \code{waiver()} for the default breaks (the scale limits) \item A character vector of breaks \item A function that takes the limits as input and returns breaks as output @@ -47,9 +47,9 @@ as output \item{limits}{A character vector that defines possible values of the scale and their order.} -\item{expand}{Vector of range expansion constants used to add some -padding around the data, to ensure that they are placed some distance -away from the axes. Use the convenience function \code{\link[=expand_scale]{expand_scale()}} +\item{expand}{For position scales, a vector of range expansion constants used to add some +padding around the data to ensure that they are placed some distance +away from the axes. Use the convenience function \code{\link[=expansion]{expansion()}} to generate the values for the \code{expand} argument. The defaults are to expand the scale by 5\% on each side for continuous variables, and by 0.6 units on each side for discrete variables.} @@ -67,14 +67,14 @@ The default, \code{TRUE}, uses the levels that appear in the data; \code{FALSE} uses all the levels in the factor.} \item{guide}{A function used to create a guide or its name. See -\code{\link[=guides]{guides()}} for more info.} +\code{\link[=guides]{guides()}} for more information.} -\item{position}{The position of the axis. "left" or "right" for vertical -scales, "top" or "bottom" for horizontal scales} +\item{position}{For position scales, The position of the axis. +\code{left} or \code{right} for y axes, \code{top} or \code{bottom} for x axes.} \item{super}{The super class to use for the constructed scale} } \description{ -Discrete scale constructor. +Discrete scale constructor } \keyword{internal} diff --git a/man/expand_scale.Rd b/man/expansion.Rd similarity index 65% rename from man/expand_scale.Rd rename to man/expansion.Rd index f7ddb6b770..cc6d6aa2fc 100644 --- a/man/expand_scale.Rd +++ b/man/expansion.Rd @@ -1,9 +1,12 @@ % Generated by roxygen2: do not edit by hand -% Please edit documentation in R/utilities.r -\name{expand_scale} +% Please edit documentation in R/scale-expansion.r +\name{expansion} +\alias{expansion} \alias{expand_scale} -\title{Generate expansion vector for scales.} +\title{Generate expansion vector for scales} \usage{ +expansion(mult = 0, add = 0) + expand_scale(mult = 0, add = 0) } \arguments{ @@ -20,27 +23,26 @@ limit by \code{add[2]}.} } \description{ This is a convenience function for generating scale expansion vectors -for the \code{expand} argument of -\code{\link[=scale_x_continuous]{scale_*_continuous}} and -\code{\link[=scale_x_discrete]{scale_*_discrete}}. -The expansions vectors are used to add some space between -the data and the axes. +for the \code{expand} argument of \link[=scale_x_continuous]{scale_(x|y)_continuous} +and \link[=scale_x_discrete]{scale_(x|y)_discrete}. The expansion vectors are used to +add some space between the data and the axes. } \examples{ # No space below the bars but 10\% above them ggplot(mtcars) + geom_bar(aes(x = factor(cyl))) + - scale_y_continuous(expand = expand_scale(mult = c(0, .1))) + scale_y_continuous(expand = expansion(mult = c(0, .1))) # Add 2 units of space on the left and right of the data ggplot(subset(diamonds, carat > 2), aes(cut, clarity)) + geom_jitter() + - scale_x_discrete(expand = expand_scale(add = 2)) + scale_x_discrete(expand = expansion(add = 2)) # Reproduce the default range expansion used # when the 'expand' argument is not specified ggplot(subset(diamonds, carat > 2), aes(cut, price)) + geom_jitter() + - scale_x_discrete(expand = expand_scale(add = .6)) + - scale_y_continuous(expand = expand_scale(mult = .05)) + scale_x_discrete(expand = expansion(add = .6)) + + scale_y_continuous(expand = expansion(mult = .05)) + } diff --git a/man/facet_grid.Rd b/man/facet_grid.Rd index 26d483a69b..b711eb1929 100644 --- a/man/facet_grid.Rd +++ b/man/facet_grid.Rd @@ -38,12 +38,13 @@ before statistical summary.} \item{labeller}{A function that takes one data frame of labels and returns a list or data frame of character vectors. Each input column corresponds to one factor. Thus there will be more than -one with formulae of the type \code{~cyl + am}. Each output +one with \code{vars(cyl, am)}. Each output column gets displayed as one separate line in the strip label. This function should inherit from the "labeller" S3 class -for compatibility with \code{\link[=labeller]{labeller()}}. See -\code{\link[=label_value]{label_value()}} for more details and pointers to other -options.} +for compatibility with \code{\link[=labeller]{labeller()}}. You can use different labeling +functions for different kind of labels, for example use \code{\link[=label_parsed]{label_parsed()}} for +formatting facet labels. \code{\link[=label_value]{label_value()}} is used by default, +check it for more details and pointers to other options.} \item{as.table}{If \code{TRUE}, the default, the facets are laid out like a table with highest values at the bottom-right. If \code{FALSE}, the @@ -74,6 +75,7 @@ and \code{cols} instead.} \code{facet_grid()} forms a matrix of panels defined by row and column faceting variables. It is most useful when you have two discrete variables, and all combinations of the variables exist in the data. +If you have only one variable with many levels, try \code{\link[=facet_wrap]{facet_wrap()}}. } \examples{ p <- ggplot(mpg, aes(displ, cty)) + geom_point() @@ -83,13 +85,6 @@ p + facet_grid(rows = vars(drv)) p + facet_grid(cols = vars(cyl)) p + facet_grid(vars(drv), vars(cyl)) -# The historical formula interface is also available: -\donttest{ -p + facet_grid(. ~ cyl) -p + facet_grid(drv ~ .) -p + facet_grid(drv ~ cyl) -} - # To change plot order of facet grid, # change the order of variable levels with factor() @@ -108,7 +103,7 @@ p + mt <- ggplot(mtcars, aes(mpg, wt, colour = factor(cyl))) + geom_point() -mt + facet_grid(. ~ cyl, scales = "free") +mt + facet_grid(vars(cyl), scales = "free") # If scales and space are free, then the mapping between position # and values in the data will be the same across all panels. This diff --git a/man/facet_wrap.Rd b/man/facet_wrap.Rd index ccea530f4e..d6c3075988 100644 --- a/man/facet_wrap.Rd +++ b/man/facet_wrap.Rd @@ -30,12 +30,13 @@ before statistical summary.} \item{labeller}{A function that takes one data frame of labels and returns a list or data frame of character vectors. Each input column corresponds to one factor. Thus there will be more than -one with formulae of the type \code{~cyl + am}. Each output +one with \code{vars(cyl, am)}. Each output column gets displayed as one separate line in the strip label. This function should inherit from the "labeller" S3 class -for compatibility with \code{\link[=labeller]{labeller()}}. See -\code{\link[=label_value]{label_value()}} for more details and pointers to other -options.} +for compatibility with \code{\link[=labeller]{labeller()}}. You can use different labeling +functions for different kind of labels, for example use \code{\link[=label_parsed]{label_parsed()}} for +formatting facet labels. \code{\link[=label_value]{label_value()}} is used by default, +check it for more details and pointers to other options.} \item{as.table}{If \code{TRUE}, the default, the facets are laid out like a table with highest values at the bottom-right. If \code{FALSE}, the @@ -70,9 +71,6 @@ p <- ggplot(mpg, aes(displ, hwy)) + geom_point() # Use vars() to supply faceting variables: p + facet_wrap(vars(class)) -# The historical interface with formulas is also available: -p + facet_wrap(~class) - # Control the number of rows and columns with nrow and ncol p + facet_wrap(vars(class), nrow = 4) @@ -85,14 +83,14 @@ ggplot(mpg, aes(displ, hwy)) + # Use the `labeller` option to control how labels are printed: ggplot(mpg, aes(displ, hwy)) + geom_point() + - facet_wrap(c("cyl", "drv"), labeller = "label_both") + facet_wrap(vars(cyl, drv), labeller = "label_both") # To change the order in which the panels appear, change the levels # of the underlying factor. mpg$class2 <- reorder(mpg$class, mpg$displ) ggplot(mpg, aes(displ, hwy)) + geom_point() + - facet_wrap(~class2) + facet_wrap(vars(class2)) # By default, the same scales are used for all panels. You can allow # scales to vary across the panels with the `scales` argument. @@ -100,14 +98,14 @@ ggplot(mpg, aes(displ, hwy)) + # harder to compare across panels. ggplot(mpg, aes(displ, hwy)) + geom_point() + - facet_wrap(~class, scales = "free") + facet_wrap(vars(class), scales = "free") # To repeat the same data in every panel, simply construct a data frame # that does not contain the faceting variable. ggplot(mpg, aes(displ, hwy)) + geom_point(data = transform(mpg, class = NULL), colour = "grey85") + geom_point() + - facet_wrap(~class) + facet_wrap(vars(class)) # Use `strip.position` to display the facet labels at the side of your # choice. Setting it to `bottom` makes it act as a subtitle for the axis. @@ -115,7 +113,7 @@ ggplot(mpg, aes(displ, hwy)) + # strip labels. ggplot(economics_long, aes(date, value)) + geom_line() + - facet_wrap(~variable, scales = "free_y", nrow = 2, strip.position = "bottom") + + facet_wrap(vars(variable), scales = "free_y", nrow = 2, strip.position = "top") + theme(strip.background = element_blank(), strip.placement = "outside") } } diff --git a/man/geom_contour.Rd b/man/geom_contour.Rd index 3632c31f82..9c2b89d6f6 100644 --- a/man/geom_contour.Rd +++ b/man/geom_contour.Rd @@ -2,16 +2,30 @@ % Please edit documentation in R/geom-contour.r, R/stat-contour.r \name{geom_contour} \alias{geom_contour} +\alias{geom_contour_filled} \alias{stat_contour} +\alias{stat_contour_filled} \title{2d contours of a 3d surface} \usage{ geom_contour(mapping = NULL, data = NULL, stat = "contour", - position = "identity", ..., lineend = "butt", linejoin = "round", + position = "identity", ..., bins = NULL, binwidth = NULL, + breaks = NULL, lineend = "butt", linejoin = "round", linemitre = 10, na.rm = FALSE, show.legend = NA, inherit.aes = TRUE) +geom_contour_filled(mapping = NULL, data = NULL, + stat = "contour_filled", position = "identity", ..., bins = NULL, + binwidth = NULL, breaks = NULL, na.rm = FALSE, show.legend = NA, + inherit.aes = TRUE) + stat_contour(mapping = NULL, data = NULL, geom = "contour", - position = "identity", ..., na.rm = FALSE, show.legend = NA, + position = "identity", ..., bins = NULL, binwidth = NULL, + breaks = NULL, na.rm = FALSE, show.legend = NA, + inherit.aes = TRUE) + +stat_contour_filled(mapping = NULL, data = NULL, geom = "polygon", + position = "identity", ..., bins = NULL, binwidth = NULL, + breaks = NULL, na.rm = FALSE, show.legend = NA, inherit.aes = TRUE) } \arguments{ @@ -46,6 +60,14 @@ often aesthetics, used to set an aesthetic to a fixed value, like \code{colour = "red"} or \code{size = 3}. They may also be parameters to the paired geom/stat.} +\item{bins}{Number of contour bins. Overridden by \code{binwidth}.} + +\item{binwidth}{The width of the contour bins. Overridden by \code{breaks}.} + +\item{breaks}{Numeric vector to set the contour breaks. +Overrides \code{binwidth} and \code{bins}. By default, this is a vector of +length ten with \code{\link[=pretty]{pretty()}} breaks.} + \item{lineend}{Line end style (round, butt, square).} \item{linejoin}{Line join style (round, mitre, bevel).} @@ -123,6 +145,9 @@ ggplot(faithful, aes(waiting, eruptions)) + geom_density_2d() \donttest{ +# use geom_contour_filled() for filled contours +v + geom_contour_filled() + # Setting bins creates evenly spaced contours in the range of the data v + geom_contour(bins = 2) v + geom_contour(bins = 10) diff --git a/man/geom_density_2d.Rd b/man/geom_density_2d.Rd index c649a29884..9f88c1aa22 100644 --- a/man/geom_density_2d.Rd +++ b/man/geom_density_2d.Rd @@ -14,7 +14,8 @@ geom_density_2d(mapping = NULL, data = NULL, stat = "density2d", stat_density_2d(mapping = NULL, data = NULL, geom = "density_2d", position = "identity", ..., contour = TRUE, n = 100, h = NULL, - na.rm = FALSE, show.legend = NA, inherit.aes = TRUE) + adjust = c(1, 1), na.rm = FALSE, show.legend = NA, + inherit.aes = TRUE) } \arguments{ \item{mapping}{Set of aesthetic mappings created by \code{\link[=aes]{aes()}} or @@ -75,6 +76,11 @@ estimation} \item{h}{Bandwidth (vector of length two). If \code{NULL}, estimated using \code{\link[MASS:bandwidth.nrd]{MASS::bandwidth.nrd()}}.} + +\item{adjust}{A multiplicative bandwidth adjustment to be used if 'h' is +'NULL'. This makes it possible to adjust the bandwidth while still +using the a bandwidth estimator. For example, \code{adjust = 1/2} means +use half of the default bandwidth.} } \description{ Perform a 2D kernel density estimation using \code{\link[MASS:kde2d]{MASS::kde2d()}} and diff --git a/man/geom_histogram.Rd b/man/geom_histogram.Rd index 99189ee2a9..1cf44f65cf 100644 --- a/man/geom_histogram.Rd +++ b/man/geom_histogram.Rd @@ -69,8 +69,8 @@ or as a function that calculates width from unscaled x. Here, "unscaled x" refers to the original x values in the data, before application of any scale transformation. When specifying a function along with a grouping structure, the function will be called once per group. -The default is to use \code{bins} -bins that cover the range of the data. You should always override +The default is to use the number of bins in \code{bins}, +covering the range of the data. You should always override this value, exploring multiple widths to find the best to illustrate the stories in your data. @@ -116,8 +116,10 @@ discrete, you probably want to use \code{\link[=stat_count]{stat_count()}}. By default, the underlying computation (\code{stat_bin()}) uses 30 bins; this is not a good default, but the idea is to get you experimenting with -different bin widths. You may need to look at a few to uncover the full -story behind your data. +different number of bins. You can also experiment modifying the \code{binwidth} with +\code{center} or \code{boundary} arguments. \code{binwidth} overrides \code{bins} so you should do +one change at a time. You may need to look at a few options to uncover +the full story behind your data. } \section{Aesthetics}{ diff --git a/man/geom_path.Rd b/man/geom_path.Rd index 3269ac29ef..87042c73f0 100644 --- a/man/geom_path.Rd +++ b/man/geom_path.Rd @@ -73,8 +73,9 @@ rather than combining with them. This is most useful for helper functions that define both data and aesthetics and shouldn't inherit behaviour from the default plot specification, e.g. \code{\link[=borders]{borders()}}.} -\item{direction}{direction of stairs: 'vh' for vertical then horizontal, or -'hv' for horizontal then vertical.} +\item{direction}{direction of stairs: 'vh' for vertical then horizontal, +'hv' for horizontal then vertical, or 'mid' for step half-way between +adjacent x-values.} } \description{ \code{geom_path()} connects the observations in the order in which they appear diff --git a/man/geom_ribbon.Rd b/man/geom_ribbon.Rd index af2288135e..f5142578ec 100644 --- a/man/geom_ribbon.Rd +++ b/man/geom_ribbon.Rd @@ -60,9 +60,10 @@ that define both data and aesthetics and shouldn't inherit behaviour from the default plot specification, e.g. \code{\link[=borders]{borders()}}.} } \description{ -For each x value, \code{geom_ribbon} displays a y interval defined -by \code{ymin} and \code{ymax}. \code{geom_area} is a special case of -\code{geom_ribbon}, where the \code{ymin} is fixed to 0. +For each x value, \code{geom_ribbon()} displays a y interval defined +by \code{ymin} and \code{ymax}. \code{geom_area()} is a special case of +\code{geom_ribbon}, where the \code{ymin} is fixed to 0 and \code{y} is used instead +of \code{ymax}. } \details{ An area plot is the continuous analogue of a stacked bar chart (see diff --git a/man/geom_smooth.Rd b/man/geom_smooth.Rd index e82e5b94b3..c29a0fa0d0 100644 --- a/man/geom_smooth.Rd +++ b/man/geom_smooth.Rd @@ -51,7 +51,7 @@ e.g. \code{"auto"}, \code{"lm"}, \code{"glm"}, \code{"gam"}, \code{"loess"} or a For \code{method = "auto"} the smoothing method is chosen based on the size of the largest group (across all panels). \code{\link[stats:loess]{stats::loess()}} is used for less than 1,000 observations; otherwise \code{\link[mgcv:gam]{mgcv::gam()}} is -used with \code{formula = y ~ s(x, bs = "cs")}. Somewhat anecdotally, +used with \code{formula = y ~ s(x, bs = "cs")} with \code{method = "REML"}. Somewhat anecdotally, \code{loess} gives a better appearance, but is \eqn{O(N^{2})}{O(N^2)} in memory, so does not work for larger datasets. diff --git a/man/gg_dep.Rd b/man/gg_dep.Rd index 3317018869..3a59a65785 100644 --- a/man/gg_dep.Rd +++ b/man/gg_dep.Rd @@ -13,24 +13,6 @@ gg_dep(version, msg) \item{msg}{The message to print.} } \description{ -Version numbers have the format .., like 0.9.2. -This function compares the current version number of ggplot2 against the -specified \code{version}, which is the most recent version before the -function (or other object) was deprecated. -} -\details{ -\code{gg_dep} will give an error, warning, or message, depending on the -difference between the current ggplot2 version and the specified -\code{version}. - -If the current major number is greater than \code{version}'s major number, -or if the current minor number is more than 1 greater than \code{version}'s -minor number, give an error. - -If the current minor number differs from \code{version}'s minor number by -one, give a warning. - -If the current subminor number differs from \code{version}'s subminor -number, print a message. +This function is deprecated. } \keyword{internal} diff --git a/man/ggplot2-ggproto.Rd b/man/ggplot2-ggproto.Rd index b7afad3c3a..df35148ece 100644 --- a/man/ggplot2-ggproto.Rd +++ b/man/ggplot2-ggproto.Rd @@ -107,6 +107,7 @@ \alias{StatBinhex} \alias{StatBoxplot} \alias{StatContour} +\alias{StatContourFilled} \alias{StatCount} \alias{StatDensity2d} \alias{StatDensity} @@ -401,12 +402,80 @@ that must be present for this position adjustment to work. \section{Scales}{ -All \code{scale_*} functions (like \code{scale_x_continuous}) return a -\code{Scale*} object (like \code{ScaleContinuous}). The \code{Scale*} -object represents a single scale. +All \code{scale_*} functions like \code{\link[=scale_x_continuous]{scale_x_continuous()}} return a \code{Scale*} +object like \code{ScaleContinuous}. Each of the \code{Scale*} objects is a \code{\link[=ggproto]{ggproto()}} +object, descended from the top-level \code{Scale}. -Each of the \code{Scale*} objects is a \code{\link[=ggproto]{ggproto()}} object, -descended from the top-level \code{Scale}. +Properties not documented in \code{\link[=continuous_scale]{continuous_scale()}} or \code{\link[=discrete_scale]{discrete_scale()}}: +\itemize{ +\item \code{call} The call to \code{\link[=continuous_scale]{continuous_scale()}} or \code{\link[=discrete_scale]{discrete_scale()}} that constructed +the scale. +\item \code{range} One of \code{continuous_range()} or \code{discrete_range()}. +} + +Methods: +\itemize{ +\item \code{is_discrete()} Returns \code{TRUE} if the scale is a discrete scale +\item \code{is_empty()} Returns \code{TRUE} if the scale contains no information (i.e., +it has no information with which to calculate its \code{limits}). +\item \code{clone()} Returns a copy of the scale that can be trained +independently without affecting the original scale. +\item \code{transform()} Transforms a vector of values using \code{self$trans}. +This occurs before the \code{Stat} is calculated. +\item \code{train()} Update the \code{self$range} of observed (transformed) data values with +a vector of (possibly) new values. +\item \code{reset()} Reset the \code{self$range} of observed data values. For discrete +position scales, only the continuous range is reset. +\item \code{map()} Map transformed data values to some output value as +determined by \code{self$rescale()} and \code{self$pallete} (except for position scales, +which do not use the default implementation of this method). The output corresponds +to the transformed data value in aesthetic space (e.g., a color, line width, or size). +\item \code{rescale()} Rescale transformed data to the the range 0, 1. This is most useful for +position scales. For continuous scales, \code{rescale()} uses the \code{rescaler} that +was provided to the constructor. \code{rescale()} does not apply \code{self$oob()} to +its input, which means that discrete values outside \code{limits} will be \code{NA}, and +values that are outside \code{range} will have values less than 0 or greater than 1. +This allows guides more control over how out-of-bounds values are displayed. +\item \code{transform_df()}, \code{train_df()}, \code{map_df()} These \code{_df} variants +accept a data frame, and apply the \code{transform}, \code{train}, and \code{map} methods +(respectively) to the columns whose names are in \code{self$aesthetics}. +\item \code{get_limits()} Calculates the final scale limits in transformed data space +based on the combination of \code{self$limits} and/or the range of observed values +(\code{self$range}). +\item \code{get_breaks()} Calculates the final scale breaks in transformed data space +based on on the combination of \code{self$breaks}, \code{self$trans$breaks()} (for +continuous scales), and \code{limits}. Breaks outside of \code{limits} are assigned +a value of \code{NA} (continuous scales) or dropped (discrete scales). +\item \code{get_labels()} Calculates labels for a given set of (transformed) \code{breaks} +based on the combination of \code{self$labels} and \code{breaks}. +\item \code{get_breaks_minor()} For continuous scales, calculates the final scale minor breaks +in transformed data space based on the rescaled \code{breaks}, the value of \code{self$minor_breaks}, +and the value of \code{self$trans$minor_breaks()}. Discrete scales always return \code{NULL}. +\item \code{make_title()} Hook to modify the title that is calculated during guide construction +(for non-position scales) or when the \code{Layout} calculates the x and y labels +(position scales). +} + +These methods are only valid for position (x and y) scales: +\itemize{ +\item \code{dimension()} For continuous scales, the dimension is the same concept as the limits. +For discrete scales, \code{dimension()} returns a continuous range, where the limits +would be placed at integer positions. \code{dimension()} optionally expands +this range given an expantion of length 4 (see \code{\link[=expansion]{expansion()}}). +\item \code{break_info()} Returns a \code{list()} with calculated values needed for the \code{Coord} +to transform values in transformed data space. Axis and grid guides also use +these values to draw guides. This is called with +a (usually expanded) continuous range, such as that returned by \code{self$dimension()} +(even for discrete scales). The list has components \code{major_source} +(\code{self$get_breaks()} for continuous scales, or \code{seq_along(self$get_breaks())} +for discrete scales), \code{major} (the rescaled value of \code{major_source}, ignoring +\code{self$rescaler}), \code{minor} (the rescaled value of \code{minor_source}, ignoring +\code{self$rescaler}), \code{range} (the range that was passed in to \code{break_info()}), +\code{labels} (the label values, one for each element in \code{breaks}). +\item \code{axis_order()} One of \code{c("primary", "secondary")} or \code{c("secondary", "primary")} +\item \code{make_sec_title()} Hook to modify the title for the second axis that is calculated +when the \code{Layout} calculates the x and y labels. +} } \seealso{ diff --git a/man/ggsave.Rd b/man/ggsave.Rd index e3b2dcda0e..72725d51fd 100644 --- a/man/ggsave.Rd +++ b/man/ggsave.Rd @@ -17,7 +17,9 @@ ggsave(filename, plot = last_plot(), device = NULL, path = NULL, (e.g. \code{\link[=png]{png()}}), or one of "eps", "ps", "tex" (pictex), "pdf", "jpeg", "tiff", "png", "bmp", "svg" or "wmf" (windows only).} -\item{path}{Path to save plot to (combined with filename).} +\item{path}{Path of the directory to save plot to: \code{path} and \code{filename} +are combined to create the fully qualified file name. Defaults to the +working directory.} \item{scale}{Multiplicative scaling factor.} diff --git a/man/labeller.Rd b/man/labeller.Rd index 1587e6309e..52ef1a196f 100644 --- a/man/labeller.Rd +++ b/man/labeller.Rd @@ -30,7 +30,7 @@ function.} used with lookup tables or non-labeller functions.} } \value{ -A labeller function to supply to \code{\link[=facet_grid]{facet_grid()}} +A labeller function to supply to \code{\link[=facet_grid]{facet_grid()}} or \code{\link[=facet_wrap]{facet_wrap()}} for the argument \code{labeller}. } \description{ diff --git a/man/lims.Rd b/man/lims.Rd index 9651f705ed..c9f05436bc 100644 --- a/man/lims.Rd +++ b/man/lims.Rd @@ -57,9 +57,34 @@ ggplot(small, aes(mpg, wt, colour = factor(cyl))) + ggplot(big, aes(mpg, wt, colour = factor(cyl))) + geom_point() + lims(colour = c("4", "6", "8")) + +# There are two ways of setting the axis limits: with limits or +# with coordinate systems. They work in two rather different ways. + +last_month <- Sys.Date() - 0:59 +df <- data.frame( + date = last_month, + price = c(rnorm(30, mean = 15), runif(30) + 0.2 * (1:30)) +) + +p <- ggplot(df, aes(date, price)) + + geom_line() + + stat_smooth() + +p + +# Setting the limits with the scale discards all data outside the range. +p + lims(x= c(Sys.Date() - 30, NA), y = c(10, 20)) + +# For changing x or y axis limits **without** dropping data +# observations use [coord_cartesian()]. Setting the limits on the +# coordinate system performs a visual zoom. +p + coord_cartesian(xlim =c(Sys.Date() - 30, NA), ylim = c(10, 20)) + } \seealso{ For changing x or y axis limits \strong{without} dropping data observations, see \code{\link[=coord_cartesian]{coord_cartesian()}}. To expand the range of -a plot to always include certain values, see \code{\link[=expand_limits]{expand_limits()}}. +a plot to always include certain values, see \code{\link[=expand_limits]{expand_limits()}}. For other +types of data, see \code{\link[=scale_x_discrete]{scale_x_discrete()}}, \code{\link[=scale_x_continuous]{scale_x_continuous()}}, \code{\link[=scale_x_date]{scale_x_date()}}. } diff --git a/man/position_dodge.Rd b/man/position_dodge.Rd index 708a72b4d5..a35191e4fa 100644 --- a/man/position_dodge.Rd +++ b/man/position_dodge.Rd @@ -28,7 +28,8 @@ This is useful if you're rotating both the plot and legend.} Dodging preserves the vertical position of an geom while adjusting the horizontal position. \code{position_dodge2} is a special case of \code{position_dodge} for arranging box plots, which can have variable widths. \code{position_dodge2} -also works with bars and rectangles. +also works with bars and rectangles. But unlike \code{position_dodge}, +\code{position_dodge2} works without a grouping variable in a layer. } \examples{ ggplot(mtcars, aes(factor(cyl), fill = factor(vs))) + diff --git a/man/scale_brewer.Rd b/man/scale_brewer.Rd index e65b97f618..c44e785402 100644 --- a/man/scale_brewer.Rd +++ b/man/scale_brewer.Rd @@ -30,8 +30,9 @@ limits, breaks, labels and so forth.} \item{type}{One of seq (sequential), div (diverging) or qual (qualitative)} -\item{palette}{If a string, will use that named palette. If a number, will -index into the list of palettes of appropriate \code{type}} +\item{palette}{If a string, will use that named palette. If a number, will index into +the list of palettes of appropriate \code{type}. The list of available palettes can found +in the Palettes section.} \item{direction}{Sets the order of colours in the scale. If 1, the default, colours are as output by \code{\link[RColorBrewer:brewer.pal]{RColorBrewer::brewer.pal()}}. If -1, the @@ -79,6 +80,7 @@ The following palettes are available for use with these scales: \item{Sequential}{Blues, BuGn, BuPu, GnBu, Greens, Greys, Oranges, OrRd, PuBu, PuBuGn, PuRd, Purples, RdPu, Reds, YlGn, YlGnBu, YlOrBr, YlOrRd} } +Modify the palette through the \code{palette} arguement. } \examples{ diff --git a/man/scale_colour_continuous.Rd b/man/scale_colour_continuous.Rd index bf7a287ac4..f0ad9fba5a 100644 --- a/man/scale_colour_continuous.Rd +++ b/man/scale_colour_continuous.Rd @@ -1,8 +1,9 @@ % Generated by roxygen2: do not edit by hand -% Please edit documentation in R/scale-colour.r +% Please edit documentation in R/scale-colour.r, R/zxx.r \name{scale_colour_continuous} \alias{scale_colour_continuous} \alias{scale_fill_continuous} +\alias{scale_color_continuous} \title{Continuous colour scales} \usage{ scale_colour_continuous(..., @@ -23,6 +24,25 @@ Colour scales for continuous data default to the values of the options are not present, \code{"gradient"} will be used. See \code{\link[=options]{options()}} for more information. } +\section{Color Blindness}{ + +Many color palettes derived from RGB combinations (like the "rainbow" color +palette) are not suitable to support all viewers, especially those with +color vision deficiencies. Using \code{viridis} type, which is perceptually +uniform in both colour and black-and-white display is an easy option to +ensure good perceptive properties of your visulizations. +The colorspace package offers functionalities +\itemize{ +\item to generate color palettes with good perceptive properties, +\item to analyse a given color palette, like emulating color blindness, +\item and to modify a given color palette for better perceptivity. +} + +For more information on color vision deficiencies and suitable color choices +see the \href{https://arxiv.org/abs/1903.06490}{paper on the colorspace package} +and references therein. +} + \examples{ v <- ggplot(faithfuld, aes(waiting, eruptions, fill = density)) + geom_tile() diff --git a/man/scale_continuous.Rd b/man/scale_continuous.Rd index 1806550a1f..d904a781fe 100644 --- a/man/scale_continuous.Rd +++ b/man/scale_continuous.Rd @@ -43,10 +43,10 @@ omitted.} \itemize{ \item \code{NULL} for no breaks \item \code{waiver()} for the default breaks computed by the -transformation object +\link[scales:trans_new]{transformation object} \item A numeric vector of positions \item A function that takes the limits as input and returns breaks -as output +as output (e.g., a function returned by \code{\link[scales:extended_breaks]{scales::extended_breaks()}}) }} \item{minor_breaks}{One of: @@ -75,36 +75,46 @@ as output Use \code{NA} to refer to the existing minimum or maximum \item A function that accepts the existing (automatic) limits and returns new limits +Note that setting limits on positional scales will \strong{remove} data outside of the limits. +If the purpose is to zoom, use the limit argument in the coordinate system +(see \code{\link[=coord_cartesian]{coord_cartesian()}}). }} -\item{expand}{Vector of range expansion constants used to add some -padding around the data, to ensure that they are placed some distance -away from the axes. Use the convenience function \code{\link[=expand_scale]{expand_scale()}} +\item{expand}{For position scales, a vector of range expansion constants used to add some +padding around the data to ensure that they are placed some distance +away from the axes. Use the convenience function \code{\link[=expansion]{expansion()}} to generate the values for the \code{expand} argument. The defaults are to expand the scale by 5\% on each side for continuous variables, and by 0.6 units on each side for discrete variables.} -\item{oob}{Function that handles limits outside of the scale limits -(out of bounds). The default replaces out of bounds values with \code{NA}.} +\item{oob}{One of: +\itemize{ +\item Function that handles limits outside of the scale limits +(out of bounds). +\item The default (\code{\link[scales:censor]{scales::censor()}}) replaces out of +bounds values with \code{NA}. +\item \code{\link[scales:squish]{scales::squish()}} for squishing out of bounds values into range. +\item \code{\link[scales:squish_infinite]{scales::squish_infinite()}} for squishing infitite values into range. +}} \item{na.value}{Missing values will be replaced with this value.} -\item{trans}{Either the name of a transformation object, or the -object itself. Built-in transformations include "asn", "atanh", +\item{trans}{For continuous scales, the name of a transformation object +or the object itself. Built-in transformations include "asn", "atanh", "boxcox", "date", "exp", "hms", "identity", "log", "log10", "log1p", "log2", "logit", "modulus", "probability", "probit", "pseudo_log", "reciprocal", "reverse", "sqrt" and "time". A transformation object bundles together a transform, its inverse, and methods for generating breaks and labels. Transformation objects -are defined in the scales package, and are called \code{name_trans}, e.g. -\code{\link[scales:boxcox_trans]{scales::boxcox_trans()}}. You can create your own +are defined in the scales package, and are called \code{_trans} (e.g., +\code{\link[scales:boxcox_trans]{scales::boxcox_trans()}}). You can create your own transformation with \code{\link[scales:trans_new]{scales::trans_new()}}.} -\item{position}{The position of the axis. "left" or "right" for vertical -scales, "top" or "bottom" for horizontal scales} +\item{position}{For position scales, The position of the axis. +\code{left} or \code{right} for y axes, \code{top} or \code{bottom} for x axes.} -\item{sec.axis}{specify a secondary axis} +\item{sec.axis}{\code{\link[=sec_axis]{sec_axis()}} is used to specify a secondary axis.} \item{...}{Other arguments passed on to \code{scale_(x|y)_continuous()}} } @@ -173,8 +183,6 @@ p1 + scale_y_continuous(trans = scales::reciprocal_trans()) } \seealso{ -\code{\link[=sec_axis]{sec_axis()}} for how to specify secondary axes - Other position scales: \code{\link{scale_x_date}}, \code{\link{scale_x_discrete}} } diff --git a/man/scale_date.Rd b/man/scale_date.Rd index f09a0f339c..3b9e123d5a 100644 --- a/man/scale_date.Rd +++ b/man/scale_date.Rd @@ -95,25 +95,35 @@ like "2 weeks", or "10 years". If both \code{minor_breaks} and Use \code{NA} to refer to the existing minimum or maximum \item A function that accepts the existing (automatic) limits and returns new limits +Note that setting limits on positional scales will \strong{remove} data outside of the limits. +If the purpose is to zoom, use the limit argument in the coordinate system +(see \code{\link[=coord_cartesian]{coord_cartesian()}}). }} -\item{expand}{Vector of range expansion constants used to add some -padding around the data, to ensure that they are placed some distance -away from the axes. Use the convenience function \code{\link[=expand_scale]{expand_scale()}} +\item{expand}{For position scales, a vector of range expansion constants used to add some +padding around the data to ensure that they are placed some distance +away from the axes. Use the convenience function \code{\link[=expansion]{expansion()}} to generate the values for the \code{expand} argument. The defaults are to expand the scale by 5\% on each side for continuous variables, and by 0.6 units on each side for discrete variables.} -\item{position}{The position of the axis. "left" or "right" for vertical -scales, "top" or "bottom" for horizontal scales} +\item{position}{For position scales, The position of the axis. +\code{left} or \code{right} for y axes, \code{top} or \code{bottom} for x axes.} -\item{sec.axis}{specify a secondary axis} +\item{sec.axis}{\code{\link[=sec_axis]{sec_axis()}} is used to specify a secondary axis.} \item{timezone}{The timezone to use for display on the axes. The default (\code{NULL}) uses the timezone encoded in the data.} -\item{oob}{Function that handles limits outside of the scale limits -(out of bounds). The default replaces out of bounds values with \code{NA}.} +\item{oob}{One of: +\itemize{ +\item Function that handles limits outside of the scale limits +(out of bounds). +\item The default (\code{\link[scales:censor]{scales::censor()}}) replaces out of +bounds values with \code{NA}. +\item \code{\link[scales:squish]{scales::squish()}} for squishing out of bounds values into range. +\item \code{\link[scales:squish_infinite]{scales::squish_infinite()}} for squishing infitite values into range. +}} \item{na.value}{Missing values will be replaced with this value.} } diff --git a/man/scale_discrete.Rd b/man/scale_discrete.Rd index 457785b76f..b0f730cae2 100644 --- a/man/scale_discrete.Rd +++ b/man/scale_discrete.Rd @@ -14,12 +14,11 @@ scale_y_discrete(..., expand = waiver(), position = "left") \describe{ \item{palette}{A palette function that when called with a single integer argument (the number of levels in the scale) returns the values that -they should take.} +they should take (e.g., \code{\link[scales:hue_pal]{scales::hue_pal()}}).} \item{breaks}{One of: \itemize{ \item \code{NULL} for no breaks -\item \code{waiver()} for the default breaks computed by the -transformation object +\item \code{waiver()} for the default breaks (the scale limits) \item A character vector of breaks \item A function that takes the limits as input and returns breaks as output @@ -35,8 +34,9 @@ from a discrete scale, specify \code{na.translate = FALSE}.} \item{na.value}{If \code{na.translate = TRUE}, what value aesthetic value should missing be displayed as? Does not apply to position scales where \code{NA} is always placed at the far right.} - \item{aesthetics}{The names of the aesthetics that this scale works with} - \item{scale_name}{The name of the scale} + \item{aesthetics}{The names of the aesthetics that this scale works with.} + \item{scale_name}{The name of the scale that should be used for error messages +associated with this scale.} \item{name}{The name of the scale. Used as the axis or legend title. If \code{waiver()}, the default, the name of the scale is taken from the first mapping used for that aesthetic. If \code{NULL}, the legend title will be @@ -50,22 +50,35 @@ transformation object \item A function that takes the breaks as input and returns labels as output }} + \item{expand}{For position scales, a vector of range expansion constants used to add some +padding around the data to ensure that they are placed some distance +away from the axes. Use the convenience function \code{\link[=expansion]{expansion()}} +to generate the values for the \code{expand} argument. The defaults are to +expand the scale by 5\% on each side for continuous variables, and by +0.6 units on each side for discrete variables.} \item{guide}{A function used to create a guide or its name. See -\code{\link[=guides]{guides()}} for more info.} +\code{\link[=guides]{guides()}} for more information.} + \item{position}{For position scales, The position of the axis. +\code{left} or \code{right} for y axes, \code{top} or \code{bottom} for x axes.} \item{super}{The super class to use for the constructed scale} }} -\item{expand}{Vector of range expansion constants used to add some -padding around the data, to ensure that they are placed some distance -away from the axes. Use the convenience function \code{\link[=expand_scale]{expand_scale()}} +\item{expand}{For position scales, a vector of range expansion constants used to add some +padding around the data to ensure that they are placed some distance +away from the axes. Use the convenience function \code{\link[=expansion]{expansion()}} to generate the values for the \code{expand} argument. The defaults are to expand the scale by 5\% on each side for continuous variables, and by 0.6 units on each side for discrete variables.} -\item{position}{The position of the axis. \code{left} or \code{right} for y -axes, \code{top} or \code{bottom} for x axes} +\item{position}{For position scales, The position of the axis. +\code{left} or \code{right} for y axes, \code{top} or \code{bottom} for x axes.} } \description{ +\code{scale_x_discrete} and \code{scale_y_discrete} are used to set the values for +discrete x and y scale aesthetics. For simple manipulation of scale labels +and limits, you may wish to use \code{\link[=labs]{labs()}} and \code{\link[=lims]{lims()}} instead. +} +\details{ You can use continuous positions even with a discrete position scale - this allows you (e.g.) to place labels between bars in a bar chart. Continuous positions are numeric values starting at one for the first diff --git a/man/scale_gradient.Rd b/man/scale_gradient.Rd index 44237ace91..9569378fd7 100644 --- a/man/scale_gradient.Rd +++ b/man/scale_gradient.Rd @@ -11,7 +11,6 @@ \alias{scale_colour_date} \alias{scale_fill_datetime} \alias{scale_fill_date} -\alias{scale_color_continuous} \alias{scale_color_gradient} \alias{scale_color_gradient2} \alias{scale_color_gradientn} @@ -44,10 +43,11 @@ scale_fill_gradientn(..., colours, values = NULL, space = "Lab", \arguments{ \item{...}{Arguments passed on to \code{continuous_scale} \describe{ - \item{scale_name}{The name of the scale} + \item{scale_name}{The name of the scale that should be used for error messages +associated with this scale.} \item{palette}{A palette function that when called with a numeric vector with -values between 0 and 1 returns the corresponding values in the range the -scale maps to.} +values between 0 and 1 returns the corresponding output values +(e.g., \code{\link[scales:area_pal]{scales::area_pal()}}).} \item{name}{The name of the scale. Used as the axis or legend title. If \code{waiver()}, the default, the name of the scale is taken from the first mapping used for that aesthetic. If \code{NULL}, the legend title will be @@ -56,10 +56,10 @@ omitted.} \itemize{ \item \code{NULL} for no breaks \item \code{waiver()} for the default breaks computed by the -transformation object +\link[scales:trans_new]{transformation object} \item A numeric vector of positions \item A function that takes the limits as input and returns breaks -as output +as output (e.g., a function returned by \code{\link[scales:extended_breaks]{scales::extended_breaks()}}) }} \item{minor_breaks}{One of: \itemize{ @@ -85,32 +85,44 @@ as output Use \code{NA} to refer to the existing minimum or maximum \item A function that accepts the existing (automatic) limits and returns new limits +Note that setting limits on positional scales will \strong{remove} data outside of the limits. +If the purpose is to zoom, use the limit argument in the coordinate system +(see \code{\link[=coord_cartesian]{coord_cartesian()}}). +}} + \item{rescaler}{A function used to scale the input values to the +range [0, 1]. This is always \code{\link[scales:rescale]{scales::rescale()}}, except for +diverging and n colour gradients (i.e., \code{\link[=scale_colour_gradient2]{scale_colour_gradient2()}}, +\code{\link[=scale_colour_gradientn]{scale_colour_gradientn()}}). The \code{rescaler} is ignored by position +scales, which ways use \code{\link[scales:rescale]{scales::rescale()}}.} + \item{oob}{One of: +\itemize{ +\item Function that handles limits outside of the scale limits +(out of bounds). +\item The default (\code{\link[scales:censor]{scales::censor()}}) replaces out of +bounds values with \code{NA}. +\item \code{\link[scales:squish]{scales::squish()}} for squishing out of bounds values into range. +\item \code{\link[scales:squish_infinite]{scales::squish_infinite()}} for squishing infitite values into range. }} - \item{rescaler}{Used by diverging and n colour gradients -(i.e. \code{\link[=scale_colour_gradient2]{scale_colour_gradient2()}}, \code{\link[=scale_colour_gradientn]{scale_colour_gradientn()}}). -A function used to scale the input values to the range [0, 1].} - \item{oob}{Function that handles limits outside of the scale limits -(out of bounds). The default replaces out of bounds values with \code{NA}.} - \item{trans}{Either the name of a transformation object, or the -object itself. Built-in transformations include "asn", "atanh", + \item{trans}{For continuous scales, the name of a transformation object +or the object itself. Built-in transformations include "asn", "atanh", "boxcox", "date", "exp", "hms", "identity", "log", "log10", "log1p", "log2", "logit", "modulus", "probability", "probit", "pseudo_log", "reciprocal", "reverse", "sqrt" and "time". A transformation object bundles together a transform, its inverse, and methods for generating breaks and labels. Transformation objects -are defined in the scales package, and are called \code{name_trans}, e.g. -\code{\link[scales:boxcox_trans]{scales::boxcox_trans()}}. You can create your own +are defined in the scales package, and are called \code{_trans} (e.g., +\code{\link[scales:boxcox_trans]{scales::boxcox_trans()}}). You can create your own transformation with \code{\link[scales:trans_new]{scales::trans_new()}}.} - \item{position}{The position of the axis. "left" or "right" for vertical -scales, "top" or "bottom" for horizontal scales} - \item{super}{The super class to use for the constructed scale} - \item{expand}{Vector of range expansion constants used to add some -padding around the data, to ensure that they are placed some distance -away from the axes. Use the convenience function \code{\link[=expand_scale]{expand_scale()}} + \item{expand}{For position scales, a vector of range expansion constants used to add some +padding around the data to ensure that they are placed some distance +away from the axes. Use the convenience function \code{\link[=expansion]{expansion()}} to generate the values for the \code{expand} argument. The defaults are to expand the scale by 5\% on each side for continuous variables, and by 0.6 units on each side for discrete variables.} + \item{position}{For position scales, The position of the axis. +\code{left} or \code{right} for y axes, \code{top} or \code{bottom} for x axes.} + \item{super}{The super class to use for the constructed scale} }} \item{low, high}{Colours for low and high ends of the gradient.} @@ -160,6 +172,13 @@ df <- data.frame( z2 = abs(rnorm(100)) ) +df_na <- data.frame( + value = seq(1, 20), + x = runif(20), + y = runif(20), + z1 = c(rep(NA, 10), rnorm(10)) +) + # Default colour scale colours from light blue to dark blue ggplot(df, aes(x, y)) + geom_point(aes(colour = z2)) @@ -185,6 +204,16 @@ ggplot(df, aes(x, y)) + scale_colour_gradient(low = "white", high = "black") # Avoid red-green colour contrasts because ~10\% of men have difficulty # seeing them + +# Use `na.value = NA` to hide missing values but keep the original axis range +ggplot(df_na, aes(x = value, y)) + + geom_bar(aes(fill = z1), stat = "identity") + + scale_fill_gradient(low = "yellow", high = "red", na.value = NA) + + ggplot(df_na, aes(x, y)) + + geom_point(aes(colour = z1)) + + scale_colour_gradient(low = "yellow", high = "red", na.value = NA) + } \seealso{ \code{\link[scales:seq_gradient_pal]{scales::seq_gradient_pal()}} for details on underlying diff --git a/man/scale_grey.Rd b/man/scale_grey.Rd index cc75334ada..0f4b1c491b 100644 --- a/man/scale_grey.Rd +++ b/man/scale_grey.Rd @@ -17,12 +17,11 @@ scale_fill_grey(..., start = 0.2, end = 0.8, na.value = "red", \describe{ \item{palette}{A palette function that when called with a single integer argument (the number of levels in the scale) returns the values that -they should take.} +they should take (e.g., \code{\link[scales:hue_pal]{scales::hue_pal()}}).} \item{breaks}{One of: \itemize{ \item \code{NULL} for no breaks -\item \code{waiver()} for the default breaks computed by the -transformation object +\item \code{waiver()} for the default breaks (the scale limits) \item A character vector of breaks \item A function that takes the limits as input and returns breaks as output @@ -38,8 +37,9 @@ from a discrete scale, specify \code{na.translate = FALSE}.} \item{na.value}{If \code{na.translate = TRUE}, what value aesthetic value should missing be displayed as? Does not apply to position scales where \code{NA} is always placed at the far right.} - \item{aesthetics}{The names of the aesthetics that this scale works with} - \item{scale_name}{The name of the scale} + \item{aesthetics}{The names of the aesthetics that this scale works with.} + \item{scale_name}{The name of the scale that should be used for error messages +associated with this scale.} \item{name}{The name of the scale. Used as the axis or legend title. If \code{waiver()}, the default, the name of the scale is taken from the first mapping used for that aesthetic. If \code{NULL}, the legend title will be @@ -53,16 +53,16 @@ transformation object \item A function that takes the breaks as input and returns labels as output }} - \item{expand}{Vector of range expansion constants used to add some -padding around the data, to ensure that they are placed some distance -away from the axes. Use the convenience function \code{\link[=expand_scale]{expand_scale()}} + \item{expand}{For position scales, a vector of range expansion constants used to add some +padding around the data to ensure that they are placed some distance +away from the axes. Use the convenience function \code{\link[=expansion]{expansion()}} to generate the values for the \code{expand} argument. The defaults are to expand the scale by 5\% on each side for continuous variables, and by 0.6 units on each side for discrete variables.} \item{guide}{A function used to create a guide or its name. See -\code{\link[=guides]{guides()}} for more info.} - \item{position}{The position of the axis. "left" or "right" for vertical -scales, "top" or "bottom" for horizontal scales} +\code{\link[=guides]{guides()}} for more information.} + \item{position}{For position scales, The position of the axis. +\code{left} or \code{right} for y axes, \code{top} or \code{bottom} for x axes.} \item{super}{The super class to use for the constructed scale} }} diff --git a/man/scale_hue.Rd b/man/scale_hue.Rd index f491673668..e5c5d6ff36 100644 --- a/man/scale_hue.Rd +++ b/man/scale_hue.Rd @@ -22,12 +22,11 @@ scale_fill_hue(..., h = c(0, 360) + 15, c = 100, l = 65, \describe{ \item{palette}{A palette function that when called with a single integer argument (the number of levels in the scale) returns the values that -they should take.} +they should take (e.g., \code{\link[scales:hue_pal]{scales::hue_pal()}}).} \item{breaks}{One of: \itemize{ \item \code{NULL} for no breaks -\item \code{waiver()} for the default breaks computed by the -transformation object +\item \code{waiver()} for the default breaks (the scale limits) \item A character vector of breaks \item A function that takes the limits as input and returns breaks as output @@ -43,7 +42,8 @@ from a discrete scale, specify \code{na.translate = FALSE}.} \item{na.value}{If \code{na.translate = TRUE}, what value aesthetic value should missing be displayed as? Does not apply to position scales where \code{NA} is always placed at the far right.} - \item{scale_name}{The name of the scale} + \item{scale_name}{The name of the scale that should be used for error messages +associated with this scale.} \item{name}{The name of the scale. Used as the axis or legend title. If \code{waiver()}, the default, the name of the scale is taken from the first mapping used for that aesthetic. If \code{NULL}, the legend title will be @@ -57,16 +57,16 @@ transformation object \item A function that takes the breaks as input and returns labels as output }} - \item{expand}{Vector of range expansion constants used to add some -padding around the data, to ensure that they are placed some distance -away from the axes. Use the convenience function \code{\link[=expand_scale]{expand_scale()}} + \item{expand}{For position scales, a vector of range expansion constants used to add some +padding around the data to ensure that they are placed some distance +away from the axes. Use the convenience function \code{\link[=expansion]{expansion()}} to generate the values for the \code{expand} argument. The defaults are to expand the scale by 5\% on each side for continuous variables, and by 0.6 units on each side for discrete variables.} \item{guide}{A function used to create a guide or its name. See -\code{\link[=guides]{guides()}} for more info.} - \item{position}{The position of the axis. "left" or "right" for vertical -scales, "top" or "bottom" for horizontal scales} +\code{\link[=guides]{guides()}} for more information.} + \item{position}{For position scales, The position of the axis. +\code{left} or \code{right} for y axes, \code{top} or \code{bottom} for x axes.} \item{super}{The super class to use for the constructed scale} }} diff --git a/man/scale_linetype.Rd b/man/scale_linetype.Rd index 68fc2b88f9..dd4588a11e 100644 --- a/man/scale_linetype.Rd +++ b/man/scale_linetype.Rd @@ -17,12 +17,11 @@ scale_linetype_discrete(..., na.value = "blank") \describe{ \item{palette}{A palette function that when called with a single integer argument (the number of levels in the scale) returns the values that -they should take.} +they should take (e.g., \code{\link[scales:hue_pal]{scales::hue_pal()}}).} \item{breaks}{One of: \itemize{ \item \code{NULL} for no breaks -\item \code{waiver()} for the default breaks computed by the -transformation object +\item \code{waiver()} for the default breaks (the scale limits) \item A character vector of breaks \item A function that takes the limits as input and returns breaks as output @@ -35,8 +34,9 @@ The default, \code{TRUE}, uses the levels that appear in the data; \item{na.translate}{Unlike continuous scales, discrete scales can easily show missing values, and do so by default. If you want to remove missing values from a discrete scale, specify \code{na.translate = FALSE}.} - \item{aesthetics}{The names of the aesthetics that this scale works with} - \item{scale_name}{The name of the scale} + \item{aesthetics}{The names of the aesthetics that this scale works with.} + \item{scale_name}{The name of the scale that should be used for error messages +associated with this scale.} \item{name}{The name of the scale. Used as the axis or legend title. If \code{waiver()}, the default, the name of the scale is taken from the first mapping used for that aesthetic. If \code{NULL}, the legend title will be @@ -51,7 +51,7 @@ transformation object as output }} \item{guide}{A function used to create a guide or its name. See -\code{\link[=guides]{guides()}} for more info.} +\code{\link[=guides]{guides()}} for more information.} \item{super}{The super class to use for the constructed scale} }} diff --git a/man/scale_manual.Rd b/man/scale_manual.Rd index f69428d917..9f0df47334 100644 --- a/man/scale_manual.Rd +++ b/man/scale_manual.Rd @@ -30,12 +30,11 @@ scale_discrete_manual(aesthetics, ..., values) \describe{ \item{palette}{A palette function that when called with a single integer argument (the number of levels in the scale) returns the values that -they should take.} +they should take (e.g., \code{\link[scales:hue_pal]{scales::hue_pal()}}).} \item{breaks}{One of: \itemize{ \item \code{NULL} for no breaks -\item \code{waiver()} for the default breaks computed by the -transformation object +\item \code{waiver()} for the default breaks (the scale limits) \item A character vector of breaks \item A function that takes the limits as input and returns breaks as output @@ -51,7 +50,8 @@ from a discrete scale, specify \code{na.translate = FALSE}.} \item{na.value}{If \code{na.translate = TRUE}, what value aesthetic value should missing be displayed as? Does not apply to position scales where \code{NA} is always placed at the far right.} - \item{scale_name}{The name of the scale} + \item{scale_name}{The name of the scale that should be used for error messages +associated with this scale.} \item{name}{The name of the scale. Used as the axis or legend title. If \code{waiver()}, the default, the name of the scale is taken from the first mapping used for that aesthetic. If \code{NULL}, the legend title will be @@ -66,7 +66,7 @@ transformation object as output }} \item{guide}{A function used to create a guide or its name. See -\code{\link[=guides]{guides()}} for more info.} +\code{\link[=guides]{guides()}} for more information.} \item{super}{The super class to use for the constructed scale} }} @@ -94,6 +94,25 @@ have an optional \code{aesthetics} argument that can be used to define both \cod \code{scale_discrete_manual()} is a generic scale that can work with any aesthetic or set of aesthetics provided via the \code{aesthetics} argument. } +\section{Color Blindness}{ + +Many color palettes derived from RGB combinations (like the "rainbow" color +palette) are not suitable to support all viewers, especially those with +color vision deficiencies. Using \code{viridis} type, which is perceptually +uniform in both colour and black-and-white display is an easy option to +ensure good perceptive properties of your visulizations. +The colorspace package offers functionalities +\itemize{ +\item to generate color palettes with good perceptive properties, +\item to analyse a given color palette, like emulating color blindness, +\item and to modify a given color palette for better perceptivity. +} + +For more information on color vision deficiencies and suitable color choices +see the \href{https://arxiv.org/abs/1903.06490}{paper on the colorspace package} +and references therein. +} + \examples{ p <- ggplot(mtcars, aes(mpg, wt)) + geom_point(aes(colour = factor(cyl))) diff --git a/man/scale_shape.Rd b/man/scale_shape.Rd index 58bb2e265e..2763e5e984 100644 --- a/man/scale_shape.Rd +++ b/man/scale_shape.Rd @@ -14,12 +14,11 @@ scale_shape(..., solid = TRUE) \describe{ \item{palette}{A palette function that when called with a single integer argument (the number of levels in the scale) returns the values that -they should take.} +they should take (e.g., \code{\link[scales:hue_pal]{scales::hue_pal()}}).} \item{breaks}{One of: \itemize{ \item \code{NULL} for no breaks -\item \code{waiver()} for the default breaks computed by the -transformation object +\item \code{waiver()} for the default breaks (the scale limits) \item A character vector of breaks \item A function that takes the limits as input and returns breaks as output @@ -35,8 +34,9 @@ from a discrete scale, specify \code{na.translate = FALSE}.} \item{na.value}{If \code{na.translate = TRUE}, what value aesthetic value should missing be displayed as? Does not apply to position scales where \code{NA} is always placed at the far right.} - \item{aesthetics}{The names of the aesthetics that this scale works with} - \item{scale_name}{The name of the scale} + \item{aesthetics}{The names of the aesthetics that this scale works with.} + \item{scale_name}{The name of the scale that should be used for error messages +associated with this scale.} \item{name}{The name of the scale. Used as the axis or legend title. If \code{waiver()}, the default, the name of the scale is taken from the first mapping used for that aesthetic. If \code{NULL}, the legend title will be @@ -51,7 +51,7 @@ transformation object as output }} \item{guide}{A function used to create a guide or its name. See -\code{\link[=guides]{guides()}} for more info.} +\code{\link[=guides]{guides()}} for more information.} \item{super}{The super class to use for the constructed scale} }} diff --git a/man/scale_size.Rd b/man/scale_size.Rd index 75820050ce..17614721fc 100644 --- a/man/scale_size.Rd +++ b/man/scale_size.Rd @@ -31,10 +31,10 @@ omitted.} \itemize{ \item \code{NULL} for no breaks \item \code{waiver()} for the default breaks computed by the -transformation object +\link[scales:trans_new]{transformation object} \item A numeric vector of positions \item A function that takes the limits as input and returns breaks -as output +as output (e.g., a function returned by \code{\link[scales:extended_breaks]{scales::extended_breaks()}}) }} \item{labels}{One of: @@ -54,25 +54,28 @@ as output Use \code{NA} to refer to the existing minimum or maximum \item A function that accepts the existing (automatic) limits and returns new limits +Note that setting limits on positional scales will \strong{remove} data outside of the limits. +If the purpose is to zoom, use the limit argument in the coordinate system +(see \code{\link[=coord_cartesian]{coord_cartesian()}}). }} \item{range}{a numeric vector of length 2 that specifies the minimum and maximum size of the plotting symbol after transformation.} -\item{trans}{Either the name of a transformation object, or the -object itself. Built-in transformations include "asn", "atanh", +\item{trans}{For continuous scales, the name of a transformation object +or the object itself. Built-in transformations include "asn", "atanh", "boxcox", "date", "exp", "hms", "identity", "log", "log10", "log1p", "log2", "logit", "modulus", "probability", "probit", "pseudo_log", "reciprocal", "reverse", "sqrt" and "time". A transformation object bundles together a transform, its inverse, and methods for generating breaks and labels. Transformation objects -are defined in the scales package, and are called \code{name_trans}, e.g. -\code{\link[scales:boxcox_trans]{scales::boxcox_trans()}}. You can create your own +are defined in the scales package, and are called \code{_trans} (e.g., +\code{\link[scales:boxcox_trans]{scales::boxcox_trans()}}). You can create your own transformation with \code{\link[scales:trans_new]{scales::trans_new()}}.} \item{guide}{A function used to create a guide or its name. See -\code{\link[=guides]{guides()}} for more info.} +\code{\link[=guides]{guides()}} for more information.} \item{...}{Arguments passed on to \code{continuous_scale} \describe{ @@ -84,10 +87,10 @@ omitted.} \itemize{ \item \code{NULL} for no breaks \item \code{waiver()} for the default breaks computed by the -transformation object +\link[scales:trans_new]{transformation object} \item A numeric vector of positions \item A function that takes the limits as input and returns breaks -as output +as output (e.g., a function returned by \code{\link[scales:extended_breaks]{scales::extended_breaks()}}) }} \item{minor_breaks}{One of: \itemize{ @@ -113,32 +116,42 @@ as output Use \code{NA} to refer to the existing minimum or maximum \item A function that accepts the existing (automatic) limits and returns new limits +Note that setting limits on positional scales will \strong{remove} data outside of the limits. +If the purpose is to zoom, use the limit argument in the coordinate system +(see \code{\link[=coord_cartesian]{coord_cartesian()}}). +}} + \item{oob}{One of: +\itemize{ +\item Function that handles limits outside of the scale limits +(out of bounds). +\item The default (\code{\link[scales:censor]{scales::censor()}}) replaces out of +bounds values with \code{NA}. +\item \code{\link[scales:squish]{scales::squish()}} for squishing out of bounds values into range. +\item \code{\link[scales:squish_infinite]{scales::squish_infinite()}} for squishing infitite values into range. }} - \item{oob}{Function that handles limits outside of the scale limits -(out of bounds). The default replaces out of bounds values with \code{NA}.} \item{na.value}{Missing values will be replaced with this value.} - \item{trans}{Either the name of a transformation object, or the -object itself. Built-in transformations include "asn", "atanh", + \item{trans}{For continuous scales, the name of a transformation object +or the object itself. Built-in transformations include "asn", "atanh", "boxcox", "date", "exp", "hms", "identity", "log", "log10", "log1p", "log2", "logit", "modulus", "probability", "probit", "pseudo_log", "reciprocal", "reverse", "sqrt" and "time". A transformation object bundles together a transform, its inverse, and methods for generating breaks and labels. Transformation objects -are defined in the scales package, and are called \code{name_trans}, e.g. -\code{\link[scales:boxcox_trans]{scales::boxcox_trans()}}. You can create your own +are defined in the scales package, and are called \code{_trans} (e.g., +\code{\link[scales:boxcox_trans]{scales::boxcox_trans()}}). You can create your own transformation with \code{\link[scales:trans_new]{scales::trans_new()}}.} \item{guide}{A function used to create a guide or its name. See -\code{\link[=guides]{guides()}} for more info.} - \item{position}{The position of the axis. "left" or "right" for vertical -scales, "top" or "bottom" for horizontal scales} - \item{super}{The super class to use for the constructed scale} - \item{expand}{Vector of range expansion constants used to add some -padding around the data, to ensure that they are placed some distance -away from the axes. Use the convenience function \code{\link[=expand_scale]{expand_scale()}} +\code{\link[=guides]{guides()}} for more information.} + \item{expand}{For position scales, a vector of range expansion constants used to add some +padding around the data to ensure that they are placed some distance +away from the axes. Use the convenience function \code{\link[=expansion]{expansion()}} to generate the values for the \code{expand} argument. The defaults are to expand the scale by 5\% on each side for continuous variables, and by 0.6 units on each side for discrete variables.} + \item{position}{For position scales, The position of the axis. +\code{left} or \code{right} for y axes, \code{top} or \code{bottom} for x axes.} + \item{super}{The super class to use for the constructed scale} }} \item{max_size}{Size of largest points.} diff --git a/man/scale_viridis.Rd b/man/scale_viridis.Rd index 621aa8c6f3..a3817e2e30 100644 --- a/man/scale_viridis.Rd +++ b/man/scale_viridis.Rd @@ -59,7 +59,7 @@ other values are deprecated.} \item{na.value}{Missing values will be replaced with this value.} \item{guide}{A function used to create a guide or its name. See -\code{\link[=guides]{guides()}} for more info.} +\code{\link[=guides]{guides()}} for more information.} } \description{ The \code{viridis} scales provide colour maps that are perceptually uniform in both diff --git a/man/stat_ellipse.Rd b/man/stat_ellipse.Rd index aa1a6cd163..0923ff478b 100644 --- a/man/stat_ellipse.Rd +++ b/man/stat_ellipse.Rd @@ -2,7 +2,7 @@ % Please edit documentation in R/stat-ellipse.R \name{stat_ellipse} \alias{stat_ellipse} -\title{Compute normal confidence ellipses} +\title{Compute normal data ellipses} \usage{ stat_ellipse(mapping = NULL, data = NULL, geom = "path", position = "identity", ..., type = "t", level = 0.95, @@ -47,7 +47,7 @@ The default \code{"t"} assumes a multivariate t-distribution, and representing the euclidean distance from the center. This ellipse probably won't appear circular unless \code{coord_fixed()} is applied.} -\item{level}{The confidence level at which to draw an ellipse (default is 0.95), +\item{level}{The level at which to draw an ellipse, or, if \code{type="euclid"}, the radius of the circle to be drawn.} \item{segments}{The number of segments to be used in drawing the ellipse.} @@ -68,7 +68,7 @@ the default plot specification, e.g. \code{\link[=borders]{borders()}}.} } \description{ The method for calculating the ellipses has been modified from -\code{car::ellipse} (Fox and Weisberg, 2011) +\code{car::dataEllipse} (Fox and Weisberg, 2011) } \examples{ ggplot(faithful, aes(waiting, eruptions)) + diff --git a/man/stat_function.Rd b/man/stat_function.Rd index 38793a0577..44c54d7c83 100644 --- a/man/stat_function.Rd +++ b/man/stat_function.Rd @@ -49,7 +49,7 @@ be vectorised.} \item{n}{Number of points to interpolate along} -\item{args}{List of additional arguments to pass to \code{fun}} +\item{args}{List of additional arguments passed on to the function defined by \code{fun}.} \item{na.rm}{If \code{FALSE}, the default, missing values are removed with a warning. If \code{TRUE}, missing values are silently removed.} @@ -70,16 +70,6 @@ This stat makes it easy to superimpose a function on top of an existing plot. The function is called with a grid of evenly spaced values along the x axis, and the results are drawn (by default) with a line. } -\section{Aesthetics}{ - -\code{stat_function()} understands the following aesthetics (required aesthetics are in bold): -\itemize{ -\item \code{group} -\item \code{y} -} -Learn more about setting these aesthetics in \code{vignette("ggplot2-specs")}. -} - \section{Computed variables}{ \describe{ diff --git a/man/stat_summary.Rd b/man/stat_summary.Rd index b1facb5f16..3acd3a2588 100644 --- a/man/stat_summary.Rd +++ b/man/stat_summary.Rd @@ -64,8 +64,8 @@ or as a function that calculates width from unscaled x. Here, "unscaled x" refers to the original x values in the data, before application of any scale transformation. When specifying a function along with a grouping structure, the function will be called once per group. -The default is to use \code{bins} -bins that cover the range of the data. You should always override +The default is to use the number of bins in \code{bins}, +covering the range of the data. You should always override this value, exploring multiple widths to find the best to illustrate the stories in your data. diff --git a/tests/figs/coord-sf/sf-polygons.svg b/tests/figs/coord-sf/sf-polygons.svg index 6d80f68be6..a3b1b6928c 100644 --- a/tests/figs/coord-sf/sf-polygons.svg +++ b/tests/figs/coord-sf/sf-polygons.svg @@ -13,182 +13,80 @@ - - + + - + - - + + - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + + + -34 -° -N -34.5 -° -N -35 -° -N -35.5 -° -N -36 -° -N -36.5 -° -N - - - - - - - - - - - -84 -° -W -82 -° -W -80 -° -W -78 -° -W -76 -° -W -sf-polygons +36.25 +° +N +36.3 +° +N +36.35 +° +N +36.4 +° +N +36.45 +° +N +36.5 +° +N +36.55 +° +N +36.6 +° +N + + + + + + + + + + + + + +81.7 +° +W +81.6 +° +W +81.5 +° +W +81.4 +° +W +81.3 +° +W +sf-polygons diff --git a/tests/figs/coord-trans/basic-coord-trans-plot.svg b/tests/figs/coord-trans/basic-coord-trans-plot.svg new file mode 100644 index 0000000000..2465761e51 --- /dev/null +++ b/tests/figs/coord-trans/basic-coord-trans-plot.svg @@ -0,0 +1,285 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +20 +30 +40 + + + + + + + + + + +2seater +compact +midsize +minivan +pickup +subcompact +suv +class +hwy +basic coord_trans() plot + diff --git a/tests/figs/coord-trans/sec-axis-with-coord-trans.svg b/tests/figs/coord-trans/sec-axis-with-coord-trans.svg new file mode 100644 index 0000000000..55015fbeff --- /dev/null +++ b/tests/figs/coord-trans/sec-axis-with-coord-trans.svg @@ -0,0 +1,311 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +10 +15 +20 +25 +30 +35 + + + + + + +11.31371 +16.00000 +22.62742 +32.00000 +45.25483 + + + + + + + + + + +3.5 +4.0 +4.5 +5.0 +5.5 + + + + + + +10 +15 +20 +25 +30 +35 +cty +cty +hwy +log2(hwy) +sec_axis with coord_trans() + diff --git a/tests/figs/geom-polygon/basic-polygon-plot.svg b/tests/figs/geom-polygon/basic-polygon-plot.svg new file mode 100644 index 0000000000..29a3195d0d --- /dev/null +++ b/tests/figs/geom-polygon/basic-polygon-plot.svg @@ -0,0 +1,49 @@ + + + + + + + + + + + + + + + + + + + + + +0 +10 +20 +30 + + + + + + + + +0 +10 +20 +30 +x +y +basic polygon plot + diff --git a/tests/figs/geom-polygon/stat-density2d-with-filled-polygons.svg b/tests/figs/geom-polygon/stat-density2d-with-filled-polygons.svg deleted file mode 100644 index 7b8a5de68f..0000000000 --- a/tests/figs/geom-polygon/stat-density2d-with-filled-polygons.svg +++ /dev/null @@ -1,79 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -40 -60 -80 -100 - - - - - - - -2 -4 -6 -eruptions -waiting - - -0.005 -0.010 -0.015 -0.020 -level - - - - - - - - -stat_density2d with filled polygons - diff --git a/tests/figs/geom-polygon/stat-density2d-with-paths.svg b/tests/figs/geom-polygon/stat-density2d-with-paths.svg deleted file mode 100644 index 0c8fd94fb2..0000000000 --- a/tests/figs/geom-polygon/stat-density2d-with-paths.svg +++ /dev/null @@ -1,79 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -40 -60 -80 -100 - - - - - - - -2 -4 -6 -eruptions -waiting - - -0.005 -0.010 -0.015 -0.020 -level - - - - - - - - -stat_density2d with paths - diff --git a/tests/figs/geom-sf/labels-for-north-carolina.svg b/tests/figs/geom-sf/labels-for-north-carolina.svg index c1643d8b1a..d5d41244c5 100644 --- a/tests/figs/geom-sf/labels-for-north-carolina.svg +++ b/tests/figs/geom-sf/labels-for-north-carolina.svg @@ -14,46 +14,32 @@ - - + + - - -Ashe - -Alleghany - -Surry - + + +ashe + -4354000 -4356000 -4358000 -4360000 -4362000 -4364000 -4366000 - - - - - - - - - - - --9075000 --9050000 --9025000 --9000000 -x -y -Labels for North Carolina +4357807 +4357807 +4357807 + + + + + + +-9072027 +-9072027 +-9072027 +x +y +Labels for North Carolina diff --git a/tests/figs/geom-sf/north-carolina-county-boundaries.svg b/tests/figs/geom-sf/north-carolina-county-boundaries.svg index 5dab3a70da..805823d951 100644 --- a/tests/figs/geom-sf/north-carolina-county-boundaries.svg +++ b/tests/figs/geom-sf/north-carolina-county-boundaries.svg @@ -13,182 +13,80 @@ - - + + - + - - + + - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + + + -34 -° -N -34.5 -° -N -35 -° -N -35.5 -° -N -36 -° -N -36.5 -° -N - - - - - - - - - - - -84 -° -W -82 -° -W -80 -° -W -78 -° -W -76 -° -W -North Carolina county boundaries +36.25 +° +N +36.3 +° +N +36.35 +° +N +36.4 +° +N +36.45 +° +N +36.5 +° +N +36.55 +° +N +36.6 +° +N + + + + + + + + + + + + + +81.7 +° +W +81.6 +° +W +81.5 +° +W +81.4 +° +W +81.3 +° +W +North Carolina county boundaries diff --git a/tests/figs/geom-sf/spatial-points.svg b/tests/figs/geom-sf/spatial-points.svg index 96a287563e..a12e85703f 100644 --- a/tests/figs/geom-sf/spatial-points.svg +++ b/tests/figs/geom-sf/spatial-points.svg @@ -13,65 +13,53 @@ - - + + - + - - + + - - - - - - - - - - - - - - - - + + + + -1.0 -1.2 -1.4 -1.6 -1.8 -2.0 - - - - - - - - - - - - - 0.0 - 0.2 - 0.4 - 0.6 - 0.8 - 1.0 -spatial points +1.0 +1.2 +1.4 +1.6 +1.8 +2.0 + + + + + + + + + + + + + 0.0 + 0.2 + 0.4 + 0.6 + 0.8 + 1.0 +spatial points diff --git a/tests/figs/geom-sf/texts-for-north-carolina.svg b/tests/figs/geom-sf/texts-for-north-carolina.svg index 633254ec86..3ab383d80a 100644 --- a/tests/figs/geom-sf/texts-for-north-carolina.svg +++ b/tests/figs/geom-sf/texts-for-north-carolina.svg @@ -14,43 +14,31 @@ - - + + - -Ashe -Alleghany -Surry - + +ashe + -4354000 -4356000 -4358000 -4360000 -4362000 -4364000 -4366000 - - - - - - - - - - - --9075000 --9050000 --9025000 --9000000 -x -y -Texts for North Carolina +4357807 +4357807 +4357807 + + + + + + +-9072027 +-9072027 +-9072027 +x +y +Texts for North Carolina diff --git a/tests/figs/guides/axis-guides-basic.svg b/tests/figs/guides/axis-guides-basic.svg new file mode 100644 index 0000000000..efc918fbba --- /dev/null +++ b/tests/figs/guides/axis-guides-basic.svg @@ -0,0 +1,84 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +1 +2 +3 + + + + + + + +1 +2 +3 + + + + +1 +2 +3 + +1 +2 +3 + + + + diff --git a/tests/figs/guides/axis-guides-check-overlap.svg b/tests/figs/guides/axis-guides-check-overlap.svg new file mode 100644 index 0000000000..dc9faa4c64 --- /dev/null +++ b/tests/figs/guides/axis-guides-check-overlap.svg @@ -0,0 +1,198 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +1,000,000,000 +20,000,000,000 +10,000,000,000 +5,000,000,000 +3,000,000,000 +7,000,000,000 +15,000,000,000 +12,000,000,000 +17,000,000,000 + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +1,000,000,000 +20,000,000,000 +10,000,000,000 +5,000,000,000 +3,000,000,000 +2,000,000,000 +4,000,000,000 +7,000,000,000 +6,000,000,000 +8,000,000,000 +9,000,000,000 +15,000,000,000 +12,000,000,000 +11,000,000,000 +13,000,000,000 +14,000,000,000 +17,000,000,000 +16,000,000,000 +18,000,000,000 +19,000,000,000 + + + + + + + + + + + + + + + + + + + + + +1,000,000,000 +20,000,000,000 +10,000,000,000 +5,000,000,000 +3,000,000,000 +7,000,000,000 +15,000,000,000 +12,000,000,000 +17,000,000,000 + +1,000,000,000 +20,000,000,000 +10,000,000,000 +5,000,000,000 +3,000,000,000 +2,000,000,000 +4,000,000,000 +7,000,000,000 +6,000,000,000 +8,000,000,000 +9,000,000,000 +15,000,000,000 +12,000,000,000 +11,000,000,000 +13,000,000,000 +14,000,000,000 +17,000,000,000 +16,000,000,000 +18,000,000,000 +19,000,000,000 + + + + + + + + + + + + + + + + + + + + + diff --git a/tests/figs/guides/axis-guides-negative-rotation.svg b/tests/figs/guides/axis-guides-negative-rotation.svg new file mode 100644 index 0000000000..fb9c1da5d3 --- /dev/null +++ b/tests/figs/guides/axis-guides-negative-rotation.svg @@ -0,0 +1,140 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +1,000 +2,000 +3,000 +4,000 +5,000 +6,000 +7,000 +8,000 +9,000 +10,000 + + + + + + + + + + + + + + + + + + + + + +1,000 +2,000 +3,000 +4,000 +5,000 +6,000 +7,000 +8,000 +9,000 +10,000 + + + + + + + + + + + +1,000 +2,000 +3,000 +4,000 +5,000 +6,000 +7,000 +8,000 +9,000 +10,000 + +1,000 +2,000 +3,000 +4,000 +5,000 +6,000 +7,000 +8,000 +9,000 +10,000 + + + + + + + + + + + diff --git a/tests/figs/guides/axis-guides-positive-rotation.svg b/tests/figs/guides/axis-guides-positive-rotation.svg new file mode 100644 index 0000000000..fe73c5aa16 --- /dev/null +++ b/tests/figs/guides/axis-guides-positive-rotation.svg @@ -0,0 +1,140 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +1,000 +2,000 +3,000 +4,000 +5,000 +6,000 +7,000 +8,000 +9,000 +10,000 + + + + + + + + + + + + + + + + + + + + + +1,000 +2,000 +3,000 +4,000 +5,000 +6,000 +7,000 +8,000 +9,000 +10,000 + + + + + + + + + + + +1,000 +2,000 +3,000 +4,000 +5,000 +6,000 +7,000 +8,000 +9,000 +10,000 + +1,000 +2,000 +3,000 +4,000 +5,000 +6,000 +7,000 +8,000 +9,000 +10,000 + + + + + + + + + + + diff --git a/tests/figs/guides/axis-guides-text-dodged-into-rows-cols.svg b/tests/figs/guides/axis-guides-text-dodged-into-rows-cols.svg new file mode 100644 index 0000000000..224723b0c0 --- /dev/null +++ b/tests/figs/guides/axis-guides-text-dodged-into-rows-cols.svg @@ -0,0 +1,140 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +2,000,000,000 +4,000,000,000 +6,000,000,000 +8,000,000,000 +10,000,000,000 +1,000,000,000 +3,000,000,000 +5,000,000,000 +7,000,000,000 +9,000,000,000 + + + + + + + + + + + + + + + + + + + + + +1,000,000,000 +3,000,000,000 +5,000,000,000 +7,000,000,000 +9,000,000,000 +2,000,000,000 +4,000,000,000 +6,000,000,000 +8,000,000,000 +10,000,000,000 + + + + + + + + + + + +1,000,000,000 +3,000,000,000 +5,000,000,000 +7,000,000,000 +9,000,000,000 +2,000,000,000 +4,000,000,000 +6,000,000,000 +8,000,000,000 +10,000,000,000 + +2,000,000,000 +4,000,000,000 +6,000,000,000 +8,000,000,000 +10,000,000,000 +1,000,000,000 +3,000,000,000 +5,000,000,000 +7,000,000,000 +9,000,000,000 + + + + + + + + + + + diff --git a/tests/figs/guides/axis-guides-vertical-negative-rotation.svg b/tests/figs/guides/axis-guides-vertical-negative-rotation.svg new file mode 100644 index 0000000000..ade52eeafd --- /dev/null +++ b/tests/figs/guides/axis-guides-vertical-negative-rotation.svg @@ -0,0 +1,140 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +1,000 +2,000 +3,000 +4,000 +5,000 +6,000 +7,000 +8,000 +9,000 +10,000 + + + + + + + + + + + + + + + + + + + + + +1,000 +2,000 +3,000 +4,000 +5,000 +6,000 +7,000 +8,000 +9,000 +10,000 + + + + + + + + + + + +1,000 +2,000 +3,000 +4,000 +5,000 +6,000 +7,000 +8,000 +9,000 +10,000 + +1,000 +2,000 +3,000 +4,000 +5,000 +6,000 +7,000 +8,000 +9,000 +10,000 + + + + + + + + + + + diff --git a/tests/figs/guides/axis-guides-vertical-rotation.svg b/tests/figs/guides/axis-guides-vertical-rotation.svg new file mode 100644 index 0000000000..66f71a4725 --- /dev/null +++ b/tests/figs/guides/axis-guides-vertical-rotation.svg @@ -0,0 +1,140 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +1,000 +2,000 +3,000 +4,000 +5,000 +6,000 +7,000 +8,000 +9,000 +10,000 + + + + + + + + + + + + + + + + + + + + + +1,000 +2,000 +3,000 +4,000 +5,000 +6,000 +7,000 +8,000 +9,000 +10,000 + + + + + + + + + + + +1,000 +2,000 +3,000 +4,000 +5,000 +6,000 +7,000 +8,000 +9,000 +10,000 + +1,000 +2,000 +3,000 +4,000 +5,000 +6,000 +7,000 +8,000 +9,000 +10,000 + + + + + + + + + + + diff --git a/tests/figs/guides/axis-guides-zero-breaks.svg b/tests/figs/guides/axis-guides-zero-breaks.svg new file mode 100644 index 0000000000..b157ad8d9b --- /dev/null +++ b/tests/figs/guides/axis-guides-zero-breaks.svg @@ -0,0 +1,60 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/tests/figs/guides/axis-guides-zero-rotation.svg b/tests/figs/guides/axis-guides-zero-rotation.svg new file mode 100644 index 0000000000..e3d93e9a3c --- /dev/null +++ b/tests/figs/guides/axis-guides-zero-rotation.svg @@ -0,0 +1,140 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +1,000 +2,000 +3,000 +4,000 +5,000 +6,000 +7,000 +8,000 +9,000 +10,000 + + + + + + + + + + + + + + + + + + + + + +1,000 +2,000 +3,000 +4,000 +5,000 +6,000 +7,000 +8,000 +9,000 +10,000 + + + + + + + + + + + +1,000 +2,000 +3,000 +4,000 +5,000 +6,000 +7,000 +8,000 +9,000 +10,000 + +1,000 +2,000 +3,000 +4,000 +5,000 +6,000 +7,000 +8,000 +9,000 +10,000 + + + + + + + + + + + diff --git a/tests/figs/themes/theme-gray-large.svg b/tests/figs/themes/theme-gray-large.svg index ab9f3b3ba8..e6d1979bb3 100644 --- a/tests/figs/themes/theme-gray-large.svg +++ b/tests/figs/themes/theme-gray-large.svg @@ -81,9 +81,9 @@ y z - + - + a b diff --git a/tests/figs/themes/theme-gray.svg b/tests/figs/themes/theme-gray.svg index fd24066dbd..f8233decea 100644 --- a/tests/figs/themes/theme-gray.svg +++ b/tests/figs/themes/theme-gray.svg @@ -81,9 +81,9 @@ y z - + - + a b diff --git a/tests/figs/themes/ticks_length.svg b/tests/figs/themes/ticks_length.svg deleted file mode 100644 index 54c82ae3b1..0000000000 --- a/tests/figs/themes/ticks_length.svg +++ /dev/null @@ -1,74 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -2.5 -5.0 -7.5 -10.0 - - - - -2.5 -5.0 -7.5 -10.0 - - - - - - - - -2.5 -5.0 -7.5 -10.0 - - - - -2.5 -5.0 -7.5 -10.0 -1:10 -1:10 -1:10 -1:10 - diff --git a/tests/testthat/test-aes.r b/tests/testthat/test-aes.r index be8bd96c51..043ed8f641 100644 --- a/tests/testthat/test-aes.r +++ b/tests/testthat/test-aes.r @@ -111,6 +111,46 @@ test_that("aes standardises aesthetic names", { expect_warning(aes(color = x, colour = y), "Duplicated aesthetics") }) +test_that("warn_for_aes_extract_usage() warns for discouraged uses of $ and [[ within aes()", { + + df <- data_frame(x = 1:5, nested_df = data_frame(x = 6:10)) + + expect_warning( + warn_for_aes_extract_usage(aes(df$x), df), + "Use of `df\\$x` is discouraged" + ) + + expect_warning( + warn_for_aes_extract_usage(aes(df[["x"]]), df), + 'Use of `df\\[\\["x"\\]\\]` is discouraged' + ) +}) + +test_that("warn_for_aes_extract_usage() does not evaluate function calls", { + df <- data_frame(x = 1:5, nested_df = data_frame(x = 6:10)) + returns_df <- function() df + + expect_warning(warn_for_aes_extract_usage(aes(df$x), df)) + expect_silent(warn_for_aes_extract_usage(aes(returns_df()$x), df)) +}) + +test_that("warn_for_aes_extract_usage() does not warn for valid uses of $ and [[ within aes()", { + df <- data_frame(x = 1:5, nested_df = data_frame(x = 6:10)) + + # use of .data + expect_silent(warn_for_aes_extract_usage(aes(.data$x), df)) + expect_silent(warn_for_aes_extract_usage(aes(.data[["x"]]), df)) + + # use of $ for a nested data frame column + expect_silent(warn_for_aes_extract_usage(aes(nested_df$x), df)) + expect_silent(warn_for_aes_extract_usage(aes(nested_df[["x"]]), df)) +}) + +test_that("Warnings are issued when plots use discouraged extract usage within aes()", { + df <- data_frame(x = 1:3, y = 1:3) + p <- ggplot(df, aes(df$x, y)) + geom_point() + expect_warning(ggplot_build(p), "Use of `df\\$x` is discouraged") +}) # Visual tests ------------------------------------------------------------ diff --git a/tests/testthat/test-coord-transform.R b/tests/testthat/test-coord-transform.R index 5e089bc154..1b2ecfcc11 100644 --- a/tests/testthat/test-coord-transform.R +++ b/tests/testthat/test-coord-transform.R @@ -20,3 +20,104 @@ test_that("no warnings are generated when original data has Inf values, but no n expect_silent(benchplot(p)) }) + +test_that("coord_trans() expands axes identically to coord_cartesian()", { + p <- ggplot(mpg, aes(class, hwy)) + geom_point() + built_cartesian <- ggplot_build(p + coord_cartesian()) + built_trans <- ggplot_build(p + coord_trans()) + + cartesian_params <- built_cartesian$layout$panel_params[[1]] + trans_params <- built_trans$layout$panel_params[[1]] + + expect_identical(cartesian_params$x.range, trans_params$x.range) + expect_identical(cartesian_params$y.range, trans_params$y.range) +}) + +test_that("coord_trans(expand = FALSE) expands axes identically to coord_cartesian(expand = FALSE)", { + p <- ggplot(mpg, aes(class, hwy)) + geom_point() + built_cartesian <- ggplot_build(p + coord_cartesian(expand = FALSE)) + built_trans <- ggplot_build(p + coord_trans(expand = FALSE)) + + cartesian_params <- built_cartesian$layout$panel_params[[1]] + trans_params <- built_trans$layout$panel_params[[1]] + + expect_identical(cartesian_params$x.range, trans_params$x.range) + expect_identical(cartesian_params$y.range, trans_params$y.range) +}) + +test_that("coord_trans(y = 'log10') expands the x axis identically to scale_y_log10()", { + p <- ggplot(mpg, aes(class, hwy)) + geom_point() + built_cartesian <- ggplot_build(p + scale_y_log10()) + built_trans <- ggplot_build(p + coord_trans(y = "log10")) + + cartesian_params <- built_cartesian$layout$panel_params[[1]] + trans_params <- built_trans$layout$panel_params[[1]] + + expect_identical(cartesian_params$y.range, trans_params$y.range) +}) + +test_that("coord_trans() expands axes outside the domain of the axis trans", { + # sqrt_trans() has a lower limit of 0 + df <- data_frame(x = 1, y = c(0, 1, 2)) + p <- ggplot(df, aes(x, y)) + geom_point() + built_cartesian <- ggplot_build(p + scale_y_sqrt()) + built_trans <- ggplot_build(p + coord_trans(y = "sqrt")) + + cartesian_params <- built_cartesian$layout$panel_params[[1]] + trans_params <- built_trans$layout$panel_params[[1]] + + expect_identical(cartesian_params$y.range, trans_params$y.range) +}) + +test_that("coord_trans() works with the reverse transformation", { + df <- data_frame(x = c("1-one", "2-two", "3-three"), y = c(20, 30, 40)) + + p <- ggplot(df, aes(x, y)) + geom_point() + built_cartesian <- ggplot_build(p + scale_y_reverse()) + built_trans <- ggplot_build(p + coord_trans(y = "reverse")) + + cartesian_params <- built_cartesian$layout$panel_params[[1]] + trans_params <- built_trans$layout$panel_params[[1]] + + expect_identical(cartesian_params$y.range, trans_params$y.range) +}) + +test_that("coord_trans() can reverse discrete axes", { + df <- data_frame(x = c("1-one", "2-two", "3-three"), y = c(20, 30, 40)) + + p <- ggplot(df, aes(x, y)) + geom_point() + built_cartesian <- ggplot_build(p) + built_trans <- ggplot_build(p + coord_trans(x = "reverse")) + + cartesian_params <- built_cartesian$layout$panel_params[[1]] + trans_params <- built_trans$layout$panel_params[[1]] + + expect_identical(cartesian_params$x.range, -rev(trans_params$x.range)) +}) + +test_that("basic coord_trans() plot displays both continuous and discrete axes", { + expect_doppelganger( + "basic coord_trans() plot", + ggplot(mpg, aes(class, hwy)) + + geom_point() + + coord_trans(y = "log10") + ) +}) + +test_that("second axes display in coord_trans()", { + expect_doppelganger( + "sec_axis with coord_trans()", + ggplot(mpg, aes(cty, hwy)) + + geom_point() + + scale_y_continuous( + sec.axis = sec_axis( + trans = ~log2(.), + breaks = c(3.5, 4, 4.5, 5, 5.5), + name = "log2(hwy)" + ), + breaks = 2^c(3.5, 4, 4.5, 5, 5.5) + ) + + scale_x_continuous(sec.axis = dup_axis()) + + coord_trans(y = "log2") + ) +}) diff --git a/tests/testthat/test-coord_sf.R b/tests/testthat/test-coord_sf.R index 0c0272e56e..cf4d325b25 100644 --- a/tests/testthat/test-coord_sf.R +++ b/tests/testthat/test-coord_sf.R @@ -3,16 +3,20 @@ context("coord_sf") test_that("basic plot builds without error", { skip_if_not_installed("sf") - nc <- sf::st_read(system.file("shape/nc.shp", package = "sf"), quiet = TRUE) - plot <- ggplot(nc) + - geom_sf() + - coord_sf() + nc_tiny_coords <- matrix( + c(-81.473, -81.741, -81.67, -81.345, -81.266, -81.24, -81.473, + 36.234, 36.392, 36.59, 36.573, 36.437, 36.365, 36.234), + ncol = 2 + ) - # Perform minimal test as long as vdiffr test is disabled - expect_error(regexp = NA, ggplot_build(plot)) + nc <- sf::st_as_sf( + data_frame( + NAME = "ashe", + geometry = sf::st_sfc(sf::st_polygon(list(nc_tiny_coords)), crs = 4326) + ) + ) - skip("sf tests are currently unstable") - expect_doppelganger("sf-polygons", plot) + expect_doppelganger("sf-polygons", ggplot(nc) + geom_sf() + coord_sf()) }) test_that("graticule lines can be removed via theme", { diff --git a/tests/testthat/test-data.r b/tests/testthat/test-data.r index 83733e60e1..abe4d88943 100644 --- a/tests/testthat/test-data.r +++ b/tests/testthat/test-data.r @@ -8,7 +8,7 @@ test_that("stringsAsFactors doesn't affect results", { dat.factor <- data_frame(x = letters[5:1], y = 1:5, stringsAsFactors = TRUE) base <- ggplot(mapping = aes(x, y)) + geom_point() - xlabels <- function(x) x$layout$panel_params[[1]]$x.labels + xlabels <- function(x) x$layout$panel_params[[1]]$x$get_labels() options(stringsAsFactors = TRUE) char_true <- ggplot_build(base %+% dat.character) diff --git a/tests/testthat/test-facet-.r b/tests/testthat/test-facet-.r index 54913001a3..13ac177aed 100644 --- a/tests/testthat/test-facet-.r +++ b/tests/testthat/test-facet-.r @@ -204,6 +204,103 @@ test_that("facet gives clear error if ", { ) }) +# Variable combinations --------------------------------------------------- + +test_that("zero-length vars in combine_vars() generates zero combinations", { + df <- data_frame(letter = c("a", "b")) + expect_equal(nrow(combine_vars(list(df), vars = vars())), 0) + expect_equal(ncol(combine_vars(list(df), vars = vars())), 0) +}) + +test_that("at least one layer must contain all facet variables in combine_vars()", { + df <- data_frame(letter = c("a", "b")) + expect_silent(combine_vars(list(df), vars = vars(letter = letter))) + expect_error( + combine_vars(list(df), vars = vars(letter = number)), + "At least one layer" + ) +}) + +test_that("at least one combination must exist in combine_vars()", { + df <- data_frame(letter = character(0)) + expect_error( + combine_vars(list(df), vars = vars(letter = letter)), + "Faceting variables must have at least one value" + ) +}) + +test_that("combine_vars() generates the correct combinations", { + df_one <- data_frame( + letter = c("a", "b"), + number = c(1, 2), + boolean = c(TRUE, FALSE), + factor = factor(c("level1", "level2")) + ) + + df_all <- expand.grid( + letter = c("a", "b"), + number = c(1, 2), + boolean = c(TRUE, FALSE), + factor = factor(c("level1", "level2")), + stringsAsFactors = FALSE + ) + + vars_all <- vars(letter = letter, number = number, boolean = boolean, factor = factor) + + expect_equivalent( + combine_vars(list(df_one), vars = vars_all), + df_one + ) + + expect_equivalent( + combine_vars(list(df_all), vars = vars_all), + df_all + ) + + # with drop = FALSE the rows are ordered in the opposite order + # NAs are dropped with drop = FALSE (except for NA factor values); + # NAs are kept with with drop = TRUE + # drop keeps all combinations of data, regardless of the combinations in which + # they appear in the data (in addition to keeping unused factor levels) + expect_equivalent( + combine_vars(list(df_one), vars = vars_all, drop = FALSE), + df_all[order(df_all$letter, df_all$number, df_all$boolean, df_all$factor), ] + ) +}) + +test_that("drop = FALSE in combine_vars() keeps unused factor levels", { + df <- data_frame(x = factor("a", levels = c("a", "b"))) + expect_equivalent( + combine_vars(list(df), vars = vars(x = x), drop = TRUE), + data_frame(x = factor("a")) + ) + expect_equivalent( + combine_vars(list(df), vars = vars(x = x), drop = FALSE), + data_frame(x = factor(c("a", "b"))) + ) +}) + +test_that("combine_vars() generates the correct combinations with multiple data frames", { + df <- expand.grid(letter = c("a", "b"), number = c(1, 2), boolean = c(TRUE, FALSE)) + + vars <- vars(letter = letter, number = number) + expect_identical( + combine_vars(list(df), vars = vars), + combine_vars(list(df, df), vars = vars) + ) + expect_identical( + combine_vars(list(df), vars = vars), + combine_vars(list(df, df[character(0)]), vars = vars) + ) + expect_identical( + combine_vars(list(df), vars = vars), + combine_vars(list(df, df["letter"]), vars = vars) + ) + expect_identical( + combine_vars(list(df), vars = vars), + combine_vars(list(df, df[c("letter", "number")]), vars = vars) + ) +}) # Visual tests ------------------------------------------------------------ diff --git a/tests/testthat/test-geom-hline-vline-abline.R b/tests/testthat/test-geom-hline-vline-abline.R index 83bb42cced..746bfff144 100644 --- a/tests/testthat/test-geom-hline-vline-abline.R +++ b/tests/testthat/test-geom-hline-vline-abline.R @@ -46,30 +46,76 @@ test_that("curved lines in map projections", { # Warning tests ------------------------------------------------------------ +test_that("warn_overwritten_args() produces gramatically correct error messages", { + expect_warning( + warn_overwritten_args("fun_test", "is_overwritten", "provided"), + "fun_test: Ignoring `is_overwritten` because `provided` was provided." + ) + expect_warning( + warn_overwritten_args("fun_test", "is_overwritten", c("provided1", "provided2")), + "fun_test: Ignoring `is_overwritten` because `provided1` and/or `provided2` were provided." + ) + expect_warning( + warn_overwritten_args("fun_test", "is_overwritten", c("provided1", "provided2", "provided3")), + "fun_test: Ignoring `is_overwritten` because `provided1`, `provided2`, and/or `provided3` were provided." + ) +}) + test_that("Warning if a supplied mapping is going to be overwritten", { expect_warning( geom_vline(xintercept = 3, aes(colour = colour)), - "Using both" + "Ignoring `mapping`" ) expect_warning( geom_hline(yintercept = 3, aes(colour = colour)), - "Using both" + "Ignoring `mapping`" ) expect_warning( geom_abline(intercept = 3, aes(colour = colour)), - "Using " + "Ignoring `mapping`" ) expect_warning( geom_abline(intercept = 3, slope = 0.5, aes(colour = colour)), - "Using " + "Ignoring `mapping`" + ) + + expect_warning( + geom_abline(slope = 0.5, aes(colour = colour)), + "Ignoring `mapping`" + ) +}) + + +test_that("Warning if supplied data is going to be overwritten", { + + sample_data <- data_frame(x = 1) + + expect_warning( + geom_vline(xintercept = 3, data = sample_data), + "Ignoring `data`" + ) + + expect_warning( + geom_hline(yintercept = 3, data = sample_data), + "Ignoring `data`" + ) + + expect_warning( + geom_abline(intercept = 3, data = sample_data), + "Ignoring `data`" + ) + + expect_warning( + geom_abline(intercept = 3, slope = 0.5, data = sample_data), + "Ignoring `data`" ) expect_warning( - geom_abline(slope=0.5, aes(colour = colour)), - "Using " + geom_abline(slope = 0.5, data = sample_data), + "Ignoring `data`" ) }) diff --git a/tests/testthat/test-geom-path.R b/tests/testthat/test-geom-path.R index 1f8007a8e1..a0f5e94308 100644 --- a/tests/testthat/test-geom-path.R +++ b/tests/testthat/test-geom-path.R @@ -8,6 +8,40 @@ test_that("keep_mid_true drops leading/trailing FALSE", { }) +# Tests on stairstep() ------------------------------------------------------------ + +test_that("stairstep() does not error with too few observations", { + df <- data_frame(x = 1, y = 1) + expect_silent(stairstep(df)) +}) + +test_that("stairstep() exists with error when an invalid `direction` is given", { + df <- data_frame(x = 1:3, y = 1:3) + expect_error(stairstep(df, direction="invalid")) +}) + +test_that("stairstep() output is correct for direction = 'vh'", { + df <- data_frame(x = 1:3, y = 1:3) + stepped_expected <- data_frame(x = c(1L, 1L, 2L, 2L, 3L), y = c(1L, 2L, 2L, 3L, 3L)) + stepped <- stairstep(df, direction = "vh") + expect_equal(stepped, stepped_expected) +}) + +test_that("stairstep() output is correct for direction = 'hv'", { + df <- data_frame(x = 1:3, y = 1:3) + stepped_expected <- data_frame(x = c(1L, 2L, 2L, 3L, 3L), y = c(1L, 1L, 2L, 2L, 3L)) + stepped <- stairstep(df, direction = "hv") + expect_equal(stepped, stepped_expected) +}) + +test_that("stairstep() output is correct for direction = 'mid'", { + df <- data_frame(x = 1:3, y = 1:3) + stepped_expected <- data_frame(x = c(1, 1.5, 1.5, 2.5, 2.5, 3), y = c(1L, 1L, 2L, 2L, 3L, 3L)) + stepped <- stairstep(df, direction = "mid") + expect_equal(stepped, stepped_expected) +}) + + # Visual tests ------------------------------------------------------------ test_that("geom_path draws correctly", { diff --git a/tests/testthat/test-geom-polygon.R b/tests/testthat/test-geom-polygon.R index 65e5f67c25..964a8af45a 100644 --- a/tests/testthat/test-geom-polygon.R +++ b/tests/testthat/test-geom-polygon.R @@ -3,15 +3,26 @@ context("geom-polygon") # Visual tests ------------------------------------------------------------ +skip_if(utils::packageVersion('grid') < "3.6") test_that("geom_polygon draws correctly", { - expect_doppelganger("stat_density2d with paths", - ggplot(faithful, aes(x = eruptions, y = waiting)) + - stat_density_2d(aes(colour = stat(level)), geom = "path") + - xlim(0.5, 6) + ylim(40, 110) - ) - expect_doppelganger("stat_density2d with filled polygons", - ggplot(faithful, aes(x = eruptions, y = waiting)) + - stat_density2d(aes(fill = stat(level)), geom = "polygon", colour = "white") + - xlim(0.5, 6) + ylim(40, 110) + + tbl <- data_frame( + x = c( + 0, 10, 10, 0, + 20, 30, 30, 20, + 22, 28, 28, 22 + ), + y = c( + 0, 0, 10, 10, + 20, 20, 30, 30, + 22, 22, 28, 28 + ), + group = c(rep(1, 4), rep(2, 8)), + subgroup = c(rep(1, 8), rep(2, 4)) ) + + p <- ggplot(tbl, aes(x, y, group = group, subgroup = subgroup)) + + geom_polygon() + + expect_doppelganger("basic polygon plot", p) }) diff --git a/tests/testthat/test-geom-sf.R b/tests/testthat/test-geom-sf.R index c46d6a2e82..eb1c2aa54f 100644 --- a/tests/testthat/test-geom-sf.R +++ b/tests/testthat/test-geom-sf.R @@ -7,21 +7,25 @@ test_that("geom_sf draws correctly", { skip_if_not_installed("sf") if (packageVersion("sf") < "0.5.3") skip("Need sf 0.5.3") - f <- system.file("gpkg/nc.gpkg", package="sf") - nc <- sf::read_sf(f) + nc_tiny_coords <- matrix( + c(-81.473, -81.741, -81.67, -81.345, -81.266, -81.24, -81.473, + 36.234, 36.392, 36.59, 36.573, 36.437, 36.365, 36.234), + ncol = 2 + ) + nc <- sf::st_as_sf( + data_frame( + NAME = "ashe", + geometry = sf::st_sfc(sf::st_polygon(list(nc_tiny_coords)), crs = 4326) + ) + ) - # Perform minimal tests as long as vdiffr tests are disabled - plot <- ggplot() + geom_sf(data = nc) - expect_error(regexp = NA, ggplot_build(plot)) + # Perform minimal tests pts <- sf::st_sf(a = 1:2, geometry = sf::st_sfc(sf::st_point(0:1), sf::st_point(1:2))) plot <- ggplot() + geom_sf(data = pts) expect_error(regexp = NA, ggplot_build(plot)) - - skip("sf tests are currently unstable") - expect_doppelganger("North Carolina county boundaries", ggplot() + geom_sf(data = nc) + coord_sf(datum = 4326) ) @@ -36,25 +40,27 @@ test_that("geom_sf_text() and geom_sf_label() draws correctly", { skip_if_not_installed("sf") if (packageVersion("sf") < "0.5.3") skip("Need sf 0.5.3") - f <- system.file("gpkg/nc.gpkg", package="sf") - nc <- sf::read_sf(f) - # In order to avoid warning, trnasform to a projected coordinate system + nc_tiny_coords <- matrix( + c(-81.473, -81.741, -81.67, -81.345, -81.266, -81.24, -81.473, + 36.234, 36.392, 36.59, 36.573, 36.437, 36.365, 36.234), + ncol = 2 + ) + + nc <- sf::st_as_sf( + data_frame( + NAME = "ashe", + geometry = sf::st_sfc(sf::st_polygon(list(nc_tiny_coords)), crs = 4326) + ) + ) + + # In order to avoid warning, transform to a projected coordinate system nc_3857 <- sf::st_transform(nc, "+init=epsg:3857") - # Perform minimal tests as long as vdiffr tests are disabled - plot <- ggplot() + geom_sf_text(data = nc_3857[1:3, ], aes(label = NAME)) - expect_error(regexp = NA, ggplot_build(plot)) - - plot <- ggplot() + geom_sf_label(data = nc_3857[1:3, ], aes(label = NAME)) - expect_error(regexp = NA, ggplot_build(plot)) - - skip("sf tests are currently unstable") - expect_doppelganger("Texts for North Carolina", - ggplot() + geom_sf_text(data = nc_3857[1:3, ], aes(label = NAME)) + ggplot() + geom_sf_text(data = nc_3857, aes(label = NAME)) ) expect_doppelganger("Labels for North Carolina", - ggplot() + geom_sf_label(data = nc_3857[1:3, ], aes(label = NAME)) + ggplot() + geom_sf_label(data = nc_3857, aes(label = NAME)) ) }) diff --git a/tests/testthat/test-geom-smooth.R b/tests/testthat/test-geom-smooth.R index f7bce014f9..0c378eae31 100644 --- a/tests/testthat/test-geom-smooth.R +++ b/tests/testthat/test-geom-smooth.R @@ -37,7 +37,7 @@ test_that("default smoothing methods for small and large data sets work", { y = x^2 + 0.5 * rnorm(1001) ) - m <- mgcv::gam(y ~ s(x, bs = "cs"), data = df) + m <- mgcv::gam(y ~ s(x, bs = "cs"), data = df, method = "REML") range <- range(df$x, na.rm = TRUE) xseq <- seq(range[1], range[2], length.out = 80) out <- predict(m, data_frame(x = xseq)) diff --git a/tests/testthat/test-guides.R b/tests/testthat/test-guides.R index 1800a87cbe..2bd7d0b508 100644 --- a/tests/testthat/test-guides.R +++ b/tests/testthat/test-guides.R @@ -46,10 +46,92 @@ test_that("show.legend handles named vectors", { expect_equal(n_legends(p), 0) }) +test_that("axis_label_overlap_priority always returns the correct number of elements", { + expect_identical(axis_label_priority(0), numeric(0)) + expect_setequal(axis_label_priority(1), seq_len(1)) + expect_setequal(axis_label_priority(5), seq_len(5)) + expect_setequal(axis_label_priority(10), seq_len(10)) + expect_setequal(axis_label_priority(100), seq_len(100)) +}) + +test_that("axis_label_element_overrides errors when angles are outside the range [0, 90]", { + expect_is(axis_label_element_overrides("bottom", 0), "element") + expect_error(axis_label_element_overrides("bottom", 91), "`angle` must") + expect_error(axis_label_element_overrides("bottom", -91), "`angle` must") +}) # Visual tests ------------------------------------------------------------ test_that("axis guides are drawn correctly", { + theme_test_axis <- theme_test() + theme(axis.line = element_line(size = 0.5)) + test_draw_axis <- function(n_breaks = 3, + break_positions = seq_len(n_breaks) / (n_breaks + 1), + labels = as.character, + positions = c("top", "right", "bottom", "left"), + theme = theme_test_axis, + ...) { + + break_labels <- labels(seq_along(break_positions)) + + # create the axes + axes <- lapply(positions, function(position) { + draw_axis(break_positions, break_labels, axis_position = position, theme = theme, ...) + }) + axes_grob <- gTree(children = do.call(gList, axes)) + + # arrange them so there's some padding on each side + gt <- gtable( + widths = unit(c(0.05, 0.9, 0.05), "npc"), + heights = unit(c(0.05, 0.9, 0.05), "npc") + ) + gt <- gtable_add_grob(gt, list(axes_grob), 2, 2, clip = "off") + plot(gt) + } + + # basic + expect_doppelganger("axis guides basic", function() test_draw_axis()) + expect_doppelganger("axis guides, zero breaks", function() test_draw_axis(n_breaks = 0)) + + # overlapping text + expect_doppelganger( + "axis guides, check overlap", + function() test_draw_axis(20, labels = function(b) comma(b * 1e9), check.overlap = TRUE) + ) + + # rotated text + expect_doppelganger( + "axis guides, zero rotation", + function() test_draw_axis(10, labels = function(b) comma(b * 1e3), angle = 0) + ) + + expect_doppelganger( + "axis guides, positive rotation", + function() test_draw_axis(10, labels = function(b) comma(b * 1e3), angle = 45) + ) + + expect_doppelganger( + "axis guides, negative rotation", + function() test_draw_axis(10, labels = function(b) comma(b * 1e3), angle = -45) + ) + + expect_doppelganger( + "axis guides, vertical rotation", + function() test_draw_axis(10, labels = function(b) comma(b * 1e3), angle = 90) + ) + + expect_doppelganger( + "axis guides, vertical negative rotation", + function() test_draw_axis(10, labels = function(b) comma(b * 1e3), angle = -90) + ) + + # dodged text + expect_doppelganger( + "axis guides, text dodged into rows/cols", + function() test_draw_axis(10, labels = function(b) comma(b * 1e9), n_dodge = 2) + ) +}) + +test_that("axis guides are drawn correctly in plots", { expect_doppelganger("align facet labels, facets horizontal", qplot(hwy, reorder(model, hwy), data = mpg) + facet_grid(manufacturer ~ ., scales = "free", space = "free") + diff --git a/tests/testthat/test-position-stack.R b/tests/testthat/test-position-stack.R index d83b045992..c889775df2 100644 --- a/tests/testthat/test-position-stack.R +++ b/tests/testthat/test-position-stack.R @@ -52,3 +52,9 @@ test_that("data with no extent is stacked correctly", { expect_equal(layer_data(p0)$y, c(-75, -115)) expect_equal(layer_data(p1)$y, c(0, -75)) }) + +test_that("position_stack() can stack correctly when ymax is NA", { + df <- data_frame(x = c(1, 1), y = c(1, 1)) + p <- ggplot(df, aes(x, y, ymax = NA_real_)) + geom_point(position = "stack") + expect_equal(layer_data(p)$y, c(1, 2)) +}) diff --git a/tests/testthat/test-scale-discrete.R b/tests/testthat/test-scale-discrete.R index 094d97ce84..79b5c74a17 100644 --- a/tests/testthat/test-scale-discrete.R +++ b/tests/testthat/test-scale-discrete.R @@ -69,3 +69,9 @@ test_that("discrete scale shrinks to range when setting limits", { expect_equal(layer_scales(p)$x$dimension(c(0, 1)), c(0, 3)) }) + +test_that("discrete position scales can accept functional limits", { + scale <- scale_x_discrete(limits = rev) + scale$train(c("a", "b", "c")) + expect_identical(scale$get_limits(), c("c", "b", "a")) +}) diff --git a/tests/testthat/test-scale-expansion.r b/tests/testthat/test-scale-expansion.r new file mode 100644 index 0000000000..f4f0e829b5 --- /dev/null +++ b/tests/testthat/test-scale-expansion.r @@ -0,0 +1,105 @@ + +test_that("expand_scale() produces a deprecation warning", { + expect_warning(expand_scale(), "deprecated") +}) + +# Expanding continuous scales ----------------------------------------- + +test_that("expand_limits_continuous() can override limits", { + expect_identical(expand_limits_continuous(c(1, 2), coord_limits = c(NA, NA)), c(1, 2)) + expect_identical(expand_limits_continuous(c(1, 2), coord_limits = c(NA, 3)), c(1, 3)) + expect_identical(expand_limits_continuous(c(1, 2), coord_limits = c(0, NA)), c(0, 2)) +}) + +test_that("expand_limits_continuous() expands limits", { + expect_identical(expand_limits_continuous(c(1, 2), expand = expansion(add = 1)), c(0, 3)) +}) + +test_that("expand_limits_continuous() expands coord-supplied limits", { + expect_identical( + expand_limits_continuous(c(1, 2), coord_limits = c(0, 4), expand = expansion(add = 1)), + c(-1, 5) + ) +}) + +test_that("expand_limits_continuous_trans() expands limits in coordinate space", { + limit_info <- expand_limits_continuous_trans( + c(1, 2), + expand = expansion(add = 0.5), + trans = log10_trans() + ) + + expect_identical( + limit_info$continuous_range, + 10^(expand_range4(log10(c(1, 2)), expansion(add = 0.5))) + ) + + expect_identical( + limit_info$continuous_range_coord, + expand_range4(log10(c(1, 2)), expansion(add = 0.5)) + ) +}) + +test_that("introduced non-finite values fall back on scale limits", { + limit_info <- expand_limits_continuous_trans( + c(1, 100), + expand = expansion(add = 2), + trans = sqrt_trans() + ) + + expect_identical(limit_info$continuous_range, c(1, (sqrt(100) + 2)^2)) + expect_identical(limit_info$continuous_range_coord, c(-1, sqrt(100) + 2)) +}) + +# Expanding discrete scales ----------------------------------------- + +test_that("expand_limits_discrete() can override limits with an empty range", { + expect_identical(expand_limits_discrete(NULL, coord_limits = c(-1, 8)), c(-1, 8)) +}) + +test_that("expand_limits_discrete() can override limits with a discrete range", { + expect_identical(expand_limits_discrete(c("one", "two"), coord_limits = c(NA, NA)), c(1, 2)) + expect_identical(expand_limits_discrete(c("one", "two"), coord_limits = c(NA, 3)), c(1, 3)) + expect_identical(expand_limits_discrete(c("one", "two"), coord_limits = c(3, NA)), c(3, 2)) +}) + +test_that("expand_limits_discrete() can override limits with a continuous range", { + expect_identical( + expand_limits_discrete(NULL, coord_limits = c(NA, NA), range_continuous = c(1, 2)), + c(1, 2) + ) + expect_identical( + expand_limits_discrete(NULL, coord_limits = c(NA, 3), range_continuous = c(1, 2)), + c(1, 3) + ) + expect_identical( + expand_limits_discrete(NULL, coord_limits = c(0, NA), range_continuous = c(1, 2)), + c(0, 2) + ) +}) + +test_that("expand_limits_discrete() can override limits with a both discrete and continuous ranges", { + expect_identical( + expand_limits_discrete(c("one", "two"), coord_limits = c(NA, NA), range_continuous = c(1, 2)), + c(1, 2) + ) + expect_identical( + expand_limits_discrete(c("one", "two"), coord_limits = c(NA, 3), range_continuous = c(1, 2)), + c(1, 3) + ) + expect_identical( + expand_limits_discrete(c("one", "two"), coord_limits = c(0, NA), range_continuous = c(1, 2)), + c(0, 2) + ) +}) + +test_that("expand_limits_continuous_trans() works with inverted transformations", { + limit_info <- expand_limits_continuous_trans( + c(1, 2), + expand = expansion(add = 1), + trans = reverse_trans() + ) + + expect_identical(limit_info$continuous_range, c(0, 3)) + expect_identical(limit_info$continuous_range_coord, c(0, -3)) +}) diff --git a/tests/testthat/test-scale_date.R b/tests/testthat/test-scale_date.R index ceccb90955..02192180b9 100644 --- a/tests/testthat/test-scale_date.R +++ b/tests/testthat/test-scale_date.R @@ -4,6 +4,9 @@ context("scale_date") # Visual tests ------------------------------------------------------------ test_that("date scale draws correctly", { + # datetime labels are locale dependent + withr::local_locale(c(LC_TIME = "C")) + set.seed(321) df <- data_frame( dx = seq(as.Date("2012-02-29"), length.out = 100, by = "1 day")[sample(100, 50)], diff --git a/tests/testthat/test-sec-axis.R b/tests/testthat/test-sec-axis.R index 612f2f6ab3..221d1468c9 100644 --- a/tests/testthat/test-sec-axis.R +++ b/tests/testthat/test-sec-axis.R @@ -16,8 +16,15 @@ test_that("dup_axis() works", { scale <- layer_scales(p)$x expect_equal(scale$sec_name(), scale$name) breaks <- scale$break_info() - expect_equal(breaks$minor, breaks$sec.minor) - expect_equal(breaks$major_source, breaks$sec.major_source) + expect_equal(breaks$minor_source, breaks$sec.minor_source_user) + expect_equal(breaks$major_source, breaks$sec.major_source_user) + + # these aren't exactly equal because the sec_axis trans is based on a + # (default) 1000-point approximation + expect_true(all(abs(breaks$major_source - round(breaks$sec.major_source) <= 1))) + expect_true(all(abs(breaks$minor_source - round(breaks$sec.minor_source) <= 1))) + expect_equal(round(breaks$major, 3), round(breaks$major, 3)) + expect_equal(round(breaks$minor, 3), round(breaks$minor, 3)) }) test_that("sec_axis() works with subtraction", { @@ -29,8 +36,15 @@ test_that("sec_axis() works with subtraction", { scale <- layer_scales(p)$y expect_equal(scale$sec_name(), scale$name) breaks <- scale$break_info() - expect_equal(breaks$minor, breaks$sec.minor) - expect_equal(breaks$major_source, breaks$sec.major_source) + expect_equal(breaks$minor_source, breaks$sec.minor_source_user) + expect_equal(breaks$major_source, breaks$sec.major_source_user) + + # these aren't exactly equal because the sec_axis trans is based on a + # (default) 1000-point approximation + expect_true(all(abs(breaks$major_source - round(breaks$sec.major_source) <= 1))) + expect_true(all(abs(breaks$minor_source - round(breaks$sec.minor_source) <= 1))) + expect_equal(round(breaks$major, 3), round(breaks$major, 3)) + expect_equal(round(breaks$minor, 3), round(breaks$minor, 3)) }) test_that("sex axis works with division (#1804)", { @@ -59,7 +73,9 @@ test_that("sec_axis() breaks work for log-transformed scales", { breaks <- scale$break_info() # test value - expect_equal(breaks$major_source, log10(breaks$sec.major_source)) + expect_equal(breaks$major_source, log10(breaks$sec.major_source_user)) + expect_equal(round(breaks$major_source, 2), round(breaks$sec.major_source, 2)) + # test position expect_equal(breaks$major, round(breaks$sec.major, 1)) @@ -72,7 +88,9 @@ test_that("sec_axis() breaks work for log-transformed scales", { breaks <- scale$break_info() # test value - expect_equal(breaks$major_source, log10(breaks$sec.major_source) - 2) + expect_equal(breaks$major_source, log10(breaks$sec.major_source_user) - 2) + expect_equal(breaks$major_source, round(breaks$sec.major_source, 2)) + # test position expect_equal(breaks$major, round(breaks$sec.major, 1)) @@ -87,7 +105,7 @@ test_that("sec_axis() breaks work for log-transformed scales", { breaks <- scale$break_info() expect_equal(breaks$major_source, log(custom_breaks, base = 10)) - expect_equal(log_breaks()(df$y) * 100, breaks$sec.major_source) + expect_equal(log_breaks()(df$y) * 100, breaks$sec.major_source_user) }) test_that("custom breaks work", { @@ -103,7 +121,7 @@ test_that("custom breaks work", { ) scale <- layer_scales(p)$x breaks <- scale$break_info() - expect_equal(custom_breaks, breaks$sec.major_source) + expect_equal(custom_breaks, breaks$sec.major_source_user) }) test_that("sec axis works with skewed transform", { @@ -149,7 +167,7 @@ test_that("sec axis works with tidy eval", { breaks <- scale$break_info() # test transform - expect_equal(breaks$major_source / 10, breaks$sec.major_source) + expect_equal(breaks$major_source / 10, breaks$sec.major_source_user) # test positioning expect_equal(round(breaks$major, 2), round(breaks$sec.major, 2)) }) @@ -226,6 +244,9 @@ test_that("sec_axis() respects custom transformations", { }) test_that("sec_axis works with date/time/datetime scales", { + # datetime labels are locale dependent + withr::local_locale(c(LC_TIME = "C")) + df <- data_frame( dx = seq(as.POSIXct("2012-02-29 12:00:00", tz = "UTC", @@ -243,7 +264,7 @@ test_that("sec_axis works with date/time/datetime scales", { scale_x_datetime(sec.axis = dup_axis()) scale <- layer_scales(dt)$x breaks <- scale$break_info() - expect_equal(breaks$major_source, breaks$sec.major_source) + expect_equal(breaks$major_source, breaks$sec.major_source_user) # datetime scale dt <- ggplot(df, aes(date, price)) + @@ -251,7 +272,7 @@ test_that("sec_axis works with date/time/datetime scales", { scale_x_date(sec.axis = dup_axis()) scale <- layer_scales(dt)$x breaks <- scale$break_info() - expect_equal(breaks$major_source, breaks$sec.major_source) + expect_equal(breaks$major_source, breaks$sec.major_source_user) # sec_axis dt <- ggplot(df, aes(dx, price)) + @@ -267,7 +288,7 @@ test_that("sec_axis works with date/time/datetime scales", { expect_equal( as.numeric(breaks$major_source) + 12 * 60 * 60, - as.numeric(breaks$sec.major_source) + as.numeric(breaks$sec.major_source_user) ) # visual test, datetime scales, reprex #1936 diff --git a/tests/testthat/test-stat-bin.R b/tests/testthat/test-stat-bin.R index 5e70617b8b..c76520a4c6 100644 --- a/tests/testthat/test-stat-bin.R +++ b/tests/testthat/test-stat-bin.R @@ -168,6 +168,6 @@ test_that("stat_count preserves x order for continuous and discrete", { mtcars$carb3 <- factor(mtcars$carb, levels = c(4,1,2,3,6,8)) b <- ggplot_build(ggplot(mtcars, aes(carb3)) + geom_bar()) expect_identical(b$data[[1]]$x, 1:6) - expect_identical(b$layout$panel_params[[1]]$x.labels, c("4","1","2","3","6","8")) + expect_identical(b$layout$panel_params[[1]]$x$get_labels(), c("4","1","2","3","6","8")) expect_identical(b$data[[1]]$y, c(10,7,10,3,1,1)) }) diff --git a/tests/testthat/test-stat-contour.R b/tests/testthat/test-stat-contour.R new file mode 100644 index 0000000000..7512f1abd4 --- /dev/null +++ b/tests/testthat/test-stat-contour.R @@ -0,0 +1,54 @@ + +context("stat-contour") + +test_that("a warning is issued when there is more than one z per x+y", { + tbl <- data_frame(x = c(1, 1, 2), y = c(1, 1, 2), z = 3) + p <- ggplot(tbl, aes(x, y, z = z)) + geom_contour() + expect_warning(ggplot_build(p), "Zero contours were generated") +}) + +test_that("contouring sparse data results in a warning", { + tbl <- data_frame(x = c(1, 27, 32), y = c(1, 1, 30), z = c(1, 2, 3)) + p <- ggplot(tbl, aes(x, y, z = z)) + geom_contour() + expect_warning(ggplot_build(p), "Number of x coordinates must match") +}) + +test_that("contour breaks can be set manually and by bins and binwidth", { + range <- c(0, 1) + expect_equal(contour_breaks(range), pretty(range, 10)) + expect_identical(contour_breaks(range, breaks = 1:3), 1:3) + expect_length(contour_breaks(range, bins = 5), 6) + expect_equal(resolution(contour_breaks(range, binwidth = 0.3)), 0.3) +}) + +test_that("geom_contour_filled() and stat_contour_filled() result in identical layer data", { + p <- ggplot(faithfuld, aes(waiting, eruptions, z = density)) + p1 <- p + stat_contour_filled() + p2 <- p + geom_contour_filled() + expect_identical(layer_data(p1), layer_data(p2)) +}) + +test_that("geom_contour() and stat_contour() result in identical layer data", { + p <- ggplot(faithfuld, aes(waiting, eruptions, z = density)) + p1 <- p + stat_contour() + p2 <- p + geom_contour() + expect_identical(layer_data(p1), layer_data(p2)) +}) + +test_that("basic stat_contour() plot builds", { + p <- ggplot(faithfuld, aes(waiting, eruptions)) + + geom_contour(aes(z = density, col = factor(stat(level)))) + + # stat_contour() visual tests are unstable due to the + # implementation in isoband + expect_silent(ggplot_build(p)) +}) + +test_that("basic stat_contour_filled() plot builds", { + p <- ggplot(faithfuld, aes(waiting, eruptions)) + + stat_contour_filled(aes(z = density)) + + # stat_contour() visual tests are unstable due to the + # implementation in isoband + expect_silent(ggplot_build(p)) +}) diff --git a/tests/testthat/test-stat-density2d.R b/tests/testthat/test-stat-density2d.R index 2eb8b35a98..340cfa3ccb 100644 --- a/tests/testthat/test-stat-density2d.R +++ b/tests/testthat/test-stat-density2d.R @@ -15,3 +15,15 @@ test_that("uses scale limits, not data limits", { expect_true(min(ret$y) < 8) expect_true(max(ret$y) > 35) }) + +# Visual tests -------------------------------------- + +test_that("stat_density2d can produce contour and raster data", { + p <- ggplot(faithful, aes(x = eruptions, y = waiting)) + + p_contour <- p + stat_density_2d() + p_raster <- p + stat_density_2d(contour = FALSE) + + expect_true("level" %in% names(layer_data(p_contour))) + expect_true("density" %in% names(layer_data(p_raster))) +}) diff --git a/tests/testthat/test-stats-function.r b/tests/testthat/test-stats-function.r index 2444f4ef99..7fdc1fd48c 100644 --- a/tests/testthat/test-stats-function.r +++ b/tests/testthat/test-stats-function.r @@ -47,3 +47,11 @@ test_that("works with formula syntax", { expect_equal(ret$x, s) expect_equal(ret$y, s^2) }) + +test_that("`mapping` is not used by stat_function()", { + expect_warning(stat_function(aes(), fun = identity), "`mapping` is not used") +}) + +test_that("`data` is not used by stat_function()", { + expect_warning(stat_function(data = mtcars, fun = identity), "`data` is not used") +}) diff --git a/tests/testthat/test-theme.r b/tests/testthat/test-theme.r index 121004a65b..a9f8afb3e5 100644 --- a/tests/testthat/test-theme.r +++ b/tests/testthat/test-theme.r @@ -215,6 +215,12 @@ test_that("elements can be merged", { ) }) +test_that("theme elements that don't inherit from element can be combined", { + expect_identical(combine_elements(1, NULL), 1) + expect_identical(combine_elements(NULL, 1), 1) + expect_identical(combine_elements(1, 0), 1) +}) + test_that("complete plot themes shouldn't inherit from default", { default_theme <- theme_gray() + theme(axis.text.x = element_text(colour = "red")) base <- qplot(1, 1) diff --git a/vignettes/ggplot2-in-packages.Rmd b/vignettes/ggplot2-in-packages.Rmd new file mode 100644 index 0000000000..20a17adde9 --- /dev/null +++ b/vignettes/ggplot2-in-packages.Rmd @@ -0,0 +1,239 @@ +--- +title: "Using ggplot2 in packages" +output: rmarkdown::html_vignette +vignette: > + %\VignetteIndexEntry{Using ggplot2 in packages} + %\VignetteEngine{knitr::rmarkdown} + %\VignetteEncoding{UTF-8} +--- + +```{r, include = FALSE} +knitr::opts_chunk$set(collapse = TRUE, comment = "#>", fig.show = "hide") +library(ggplot2) +``` + +This vignette is intended for package developers who use ggplot2 within their package code. As of this writing, this includes over 2,000 packages on CRAN and many more elsewhere! Programming with ggplot2 within a package adds several constraints, particularly if you would like to submit the package to CRAN. In particular, programming within an R package changes the way you refer to functions from ggplot2 and how you use ggplot2's non-standard evaluation within `aes()` and `vars()`. + +## Referring to ggplot2 functions + +As with any function from another package, you will have to list ggplot2 in your `DESCRIPTION` under `Imports` and refer to its functions using `::` (e.g., `ggplot2::function_name`): + +```{r} +mpg_drv_summary <- function() { + ggplot2::ggplot(ggplot2::mpg) + + ggplot2::geom_bar(ggplot2::aes(x = .data$drv)) + + ggplot2::coord_flip() +} +``` + +```{r, include=FALSE} +# make sure this function runs! +mpg_drv_summary() +``` + +If you use ggplot2 functions frequently, you may wish to import one or more functions from ggplot2 into your `NAMESPACE`. If you use [roxygen2](https://cran.r-project.org/package=roxygen2), you can include `#' @importFrom ggplot2 ` in any roxygen comment block (this will not work for datasets like `mpg`). + +```{r} +#' @importFrom ggplot2 ggplot aes geom_bar coord_flip +mpg_drv_summary <- function() { + ggplot(ggplot2::mpg) + + geom_bar(aes(x = drv)) + + coord_flip() +} +``` + +```{r, include=FALSE} +# make sure this function runs! +mpg_drv_summary() +``` + +Even if you use many ggplot2 functions in your package, it is unwise to use ggplot2 in `Depends` or import the entire package into your `NAMESPACE` (e.g. with `#' @import ggplot2`). Using ggplot2 in `Depends` will attach ggplot2 when your package is attached, which includes when your package is tested. This makes it difficult to ensure that others can use the functions in your package without attaching it (i.e., using `::`). Similarly, importing all 450 of ggplot2's exported objects into your namespace makes it difficult to separate the responsibility of your package and the responsibility of ggplot2, in addition to making it difficult for readers of your code to figure out where functions are coming from! + +## Using `aes()` and `vars()` in a package function + +To create any graphic using ggplot2 you will probably need to use `aes()` at least once. If your graphic uses facets, you might be using `vars()` to refer to columns in the plot/layer data. Both of these functions use non-standard evaluation, so if you try to use them in a function within a package they will result in a CMD check note: + +```{r} +mpg_drv_summary <- function() { + ggplot(ggplot2::mpg) + + geom_bar(aes(x = drv)) + + coord_flip() +} +``` + +``` +N checking R code for possible problems (2.7s) + mpg_drv_summary: no visible binding for global variable ‘drv’ + Undefined global functions or variables: + drv +``` + +There are three situations in which you will encounter this problem: + +- You already know the column name or expression in advance. +- You have the column name as a character vector. +- The user specifies the column name or expression, and you want your function to use the same kind of non-standard evaluation used by `aes()` and `vars()`. + +If you already know the mapping in advance (like the above example) you should use the `.data` pronoun from [rlang](https://rlang.r-lib.org/) to make it explicit that you are referring to the `drv` in the layer data and not some other variable named `drv` (which may or may not exist elsewhere). To avoid a similar note from the CMD check about `.data`, use `#' @importFrom rlang .data` in any roxygen code block (typically this should be in the package documentation as generated by `usethis::use_package_doc()`). + +```{r} +mpg_drv_summary <- function() { + ggplot(ggplot2::mpg) + + geom_bar(aes(x = .data$drv)) + + coord_flip() +} +``` + +If you have the column name as a character vector (e.g., `col = "drv"`), use `.data[[col]]`: + +```{r} +col_summary <- function(df, col) { + ggplot(df) + + geom_bar(aes(x = .data[[col]])) + + coord_flip() +} + +col_summary(mpg, "drv") +``` + +If the column name or expression is supplied by the user, you can also pass it to `aes()` or `vars()` using `{{ col }}`. This tidy eval operator captures the expression supplied by the user and forwards it to another tidy eval-enabled function such as `aes()` or `vars()`. + +```{r, eval = (packageVersion("rlang") >= "0.3.4.9003")} +col_summary <- function(df, col) { + ggplot(df) + + geom_bar(aes(x = {{ col }})) + + coord_flip() +} + +col_summary(mpg, drv) +``` + +To summarise: + +- If you know the mapping or facet specification is `col` in advance, use `aes(.data$col)` or `vars(.data$col)`. +- If `col` is a variable that contains the column name as a character vector, use `aes(.data[[col]]` or `vars(.data[[col]])`. +- If you would like the behaviour of `col` to look and feel like it would within `aes()` and `vars()`, use `aes({{ col }})` or `vars({{ col }})`. + +You will see a lot of other ways to do this in the wild, but the syntax we use here is the only one we can guarantee will work in the future! In particular, don't use `aes_()` or `aes_string()`, as they are deprecated and may be removed in a future version. Finally, don't skip the step of creating a data frame and a mapping to pass in to `ggplot()` or its layers! You will see other ways of doing this, but these may rely on undocumented behaviour and can fail in unexpected ways. + +## Best practices for common tasks + +### Using ggplot2 to visualize an object + +ggplot2 is commonly used in packages to visualize objects (e.g., in a `plot()`-style function). For example, a package might define an S3 class that represents the probability of various discrete values: + +```{r} +mpg_drv_dist <- structure( + c( + "4" = 103 / 234, + "f" = 106 / 234, + "r" = 25 / 234 + ), + class = "discrete_distr" +) +``` + +Many S3 classes in R have a `plot()` method, but it is unrealistic to expect that a single `plot()` method can provide the visualization every one of your users is looking for. It is useful, however, to provide a `plot()` method as a visual summary that users can call to understand the essence of an object. To satisfy all your users, we suggest writing a function that transforms the object into a data frame (or a `list()` of data frames if your object is more complicated). A good example of this approach is [ggdendro](https://cran.r-project.org/package=ggdendro), which creates dendrograms using ggplot2 but also computes the data necessary for users to make their own. For the above example, the function might look like this: + +```{r} +discrete_distr_data <- function(x) { + tibble::tibble( + value = names(x), + probability = as.numeric(x) + ) +} + +discrete_distr_data(mpg_drv_dist) +``` + +In general, users of `plot()` call it for its side-effects: it results in a graphic being displayed. This is different than the behaviour of a `ggplot()`, which is not displayed unless it is explicitly `print()`ed. Because of this, ggplot2 defines its own generic `autoplot()`, a call to which is expected to return a `ggplot()` (with no side effects). + +```{r} +#' @importFrom ggplot2 autoplot +autoplot.discrete_distr <- function(object, ...) { + plot_data <- discrete_distr_data(object) + ggplot(plot_data, aes(.data$value, .data$probability)) + + geom_col() + + coord_flip() + + labs(x = "Value", y = "Probability") +} +``` + +Once an `autoplot()` method has been defined, a `plot()` method can then consist of `print()`ing the result of `autoplot()`: + +```{r} +#' @importFrom graphics plot +plot.discrete_distr <- function(x, ...) { + print(autoplot(x, ...)) +} +``` + +It is considered bad practice to implement an S3 generic like `plot()`, or `autoplot()` if you don't own the S3 class, as it makes it hard for the package developer who does have control over the S3 to implement the method themselves. This shouldn't stop you from creating your own functions to visualize these objects! + +### Creating a new theme + +When creating a new theme, it's always good practice to start with an existing theme (e.g. `theme_grey()`) and then `%+replace%` the elements that should be changed. This is the right strategy even if seemingly all elements are replaced, as not doing so makes it difficult for us to improve themes by adding new elements. There are many excellent examples of themes in the [ggthemes](https://cran.r-project.org/package=ggthemes) package. + +```{r} +#' @importFrom ggplot2 %+replace% +theme_custom <- function(...) { + theme_grey(...) %+replace% + theme( + panel.border = element_rect(size = 1, fill = NA), + panel.background = element_blank(), + panel.grid = element_line(colour = "grey80") + ) +} + +mpg_drv_summary() + theme_custom() +``` + +It is important that the theme be calculated after the package is loaded. If not, the theme object is stored in the compiled bytecode of the built package, which may or may not align with the installed version of ggplot2! If your package has a default theme for its visualizations, the correct way to load it is to have a function that returns the default theme: + +```{r} +default_theme <- function() { + theme_custom() +} + +mpg_drv_summary2 <- function() { + mpg_drv_summary() + default_theme() +} +``` + +### Testing ggplot2 output + +We suggest testing the output of ggplot2 in using the [vdiffr](https://cran.r-project.org/package=vdiffr) package, which is a tool to manage visual test cases (this is one of the ways we test ggplot2). If changes in ggplot2 or your code introduce a change in the visual output of a ggplot, tests will fail when you run them locally or on Travis. To use vdiffr, make sure you are using [testthat](https://testthat.r-lib.org/) (you can use `usethis::use_testthat()` to get started) and add vdiffr to `Suggests` in your `DESCRIPTION`. Then, use `vdiffr::expect_doppleganger(, )` to make a test that fails if there are visual changes in ``. + +```r +test_that("output of ggplot() is stable", { + vdiffr::expect_doppelganger("A blank plot", ggplot()) +}) +``` + +### ggplot2 in `Suggests` + +If you use ggplot2 in your package, most likely you will want to list it under `Imports`. If you would like to list ggplot2 in `Suggests` instead, you will not be able to `#' @importFrom ggplot2 ...` (i.e., you must refer to ggplot2 objects using `::`). If you use infix operators from ggplot2 like `%+replace%` and you want to keep ggplot2 in `Suggests`, you can assign the operator within the function before it is used: + +```{r} +theme_custom <- function(...) { + `%+replace%` <- ggplot2::`%+replace%` + + ggplot2::theme_grey(...) %+replace% + ggplot2::theme(panel.background = ggplot2::element_blank()) +} +``` + +```{r, include=FALSE} +# make sure this function runs! +mpg_drv_summary() + theme_custom() +``` + +Generally, if you add a method for a ggplot2 generic like `autoplot()`, ggplot2 should be in `Imports`. If for some reason you would like to keep ggplot2 in `Suggests`, it is possible to register your generics only if ggplot2 is installed using `vctrs::s3_register()`. If you do this, you should copy and paste the source of `vctrs::s3_register()` into your own package to avoid adding a [vctrs](https://vctrs.r-lib.org/) dependency. + +```{r, eval=FALSE} +.onLoad <- function(...) { + if (requireNamespace("ggplot2", quietly = TRUE)) { + vctrs::s3_register("ggplot2::autoplot", "discrete_distr") + } +} +```