The goal of defmacro
is to experiment with compile time macros in R.
The idea is to add a macro expansion step during the .onLoad
step of
the package.
A macro is a function that takes code and returns code.
An example package is here.
For example in a package you define a macro that evaluates an expression at “compile time”:
constexpr <- defmacro::defmacro(function(expr) {
eval(expr)
})
Then you might have a regular function that tests if a value exceeds a quantile of the standard normal distribtion:
is_invalid <- function(value) {
value > constexpr(qnorm(0.975))
}
After macro expansion during .onLoad
the following function is
exported to the user:
defmacro::expand_function(is_invalid)
#> function (value)
#> {
#> value > 1.95996398454005
#> }
#> <environment: 0x1386bc130>
Thus the call to qnorm
never happens at runtime as it could have been
evaluated during package load.
You could also define your own piping function and have all the overhead removed during runtime:
`%>%` <- defmacro::defmacro(function(lhs, rhs) {
fun_args <- c(list(lhs), unlist(as.list(rhs[-1L]), FALSE))
rlang::get_expr(rlang::quo(`!!`(rhs[[1L]])(!!!(fun_args))))
})
analyze_dataset <- function(data) {
data %>%
dplyr::filter(hp > constexpr(50 + 50)) %>%
dplyr::group_by(cyl) %>%
dplyr::summarise(dplyr::n())
}
defmacro::expand_function(analyze_dataset)
#> function (data)
#> {
#> dplyr::summarise(dplyr::group_by(dplyr::filter(data, hp >
#> 100), cyl), dplyr::n())
#> }
#> <environment: 0x1386bc130>
This can also be used to elide parts of your code, akin to #if
in C:
dash_if <- defmacro::defmacro(function(code, condition) {
if (condition) code
})
conditional <- function() {
dash_if(kept(), TRUE)
dash_if(removed(), FALSE)
}
defmacro::expand_function(conditional)
#> function ()
#> {
#> kept()
#> }
#> <environment: 0x1386bc130>
- debugme - Easy and efficient debugging for R packages