-
Notifications
You must be signed in to change notification settings - Fork 66
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
export vec_duplicate_split() #514
Comments
Would |
Would it make sense to go ahead and I think it would be nice to even return a data frame. Then With this in mind it could also be called library(vctrs)
library(tibble)
by <- mtcars[c("vs", "am")]
ki <- vctrs:::vec_duplicate_split(by)
# current return value
ki
#> $key
#> [1] 1 3 4 5
#>
#> $idx
#> $idx[[1]]
#> [1] 1 2 27 29 30 31
#>
#> $idx[[2]]
#> [1] 3 18 19 20 26 28 32
#>
#> $idx[[3]]
#> [1] 4 6 8 9 10 11 21
#>
#> $idx[[4]]
#> [1] 5 7 12 13 14 15 16 17 22 23 24 25
# proposed return value
ki$key <- vec_slice(by, ki$key)
as_tibble(new_data_frame(ki, n = vec_size(ki$key)))
#> # A tibble: 4 x 2
#> key$vs $am idx
#> <dbl> <dbl> <list>
#> 1 0 1 <int [6]>
#> 2 1 1 <int [7]>
#> 3 1 0 <int [7]>
#> 4 0 0 <int [12]>
# similar to
as_tibble(vec_split(mtcars, by))
#> # A tibble: 4 x 2
#> key$vs $am val
#> <dbl> <dbl> <list<df[,11]>>
#> 1 0 1 [6 × 11]
#> 2 1 1 [7 × 11]
#> 3 1 0 [7 × 11]
#> 4 0 0 [12 × 11] Created on 2019-07-30 by the reprex package (v0.2.1) |
I like this, having it a data frame with a data frame column for the key adds some structural information that is nice to have. I suppose a |
Would it make sense to document (This might go against the general vctrs api since we use |
For e.g. tidyverse/dplyr#4504
vec_duplicate_split()
gives exactly what is needed for avctrs
based implementation ofgroup_by()
, e.g.key
gives the indices I can use tovec_slice()
the data to get the first columnsidx
is exactly the.rows
columntidyverse/dplyr#4504 then goes a bit further to reveal empty groups when
.drop = FALSE
but that does not need to bevctrs
' concern.The text was updated successfully, but these errors were encountered: