Join GitHub today
GitHub is home to over 40 million developers working together to host and review code, manage projects, and build software together.Sign up
Added db_save_query.bigquery so compute() now works #52
Hello everyone! This pull request features an implementation of
NOTE: had to resubmit pull request since had to move it from master to a separate branch in my repository. (The original pull request is #51 .)
The motivation for this pull request is that the current implementation of
The code belows demonstrates a simple test of
library(dplyr) library(bigrquery) BILLING_PROJECT <- "<YOUR_BILLING_PROJECT>" BILLING_DATASET <- "<DATASET_INSIDE_THAT_PROJECT>" bq <- src_bigquery(BILLING_PROJECT, BILLING_DATASET) # Upload the test data frame to your project df <- data.frame(a = 1:100, b = 101:200) test_df <- copy_to(bq, df, name = "test_df") # Perform a simple query and save the result remotely test_df2 <- test_df %>% summarise( n = n() ) %>% compute( name = "test_df2", temporary = FALSE) # Display the result test_df2 %>% collect
Please note that running the test does require to have a Google Cloud billing project set up already.
Thank you for consideration!
This was referenced
Jul 1, 2015
thanks for the patch! (and sorry i've been slow at getting a chance to look at them.)
as a high-level question, i'm curious why you'd want to default to a non-temporary table -- the default is for query results to live in temporary tables, so it seems like the easiest approach might be to just go with that?
a few related questions coming up inline.
@craigcitro Thanks for the feedback! Yes, agreed. The code was ignoring
As of now, I see that one solution here could be to add an error for two cases:
I am not sure I understood how to solve the issue with