Skip to content
Input Output Files in R from Cloud or Local
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.
R fix #37 Dec 26, 2019
man import_st fixed Oct 31, 2019
wiki manual updated Dec 14, 2018
.gitignore tests added to ignore Nov 29, 2018
DESCRIPTION fix #37 Dec 26, 2019
flyio.Rproj renamed package to flyio Dec 5, 2018

flyio - Make data fly to R

Input and output data from R — download, upload, read and write objects from AWS S3, GoogleCloudStorage or local file system from a single interface.


flyio provides a common interface to interact with data from cloud storage providers or local storage directly from R. It currently supports AWS S3 and Google Cloud Storage, thanks to the API wrappers provided by cloudyr. flyio also supports reading or writing tables, rasters, shapefiles and R objects to the data source from memory.

  • flyio_set_datasource(): Set the data source (GCS, S3 or local) for all the other functions in flyio.
  • flyio_auth(): Authenticate data source (GCS or S3) so that you have access to the data. In a single session, different data sources can be authenticated.
  • flyio_set_bucket(): Set the bucket name once for any or both data sources so that you don't need to write it in each function.
  • list_files(): List the files in the bucket/folder.
  • file_exists(): Check if a file exists in the bucket/folder.
  • export_[file/folder](): Upload a file/folder to S3 or GCS from R.
  • import_file(): Download a file from S3 or GCS.
  • import_[table/raster/stack/shp/rds/rda/st](): Read a file from the set data source and bucket from a user-defined function.
  • export_[table/raster/shp/rds/rda/st](): Write a file to the set data source and bucket from a user-defined function.

For global usage, the datsource, authentication keys and bucket can be set in the environment variables of the machine so that one does not have to input it every time.

  • For datasource:CLOUD_STORAGE_NAME
  • For bucket name: flyioBucketS3 or flyioBucketGcs
  • For authentication: GCS_AUTH_FILE or AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_DEFAULT_REGION (For AWS S3, if the awscli is athenticated, then this step is not needed)


# Install the stable version from CRAN:

# Install the latest dev version from GitHub:

# Load the library

If you encounter a bug, please file an issue with steps to reproduce it on Github. Please use the same for any feature requests, enhancements or suggestions.


# Setting the data source

# Verify if the data source is set

# Authenticate the default data source and set bucket

# Authenticate S3 also
flyio_set_bucket("atlanhq-flyio", data_source = "s3")

# Listing the files in GCS
list_files(path = "test", pattern = "*csv")

# Saving mtcars to all the data sources using default function write.csv
export_table(mtcars, "~/Downloads/mtcars.csv", data_source = "local")
export_table(mtcars, "test/mtcars.csv") # saving to GCS, need not mention as set globally
export_table(mtcars, "test/mtcars.csv", data_source = "s3")

# Check if the file written exists in GCS

# Read the file from GCS using readr library
mtcars <- import_table("test/mtcars.csv", FUN = readr::read_csv)


You can’t perform that action at this time.