R interface for Apache Spark
-
Updated
May 8, 2024 - R
Apache Spark is an open source distributed general-purpose cluster-computing framework. It provides an interface for programming entire clusters with implicit data parallelism and fault tolerance.
R interface for Apache Spark
bring sf to spark in production
R interface for XGBoost on Spark
R interface to Spark TensorFlow Connector
Old repo for R interface for GraphFrames
Sparklyr extension making Flint time series library functionalities (https://github.com/twosigma/flint) easily accessible through R
Enable spatial functions in Spark through the `sparklyr` package
Mirror of https://gitlab.com/zero323/dlt
A sparklyr extension to analyze genome datasets
R workloads running at scale on Google Cloud
Projects created using R
This repository you are browsing contains intermediate level piece of codes which are useful for cleaning, exploratory analysis, handling of missing data points, outlier detection and different visualization techniques using graphics, ggplot2, tidycharts, ggExtra packages. Also in particular part of the script you can get basic information about…
Created by Matei Zaharia
Released May 26, 2014