-
Updated
Jul 3, 2017 - R
apache-spark
Apache Spark is an open source distributed general-purpose cluster-computing framework. It provides an interface for programming entire clusters with implicit data parallelism and fault tolerance.
Here are 13 public repositories matching this topic...
Projects created using R
-
Updated
Mar 19, 2018 - R
Old repo for R interface for GraphFrames
-
Updated
Mar 21, 2018 - R
Enable spatial functions in Spark through the `sparklyr` package
-
Updated
Feb 4, 2019 - R
A sparklyr extension to analyze genome datasets
-
Updated
Jun 14, 2019 - R
R workloads running at scale on Google Cloud
-
Updated
Apr 25, 2020 - R
This repository you are browsing contains intermediate level piece of codes which are useful for cleaning, exploratory analysis, handling of missing data points, outlier detection and different visualization techniques using graphics, ggplot2, tidycharts, ggExtra packages. Also in particular part of the script you can get basic information about…
-
Updated
Jun 23, 2021 - R
R interface to Spark TensorFlow Connector
-
Updated
Sep 13, 2021 - R
bring sf to spark in production
-
Updated
Dec 13, 2021 - R
Sparklyr extension making Flint time series library functionalities (https://github.com/twosigma/flint) easily accessible through R
-
Updated
Jan 11, 2022 - R
Mirror of https://gitlab.com/zero323/dlt
-
Updated
Nov 25, 2022 - R
R interface for XGBoost on Spark
-
Updated
May 1, 2024 - R
R interface for Apache Spark
-
Updated
Jun 23, 2024 - R
Created by Matei Zaharia
Released May 26, 2014
- Followers
- 420 followers
- Repository
- apache/spark
- Website
- spark.apache.org
- Wikipedia
- Wikipedia