A library for time series analysis on Apache Spark
Clone or download
souellette-faimdata and sryza TimeSeries.timeDifferences (#189)
* added locAtOrBeforeDateTime and locAtOrAfterDateTime with associated unit tests

* Checking in compilation error fix

* fixed bugs in hybrid locAtOrAfterDateTime + submitted a unit test for hybrid locAtOrBeforeDateTime/locAtOrAfterDateTime

* Added TimeSeriesRDD.saveAsParquetDataFrame and timeSeriesRDDFromParquet

* code style fixes and added comment to saveAsParquetDataFrame function.

* Added filterByInstant function to TimeSeries and TimeSeriesRDD.

* fixed sc initialization in TimeSeriesRDD.filterByInstant test

* code style fixes in filterByInstant.

* checking in TimeSeries.timeDifferences() and corresponding automated test.

* code style fixed in TimeSeries.timeDifferences()

* renamed timeDifferences to differencesByFrequency + added example in comments to function + fixed a bug + added a unit test
Latest commit 280aa88 Mar 17, 2017


Build Status

Time Series for Spark (The spark-ts Package)

A Scala / Java / Python library for interacting with time series data on Apache Spark.

Post questions and comments to the Google group, or email them directly to mailto:spark-ts@googlegroups.com.

Note: The spark-ts library is no longer under active development by me (Sandy). I unfortunately no longer have bandwidth to develop features, answer all questions on the mailing list, or fix all bugs that are filed.

That said, I remain happy to review pull requests and do whatever I can to aid others in advancing the library.

Docs are available at http://sryza.github.io/spark-timeseries.

Or check out the Scaladoc, Javadoc, or Python doc.

The aim here is to provide

  • A set of abstractions for manipulating large time series data sets, similar to what's provided for smaller data sets in Pandas, Matlab, and R's zoo and xts packages.
  • Models, tests, and functions that enable dealing with time series from a statistical perspective, similar to what's provided in StatsModels and a variety of Matlab and R packages.

The library sits on a few other excellent Java and Scala libraries.

Using this Repo


We use Maven for building Java / Scala. To compile, run tests, and build jars:

mvn package

To run Python tests (requires nose):

cd python
export SPARK_HOME=<location of local Spark installation>


To run a spark-shell with spark-ts and its dependencies on the classpath:

spark-shell --jars target/sparkts-$VERSION-SNAPSHOT-jar-with-dependencies.jar


To publish docs, easiest is to clone a separate version of this repo in some location we'll refer to as DOCS_REPO. Then:

# Build main doc
mvn site -Ddependency.locations.enabled=false

# Build scaladoc
mvn scala:doc

# Build javadoc
mvn javadoc:javadoc

# Build Python doc
cd python
export SPARK_HOME=<location of local Spark installation>
make html
cd ..

cp -r target/site/* $DOCS_REPO
cp -r python/build/html/ $DOCS_REPO/pydoc
git checkout gh-pages
git add -A
git commit -m "Some message that includes the hash of the relevant commit in master"
git push origin gh-pages

To build a Python source distribution, first build with Maven, then:

cp target/sparkts-$VERSION-SNAPSHOT-jar-with-dependencies.jar python/sparkts/
cd python
python setup.py sdist

To release Java/Scala packages (based on http://oryxproject.github.io/oryx/docs/how-to-release.html):

mvn -Darguments="-DskipTests" -DreleaseVersion=$VERSION \
    -DdevelopmentVersion=$VERSION-SNAPSHOT release:prepare

mvn -s private-settings.xml -Darguments="-DskipTests" release:perform

To release Python packages (based on http://peterdowns.com/posts/first-time-with-pypi.html):

python setup.py register -r pypi
python setup.py sdist upload -r pypi