Sbt plugin for Spark packages
Clone or download
Type Name Latest commit message Commit time
Failed to load latest commit information.
project ready for release Jun 23, 2015
src release 0.2.6 Jun 21, 2017
.gitignore new folder Feb 10, 2015
.travis.yml add travis Jun 23, 2015
LICENSE new folder Feb 10, 2015 release 0.2.6 Jun 21, 2017
build.sbt bump version to 0.2.7-SNAPSHOT Jun 21, 2017
scripted.sbt added tests Feb 13, 2015

sbt-spark-package Build Status

Sbt Plugin for Spark Packages

sbt-spark-package is a Sbt plugin that aims to simplify the use and development of Spark Packages.

Please upgrade to version 0.2.4+ as spark-packages now supports SSL.


  • sbt


The sbt way

Simply add the following to <your_project>/project/plugins.sbt:

  resolvers += "bintray-spark-packages" at ""

  addSbtPlugin("org.spark-packages" % "sbt-spark-package" % "0.2.6")


Spark Package Developers

In your build.sbt file include the appropriate values for:

  • spName := "organization/my-awesome-spark-package" // the name of your Spark Package

Please specify any Spark dependencies using sparkVersion and sparkComponents. For example:

  • sparkVersion := "2.1.0" // the Spark Version your package depends on.

Spark Core will be included by default if no value for sparkComponents is supplied. You can add sparkComponents as:

  • sparkComponents += "mllib" // creates a dependency on spark-mllib.


  • sparkComponents ++= Seq("streaming", "sql")

You can make a zip archive ready for a release on the Spark Packages website by simply calling sbt spDist. This command will include any python files related to your package in the jar inside this archive. When this jar is added to your PYTHONPATH, you will be able to use your Python files.

By default, the zip file will be produced in <project>/target, but you can override this by providing a value for spDistDirectory like:

spDistDirectory := "Users" / "foo" / "Documents" / "bar"

The slashes should still remain as slashes on a Windows system, don't switch them to backslashes.

You may publish your package locally for testing with sbt spPublishLocal.

In addition, sbt console will create you a Spark Context for testing your code like the spark-shell.

If you want to make a release of your package against multiple Scala versions (e.g. 2.10, 2.11), you may set spAppendScalaVersion := true in your build file.

In any case where you really can't specify Spark dependencies using sparkComponents (e.g. you have exclusion rules) and configure them as provided (e.g. standalone jar for a demo), you may use spIgnoreProvided := true to properly use the assembly plugin.

Including shaded dependencies

Sometimes you may require shading for your package to work in certain environments. sbt-spark-package supports publishing shaded dependencies built through the sbt-assembly plugin. To achieve this, you will need two projects, one for building the shaded dependency, and one for building the distribution ready package.

lazy val shaded = Project("shaded", file(".")).settings(
  libraryDependencies ++= (dependenciesToShade ++ % "provided")), // don't include any other dependency in your assembly jar
  target := target.value / "shaded", // have a separate target directory to make sbt happy
  assemblyShadeRules in assembly := Seq(
    ShadeRule.rename("blah.**" -> "bleh.@1").inAll
) // add all other settings

lazy val distribute = Project("distribution", file(".")).settings(
  spName := ... // your spark package name
  target := target.value / "distribution",
  assembly in spPackage := (assembly in shaded).value, // this will pick up the shaded jar for distribution
  libraryDependencies := nonShadedDependencies // have all your non shaded dependencies here so that we can
                                               // generate a clean pom.
) // add all other settings

Now you may use distribution/spDist to build your zip file, or distribution/spPublish to publish a new release. For more details on publishing, please refer to the next section.

Registering and publishing Spark Packages


In order to use spRegister or spPublish to register or publish a release of your Spark Package, you have to specify your Github credentials. You may specify your credentials through a file (recommended) or directly in your build file like below:

credentials += Credentials(Path.userHome / ".ivy2" / ".sbtcredentials") // A file containing credentials

credentials += Credentials("Spark Packages Realm",

More can be found in the sbt documentation.

Using these functions require "read:org" Github access to authenticate ownership of the repo. Documentation to generate a Github Personal Access Token can be found here.


You can register your Spark Package for the first time using this plugin with the command sbt spRegister. In order to register your package, you must have logged in to the Spark Packages website at least once and supply values for the following settings in your build file:

spShortDescription := "My awesome Spark Package" // Your one line description of your package

spDescription := """My long description.
                    |Could be multiple lines long.
                    | - My package can do this,
                    | - My package can do that.""".stripMargin

credentials += // Your credentials, see above.

The homepage of your package is by default the web page for the Github repository. You can change the default homepage by using:

spHomepage := // Set this if you want to specify a web page other than your github repository.


You can publish a new release using sbt spPublish. The HEAD commit on your local repository will be used as the git commit sha for your release. Therefore, please make sure that your local commit is indeed the version you would like to make a release for, and that you have pushed that commit to the master branch on your remote.

The required settings for spPublish are:

// You must have an Open Source License. Some common licenses can be found in:
licenses += "Apache-2.0" -> url("")

// If you published your package to Maven Central for this release (must be done prior to spPublish)
spIncludeMaven := true

credentials += // Your credentials, see above.

Spark Package Users

Any Spark Packages your package depends on can be added as:

  • spDependencies += "databricks/spark-avro:0.1" // format is spark-package-name:version

We also recommend that you use sparkVersion and sparkComponents to manage your Spark dependencies. In addition, you can use sbt assembly to create an uber jar of your project.


If you encounter bugs or want to contribute, feel free to submit an issue or pull request.