Skip to content

nose2 plugin to run the tests with support of pyspark.

License

Notifications You must be signed in to change notification settings

malexer/nose2-spark

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

nose2-spark

nose2 plugin to run the tests with support of pyspark (Apache Spark).

Features:

  1. Make "pyspark" importable in you code executed by nose2.
  2. Add a list of py-files dependencies of your pyspark application (which is usually supplied as an option spark-submit --py-files ...).

Install

$ pip install nose2-spark

Usage

Load "nose2-spark" plugin into nose2 by creating nose2.cfg in your project directory:

[unittest]
plugins = nose2_spark

Run tests with nose2-spark activated (pyspark and friends are added to pythonpath):

$ nose2 --pyspark

nose2-spark will try to import pyspark by looking into:

  1. SPARK_HOME environment variable
  2. Some common Spark locations.

You can set it manually in case if all of mentioned methods are failing to find Spark. Add section "nose2-spark" to nose2.cfg:

[nose2-spark]
spark_home = /opt/spark

You can add a list of required py-files to run your code:

[nose2-spark]
pyfiles = package1.zip
          package2.zip

Example

Example of nose2.cfg with spark_home defined, one py-files dependency and auto activating nose2-spark plugin:

[unittest]
plugins = nose2_spark

[nose2-spark]
always-on = True
spark_home = /opt/spark
pyfiles = package1.zip

This will allow to run tests by single command:

$ nose2

About

nose2 plugin to run the tests with support of pyspark.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages