-
Notifications
You must be signed in to change notification settings - Fork 184
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Add python tests and travis integration
Add python tests and travis integration
- Loading branch information
1 parent
6d49fb3
commit 3d932b4
Showing
19 changed files
with
1,710 additions
and
1,098 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -2,4 +2,8 @@ project/project | |
project/target | ||
target | ||
.idea | ||
|
||
.vscode | ||
metastore_db | ||
derby.log | ||
python/spark | ||
**/.cache |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,56 @@ | ||
<!-- | ||
# | ||
# Copyright 2017 TWO SIGMA OPEN SOURCE, LLC | ||
# | ||
# Licensed under the Apache License, Version 2.0 (the "License"); | ||
# you may not use this file except in compliance with the License. | ||
# You may obtain a copy of the License at | ||
# | ||
# http://www.apache.org/licenses/LICENSE-2.0 | ||
# | ||
# Unless required by applicable law or agreed to in writing, software | ||
# distributed under the License is distributed on an "AS IS" BASIS, | ||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
# See the License for the specific language governing permissions and | ||
# limitations under the License. | ||
# | ||
--> | ||
# Python tests | ||
|
||
## Overview | ||
This directory contains the code to test the Python code. It uses the `unittest` module. | ||
|
||
## Prerequisites | ||
The tests need a spark distribution installed locally to run. An easy way to do it is to go to the | ||
[Apache Spark download page](https://spark.apache.org/downloads.html) and select version 2.1.1 (May 02 2017), Pre-built for Apache Hadoop 2.7 and later. | ||
|
||
Extract the tarball in a local directory and set the following environment variable: | ||
``` | ||
export SPARK_HOME=<local-spark-directory> | ||
``` | ||
One time preparation for running the python tests can be setup by runinning the following | ||
from the root Flint directory: | ||
``` | ||
scripts/prepare_python_tests.sh | ||
``` | ||
|
||
## Running tests | ||
To run the tests issue the following command from the root Flint directory: | ||
``` | ||
scripts/run_python_tests.sh | ||
``` | ||
|
||
## Code | ||
The code for the tests is found in this `tests` directory. The content of the files here is as follows: | ||
|
||
* `base_test_case.py` Contains code for the `BaseTestCase` abstract class that is the grandfather of all the testcases. | ||
* `spark_test_case.py` Contains a concrete class, `SparkTestCase`, that inherits from `BaseTestCase` and sets up a local `SparkContext`. This is the default class to inherit test cases from. | ||
* `test_dataframe.py` Contains about 50 test cases for the `TimeSeriesDataFrame`. | ||
* `test_data.py` Contains constant data for the tests. | ||
* `utils.py` Contains specialized assert functions and Pandas DataFrame creation. | ||
|
||
## Extending | ||
If the test setup done in the default class, `SparkTestCase`, does not fit the needs of a particular environment, a new class can be written. The name of the new class, say `MyTestCase` is then exported in the `BASE_CLASS` variable before the tests are run: | ||
``` | ||
export FLINT_BASE_TESTCASE=<Name of new class> | ||
``` |
Empty file.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,72 @@ | ||
# | ||
# Copyright 2017 TWO SIGMA OPEN SOURCE, LLC | ||
# | ||
# Licensed under the Apache License, Version 2.0 (the "License"); | ||
# you may not use this file except in compliance with the License. | ||
# You may obtain a copy of the License at | ||
# | ||
# http://www.apache.org/licenses/LICENSE-2.0 | ||
# | ||
# Unless required by applicable law or agreed to in writing, software | ||
# distributed under the License is distributed on an "AS IS" BASIS, | ||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
# See the License for the specific language governing permissions and | ||
# limitations under the License. | ||
# | ||
''' | ||
The base class code for all Flint unit tests | ||
''' | ||
import unittest | ||
from abc import ABCMeta, abstractclassmethod | ||
import tests.utils as test_utils | ||
from tests.ts.test_data import (FORECAST_DATA, PRICE_DATA, VOL_DATA, VOL2_DATA, | ||
VOL3_DATA, INTERVALS_DATA) | ||
from functools import lru_cache | ||
|
||
|
||
class BaseTestCase(unittest.TestCase, metaclass=ABCMeta): | ||
''' Abstract base class for all Flint tests | ||
''' | ||
@abstractclassmethod | ||
def setUpClass(cls): | ||
''' The automatic setup method for subclasses ''' | ||
return | ||
|
||
@abstractclassmethod | ||
def tearDownClass(cls): | ||
''' The automatic tear down method for subclasses ''' | ||
return | ||
|
||
@lru_cache(maxsize=None) | ||
def forecast(self): | ||
return self.flintContext.read.pandas( | ||
test_utils.make_pdf(FORECAST_DATA, ["time", "id", "forecast"])) | ||
|
||
@lru_cache(maxsize=None) | ||
def vol(self): | ||
return self.flintContext.read.pandas( | ||
test_utils.make_pdf(VOL_DATA, ["time", "id", "volume"])) | ||
|
||
@lru_cache(maxsize=None) | ||
def vol2(self): | ||
return self.flintContext.read.pandas( | ||
test_utils.make_pdf(VOL2_DATA, ["time", "id", "volume"])) | ||
|
||
@lru_cache(maxsize=None) | ||
def vol3(self): | ||
return self.flintContext.read.pandas( | ||
test_utils.make_pdf(VOL3_DATA, ["time", "id", "volume"])) | ||
|
||
@lru_cache(maxsize=None) | ||
def price(self): | ||
return self.flintContext.read.pandas( | ||
test_utils.make_pdf(PRICE_DATA, ["time", "id", "price"])) | ||
|
||
@lru_cache(maxsize=None) | ||
def intervals(self): | ||
return self.flintContext.read.pandas( | ||
test_utils.make_pdf(INTERVALS_DATA, ['time'])) | ||
|
||
def clocks(self): | ||
from ts.flint import clocks | ||
return clocks |
Oops, something went wrong.