diff --git a/README.md b/README.md index 03db358..41f8f28 100644 --- a/README.md +++ b/README.md @@ -40,6 +40,8 @@ TensorFrames is available as a [official instructions](https://www.tensorflow.org/versions/r0.7/get_started/os_setup.html#download-and-setup) on how to get the latest release of TensorFlow. + - (Optional) pandas >= 0.19.1 if you want to use the python interface + - (Optional) the [Nix package manager](http://nixos.org/nix/) if you want to guarantee a fully reproducible build environment. This is the environment that will be used for reproducing bugs. Additionally, if you want to run unit tests for python, you need the following dependencies: @@ -52,7 +54,7 @@ Additionally, if you want to run unit tests for python, you need the following d Assuming that `SPARK_HOME` is set, you can use PySpark like any other Spark package. ```bash -$SPARK_HOME/bin/pyspark --packages databricks:tensorframes:0.2.9-rc0-s_2.11 +$SPARK_HOME/bin/pyspark --packages databricks:tensorframes:0.2.9-rc1-s_2.11 ``` Here is a small program that uses Tensorflow to add 3 to an existing column. @@ -150,7 +152,7 @@ The scala support is a bit more limited than python. In scala, operations can be You simply use the published package: ```bash -$SPARK_HOME/bin/spark-shell --packages databricks:tensorframes:0.2.9-rc0 +$SPARK_HOME/bin/spark-shell --packages databricks:tensorframes:0.2.9-rc1 ``` Here is the same program as before: @@ -200,14 +202,14 @@ build/sbt distribution/spDist Assuming that SPARK_HOME is set and that you are in the root directory of the project: ```bash -$SPARK_HOME/bin/spark-shell --jars $PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.9-rc0.jar +$SPARK_HOME/bin/spark-shell --jars $PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.9-rc1.jar ``` If you want to run the python version: ```bash -PYTHONPATH=$PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.9-rc0.jar \ -$SPARK_HOME/bin/pyspark --jars $PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.9-rc0.jar +PYTHONPATH=$PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.9-rc1.jar \ +$SPARK_HOME/bin/pyspark --jars $PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.9-rc1.jar ``` ## Acknowledgements diff --git a/project/Build.scala b/project/Build.scala index a2ed318..daefcfb 100644 --- a/project/Build.scala +++ b/project/Build.scala @@ -11,7 +11,7 @@ object Shading extends Build { lazy val commonSettings = Seq( - version := "0.2.9-rc0", + version := "0.2.9-rc1", name := "tensorframes", scalaVersion := sys.props.getOrElse("scala.version", "2.11.8"), organization := "databricks", diff --git a/python/requirements.txt b/python/requirements.txt index 1d15f1e..d23e599 100644 --- a/python/requirements.txt +++ b/python/requirements.txt @@ -1,3 +1,3 @@ # This file should list any python package dependencies. -nose==1.3.3 -pandas==0.19.0 +nose>=1.3.3 +pandas>=0.19.1