Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 7 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,8 @@ TensorFrames is available as a
[official instructions](https://www.tensorflow.org/versions/r0.7/get_started/os_setup.html#download-and-setup)
on how to get the latest release of TensorFlow.

- (Optional) pandas >= 0.19.1 if you want to use the python interface

- (Optional) the [Nix package manager](http://nixos.org/nix/) if you want to guarantee a fully reproducible build environment. This is the environment that will be used for reproducing bugs.

Additionally, if you want to run unit tests for python, you need the following dependencies:
Expand All @@ -52,7 +54,7 @@ Additionally, if you want to run unit tests for python, you need the following d
Assuming that `SPARK_HOME` is set, you can use PySpark like any other Spark package.

```bash
$SPARK_HOME/bin/pyspark --packages databricks:tensorframes:0.2.9-rc0-s_2.11
$SPARK_HOME/bin/pyspark --packages databricks:tensorframes:0.2.9-rc1-s_2.11
```

Here is a small program that uses Tensorflow to add 3 to an existing column.
Expand Down Expand Up @@ -150,7 +152,7 @@ The scala support is a bit more limited than python. In scala, operations can be
You simply use the published package:

```bash
$SPARK_HOME/bin/spark-shell --packages databricks:tensorframes:0.2.9-rc0
$SPARK_HOME/bin/spark-shell --packages databricks:tensorframes:0.2.9-rc1
```

Here is the same program as before:
Expand Down Expand Up @@ -200,14 +202,14 @@ build/sbt distribution/spDist
Assuming that SPARK_HOME is set and that you are in the root directory of the project:

```bash
$SPARK_HOME/bin/spark-shell --jars $PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.9-rc0.jar
$SPARK_HOME/bin/spark-shell --jars $PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.9-rc1.jar
```

If you want to run the python version:

```bash
PYTHONPATH=$PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.9-rc0.jar \
$SPARK_HOME/bin/pyspark --jars $PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.9-rc0.jar
PYTHONPATH=$PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.9-rc1.jar \
$SPARK_HOME/bin/pyspark --jars $PWD/target/testing/scala-2.11/tensorframes-assembly-0.2.9-rc1.jar
```

## Acknowledgements
Expand Down
2 changes: 1 addition & 1 deletion project/Build.scala
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ object Shading extends Build {


lazy val commonSettings = Seq(
version := "0.2.9-rc0",
version := "0.2.9-rc1",
name := "tensorframes",
scalaVersion := sys.props.getOrElse("scala.version", "2.11.8"),
organization := "databricks",
Expand Down
4 changes: 2 additions & 2 deletions python/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
# This file should list any python package dependencies.
nose==1.3.3
pandas==0.19.0
nose>=1.3.3
pandas>=0.19.1