Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,11 +18,11 @@ cache:

env:
matrix:
- SCALA_BINARY_VERSION=2.11.8 SPARK_VERSION=2.3.1 SPARK_BUILD="spark-2.3.1-bin-hadoop2.7"
SPARK_BUILD_URL="https://dist.apache.org/repos/dist/release/spark/spark-2.3.1/spark-2.3.1-bin-hadoop2.7.tgz"
- SCALA_BINARY_VERSION=2.11.8 SPARK_VERSION=2.4.0 SPARK_BUILD="spark-2.4.0-bin-hadoop2.7"
SPARK_BUILD_URL="https://dist.apache.org/repos/dist/release/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz"
PYTHON_VERSION=2.7.13
- SCALA_BINARY_VERSION=2.11.8 SPARK_VERSION=2.3.1 SPARK_BUILD="spark-2.3.1-bin-hadoop2.7"
SPARK_BUILD_URL="https://dist.apache.org/repos/dist/release/spark/spark-2.3.1/spark-2.3.1-bin-hadoop2.7.tgz"
- SCALA_BINARY_VERSION=2.11.8 SPARK_VERSION=2.4.0 SPARK_BUILD="spark-2.4.0-bin-hadoop2.7"
SPARK_BUILD_URL="https://dist.apache.org/repos/dist/release/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz"
PYTHON_VERSION=3.6.2

before_install:
Expand Down
16 changes: 8 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ Supported platforms:

See the file `project/Dependencies.scala` for adding your own platform.

Officially TensorFrames supports Spark 2.3+ and Scala 2.11.
Officially TensorFrames supports Spark 2.4+ and Scala 2.11.

See the [user guide](https://github.com/databricks/tensorframes/wiki/TensorFrames-user-guide) for
extensive information about the API.
Expand All @@ -30,11 +30,11 @@ TensorFrames is available as a

## Requirements

- A working version of Apache Spark (2.3 or greater)
- A working version of Apache Spark (2.4 or greater)

- Java 8+

- (Optional) python 2.7+/3.4+ if you want to use the python interface.
- (Optional) python 2.7+/3.6+ if you want to use the python interface.

- (Optional) the python TensorFlow package if you want to use the python interface. See the
[official instructions](https://www.tensorflow.org/install/)
Expand All @@ -54,7 +54,7 @@ Additionally, for developement, you need the following dependencies:
Assuming that `SPARK_HOME` is set, you can use PySpark like any other Spark package.

```bash
$SPARK_HOME/bin/pyspark --packages databricks:tensorframes:0.5.0-s_2.11
$SPARK_HOME/bin/pyspark --packages databricks:tensorframes:0.6.0-s_2.11
```

Here is a small program that uses TensorFlow to add 3 to an existing column.
Expand Down Expand Up @@ -151,7 +151,7 @@ The scala support is a bit more limited than python. In scala, operations can be
You simply use the published package:

```bash
$SPARK_HOME/bin/spark-shell --packages databricks:tensorframes:0.5.0-s_2.11
$SPARK_HOME/bin/spark-shell --packages databricks:tensorframes:0.6.0-s_2.11
```

Here is the same program as before:
Expand Down Expand Up @@ -204,14 +204,14 @@ build/sbt distribution/spDist
Assuming that SPARK_HOME is set and that you are in the root directory of the project:

```bash
$SPARK_HOME/bin/spark-shell --jars $PWD/target/testing/scala-2.11/tensorframes-assembly-0.5.1-SNAPSHOT.jar
$SPARK_HOME/bin/spark-shell --jars $PWD/target/testing/scala-2.11/tensorframes-assembly-0.6.1-SNAPSHOT.jar
```

If you want to run the python version:

```bash
PYTHONPATH=$PWD/target/testing/scala-2.11/tensorframes-assembly-0.5.1-SNAPSHOT.jar \
$SPARK_HOME/bin/pyspark --jars $PWD/target/testing/scala-2.11/tensorframes-assembly-0.5.1-SNAPSHOT.jar
PYTHONPATH=$PWD/target/testing/scala-2.11/tensorframes-assembly-0.6.1-SNAPSHOT.jar \
$SPARK_HOME/bin/pyspark --jars $PWD/target/testing/scala-2.11/tensorframes-assembly-0.6.1-SNAPSHOT.jar
```

## Acknowledgements
Expand Down
4 changes: 2 additions & 2 deletions project/Dependencies.scala
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
object Dependencies {
// The spark version
val targetSparkVersion = sys.props.getOrElse("spark.version", "2.3.1")
val targetTensorFlowVersion = "1.10.0"
val targetSparkVersion = sys.props.getOrElse("spark.version", "2.4.0")
val targetTensorFlowVersion = "1.12.0"
}
2 changes: 1 addition & 1 deletion python/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,4 +3,4 @@ nose>=1.3.3
pandas>=0.19.1
# The proto files under src/main/protobuf must be in sync with the TF version here.
# You can use update-tf-proto.sh under dev/ to update the files.
tensorflow==1.10.0
tensorflow==1.12.0
5 changes: 5 additions & 0 deletions src/main/protobuf/tensorflow/core/framework/step_stats.proto
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,11 @@ message NodeExecStats {
uint32 thread_id = 10;
repeated AllocationDescription referenced_tensor = 11;
MemoryStats memory_stats = 12;
int64 all_start_nanos = 13;
int64 op_start_rel_nanos = 14;
int64 op_end_rel_nanos = 15;
int64 all_end_rel_nanos = 16;
int64 scheduled_nanos = 17;
};

message DeviceStepStats {
Expand Down
2 changes: 1 addition & 1 deletion version.sbt
Original file line number Diff line number Diff line change
@@ -1 +1 @@
version in ThisBuild := "0.5.1-SNAPSHOT"
version in ThisBuild := "0.6.0-SNAPSHOT"