Skip to content

Commit

Permalink
release v0.15
Browse files Browse the repository at this point in the history
  • Loading branch information
LucaCanali committed Aug 8, 2019
1 parent 4c0dc23 commit 43f9add
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ working example of how to use Spark Listeners for collecting Spark task metrics
* Main author and contact: Luca.Canali@cern.ch + credits to Viktor.Khristenko@cern.ch + thanks to PR contributors
* Compatibility: use with Spark 2.1.x and higher. Compiles with Scala 2.11 and 2.12
* How to use: deploy [sparkMeasure from Maven Central](https://mvnrepository.com/artifact/ch.cern.sparkmeasure/spark-measure)
- Example: `bin/spark-shell --packages ch.cern.sparkmeasure:spark-measure_2.11:0.14`
- Example: `bin/spark-shell --packages ch.cern.sparkmeasure:spark-measure_2.11:0.15`
- PySpark users: in addition, install the Python wrapper APIs: `pip install sparkmeasure`
- Bleeding edge: build from master using sbt: `sbt +package`

Expand Down Expand Up @@ -60,7 +60,7 @@ A list of [docs](docs) and [examples](examples):

- CLI: Scala REPL/spark-shell
```
bin/spark-shell --packages ch.cern.sparkmeasure:spark-measure_2.11:0.14
bin/spark-shell --packages ch.cern.sparkmeasure:spark-measure_2.11:0.15
val stageMetrics = ch.cern.sparkmeasure.StageMetrics(spark)
stageMetrics.runAndMeasure(spark.sql("select count(*) from range(1000) cross join range(1000) cross join range(1000)").show())
Expand Down
2 changes: 1 addition & 1 deletion build.sbt
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
name := "spark-measure"

version := "0.15-SNAPSHOT"
version := "0.15"

scalaVersion := "2.11.12"
crossScalaVersions := Seq("2.11.12", "2.12.8")
Expand Down

0 comments on commit 43f9add

Please sign in to comment.