Skip to content

Commit

Permalink
version bump to 0.1.6
Browse files Browse the repository at this point in the history
  • Loading branch information
menishmueli committed Mar 25, 2024
1 parent c43c870 commit 61cd317
Show file tree
Hide file tree
Showing 4 changed files with 8 additions and 8 deletions.
8 changes: 4 additions & 4 deletions README.md
Expand Up @@ -47,7 +47,7 @@ See [Our Features](https://dataflint.gitbook.io/dataflint-for-spark/overview/our

Install DataFlint via sbt:
```sbt
libraryDependencies += "io.dataflint" %% "spark" % "0.1.5"
libraryDependencies += "io.dataflint" %% "spark" % "0.1.6"
```

Then instruct spark to load the DataFlint plugin:
Expand All @@ -65,7 +65,7 @@ Add these 2 configs to your pyspark session builder:
```python
builder = pyspark.sql.SparkSession.builder
...
.config("spark.jars.packages", "io.dataflint:spark_2.12:0.1.5") \
.config("spark.jars.packages", "io.dataflint:spark_2.12:0.1.6") \
.config("spark.plugins", "io.dataflint.spark.SparkDataflintPlugin") \
...
```
Expand All @@ -76,7 +76,7 @@ Alternatively, install DataFlint with **no code change** as a spark ivy package

```bash
spark-submit
--packages io.dataflint:spark_2.12:0.1.5 \
--packages io.dataflint:spark_2.12:0.1.6 \
--conf spark.plugins=io.dataflint.spark.SparkDataflintPlugin \
...
```
Expand All @@ -89,7 +89,7 @@ After the installations you will see a "DataFlint" button in Spark UI, click on

### Additional installation options

* There is also support for scala 2.13, if your spark cluster is using scala 2.13 change package name to io.dataflint:spark_**2.13**:0.1.5
* There is also support for scala 2.13, if your spark cluster is using scala 2.13 change package name to io.dataflint:spark_**2.13**:0.1.6
* For more installation options, including for **python** and **k8s spark-operator**, see [Install on Spark docs](https://dataflint.gitbook.io/dataflint-for-spark/getting-started/install-on-spark)
* For installing DataFlint in **spark history server** for observability on completed runs see [install on spark history server docs](https://dataflint.gitbook.io/dataflint-for-spark/getting-started/install-on-spark-history-server)
* For installing DataFlint on **DataBricks** see [install on databricks docs](https://dataflint.gitbook.io/dataflint-for-spark/getting-started/install-on-databricks)
Expand Down
2 changes: 1 addition & 1 deletion spark-plugin/build.sbt
@@ -1,6 +1,6 @@
import xerial.sbt.Sonatype._

lazy val versionNum: String = "0.1.5"
lazy val versionNum: String = "0.1.6"
lazy val scala212 = "2.12.18"
lazy val scala213 = "2.13.12"
lazy val supportedScalaVersions = List(scala212, scala213)
Expand Down
4 changes: 2 additions & 2 deletions spark-ui/package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion spark-ui/package.json
@@ -1,6 +1,6 @@
{
"name": "dataflint-ui",
"version": "0.1.5",
"version": "0.1.6",
"homepage": "./",
"private": true,
"dependencies": {
Expand Down

0 comments on commit 61cd317

Please sign in to comment.