Skip to content

Commit

Permalink
update readme (#73)
Browse files Browse the repository at this point in the history
  • Loading branch information
sadikovi committed Mar 17, 2017
1 parent be4c1e7 commit c107705
Showing 1 changed file with 6 additions and 4 deletions.
10 changes: 6 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,8 +52,8 @@ Currently only these types are supported for indexed columns:
- Certain Spark versions are supported (see table below)

## Requirements
| Spark version | `parquet-index` latest version |
|---------------|--------------------------------|
| Spark version | parquet-index latest version |
|---------------|------------------------------|
| 1.6.x | Not supported |
| 2.0.0 | [0.2.2](http://spark-packages.org/package/lightcopy/parquet-index) |
| 2.0.1 | [0.2.2](http://spark-packages.org/package/lightcopy/parquet-index) |
Expand Down Expand Up @@ -197,14 +197,16 @@ context.index.delete.parquet('table.parquet')

### Persistent tables API
Package also supports index for persistent tables that are saved using `saveAsTable()` in Parquet
format and accessible using `spark.table(tableName)`. API is available in Scala, Java and Python.
format and accessible using `spark.table(tableName)`. When using with persistent tables, just
replace `.parquet(path_to_the_file)` with `.table(table_name)`. API is available in Scala, Java and
Python.

#### Scala
```scala
import com.github.lightcopy.implicits._

// Create index for table name that exists in Spark catalog
spark.index.create.indexByAll("col1", "col2", "col3").table("table_name")
spark.index.create.indexBy("col1", "col2", "col3").table("table_name")

// Check if index exists for persistent table
val exists: Boolean = spark.index.exists.table("table_name")
Expand Down

0 comments on commit c107705

Please sign in to comment.