Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Link to spark-connector Slack channel at DataStax Academy Slack #1114

Merged
merged 1 commit into from
May 18, 2017
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
37 changes: 21 additions & 16 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
# Spark Cassandra Connector [![Build Status](https://travis-ci.org/datastax/spark-cassandra-connector.svg)](http://travis-ci.org/datastax/spark-cassandra-connector)
### [Spark Cassandra Connector Spark Packages Website](http://spark-packages.org/package/datastax/spark-cassandra-connector)
Chat with us at [DataStax Academy #spark-cassandra-connector](#datastax-academy)

Chat with us at [DataStax Academy's #spark-connector Slack channel](#Slack)

### Most Recent Release Scala Docs

Expand Down Expand Up @@ -29,7 +30,7 @@ execute arbitrary CQL queries in your Spark applications.
- Partition RDDs according to Cassandra replication using `repartitionByCassandraReplica` call
- Converts data types between Cassandra and Scala
- Supports all Cassandra data types including collections
- Filters rows on the server side via the CQL `WHERE` clause
- Filters rows on the server side via the CQL `WHERE` clause
- Allows for execution of arbitrary CQL statements
- Plays nice with Cassandra Virtual Nodes
- Works with PySpark DataFrames
Expand Down Expand Up @@ -81,7 +82,7 @@ API documentation for the Scala and Java interfaces are available online:
* [Spark-Cassandra-Connector-Java](http://datastax.github.io/spark-cassandra-connector/ApiDocs/1.3.1/spark-cassandra-connector-java/)
* [Embedded-Cassandra](http://datastax.github.io/spark-cassandra-connector/ApiDocs/1.3.1/spark-cassandra-connector-embedded/)

### 1.2.0
### 1.2.0
* [Spark-Cassandra-Connector](http://datastax.github.io/spark-cassandra-connector/ApiDocs/1.2.0/spark-cassandra-connector/)
* [Spark-Cassandra-Connector-Java](http://datastax.github.io/spark-cassandra-connector/ApiDocs/1.2.0/spark-cassandra-connector-java/)
* [Embedded-Cassandra](http://datastax.github.io/spark-cassandra-connector/ApiDocs/1.2.0/spark-cassandra-connector-embedded/)
Expand All @@ -91,17 +92,17 @@ This project is available on Spark Packages; this is the easiest way to start us
http://spark-packages.org/package/datastax/spark-cassandra-connector

This project has also been published to the Maven Central Repository.
For SBT to download the connector binaries, sources and javadoc, put this in your project
For SBT to download the connector binaries, sources and javadoc, put this in your project
SBT config:

libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "2.0.0"

* The default Scala version for Spark 2.0+ is 2.11 please choose the appropriate build. See the
[FAQ](doc/FAQ.md) for more information

## Building
See [Building And Artifacts](doc/12_building_and_artifacts.md)

## Documentation

- [Quick-start guide](doc/0_quick_start.md)
Expand All @@ -125,25 +126,29 @@ See [Building And Artifacts](doc/12_building_and_artifacts.md)
- [Tips for Developing the Spark Cassandra Connector](doc/developers.md)

## Online Training

### DataStax Academy

DataStax Academy provides free online training for Apache Cassandra and DataStax Enterprise. In [DS320: Analytics with Spark](https://academy.datastax.com/courses/ds320-analytics-with-apache-spark), you will learn how to effectively and efficiently solve analytical problems with Apache Spark, Apache Cassandra, and DataStax Enterprise. You will learn about Spark API, Spark-Cassandra Connector, Spark SQL, Spark Streaming, and crucial performance optimization techniques.

## Community

### Reporting Bugs

New issues may be reported using [JIRA](https://datastax-oss.atlassian.net/browse/SPARKC/). Please include
all relevant details including versions of Spark, Spark Cassandra Connector, Cassandra and/or DSE. A minimal
reproducible case with sample code is ideal.

### Mailing List

Questions and requests for help may be submitted to the [user mailing list](http://groups.google.com/a/lists.datastax.com/forum/#!forum/spark-connector-user).

### Gitter
Datastax is consolidating our chat resources to Slack at [DataStax Academy](#datastax-academy)
### Slack

The gitter room will be shut down in the near future
[![Join the chat at https://gitter.im/datastax/spark-cassandra-connector](https://badges.gitter.im/datastax/spark-cassandra-connector.svg)](https://gitter.im/datastax/spark-cassandra-connector?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
The project uses Slack to facilitate conversation in our community. Find us in `#spark-connector` channel at [DataStax Academy Slack](https://academy.datastax.com/slack).

### IRC

\#spark-cassandra-connector on irc.freenode.net. If you are new to IRC, you can use a [web-based client](http://webchat.freenode.net/?channels=#spark-cassandra-connector).

## Contributing
Expand Down Expand Up @@ -174,20 +179,20 @@ To run unit and integration tests:
By default, integration tests start up a separate, single Cassandra instance and run Spark in local mode.
It is possible to run integration tests with your own Cassandra and/or Spark cluster.
First, prepare a jar with testing code:

./sbt/sbt test:package

Then copy the generated test jar to your Spark nodes and run:

export IT_TEST_CASSANDRA_HOST=<IP of one of the Cassandra nodes>
export IT_TEST_SPARK_MASTER=<Spark Master URL>
./sbt/sbt it:test

## Generating Documents
To generate the Reference Document use
To generate the Reference Document use

./sbt/sbt spark-cassandra-connector-unshaded/run (outputLocation)

outputLocation defaults to doc/reference.md

## License
Expand Down