Skip to content

Commit 803a4fb

Browse files
committed
Removing datastax term
1 parent 2f40a0b commit 803a4fb

15 files changed

+26
-26
lines changed

articles/cosmos-db/cassandra-api-load-data.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ ms.reviewer: sngun
1515

1616
# Load sample data into an Azure Cosmos DB Cassandra API table
1717

18-
This tutorial shows how to load sample user data to a table in Azure Cosmos DB Cassandra API account by using a java application. The java application uses the [Datastax Java driver](https://github.com/datastax/java-driver) and loads user data such as user ID, user name, user city.
18+
This tutorial shows how to load sample user data to a table in Azure Cosmos DB Cassandra API account by using a java application. The java application uses the [Java driver](https://github.com/datastax/java-driver) and loads user data such as user ID, user name, user city.
1919

2020
This tutorial covers the following tasks:
2121

articles/cosmos-db/cassandra-api-query-data.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ ms.date: 09/24/2018
1313

1414
# Query data from an Azure Cosmos DB Cassandra API account
1515

16-
This tutorial shows how to query user data from Azure Cosmos DB Cassandra API account by using a Java application. The Java application uses the [Datastax Java driver](https://github.com/datastax/java-driver) and queries user data such as user ID, user name, user city.
16+
This tutorial shows how to query user data from Azure Cosmos DB Cassandra API account by using a Java application. The Java application uses the [Java driver](https://github.com/datastax/java-driver) and queries user data such as user ID, user name, user city.
1717

1818
This tutorial covers the following tasks:
1919

articles/cosmos-db/cassandra-introduction.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ ms.reviewer: sngun
1616

1717
# Introduction to the Azure Cosmos DB Cassandra API
1818

19-
Azure Cosmos DB Cassandra API can be used as the data store for apps written for [Apache Cassandra](https://cassandra.apache.org/) and DataStax. This means that by using existing [Apache drivers](https://cassandra.apache.org/doc/latest/getting_started/drivers.html?highlight=driver) compliant with CQLv4, your existing Cassandra application can now communicate with the Azure Cosmos DB Cassandra API. In many cases, you can switch from using Apache Cassandra or DataStax to using Azure Cosmos DB 's Cassandra API, by just changing a connection string.
19+
Azure Cosmos DB Cassandra API can be used as the data store for apps written for [Apache Cassandra](https://cassandra.apache.org/). This means that by using existing [Apache drivers](https://cassandra.apache.org/doc/latest/getting_started/drivers.html?highlight=driver) compliant with CQLv4, your existing Cassandra application can now communicate with the Azure Cosmos DB Cassandra API. In many cases, you can switch from using Apache Cassandra to using Azure Cosmos DB 's Cassandra API, by just changing a connection string.
2020

2121
The Cassandra API enables you to interact with data stored in Azure Cosmos DB using the Cassandra Query Language (CQL) , Cassandra-based tools (like cqlsh) and Cassandra client drivers that you’re already familiar with.
2222

articles/cosmos-db/cassandra-spark-aggregation-ops.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ This article describes basic aggregation operations against Azure Cosmos DB Cass
2525

2626
```scala
2727
import org.apache.spark.sql.cassandra._
28-
//datastax Spark connector
28+
//Spark connector
2929
import com.datastax.spark.connector._
3030
import com.datastax.spark.connector.cql.CassandraConnector
3131

articles/cosmos-db/cassandra-spark-create-ops.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ This article describes how to insert sample data into a table in Azure Cosmos DB
2121

2222
```scala
2323
import org.apache.spark.sql.cassandra._
24-
//datastax Spark connector
24+
//Spark connector
2525
import com.datastax.spark.connector._
2626
import com.datastax.spark.connector.cql.CassandraConnector
2727

articles/cosmos-db/cassandra-spark-databricks.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -29,9 +29,9 @@ This article details how to workwith Azure Cosmos DB Cassandra API from Spark on
2929

3030
* [Use cqlsh for validation if you so prefer](cassandra-spark-generic.md#connecting-to-azure-cosmos-db-cassandra-api-from-spark)
3131

32-
* **Cassandra API instance configuration for Datastax Cassandra connector:**
32+
* **Cassandra API instance configuration for Cassandra connector:**
3333

34-
The Datastax connector for Cassandra requires the Cassandra connection details to be initialized as part of the spark context. When you launch a Databricks notebook, the spark context is already initialized and it is not advisable to stop and reinitialize it. One solution is to add the Cassandra API instance configuration at a cluster level, in the cluster spark configuration. This is a one-time activity per cluster. Add the following code to the Spark configuration as a space separated key value pair:
34+
The connector for Cassandra API requires the Cassandra connection details to be initialized as part of the spark context. When you launch a Databricks notebook, the spark context is already initialized and it is not advisable to stop and reinitialize it. One solution is to add the Cassandra API instance configuration at a cluster level, in the cluster spark configuration. This is a one-time activity per cluster. Add the following code to the Spark configuration as a space separated key value pair:
3535

3636
```scala
3737
spark.cassandra.connection.host YOUR_COSMOSDB_ACCOUNT_NAME.cassandra.cosmosdb.azure.com
@@ -43,11 +43,11 @@ This article details how to workwith Azure Cosmos DB Cassandra API from Spark on
4343

4444
## Add the required dependencies
4545

46-
* **Datastax Cassandra Spark connector:** - To integrate with Azure Cosmos DB Cassandra API with Spark, the Datastax Cassandra connector should be attached to the Azure Databricks cluster. To attach the cluster:
46+
* **Cassandra Spark connector:** - To integrate Azure Cosmos DB Cassandra API with Spark, the Cassandra connector should be attached to the Azure Databricks cluster. To attach the cluster:
4747

48-
* Review the Databricks runtime version, the Spark version. Then find the [maven coordinates](https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector) that are compatible with the Datastax Cassandra Spark connector, and attach it to the cluster. See ["Upload a Maven package or Spark package"](https://docs.databricks.com/user-guide/libraries.html) article to attach the connector library to the cluster. For example, maven coordinate for "Databricks Runtime version 4.3", "Spark 2.3.1", and "Scala 2.11" is `spark-cassandra-connector_2.11-2.3.1`
48+
* Review the Databricks runtime version, the Spark version. Then find the [maven coordinates](https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector) that are compatible with the Cassandra Spark connector, and attach it to the cluster. See ["Upload a Maven package or Spark package"](https://docs.databricks.com/user-guide/libraries.html) article to attach the connector library to the cluster. For example, maven coordinate for "Databricks Runtime version 4.3", "Spark 2.3.1", and "Scala 2.11" is `spark-cassandra-connector_2.11-2.3.1`
4949

50-
* **Azure Cosmos DB Cassandra API-specific library:** - A custom connection factory is required to configure the retry policy from the Datastax Spark connector to Azure Cosmos DB Cassandra API. Add the `com.microsoft.azure.cosmosdb:azure-cosmos-cassandra-spark-helper:1.0.0`[maven coordinates](https://search.maven.org/artifact/com.microsoft.azure.cosmosdb/azure-cosmos-cassandra-spark-helper/1.0.0/jar) to attach the library to the cluster.
50+
* **Azure Cosmos DB Cassandra API-specific library:** - A custom connection factory is required to configure the retry policy from the Cassandra Spark connector to Azure Cosmos DB Cassandra API. Add the `com.microsoft.azure.cosmosdb:azure-cosmos-cassandra-spark-helper:1.0.0`[maven coordinates](https://search.maven.org/artifact/com.microsoft.azure.cosmosdb/azure-cosmos-cassandra-spark-helper/1.0.0/jar) to attach the library to the cluster.
5151

5252
## Sample notebooks
5353

articles/cosmos-db/cassandra-spark-ddl-ops.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ This article details keyspace and table DDL operations against Azure Cosmos DB C
2222
```scala
2323
import org.apache.spark.sql.cassandra._
2424

25-
//datastax Spark connector
25+
//Spark connector
2626
import com.datastax.spark.connector._
2727
import com.datastax.spark.connector.cql.CassandraConnector
2828

articles/cosmos-db/cassandra-spark-delete-ops.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ This article describes how to delete data in Azure Cosmos DB Cassandra API table
2121

2222
```scala
2323
import org.apache.spark.sql.cassandra._
24-
//datastax Spark connector
24+
//Spark connector
2525
import com.datastax.spark.connector._
2626
import com.datastax.spark.connector.cql.CassandraConnector
2727

articles/cosmos-db/cassandra-spark-generic.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -24,11 +24,11 @@ This article is one among a series of articles on Azure Cosmos DB Cassandra API
2424
* Provision your choice of Spark environment [[Azure Databricks](https://docs.microsoft.com/azure/azure-databricks/quickstart-create-databricks-workspace-portal) | [Azure HDInsight-Spark](https://docs.microsoft.com/azure/hdinsight/spark/apache-spark-jupyter-spark-sql) | Others].
2525

2626
## Dependencies for connectivity
27-
* **Datastax Spark connector for Cassandra:**
28-
Datastax Spark connector is used to connect to Azure Cosmos DB Cassandra API. Identify and use the version of the connector located in [Maven central]( https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector) that is compatible with the Spark and Scala versions of your Spark environment.
27+
* **Spark connector for Cassandra:**
28+
Spark connector is used to connect to Azure Cosmos DB Cassandra API. Identify and use the version of the connector located in [Maven central]( https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector) that is compatible with the Spark and Scala versions of your Spark environment.
2929

3030
* **Azure Cosmos DB helper library for Cassandra API:**
31-
In addition to the Datastax connector, you need another library called [azure-cosmos-cassandra-spark-helper]( https://search.maven.org/artifact/com.microsoft.azure.cosmosdb/azure-cosmos-cassandra-spark-helper/1.0.0/jar) from Azure Cosmos DB. This library contains a connection factory and a custom retry policy classes.
31+
In addition to the Spark connector, you need another library called [azure-cosmos-cassandra-spark-helper]( https://search.maven.org/artifact/com.microsoft.azure.cosmosdb/azure-cosmos-cassandra-spark-helper/1.0.0/jar) from Azure Cosmos DB. This library contains a connection factory and a custom retry policy classes.
3232

3333
The retry policy in Azure Cosmos DB is configured to handle HTTP status code 429("Request Rate Large") exceptions. The Azure Cosmos DB Cassandra API translates these exceptions into overloaded errors on the Cassandra native protocol, and you can retry with back-offs. Because Azure Cosmos DB uses provisioned throughput model, request rate limiting exceptions occur when the ingress/egress rates increase. The retry policy protects your spark jobs against data spikes that momentarily exceed the throughput allocated for your collection.
3434

@@ -39,7 +39,7 @@ This article is one among a series of articles on Azure Cosmos DB Cassandra API
3939

4040
## Spark connector throughput configuration parameters
4141

42-
The following table lists Azure Cosmos DB Cassandra API-specific throughput configuration parameters provided by the connector. For a detailed list of all configuration parameters, see [configuration reference](https://github.com/datastax/spark-cassandra-connector/blob/master/doc/reference.md) page of the DataStax Spark Cassandra Connector GitHub repository.
42+
The following table lists Azure Cosmos DB Cassandra API-specific throughput configuration parameters provided by the connector. For a detailed list of all configuration parameters, see [configuration reference](https://github.com/datastax/spark-cassandra-connector/blob/master/doc/reference.md) page of the Spark Cassandra Connector GitHub repository.
4343

4444
| **Property Name** | **Default value** | **Description** |
4545
|---------|---------|---------|
@@ -79,14 +79,14 @@ While the sections above were specific to Azure Spark-based PaaS services, this
7979

8080
#### Connector dependencies:
8181

82-
1. Add the maven coordinates for the [Datastax Cassandra connector for Spark](cassandra-spark-generic.md#dependencies-for-connectivity)
82+
1. Add the maven coordinates to get the [Cassandra connector for Spark](cassandra-spark-generic.md#dependencies-for-connectivity)
8383
2. Add the maven coordinates for the [Azure Cosmos DB helper library](cassandra-spark-generic.md#dependencies-for-connectivity) for Cassandra API
8484

8585
#### Imports:
8686

8787
```scala
8888
import org.apache.spark.sql.cassandra._
89-
//datastax Spark connector
89+
//Spark connector
9090
import com.datastax.spark.connector._
9191
import com.datastax.spark.connector.cql.CassandraConnector
9292

articles/cosmos-db/cassandra-spark-hdinsight.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@ This article covers how to access Azure Cosmos DB Cassandra API from Spark on YA
2929

3030
* [Use cqlsh for validation if you so prefer](cassandra-spark-generic.md##connecting-to-azure-cosmos-db-cassandra-api-from-spark)
3131

32-
* **Cassandra API configuration in Spark2** - The Datastax connector for Cassandra requires that the Cassandra connection details to be initialized as part of the Spark context. When you launch a Jupyter notebook, the spark session and context are already initialized and it is not advisable to stop and reinitialize the Spark context unless it's complete with every configuration set as part of the HDInsight default Jupyter notebook start-up. One workaround is to add the Cassandra instance details to Ambari, Spark2 service configuration directly. This is a one-time activity per cluster that requires a Spark2 service restart.
32+
* **Cassandra API configuration in Spark2** - The Spark connector for Cassandra requires that the Cassandra connection details to be initialized as part of the Spark context. When you launch a Jupyter notebook, the spark session and context are already initialized and it is not advisable to stop and reinitialize the Spark context unless it's complete with every configuration set as part of the HDInsight default Jupyter notebook start-up. One workaround is to add the Cassandra instance details to Ambari, Spark2 service configuration directly. This is a one-time activity per cluster that requires a Spark2 service restart.
3333

3434
1. Go to Ambari, Spark2 service and click on configs
3535

@@ -65,7 +65,7 @@ Spark shell is used for testing/exploration purposes.
6565
import org.apache.spark.sql.types.{StructType, StructField, StringType, IntegerType,LongType,FloatType,DoubleType, TimestampType}
6666
import org.apache.spark.sql.cassandra._
6767

68-
//datastax Spark connector
68+
//Spark connector
6969
import com.datastax.spark.connector._
7070
import com.datastax.spark.connector.cql.CassandraConnector
7171

0 commit comments

Comments
 (0)