Skip to content

Add .NET Core 3.1 Support Warning #33061

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
Dec 15, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion docs/spark/how-to-guides/broadcast-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ title: Use broadcast variables in .NET for Apache Spark
description: Learn how to use broadcast variables in .NET for Apache Spark applications.
ms.author: nidutta
author: Niharikadutta
ms.date: 10/09/2020
ms.date: 12/16/2022
ms.topic: conceptual
ms.custom: mvc,how-to
---
Expand All @@ -14,6 +14,8 @@ In this article, you learn how to use broadcast variables in .NET for Apache Spa

Because the data is sent only once, broadcast variables have performance benefits when compared to local variables that are shipped to the executors with each task. Refer to the [official broadcast variable documentation](https://spark.apache.org/docs/2.2.0/rdd-programming-guide.html#broadcast-variables) to get a deeper understanding of broadcast variables and why they are used.

[!INCLUDE [.NET Core 3.1 Warning](../includes/net-core-31-spark.md)]

## Create broadcast variables

To create a broadcast variable, call `SparkContext.Broadcast(v)` for any variable `v`. The broadcast variable is a wrapper around the variable `v`, and its value can be accessed by calling the `Value()` method.
Expand Down
4 changes: 3 additions & 1 deletion docs/spark/how-to-guides/connect-to-azure-storage.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ title: Connect to Remote Storage from your local machine
description: Connect to Azure Storage Accounts using .NET for Apache Spark from your local machine.
ms.author: nidutta
author: Niharikadutta
ms.date: 10/09/2020
ms.date: 12/16/2022
ms.topic: conceptual
ms.custom: mvc,how-to
---
Expand All @@ -12,6 +12,8 @@ ms.custom: mvc,how-to

In this article, you learn how to connect to an Azure Data Lake Storage (ADLS) Gen 2 or Windows Azure Storage Blob (WASB) account through an instance of [.NET for Apache Spark](https://github.com/dotnet/spark) running locally on your Windows machine.

[!INCLUDE [.NET Core 3.1 Warning](../includes/net-core-31-spark.md)]

## Set up the environment

1. Download the Apache Spark distribution built without Hadoop from [official website](https://archive.apache.org/dist/spark/) (choose a version [supported by .NET for Apache Spark](https://github.com/dotnet/spark#supported-apache-spark)), and extract it to a directory. Set the environment variable `SPARK_HOME` to this directory.
Expand Down
4 changes: 3 additions & 1 deletion docs/spark/how-to-guides/connect-to-event-hub.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ title: Connect .NET for Apache Spark to Azure Event Hubs
description: Learn how to connect to Azure Event Hub from local .NET for Apache Spark instance.
ms.author: nidutta
author: Niharikadutta
ms.date: 10/09/2020
ms.date: 12/16/2022
ms.topic: conceptual
ms.custom: mvc,how-to
---
Expand All @@ -12,6 +12,8 @@ ms.custom: mvc,how-to

In this article, you will learn how to connect your [.NET for Apache Spark](https://github.com/dotnet/spark) application with Azure Event Hubs to read and write Apache Kafka streams.

[!INCLUDE [.NET Core 3.1 Warning](../includes/net-core-31-spark.md)]

## Prerequisites

Have an Event Hubs Namespace ready with an event hub. For a step-by-step guide, refer to [Quickstart: Create an event hub using Azure portal](/azure/event-hubs/event-hubs-create). Make sure to select the Standard Pricing tier while creating the Event Hub Namespace.
Expand Down
4 changes: 3 additions & 1 deletion docs/spark/how-to-guides/connect-to-mongo-db.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ title: Connect .NET for Apache Spark to MongoDB
description: Learn how to connect to your MongoDB instance from your .NET for Apache Spark application.
ms.author: nidutta
author: Niharikadutta
ms.date: 10/09/2020
ms.date: 12/16/2022
ms.topic: conceptual
ms.custom: mvc,how-to
---
Expand All @@ -12,6 +12,8 @@ ms.custom: mvc,how-to

In this article, you learn how to connect to a MongoDB instance from your .NET for Apache Spark application.

[!INCLUDE [.NET Core 3.1 Warning](../includes/net-core-31-spark.md)]

## Prerequisites

- Have a MongoDB server up and running with a [database and some collection](https://docs.mongodb.com/manual/core/databases-and-collections/) added to it (Download [this community server](https://www.mongodb.com/try/download/community) for a local server or you can try [MongoDB Atlas](https://www.mongodb.com/cloud/atlas) for a cloud MongoDB service.)
Expand Down
4 changes: 3 additions & 1 deletion docs/spark/how-to-guides/connect-to-sql-server.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ title: Connect .NET for Apache Spark to SQL Server
description: Learn how to connect to a SQL Server instance from your .NET for Apache Spark application.
ms.author: nidutta
author: Niharikadutta
ms.date: 10/09/2020
ms.date: 12/16/2022
ms.topic: conceptual
ms.custom: mvc,how-to
---
Expand All @@ -12,6 +12,8 @@ ms.custom: mvc,how-to

In this article, you learn how to connect to an SQL server instance from your [.NET for Apache Spark](https://github.com/dotnet/spark) application.

[!INCLUDE [.NET Core 3.1 Warning](../includes/net-core-31-spark.md)]

## Configure SQL Server to grant your application access

1. Add a login user and password choosing SQL Server authentication to your SQL Server instance.
Expand Down
4 changes: 3 additions & 1 deletion docs/spark/how-to-guides/databricks-deploy-methods.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
title: Submit a .NET for Apache Spark job to Databricks
description: Learn how to submit a .NET for Apache Spark job to Databricks using spark-submit and Set Jar.
ms.date: 10/09/2020
ms.date: 12/16/2022
ms.topic: conceptual
ms.custom: mvc,how-to
---
Expand All @@ -10,6 +10,8 @@ ms.custom: mvc,how-to

You can run your .NET for Apache Spark jobs on Databricks clusters, but it is not available out-of-the-box. There are two ways to deploy your .NET for Apache Spark job to Databricks: `spark-submit` and Set Jar.

[!INCLUDE [.NET Core 3.1 Warning](../includes/net-core-31-spark.md)]

## Deploy using spark-submit

You can use the [spark-submit](https://spark.apache.org/docs/latest/submitting-applications.html) command to submit .NET for Apache Spark jobs to Databricks. `spark-submit` allows submission only to a cluster that gets created on-demand.
Expand Down
4 changes: 3 additions & 1 deletion docs/spark/how-to-guides/debug.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
title: Debug a .NET for Apache Spark application on Windows
description: Learn how to debug your .NET for Apache Spark application on Windows.
ms.date: 10/09/2020
ms.date: 12/16/2022
ms.topic: conceptual
ms.custom: mvc,how-to
---
Expand All @@ -10,6 +10,8 @@ ms.custom: mvc,how-to

This how-to provides the steps to debug your .NET for Apache Spark application on Windows.

[!INCLUDE [.NET Core 3.1 Warning](../includes/net-core-31-spark.md)]

## Debug your application

Open a new command prompt window and run the following command:
Expand Down
4 changes: 3 additions & 1 deletion docs/spark/how-to-guides/deploy-worker-udf-binaries.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
title: Deploy .NET for Apache Spark worker and user-defined function binaries
description: Learn how to deploy .NET for Apache Spark worker and user-defined function binaries.
ms.date: 10/09/2020
ms.date: 12/16/2022
ms.topic: conceptual
ms.custom: mvc,how-to
---
Expand All @@ -10,6 +10,8 @@ ms.custom: mvc,how-to

This how-to provides general instructions on how to deploy .NET for Apache Spark worker and user-defined function binaries. You learn which Environment Variables to set up, as well as some commonly used parameters for launching applications with `spark-submit`.

[!INCLUDE [.NET Core 3.1 Warning](../includes/net-core-31-spark.md)]

## Configurations

Configurations show the general environment variables and parameters settings in order to deploy .NET for Apache Spark worker and user-defined function binaries.
Expand Down
4 changes: 3 additions & 1 deletion docs/spark/how-to-guides/dotnet-interactive-udf-issue.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ title: Write and call UDFs in .NET for Apache Spark interactive environments.
description: Learn how to write and call UDFs in .NET for Apache Spark interactive shells.
ms.author: nidutta
author: Niharikadutta
ms.date: 10/09/2020
ms.date: 12/16/2022
ms.topic: conceptual
ms.custom: mvc,how-to
---
Expand All @@ -12,6 +12,8 @@ ms.custom: mvc,how-to

In this article, you will learn how to use user-defined functions (UDFs) in a .NET for Apache Spark interactive environment.

[!INCLUDE [.NET Core 3.1 Warning](../includes/net-core-31-spark.md)]

## Prerequisites

1. Install [.NET Interactive](https://github.com/dotnet/interactive)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ titleSuffix: .NET for Apache Spark
description: Use .NET for Apache Spark in interactive environments like Jupyter Notebook, Jupyter Lab, or Visual Studio Code (VS Code)
ms.author: luquinta
author: luisquintanilla
ms.date: 10/09/2020
ms.date: 12/16/2022
ms.topic: conceptual
ms.custom: mvc, how-to
---
Expand All @@ -13,6 +13,8 @@ ms.custom: mvc, how-to

In this article, you learn how to run .NET for Apache Spark jobs interactively in Jupyter Notebook and Visual Studio Code (VS Code) with .NET Interactive.

[!INCLUDE [.NET Core 3.1 Warning](../includes/net-core-31-spark.md)]

## About Jupyter

[Jupyter](https://jupyter.org/) is an open-source, cross-platform computing environment that provides a way for users to prototype and develop applications interactively. You can interact with Jupyter through a wide variety of interfaces such as Jupyter Notebook, Jupyter Lab, and VS Code.
Expand Down
4 changes: 3 additions & 1 deletion docs/spark/how-to-guides/hdinsight-deploy-methods.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
title: Submit a .NET for Apache Spark job to Azure HDInsight
description: Learn how to submit a .NET for Apache Spark job to Azure HDInsight using spark-submit and Apache Livy.
ms.date: 10/09/2020
ms.date: 12/16/2022
ms.topic: conceptual
ms.custom: mvc,how-to
---
Expand All @@ -10,6 +10,8 @@ ms.custom: mvc,how-to

There are two ways to deploy your .NET for Apache Spark job to HDInsight: `spark-submit` and Apache Livy.

[!INCLUDE [.NET Core 3.1 Warning](../includes/net-core-31-spark.md)]

## Deploy using spark-submit

You can use the [spark-submit](https://spark.apache.org/docs/latest/submitting-applications.html) command to submit .NET for Apache Spark jobs to Azure HDInsight.
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
title: Install .NET for Apache Spark on Jupyter Notebooks on Azure HDInsight Spark clusters
description: Learn how to install .NET for Apache Spark on Azure HDInsight's Jupyter Notebooks.
ms.date: 10/09/2020
ms.date: 12/16/2022
ms.topic: conceptual
ms.custom: mvc,how-to
---
Expand All @@ -17,6 +17,8 @@ To enable .NET for Apache Spark through the Jupyter Notebooks experience, you ne
> [!NOTE]
> This feature is *experimental* and is not supported by the HDInsight Spark team.

[!INCLUDE [.NET Core 3.1 Warning](../includes/net-core-31-spark.md)]

## Prerequisites

If you don't already have one, create an [Azure HDInsight Spark](/azure/hdinsight/spark/apache-spark-jupyter-spark-sql-use-portal#create-an-apache-spark-cluster-in-hdinsight) cluster.
Expand Down
4 changes: 3 additions & 1 deletion docs/spark/how-to-guides/java-udf-from-dotnet.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ title: Invoke Java UDFs from .NET for Apache Spark application
description: Learn how to call a Java UDF from a .NET for Apache Spark application.
ms.author: nidutta
author: Niharikadutta
ms.date: 10/09/2020
ms.date: 12/16/2022
ms.topic: conceptual
ms.custom: mvc,how-to
---
Expand All @@ -15,6 +15,8 @@ In this article, you learn how to call a Java User-Defined Function (UDF) from y
1. How to define your Java UDFs and compile them into a jar - this step is not needed if you already have a UDF defined in a jar file. In which case, all you need is the full name of the UDF function including the package.
2. Register and call your Java UDF in your .NET for Apache Spark application.

[!INCLUDE [.NET Core 3.1 Warning](../includes/net-core-31-spark.md)]

## Define and compile your Java UDFs

1. Create a Maven or SBT project and add the following dependencies into the project configuration file:
Expand Down
4 changes: 3 additions & 1 deletion docs/spark/how-to-guides/ubuntu-instructions.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
title: Build a .NET for Apache Spark application on Ubuntu
description: Learn how to build your .NET for Apache Spark application on Ubuntu
ms.date: 10/09/2020
ms.date: 12/16/2022
ms.topic: conceptual
ms.custom: mvc,how-to
---
Expand All @@ -11,6 +11,8 @@ ms.custom: mvc,how-to

This article teaches you how to build your .NET for Apache Spark applications on Ubuntu.

[!INCLUDE [.NET Core 3.1 Warning](../includes/net-core-31-spark.md)]

## Prerequisites

If you already have all of the following prerequisites, skip to the [build](#build) steps.
Expand Down
4 changes: 3 additions & 1 deletion docs/spark/how-to-guides/udf-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ title: Create user-defined functions (UDF) in .NET for Apache Spark
description: Learn how to implement user-defined functions (UDF) in .NET for Apache Spark applications.
ms.author: nidutta
author: Niharikadutta
ms.date: 10/09/2020
ms.date: 12/16/2022
ms.topic: conceptual
ms.custom: mvc,how-to
---
Expand All @@ -12,6 +12,8 @@ ms.custom: mvc,how-to

In this article, you learn how to use user-defined functions (UDF) in .NET for Apache Spark. [UDFs)](https://spark.apache.org/docs/latest/api/java/org/apache/spark/sql/expressions/UserDefinedFunction.html) are a Spark feature that allow you to use custom functions to extend the system's built-in functionality. UDFs transform values from a single row within a table to produce a single corresponding output value per row based on the logic defined in the UDF.

[!INCLUDE [.NET Core 3.1 Warning](../includes/net-core-31-spark.md)]

## Define UDFs

Review the following UDF definition:
Expand Down
4 changes: 3 additions & 1 deletion docs/spark/how-to-guides/windows-instructions.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
title: Build a .NET for Apache Spark application on Windows
description: Learn how to build your .NET for Apache Spark application on Windows.
ms.date: 10/09/2020
ms.date: 12/16/2022
ms.topic: conceptual
ms.custom: how-to
---
Expand All @@ -10,6 +10,8 @@ ms.custom: how-to

This article teaches you how to build your .NET for Apache Spark applications on Windows.

[!INCLUDE [.NET Core 3.1 Warning](../includes/net-core-31-spark.md)]

## Prerequisites

If you already have all of the following prerequisites, skip to the [build](#build) steps.
Expand Down
2 changes: 2 additions & 0 deletions docs/spark/includes/net-core-31-spark.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
> [!WARNING]
> .NET for Apache Spark targets an out of support version of .NET (.NET Core 3.1). For more details see the [.NET Support Policy](https://dotnet.microsoft.com/platform/support/policy/dotnet-core).
4 changes: 3 additions & 1 deletion docs/spark/tutorials/amazon-emr-spark-deployment.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
title: Deploy a .NET for Apache Spark application to Amazon EMR Spark
description: Discover how to deploy a .NET for Apache Spark application to Amazon EMR Spark.
ms.date: 10/09/2020
ms.date: 12/16/2022
ms.topic: tutorial
ms.custom: mvc
recommendations: false
Expand All @@ -24,6 +24,8 @@ In this tutorial, you learn how to:
> [!Note]
> AWS EMR Spark is Linux-based. Therefore, if you are interested in deploying your app to AWS EMR Spark, make sure your app is .NET Standard compatible and that you use .NET Core compiler to compile your app.

[!INCLUDE [.NET Core 3.1 Warning](../includes/net-core-31-spark.md)]

## Prerequisites

Before you start, do the following:
Expand Down
4 changes: 3 additions & 1 deletion docs/spark/tutorials/batch-processing.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ title: Batch processing with .NET for Apache Spark tutorial
description: Learn how to do batch processing using .NET for Apache Spark.
author: mamccrea
ms.author: mamccrea
ms.date: 10/09/2020
ms.date: 12/16/2022
ms.topic: tutorial
recommendations: false
---
Expand All @@ -22,6 +22,8 @@ In this tutorial, you learn how to:
> * Read data into a DataFrame and prepare it for analysis
> * Process the data using Spark SQL

[!INCLUDE [.NET Core 3.1 Warning](../includes/net-core-31-spark.md)]

## Prerequisites

If this is your first time using .NET for Apache Spark, check out the [Get started with .NET for Apache Spark](get-started.md) tutorial to learn how to prepare your environment and run your first .NET for Apache Spark application.
Expand Down
4 changes: 3 additions & 1 deletion docs/spark/tutorials/databricks-deployment.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
title: Deploy a .NET for Apache Spark application to Databricks
description: Discover how to deploy a .NET for Apache Spark application to Databricks.
ms.date: 10/09/2020
ms.date: 12/16/2022
ms.topic: tutorial
ms.custom: mvc
recommendations: false
Expand All @@ -24,6 +24,8 @@ In this tutorial, you learn how to:
> [!IMPORTANT]
> [.NET for Apache Spark](https://github.com/dotnet/spark) is an open source project under the [.NET Foundation](https://dotnetfoundation.org/) and does not come with Microsoft Support unless otherwise noted. For issues with or questions about .NET for Apache Spark, please [create an issue in its GitHub repository](https://github.com/dotnet/spark/issues). The community is active and is monitoring submissions.

[!INCLUDE [.NET Core 3.1 Warning](../includes/net-core-31-spark.md)]

## Prerequisites

Before you start, do the following tasks:
Expand Down
2 changes: 2 additions & 0 deletions docs/spark/tutorials/get-started.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,8 @@ In this tutorial, you learn how to:
> * Write your first .NET for Apache Spark application
> * Build and run your .NET for Apache Spark application

[!INCLUDE [.NET Core 3.1 Warning](../includes/net-core-31-spark.md)]

## Prepare your environment

Before you begin writing your app, you need to set up some prerequisite dependencies. If you can run `dotnet`, `java`, `spark-shell` from your command line environment, then your environment is already prepared and you can skip to the next section. If you cannot run any or all of the commands, do the following steps.
Expand Down
4 changes: 3 additions & 1 deletion docs/spark/tutorials/hdinsight-deployment.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
title: Deploy a .NET for Apache Spark application to Azure HDInsight
description: Discover how to deploy a .NET for Apache Spark application to HDInsight.
ms.date: 10/09/2020
ms.date: 12/16/2022
ms.topic: tutorial
ms.custom: mvc
recommendations: false
Expand All @@ -22,6 +22,8 @@ In this tutorial, you learn how to:
> * Create and run an HDInsight script action.
> * Run a .NET for Apache Spark app on an HDInsight cluster.

[!INCLUDE [.NET Core 3.1 Warning](../includes/net-core-31-spark.md)]

## Prerequisites

Before you start, do the following tasks:
Expand Down
4 changes: 3 additions & 1 deletion docs/spark/tutorials/ml-sentiment-analysis.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ title: Sentiment analysis with .NET for Apache Spark and ML.NET tutorial
description: In this tutorial, you learn how to use ML.NET with .NET for Apache Spark for sentiment analysis.
author: mamccrea
ms.author: mamccrea
ms.date: 10/09/2020
ms.date: 12/16/2022
ms.topic: tutorial
recommendations: false
---
Expand All @@ -21,6 +21,8 @@ In this tutorial, you learn how to:
> * Write and implement a user-defined function.
> * Run a .NET for Apache Spark console app.

[!INCLUDE [.NET Core 3.1 Warning](../includes/net-core-31-spark.md)]

## Prerequisites

* If you haven't developed a .NET for Apache Spark application before, start with the [Getting Started tutorial](get-started.md) to become familiar with the basics. Complete all of the prerequisites for the Getting Started tutorial before you continue with this tutorial.
Expand Down
Loading