Skip to content

Latest commit

 

History

History
263 lines (158 loc) · 22 KB

quick-create-visual-studio-code.md

File metadata and controls

263 lines (158 loc) · 22 KB
title description ms.service author ms.author ms.date ms.topic ms.custom
Quickstart - Create a Stream Analytics job using Visual Studio Code
This quickstart shows you how to create a Stream Analytics job using the ASA extension for Visual Studio Code.
stream-analytics
ahartoon
anboisve
07/17/2023
quickstart
mvc, mode-ui

Quickstart: Create a Stream Analytics job using Visual Studio Code

This quickstart shows you how to create, run and submit an Azure Stream Analytics (ASA) job using the ASA Tools extension for Visual Studio Code in your local machine. You learn to build an ASA job that reads real-time streaming data from IoT Hub and filters events with a temperature greater than 27°. The output results are sent to a file in blob storage. The input data used in this quickstart is generated by a Raspberry Pi online simulator.

Note

Visual Studio Code tools don't support jobs in the China East, China North, Germany Central, and Germany NorthEast regions.

Prerequisites

Install the Azure Stream Analytics Tools extension

  1. Open Visual Studio Code (VS Code).

  2. From Extensions on the left pane, search for stream analytics and select Install on the Azure Stream Analytics Tools extension.

    :::image type="content" source="./media/quick-create-visual-studio-code/install-extension.png" alt-text="Screenshot showing the Extensions page of Visual Studio Code with an option to install Stream Analytics extension.":::

  3. After it's installed, select the Azure icon on the activity bar and sign in to Azure.

    :::image type="content" source="./media/quick-create-visual-studio-code/azure-sign-in.png" alt-text="Screenshot showing how to sign in to Azure.":::

  4. Once you're signed in, you can see the subscriptions under your Azure account.

Note

The ASA Tools extension will automatically sign you in every time you open VS Code. If your account has two-factor authentication, we recommend that you use phone authentication rather than using a PIN. To sign out your Azure account, press Ctrl + Shift + P and enter Azure: Sign Out.

Prepare the input data

Before defining the Stream Analytics job, you should prepare the input data. The real-time sensor data is ingested to IoT Hub, which later configured as the job input. To prepare the input data required by the job, follow these steps:

  1. Sign in to the Azure portal.

  2. Select Create a resource > Internet of Things > IoT Hub.

    :::image type="content" source="./media/quick-create-visual-studio-code/create-resource-iot-hub-menu.png" alt-text="Screenshot showing the Create Resource page for Iot Hub.":::

  3. On the IoT Hub page, enter the following information:

    • Subscription, select your Azure subscription.
    • Resource group, select an existing resource group or create a new resource group.
    • IoT hub name, enter a name for your IoT hub.
    • Region, select the region that's closest to you.

    :::image type="content" source="./media/quick-create-visual-studio-code/create-iot-hub.png" alt-text="Screenshot showing the IoT Hub page for creation.":::

  4. Go to Management page, for Pricing and scale tier, select F1: Free tier, if it's still available on your subscription. For more information, see Azure IoT Hub pricing.

    :::image type="content" source="./media/quick-create-visual-studio-code/iot-management-page.png" alt-text="Screenshot showing the IoT Hub management page.":::

  5. Select Review + create. Review your IoT hub information and select Create. This process may take a few minutes to deploy your IoT hub.

  6. After your IoT hub is created, select Go to resource to navigate to the IoT Hub page. '

  7. On the IoT Hub page, select Devices on the left menu, and then select + Add Device.

    :::image type="content" source="./media/quick-create-visual-studio-code/add-device-menu.png" alt-text="Screenshot showing the Add Device button on the Devices page.":::

  8. Enter a Device ID and select Save.

    :::image type="content" source="./media/quick-create-visual-studio-code/add-device-iot-hub.png" alt-text="Screenshot showing the Add Device page.":::

  9. Once the device is created, you should see the device from the IoT devices list. Select Refresh button on the page if you don't see it.

    :::image type="content" source="./media/quick-create-visual-studio-code/select-device.png" alt-text="Screenshot showing the selection of the device on the Devices page.":::

  10. Select your device from the list. Copy Primary Connection String and save it to a notepad to use later.

    :::image type="content" source="./media/quick-create-visual-studio-code/save-iot-device-connection-string.png" alt-text="Screenshot showing the primary connection string of the device you created.":::

Run the IoT simulator

  1. Open the Raspberry Pi Azure IoT Online Simulator in a new browser tab.

  2. Replace the placeholder in line 15 with the IoT hub device connection string that you saved earlier.

  3. Select Run. The output should show the sensor data and messages that are being sent to your IoT hub.

    :::image type="content" source="./media/quick-create-visual-studio-code/ras-pi-connection-string.png" lightbox="./media/quick-create-visual-studio-code/ras-pi-connection-string.png" alt-text="Screenshot showing the Raspberry Pi Azure IoT Online Simulator with output.":::

Create blob storage

  1. From the upper-left corner of the Azure portal, select Create a resource > Storage > Storage account.

    :::image type="content" source="./media/quick-create-visual-studio-code/create-storage-account-menu.png" alt-text="Screenshot showing the Create storage account menu.":::

  2. In the Create storage account pane, enter a storage account name, location, and resource group. Choose the same location and resource group as the IoT hub that you created. Then select Review and Create to create the storage account.

    :::image type="content" source="./media/quick-create-visual-studio-code/create-storage-account.png" alt-text="Screenshot showing the Create storage account page.":::

  3. On the Storage account page, select Containers on the left menu, and then select + Container on the command bar.

    :::image type="content" source="./media/quick-create-visual-studio-code/add-blob-container-menu.png" alt-text="Screenshot showing the Containers page.":::

  4. From the New container page, provide a name for your container, leave Public access level as Private (no anonymous access), and select OK.

    :::image type="content" source="./media/quick-create-visual-studio-code/create-blob-container.png" alt-text="Screenshot showing the creation of a blob container page.":::

Create a Stream Analytics project

  1. In Visual Studio Code, press Ctrl+Shift+P and enter ASA: Create New Project.

    :::image type="content" source="./media/quick-create-visual-studio-code/create-new-project.png" alt-text="Screenshot showing the selection of ASA: Create New Project in the command palette.":::

  2. Enter your project name, like myASAproj, and select a folder for your project.

    :::image type="content" source="./media/quick-create-visual-studio-code/create-project-name.png" alt-text="Screenshot showing entering an ASA project name.":::

  3. An ASA project is added to your workspace. It consists of three folders: Inputs, Outputs, and Functions. It also has the query script (*.asaql), a JobConfig.json file, and an asaproj.json configuration file.

    :::image type="content" source="./media/quick-create-visual-studio-code/asa-project-files.png" alt-text="Screenshot showing Stream Analytics project files in Visual Studio Code.":::

    The asaproj.json file contains the inputs, outputs, and job configuration settings for submitting the Stream Analytics job to Azure.

    [!Note] When you're adding inputs and outputs from the command palette, the corresponding paths are added to asaproj.json automatically. If you add or remove inputs or outputs on disk directly, you need to manually add or remove them from asaproj.json. You can choose to put the inputs and outputs in one place and then reference them in different jobs by specifying the paths in each asaproj.json file.

Define the transformation query

  1. Open myASAproj.asaql file and add the following query:

    SELECT *
    INTO Output
    FROM Input
    WHERE Temperature > 27

    :::image type="content" source="./media/quick-create-visual-studio-code/query.png" lightbox="./media/quick-create-visual-studio-code/query.png" alt-text="Screenshot showing the transformation query.":::

Configure job input

  1. Right-click the Inputs folder in your Stream Analytics project. Then select ASA: Add Input from the context menu.

    :::image type="content" source="./media/quick-create-visual-studio-code/add-input-from-inputs-folder.png" lightbox="./media/quick-create-visual-studio-code/add-input-from-inputs-folder.png" alt-text="Screenshot showing the ASA: Add input menu in Visual Studio Code.":::

    Or press Ctrl+Shift+P to open the command palette and enter ASA: Add Input.

  2. Choose IoT Hub for the input type.

    :::image type="content" source="./media/quick-create-visual-studio-code/iot-hub.png" lightbox="./media/quick-create-visual-studio-code/iot-hub.png" alt-text="Screenshot showing the selection of your IoT hub in VS Code command palette.":::

  3. Select an ASA script *.asaql and Azure Subscriptions from the drop-down menu, and then press ENTER.

  4. Under Inputs folder, you see an IoTHub1.json file is created. Replace settings with following suggested values and keep default values for fields not mentioned here.

    Setting Suggested Value Description
    Name Input This input name is used for FROM statement in the query.
    IotHubNamespace spiothub Name of your IoT hub. The IoT hub names are automatically detected if you Select from your subscription.
    SharedAccessPolicyName iothubowner

    :::image type="content" source="./media/quick-create-visual-studio-code/iothub-configuration.png" lightbox="./media/quick-create-visual-studio-code/iothub-configuration.png" alt-text="Screenshot showing the IoT Hub configuration in VS Code.":::

  5. Select Preview data to see if the input data is successfully configured for your job. It will fetch a sample of your IoT Hub and show in the preview window.

    :::image type="content" source="./media/quick-create-visual-studio-code/preview-live-input.png" lightbox="./media/quick-create-visual-studio-code/preview-live-input.png" alt-text="Screenshot showing the preview of input data in your IoT hub.":::

Configure job output

  1. Press Ctrl+Shift+P to open the command palette and enter ASA: Add Output.

  2. Choose Data Lake Storage Gen2/Blob Storage for the sink type.

  3. Select the query script using this output.

  4. Enter BlobStorage1 as output file name.

  5. Edit the settings using the following values. Keep default values for fields not mentioned here.

    Setting Suggested value Description
    Name Output This output name is used for INTO statement in the query.
    Storage Account spstorageaccount0901 Choose or enter the name of your storage account. Storage account names are automatically detected if they're created in the same subscription.
    Container spcontainer Select the existing container that you created in your storage account.

:::image type="content" source="./media/quick-create-visual-studio-code/configure-output.png" lightbox="./media/quick-create-visual-studio-code/configure-output.png" alt-text="Screenshot showing the configuration of output for the Stream Analytics job.":::

Compile the script and submit to Azure

Script compilation checks syntax and generates the Azure Resource Manager templates for automatic deployment.

  1. Right-click the script and select ASA: Compile Script.

    :::image type="content" source="./media/quick-create-visual-studio-code/compile-script-2.png" lightbox="./media/quick-create-visual-studio-code/compile-script-2.png" alt-text="Screenshot showing the compilation of script option from the Stream Analytics explorer in VS Code.":::

  2. After compilation, you see a Deploy folder under your project with two Azure Resource Manager templates. These two files are used for automatic deployment.

    :::image type="content" source="./media/quick-create-visual-studio-code/deployment-templates.png" lightbox="./media/quick-create-visual-studio-code/deployment-templates.png" alt-text="Screenshot showing the generated deployment templates in the project folder.":::

  3. Select Submit to Azure in the query editor.

    :::image type="content" source="./media/quick-create-visual-studio-code/submit-job.png" lightbox="./media/quick-create-visual-studio-code/submit-job.png" alt-text="Screenshot showing the submit job button to submit the Stream Analytics job to Azure.":::

    Then follow the instructions to complete the process: Select subscription > Select a job > Create New Job > Enter job name > Choose resource group and region.

  4. Select Publish to Azure and complete. Wait for it to open a new tab Cloud Job View showing your job's status.

    :::image type="content" source="./media/quick-create-visual-studio-code/publish-to-azure.png" lightbox="./media/quick-create-visual-studio-code/publish-to-azure.png" alt-text="Screenshot showing the publish to Azure button in VS Code.":::

Start the Stream Analytics job and check output

  1. On the Cloud Job View tab, select Start to run your job in the cloud. This process may take a few minutes to complete.

    :::image type="content" source="./media/quick-create-visual-studio-code/start-asa-job-vs-code.png" lightbox="./media/quick-create-visual-studio-code/start-asa-job-vs-code.png" alt-text="Screenshot showing the Start job button in the Cloud view page.":::

  2. If your job starts successfully, the job status is changed to Running. You can see a logical diagram showing how your ASA job is running.

    :::image type="content" source="./media/quick-create-visual-studio-code/job-running-status.png" lightbox="./media/quick-create-visual-studio-code/job-running-status.png" alt-text="Screenshot showing the job running status in VS Code.":::

  3. To view the output results, you can open the blob storage in the Visual Studio Code extension or in the Azure portal.

    :::image type="content" source="./media/quick-create-visual-studio-code/output-files.png" lightbox="./media/quick-create-visual-studio-code/output-files.png" alt-text="Screenshot showing the output file in the Blob container.":::

    Download and open the file to see output.

    {"messageId":11,"deviceId":"Raspberry Pi Web Client","temperature":28.165519323167562,"humidity":76.875393581654379,"EventProcessedUtcTime":"2022-09-01T22:53:58.1015921Z","PartitionId":3,"EventEnqueuedUtcTime":"2022-09-01T22:52:57.6250000Z","IoTHub":{"MessageId":null,"CorrelationId":null,"ConnectionDeviceId":"MyASAIoTDevice","ConnectionDeviceGenerationId":"637976642928634103","EnqueuedTime":"2022-09-01T22:52:57.6290000Z"}}
    {"messageId":14,"deviceId":"Raspberry Pi Web Client","temperature":29.014941877871451,"humidity":64.93477299527828,"EventProcessedUtcTime":"2022-09-01T22:53:58.2421545Z","PartitionId":3,"EventEnqueuedUtcTime":"2022-09-01T22:53:03.6100000Z","IoTHub":{"MessageId":null,"CorrelationId":null,"ConnectionDeviceId":"MyASAIoTDevice","ConnectionDeviceGenerationId":"637976642928634103","EnqueuedTime":"2022-09-01T22:53:03.6140000Z"}}
    {"messageId":17,"deviceId":"Raspberry Pi Web Client","temperature":28.032846241745975,"humidity":66.146114343897338,"EventProcessedUtcTime":"2022-09-01T22:53:58.2421545Z","PartitionId":3,"EventEnqueuedUtcTime":"2022-09-01T22:53:19.5960000Z","IoTHub":{"MessageId":null,"CorrelationId":null,"ConnectionDeviceId":"MyASAIoTDevice","ConnectionDeviceGenerationId":"637976642928634103","EnqueuedTime":"2022-09-01T22:53:19.5830000Z"}}
    {"messageId":18,"deviceId":"Raspberry Pi Web Client","temperature":30.176185593576143,"humidity":72.697359909427419,"EventProcessedUtcTime":"2022-09-01T22:53:58.2421545Z","PartitionId":3,"EventEnqueuedUtcTime":"2022-09-01T22:53:21.6120000Z","IoTHub":{"MessageId":null,"CorrelationId":null,"ConnectionDeviceId":"MyASAIoTDevice","ConnectionDeviceGenerationId":"637976642928634103","EnqueuedTime":"2022-09-01T22:53:21.6140000Z"}}
    {"messageId":20,"deviceId":"Raspberry Pi Web Client","temperature":27.851894248213021,"humidity":71.610229530268214,"EventProcessedUtcTime":"2022-09-01T22:53:58.2421545Z","PartitionId":3,"EventEnqueuedUtcTime":"2022-09-01T22:53:25.6270000Z","IoTHub":{"MessageId":null,"CorrelationId":null,"ConnectionDeviceId":"MyASAIoTDevice","ConnectionDeviceGenerationId":"637976642928634103","EnqueuedTime":"2022-09-01T22:53:25.6140000Z"}}
    {"messageId":21,"deviceId":"Raspberry Pi Web Client","temperature":27.718624694772238,"humidity":66.540445035685153,"EventProcessedUtcTime":"2022-09-01T22:53:58.2421545Z","PartitionId":3,"EventEnqueuedUtcTime":"2022-09-01T22:53:48.0820000Z","IoTHub":{"MessageId":null,"CorrelationId":null,"ConnectionDeviceId":"MyASAIoTDevice","ConnectionDeviceGenerationId":"637976642928634103","EnqueuedTime":"2022-09-01T22:53:48.0830000Z"}}
    {"messageId":22,"deviceId":"Raspberry Pi Web Client","temperature":27.7849054424326,"humidity":74.300662748167085,"EventProcessedUtcTime":"2022-09-01T22:54:09.3393532Z","PartitionId":3,"EventEnqueuedUtcTime":"2022-09-01T22:54:09.2390000Z","IoTHub":{"MessageId":null,"CorrelationId":null,"ConnectionDeviceId":"MyASAIoTDevice","ConnectionDeviceGenerationId":"637976642928634103","EnqueuedTime":"2022-09-01T22:54:09.2400000Z"}}
    {"messageId":28,"deviceId":"Raspberry Pi Web Client","temperature":30.839892925680324,"humidity":76.237611741451786,"EventProcessedUtcTime":"2022-09-01T22:54:47.8053253Z","PartitionId":3,"EventEnqueuedUtcTime":"2022-09-01T22:54:47.6180000Z","IoTHub":{"MessageId":null,"CorrelationId":null,"ConnectionDeviceId":"MyASAIoTDevice","ConnectionDeviceGenerationId":"637976642928634103","EnqueuedTime":"2022-09-01T22:54:47.6150000Z"}}
    {"messageId":29,"deviceId":"Raspberry Pi Web Client","temperature":30.561040300759053,"humidity":78.3845172058103,"EventProcessedUtcTime":"2022-09-01T22:54:49.8070489Z","PartitionId":3,"EventEnqueuedUtcTime":"2022-09-01T22:54:49.6030000Z","IoTHub":{"MessageId":null,"CorrelationId":null,"ConnectionDeviceId":"MyASAIoTDevice","ConnectionDeviceGenerationId":"637976642928634103","EnqueuedTime":"2022-09-01T22:54:49.5990000Z"}}
    {"messageId":31,"deviceId":"Raspberry Pi Web Client","temperature":28.163585438418679,"humidity":60.0511571297096,"EventProcessedUtcTime":"2022-09-01T22:55:25.1528729Z","PartitionId":3,"EventEnqueuedUtcTime":"2022-09-01T22:55:24.9050000Z","IoTHub":{"MessageId":null,"CorrelationId":null,"ConnectionDeviceId":"MyASAIoTDevice","ConnectionDeviceGenerationId":"637976642928634103","EnqueuedTime":"2022-09-01T22:55:24.9120000Z"}}
    {"messageId":32,"deviceId":"Raspberry Pi Web Client","temperature":31.00503387156985,"humidity":78.68821066044552,"EventProcessedUtcTime":"2022-09-01T22:55:43.2652127Z","PartitionId":3,"EventEnqueuedUtcTime":"2022-09-01T22:55:43.0480000Z","IoTHub":{"MessageId":null,"CorrelationId":null,"ConnectionDeviceId":"MyASAIoTDevice","ConnectionDeviceGenerationId":"637976642928634103","EnqueuedTime":"2022-09-01T22:55:43.0520000Z"}}

Clean up resources

When no longer needed, delete the resource group, the Stream Analytics job, and all related resources. Deleting the job avoids billing the streaming units consumed by the job. If you're planning to use the job in future, you can stop it and restart it later when you need. If you aren't going to continue to use this job, delete all resources created by this quickstart by using the following steps:

  1. From the left menu in the Azure portal, select Resource groups and then select the name of the resource that you created.

  2. On your resource group page, select Delete. Enter the name of the resource to delete in the text box, and then select Delete.

Next steps

To learn more about ASA Tools extension for Visual Studio Code, continue to the following articles: