Skip to content

Commit f375257

Browse files
committed
Validation - alt text duplicates
1 parent 18c21b9 commit f375257

13 files changed

+50
-50
lines changed

articles/data-factory/author-global-parameters.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -20,11 +20,11 @@ Global parameters are constants across a data factory that can be consumed by a
2020

2121
To create a global parameter, go to the *Global parameters* tab in the *Manage* section. Select **New** to open the creation side-nav.
2222

23-
![Create global parameters](media/author-global-parameters/create-global-parameter-1.png)
23+
![Screenshot that highlights the New button you select to create global parameters.](media/author-global-parameters/create-global-parameter-1.png)
2424

2525
In the side-nav, enter a name, select a data type, and specify the value of your parameter.
2626

27-
![Create global parameters](media/author-global-parameters/create-global-parameter-2.png)
27+
![Screenshot that shows where you add the name, data type, and value for the new global parameter.](media/author-global-parameters/create-global-parameter-2.png)
2828

2929
After a global parameter is created, you can edit it by clicking the parameter's name. To alter multiple parameters at once, select **Edit all**.
3030

articles/data-factory/author-management-hub.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -51,7 +51,7 @@ To override the generated Resource Manager template parameters when publishing f
5151

5252
Triggers determine when a pipeline run should be kicked off. Currently triggers can be on a wall clock schedule, operate on a periodic interval, or depend on an event. For more information, learn about [trigger execution](concepts-pipeline-execution-triggers.md#trigger-execution). In the management hub, you can create, edit, delete, or view the current state of a trigger.
5353

54-
![Manage custom params](media/author-management-hub/management-hub-triggers.png)
54+
![Screenshot that shows where to create, edit, delete, nor view the current state of a trigger.](media/author-management-hub/management-hub-triggers.png)
5555

5656
### Global parameters
5757

articles/data-factory/monitor-integration-runtime.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -253,13 +253,13 @@ If you join your Azure-SSIS IR to a VNet, you'll see the **VALIDATE VNET / SUBNE
253253

254254
On the **DIAGNOSE CONNECTIVITY** tile of your Azure-SSIS IR monitoring page, you can select the **Test connection** link to pop up a window, where you can check the connections between your Azure-SSIS IR and relevant package/configuration/data stores, as well as management services, via their fully qualified domain name (FQDN)/IP address and designated port (see [Testing connections from your Azure-SSIS IR](https://docs.microsoft.com/azure/data-factory/ssis-integration-runtime-diagnose-connectivity-faq)).
255255

256-
![Monitor your Azure-SSIS IR - DIAGNOSE tile](media/monitor-integration-runtime/monitor-azure-ssis-integration-runtime-diagnose.png)
256+
![Screenshot that shows where you can test the connections between your Azure-SSIS IR and relevant package/configuration/data stores.](media/monitor-integration-runtime/monitor-azure-ssis-integration-runtime-diagnose.png)
257257

258258
#### STATIC PUBLIC IP ADDRESSES tile
259259

260260
If you bring your own static public IP addresses for Azure-SSIS IR, you'll see the **STATIC PUBLIC IP ADDRESSES** tile on your Azure-SSIS IR monitoring page (see [Bringing your own static public IP addresses for Azure-SSIS IR](https://docs.microsoft.com/azure/data-factory/join-azure-ssis-integration-runtime-virtual-network#publicIP)). On this tile, you can select links designating your first/second static public IP addresses for Azure-SSIS IR to pop up a window, where you can copy their resource ID (`/subscriptions/YourAzureSubscripton/resourceGroups/YourResourceGroup/providers/Microsoft.Network/publicIPAddresses/YourPublicIPAddress`) from a text box. On the pop-up window, you can also select the **See your first/second static public IP address settings** link to manage your first/second static public IP address in Azure portal.
261261

262-
![Monitor your Azure-SSIS IR - DIAGNOSE tile](media/monitor-integration-runtime/monitor-azure-ssis-integration-runtime-static.png)
262+
![Screenshot that shows where you can designate your first/second static public IP addresses.](media/monitor-integration-runtime/monitor-azure-ssis-integration-runtime-static.png)
263263

264264
#### PACKAGE STORES tile
265265

articles/data-factory/monitor-visually.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -131,7 +131,7 @@ You can also view rerun history for a particular pipeline run.
131131

132132
You can see the resources consumed by a pipeline run by clicking the consumption icon next to the run.
133133

134-
![Monitor consumption](media/monitor-visually/monitor-consumption-1.png)
134+
![Screenshot that shows where you can see the resources consumed by a pipeline.](media/monitor-visually/monitor-consumption-1.png)
135135

136136
Clicking the icon opens a consumption report of resources used by that pipeline run.
137137

@@ -185,7 +185,7 @@ For a seven-minute introduction and demonstration of this feature, watch the fol
185185

186186
![Box for target criteria](media/monitor-visually/add-criteria-1.png)
187187

188-
![List of criteria](media/monitor-visually/add-criteria-2.png)
188+
![Screenshot that shows where you select one metric to set up the alert condition.](media/monitor-visually/add-criteria-2.png)
189189

190190
![List of criteria](media/monitor-visually/add-criteria-3.png)
191191

articles/data-factory/tutorial-control-flow-portal.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -180,7 +180,7 @@ In this step, you create a pipeline with one Copy activity and two Web activitie
180180
![Drag-drop copy activity](./media/tutorial-control-flow-portal/drag-drop-copy-activity.png)
181181
5. In the **Properties** window for the **Copy** activity at the bottom, switch to the **Source** tab, and click **+ New**. You create a source dataset for the copy activity in this step.
182182

183-
![Source dataset](./media/tutorial-control-flow-portal/new-source-dataset-button.png)
183+
![Screenshot that shows how to create a source dataset for teh copy activity.](./media/tutorial-control-flow-portal/new-source-dataset-button.png)
184184
6. In the **New Dataset** window, select **Azure Blob Storage**, and click **Finish**.
185185

186186
![Select Azure Blob Storage](./media/tutorial-control-flow-portal/select-azure-blob-storage.png)
@@ -269,7 +269,7 @@ In this step, you create a pipeline with one Copy activity and two Web activitie
269269
![Settings for the second Web activity](./media/tutorial-control-flow-portal/web-activity2-settings.png)
270270
22. Select **Copy** activity in the pipeline designer, and click **+->** button, and select **Error**.
271271

272-
![Settings for the second Web activity](./media/tutorial-control-flow-portal/select-copy-failure-link.png)
272+
![Screenshot that shows how to select Error on the Copy activity in the pipeline designer.](./media/tutorial-control-flow-portal/select-copy-failure-link.png)
273273
23. Drag the **red** button next to the Copy activity to the second Web activity **SendFailureEmailActivity**. You can move the activities around so that the pipeline looks like in the following image:
274274

275275
![Full pipeline with all activities](./media/tutorial-control-flow-portal/full-pipeline.png)
@@ -300,7 +300,7 @@ In this step, you create a pipeline with one Copy activity and two Web activitie
300300
![Successful pipeline run](./media/tutorial-control-flow-portal/monitor-success-pipeline-run.png)
301301
2. To **view activity runs** associated with this pipeline run, click the first link in the **Actions** column. You can switch back to the previous view by clicking **Pipelines** at the top. Use the **Refresh** button to refresh the list.
302302

303-
![Activity runs](./media/tutorial-control-flow-portal/activity-runs-success.png)
303+
![Screenshot that shows how to view the list of activity runs.](./media/tutorial-control-flow-portal/activity-runs-success.png)
304304

305305
## Trigger a pipeline run that fails
306306
1. Switch to the **Edit** tab on the left.

articles/data-factory/tutorial-data-flow.md

Lines changed: 19 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -75,30 +75,30 @@ In this step, you'll create a pipeline that contains a Data Flow activity.
7575
![Data Flow Activity](media/tutorial-data-flow/dataflow1.png)
7676
1. In the **Activities** pane, expand the **Move and Transform** accordion. Drag and drop the **Data Flow** activity from the pane to the pipeline canvas.
7777

78-
![Data Flow Activity](media/tutorial-data-flow/activity1.png)
78+
![Screenshot that shows the pipeline canvas where you can drop the Data Flow activity.](media/tutorial-data-flow/activity1.png)
7979
1. In the **Adding Data Flow** pop-up, select **Create new Data Flow** and then name your data flow **TransformMovies**. Click Finish when done.
8080

81-
![Data Flow Activity](media/tutorial-data-flow/activity2.png)
81+
![Screenshot that shows where you name your data flow when you create a new data flow.](media/tutorial-data-flow/activity2.png)
8282

8383
## Build transformation logic in the data flow canvas
8484

8585
Once you create your Data Flow, you'll be automatically sent to the data flow canvas. In this step, you'll build a data flow that takes the moviesDB.csv in ADLS storage and aggregates the average rating of comedies from 1910 to 2000. You'll then write this file back to the ADLS storage.
8686

8787
1. In the data flow canvas, add a source by clicking on the **Add Source** box.
8888

89-
![Data Flow Canvas](media/tutorial-data-flow/dataflow2.png)
89+
![Screenshot that shows the Add Source box.](media/tutorial-data-flow/dataflow2.png)
9090
1. Name your source **MoviesDB**. Click on **New** to create a new source dataset.
9191

92-
![Data Flow Canvas](media/tutorial-data-flow/dataflow3.png)
92+
![Screenshot that shows where you select New after you name your source.](media/tutorial-data-flow/dataflow3.png)
9393
1. Choose **Azure Data Lake Storage Gen2**. Click Continue.
9494

95-
![Dataset](media/tutorial-data-flow/dataset1.png)
95+
![Screenshot that shows the Azure Data Lake Storage Gen2 tile.](media/tutorial-data-flow/dataset1.png)
9696
1. Choose **DelimitedText**. Click Continue.
9797

98-
![Dataset](media/tutorial-data-flow/dataset2.png)
98+
![Screenshot that shows the DelimitedText tile.](media/tutorial-data-flow/dataset2.png)
9999
1. Name your dataset **MoviesDB**. In the linked service dropdown, choose **New**.
100100

101-
![Dataset](media/tutorial-data-flow/dataset3.png)
101+
![Screenshot that shows the Linked service dropdown list.](media/tutorial-data-flow/dataset3.png)
102102
1. In the linked service creation screen, name your ADLS gen2 linked service **ADLSGen2** and specify your authentication method. Then enter your connection credentials. In this tutorial, we're using Account key to connect to our storage account. You can click **Test connection** to verify your credentials were entered correctly. Click Create when finished.
103103

104104
![Linked Service](media/tutorial-data-flow/ls1.png)
@@ -107,13 +107,13 @@ Once you create your Data Flow, you'll be automatically sent to the data flow ca
107107
![Datasets](media/tutorial-data-flow/dataset4.png)
108108
1. If your debug cluster has started, go to the **Data Preview** tab of the source transformation and click **Refresh** to get a snapshot of the data. You can use data preview to verify your transformation is configured correctly.
109109

110-
![Data Flow Canvas](media/tutorial-data-flow/dataflow4.png)
110+
![Screenshot that shows where you can preview your data to verify your transformation is configured correctly.](media/tutorial-data-flow/dataflow4.png)
111111
1. Next to your source node on the data flow canvas, click on the plus icon to add a new transformation. The first transformation you're adding is a **Filter**.
112112

113113
![Data Flow Canvas](media/tutorial-data-flow/dataflow5.png)
114114
1. Name your filter transformation **FilterYears**. Click on the expression box next to **Filter on** to open the expression builder. Here you'll specify your filtering condition.
115115

116-
![Filter](media/tutorial-data-flow/filter1.png)
116+
![Screenshot that shows the Filter on expression box.](media/tutorial-data-flow/filter1.png)
117117
1. The data flow expression builder lets you interactively build expressions to use in various transformations. Expressions can include built-in functions, columns from the input schema, and user-defined parameters. For more information on how to build expressions, see [Data Flow expression builder](concepts-data-flow-expression-builder.md).
118118

119119
In this tutorial, you want to filter movies of genre comedy that came out between the years 1910 and 2000. As year is currently a string, you need to convert it to an integer using the ```toInteger()``` function. Use the greater than or equals to (>=) and less than or equals to (<=) operators to compare against literal year values 1910 and 200-. Union these expressions together with the and (&&) operator. The expression comes out as:
@@ -132,35 +132,35 @@ Once you create your Data Flow, you'll be automatically sent to the data flow ca
132132

133133
1. Fetch a **Data Preview** to verify the filter is working correctly.
134134

135-
![Filter](media/tutorial-data-flow/filter3.png)
135+
![Screenshot that shows the Data Preview that you fetched.](media/tutorial-data-flow/filter3.png)
136136
1. The next transformation you'll add is an **Aggregate** transformation under **Schema modifier**.
137137

138-
![Aggregate](media/tutorial-data-flow/agg1.png)
138+
![Screenshot that shows the Aggregate schema modifier.](media/tutorial-data-flow/agg1.png)
139139
1. Name your aggregate transformation **AggregateComedyRatings**. In the **Group by** tab, select **year** from the dropdown to group the aggregations by the year the movie came out.
140140

141-
![Aggregate](media/tutorial-data-flow/agg2.png)
141+
![Screenshot that shows the year option in the Group by tab under Aggregate Settings.](media/tutorial-data-flow/agg2.png)
142142
1. Go to the **Aggregates** tab. In the left text box, name the aggregate column **AverageComedyRating**. Click on the right expression box to enter the aggregate expression via the expression builder.
143143

144-
![Aggregate](media/tutorial-data-flow/agg3.png)
144+
![Screenshot that shows the year option in the Aggregates tab under Aggregate Settings.](media/tutorial-data-flow/agg3.png)
145145
1. To get the average of column **Rating**, use the ```avg()``` aggregate function. As **Rating** is a string and ```avg()``` takes in a numerical input, we must convert the value to a number via the ```toInteger()``` function. This is expression looks like:
146146

147147
```avg(toInteger(Rating))```
148148

149149
Click **Save and Finish** when done.
150150

151-
![Aggregate](media/tutorial-data-flow/agg4.png)
151+
![Screenshot that shows the saved expression.](media/tutorial-data-flow/agg4.png)
152152
1. Go to the **Data Preview** tab to view the transformation output. Notice only two columns are there, **year** and **AverageComedyRating**.
153153

154154
![Aggregate](media/tutorial-data-flow/agg3.png)
155155
1. Next, you want to add a **Sink** transformation under **Destination**.
156156

157-
![Sink](media/tutorial-data-flow/sink1.png)
157+
![Screenshot that shows where to add a sink transformation under Destination.](media/tutorial-data-flow/sink1.png)
158158
1. Name your sink **Sink**. Click **New** to create your sink dataset.
159159

160-
![Sink](media/tutorial-data-flow/sink2.png)
160+
![Screenshot that shows where you can name your sink and create a new sink dataset.](media/tutorial-data-flow/sink2.png)
161161
1. Choose **Azure Data Lake Storage Gen2**. Click Continue.
162162

163-
![Dataset](media/tutorial-data-flow/dataset1.png)
163+
![Screenshot that shows the Azure Data Lake Storage Gen2 tile you can choose.](media/tutorial-data-flow/dataset1.png)
164164
1. Choose **DelimitedText**. Click Continue.
165165

166166
![Dataset](media/tutorial-data-flow/dataset2.png)
@@ -176,13 +176,13 @@ You can debug a pipeline before you publish it. In this step, you're going to tr
176176

177177
1. Go to the pipeline canvas. Click **Debug** to trigger a debug run.
178178

179-
![Pipeline](media/tutorial-data-flow/pipeline1.png)
179+
![Screenshot that shows the pipeline canvas with Debug highlighted.](media/tutorial-data-flow/pipeline1.png)
180180
1. Pipeline debug of Data Flow activities uses the active debug cluster but still take at least a minute to initialize. You can track the progress via the **Output** tab. Once the run is successful, click on the eyeglasses icon to open the monitoring pane.
181181

182182
![Pipeline](media/tutorial-data-flow/pipeline2.png)
183183
1. In the monitoring pane, you can see the number of rows and time spent in each transformation step.
184184

185-
![Monitoring](media/tutorial-data-flow/pipeline3.png)
185+
![Screenshot that shows the monitoring pane where you can see the number of rows and time spent in each transformation step.](media/tutorial-data-flow/pipeline3.png)
186186
1. Click on a transformation to get detailed information about the columns and partitioning of the data.
187187

188188
![Monitoring](media/tutorial-data-flow/pipeline4.png)

0 commit comments

Comments
 (0)