From 0fc088d8bb00d4ed153f1ae09223a13d06df7283 Mon Sep 17 00:00:00 2001 From: John Kerski <85596845+kerski@users.noreply.github.com> Date: Tue, 24 May 2022 21:39:24 -0400 Subject: [PATCH 1/2] Update dataflows-azure-data-lake-storage-integration.md Based on my experience on May 24th, 2022 and reading the top search result to an Access Denied issue: https://community.powerbi.com/t5/Community-Blog/Connecting-to-an-Azure-Data-Lake-Gen-2-at-a-Power-BI-Workspace/ba-p/1412362 ... I am submitting an update to the prerequisites. Thank you. --- .../dataflows/dataflows-azure-data-lake-storage-integration.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/powerbi-docs/transform-model/dataflows/dataflows-azure-data-lake-storage-integration.md b/powerbi-docs/transform-model/dataflows/dataflows-azure-data-lake-storage-integration.md index a9cecc34c9..c24ca9a353 100644 --- a/powerbi-docs/transform-model/dataflows/dataflows-azure-data-lake-storage-integration.md +++ b/powerbi-docs/transform-model/dataflows/dataflows-azure-data-lake-storage-integration.md @@ -28,7 +28,7 @@ There are two ways to configure which ADLS Gen 2 store to use: you can use a ten - The storage account must be created in the same Azure Active Directory tenant as the Power BI tenant. -- The user must have Azure Blob Data Contributor role, and a Owner role at the storage account level. +- The user must have Azure Blob Data Owner role, Azure Blob Data Reader role, and a Owner role at the storage account level (scope should be 'this resource' and not inherited). Note: Applied role changes may take a few minutes to sync before the steps below can be completed in the Power BI Service. - The Power BI workspace tenant region should be the same as the storage account region. From 2f46d7a4fec34dc2bc11e2d546fbafb0b2731f53 Mon Sep 17 00:00:00 2001 From: David Iseminger <1276752+davidiseminger@users.noreply.github.com> Date: Fri, 8 Jul 2022 09:17:36 -0700 Subject: [PATCH 2/2] Update dataflows-azure-data-lake-storage-integration.md Fixed note, cleaned up text a bit. --- .../dataflows/dataflows-azure-data-lake-storage-integration.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/powerbi-docs/transform-model/dataflows/dataflows-azure-data-lake-storage-integration.md b/powerbi-docs/transform-model/dataflows/dataflows-azure-data-lake-storage-integration.md index 558eae8ca7..0ee924858f 100644 --- a/powerbi-docs/transform-model/dataflows/dataflows-azure-data-lake-storage-integration.md +++ b/powerbi-docs/transform-model/dataflows/dataflows-azure-data-lake-storage-integration.md @@ -28,7 +28,7 @@ There are two ways to configure which ADLS Gen 2 store to use: you can use a ten - The storage account must be created in the same Azure Active Directory tenant as the [Power BI tenant](/power-bi/admin/service-admin-where-is-my-tenant-located#how-to-find-the-default-region-for-your-organization). -- The user must have Azure Blob Data Owner role, Azure Blob Data Reader role, and a Owner role at the storage account level (scope should be 'this resource' and not inherited). Note: Applied role changes may take a few minutes to sync before the steps below can be completed in the Power BI Service. +- The user must have Azure Blob Data Owner role, Azure Blob Data Reader role, and a Owner role at the storage account level (scope should be *this resource* and not inherited). Any applied role changes may take a few minutes to sync, and must sync before the following steps can be completed in the Power BI service. - The Power BI workspace tenant region should be the same as the storage account region.