From 2bf9e9d1a151b34d91657c6826461b90a7007982 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Mon, 20 Oct 2025 15:24:34 +0200 Subject: [PATCH 01/38] Create Introduction to Azure Region Selection Toolkit Added a comprehensive introduction to the Region Selection Toolkit, detailing its purpose, functionalities, and the factors it evaluates for optimal Azure region selection. --- ...ction-to-Azure-Region-Selection-Toolkit.md | 32 +++++++++++++++++++ 1 file changed, 32 insertions(+) create mode 100644 docs/wiki/Introduction-to-Azure-Region-Selection-Toolkit.md diff --git a/docs/wiki/Introduction-to-Azure-Region-Selection-Toolkit.md b/docs/wiki/Introduction-to-Azure-Region-Selection-Toolkit.md new file mode 100644 index 0000000..5add5d5 --- /dev/null +++ b/docs/wiki/Introduction-to-Azure-Region-Selection-Toolkit.md @@ -0,0 +1,32 @@ +# Introduction to the Region Selection Toolkit + +Selecting the right Azure region for a workload is a **critical decision** in any cloud deployment. Azure offers dozens of regions worldwide (each with unique capabilities and constraints), and region choice directly affects compliance, performance, resiliency, and cost. +A poor region choice can lead to issues like legal/regulatory problems, higher latency or poor user experience, and unnecessary expenses. +The **Region Selection Toolkit** is designed to simplify this complex decision. It helps **identifying the optimal Azure region** for their workloads by automating a multi-factor analysis that would be tedious and error-prone to do manually. +By considering technical, business, and even environmental factors, the toolkit provides data-driven recommendations for and helps you confidently plan scenarios such as moving to a new region, expanding an application into additional regions, or choosing a region for a new deployment. + +## What the Toolkit Does +The Region Selection Toolkit **evaluates multiple key factors** to recommend the best region(s) for a given set of Azure resources. +Its holistic approach ensures you don’t overlook important criteria when comparing cloud regions. In particular, the toolkit performs: + +- **Inventory Collection:** It can automatically gather an inventory of your existing Azure resources (for example, via Azure Resource Graph) or accept input from an Azure Migrate assessment. +This inventory of services and components is the foundation for region analysis. + +- **Multi-Factor Region Analysis:** For each candidate region, the toolkit analyzes a wide range of criteria that are crucial for decision-making: + + - _Service Availability & Roadmap:_ Verifies that all Azure services used by your workload are available (or planned) in the target region. It cross-references your workload’s resource types against Azure’s _products-by-region_ lists to avoid deploying into a region where required services are not supported. This factor helps prevent incompatibility or missing service issues. + + - _Cost Differences:_ Compares estimated costs of running the workload in different regions. Azure service pricing can vary by region, so the toolkit pulls pricing for your resource inventory in each region, enabling side-by-side cost comparisons. This helps you weigh budget impacts and identify cost-effective regions without manual price research. + + - _[In progress] Compliance and Geopolitical Factors:_ Accounts for data residency and regulatory requirements tied to geographic location. The toolkit flags if a region belongs to a special sovereignty (e.g. EU, US Gov, China) or has specific compliance certifications. This ensures your choice aligns with laws and policies (for example, GDPR in Europe or other regional regulations). In short, it helps you **choose a region that meets your organization’s compliance mandates and avoids legal risk.** + + - _[In progress] Performance and Resiliency:_ Provides insight into performance-related considerations like network latency and infrastructure resiliency for each region. For example, it notes whether a region supports Availability Zones and identifies its paired region for disaster recovery purposes. These details help evaluate reliability (high availability and DR options) and potential latency impacts on end-users when choosing or moving to that region. (Future versions may integrate more detailed latency testing and capacity data.) + + - _[In progress] Sustainability Metrics:_ Highlights the sustainability considerations of each region, such as regional carbon intensity or the availability of renewable energy. While this data may not always be available for every location, the toolkit surfaces whatever sustainability metrics it can (e.g. relative carbon footprint of running in Region A vs Region B). This helps organizations factor in environmental impact when selecting an Azure region, supporting corporate sustainability goals. + +- **Recommendation Report:** After analyzing the above factors, the toolkit generates a clear **Recommendation Report**. This report lists region choices for your workload and provides the reasoning behind each recommendation. Each recommendation is backed by data, allowing you to confidently present options to stakeholders or use the report as a blueprint for the actual deployment/migration. + +The toolkit is regularly updated to reflect new Azure regions and services, helping teams make informed, balanced decisions on region selection with speed and confidence. + +> [!NOTE] +> The Region Selection Toolkit is modular and extensible. Not all features are fully implemented yet, like _Compliance and Geopolitical Factors, Performance and Resiliency, Sustainability Metrics and Capacity planning_ are in progress. From 270757bdc096c809813819f2e5e700882229090d Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Mon, 20 Oct 2025 17:06:42 +0200 Subject: [PATCH 02/38] Create guide for using Region Selection Toolkit Added a comprehensive guide for using the Region Selection Toolkit, including prerequisites, installation steps, input data requirements, and step-by-step execution instructions. --- docs/wiki/How-to-Use-Toolkit.md | 83 +++++++++++++++++++++++++++++++++ 1 file changed, 83 insertions(+) create mode 100644 docs/wiki/How-to-Use-Toolkit.md diff --git a/docs/wiki/How-to-Use-Toolkit.md b/docs/wiki/How-to-Use-Toolkit.md new file mode 100644 index 0000000..bb83a15 --- /dev/null +++ b/docs/wiki/How-to-Use-Toolkit.md @@ -0,0 +1,83 @@ +# How to Use the Region Selection Toolkit +_This page is a practical guide for running the Region Selection Toolkit, helping you evaluate and choose the optimal Azure region for your workloads._ + +## Prerequisites +Before using the toolkit, ensure the following prerequisites are met: +- **Azure Subscription Access:** You should have access to the Azure subscription(s) containing the workload you want to analyse. At minimum, read permissions (e.g. **Azure Reader role**) on the relevant resources are required to gather inventory. If analysing a planned deployment (with no existing Azure resources yet), you can skip resource access but will need an Azure Migrate assessment export (see Input Data below). + +- **Environment:** Prepare a PowerShell environment to run the toolkit. The toolkit is implemented in PowerShell scripts, so you can run it on Windows, Linux, or in the Azure Cloud Shell (which comes pre-configured with Azure PowerShell). Ensure you have **PowerShell 7.x (Core)** installed (PowerShell 7.5.1 or later is recommended). + +- **Azure PowerShell Modules:** Install the necessary Azure PowerShell modules (if using Azure Cloud Shell, these are already available). The key modules needed include: + + - **Az.Accounts** (for logging in and selecting subscriptions) – version 4.1.0 or later + + - **Az.ResourceGraph** (for inventory queries) – version 1.2.0 or later + + - (Optional) **Az.Monitor** (for any performance metrics, if used) – version 5.2.2 or later + +- **Azure Login:** You must be able to authenticate to Azure. If running locally, use Connect-AzAccount to sign in with your Azure credentials (or ensure your Azure CLI/PowerShell context is already logged in). + +## Installation (Getting the Toolkit) +To obtain the Region Selection Toolkit on your machine or environment: +1. **Download or Clone** the toolkit’s repository (e.g., via Git): The toolkit is provided as a set of scripts in a GitHub repository (e.g. `Azure/AzRegionSelection`). You can clone it using `git clone https://github.com/Azure/AzRegionSelection.git`, or download the repository ZIP and extract it. +2. **Directory Structure:** After retrieval, you should have a directory containing the toolkit scripts. Key sub-folders include `1-Collect`, `2-AvailabilityCheck`, `3-CostInformation`, and `7-Report` (these correspond to different stages of the analysis). It’s important to keep this structure intact. You do **not** need to compile anything – the toolkit is ready to run via PowerShell scripts. + +> [!NOTE] +> Ensure your environment (local machine or Cloud Shell) has network access to Azure endpoints. The toolkit may call Azure APIs (for resource data and pricing information), so an internet connection is required when running it. + +## Input Data: Providing a Workload Inventory +The first step in using the toolkit is to provide an inventory of the workload’s Azure resources. The Region Selection Toolkit supports two main input methods for this inventory: + +- **A.** **Automatic Inventory via Azure Resource Graph:** If the workload is already deployed in Azure (or you have an existing Azure environment you want to evaluate), the toolkit can automatically collect the resource list. In this case, you’ll run the `1-Collect` script which uses Azure Resource Graph to retrieve all resources in the specified subscription or resource group. This requires the prerequisites above (Azure login and appropriate permissions). You will specify which subscription (or other scope) to query. + +- **B.** **Import from Azure Migrate Assessment:** If you are planning a migration (for example, moving on-premises or other cloud workloads to Azure) and have used Azure Migrate to assess your environment, you can use that data as input. First, export the Azure Migrate assessment results (Azure Migrate allows exporting discovered VM and resource metadata to files such as Excel/CSV). Then, the toolkit’s `1-Collect` stage can ingest this file to create an inventory of resources. Ensure the exported data is in a format the toolkit expects (check the toolkit documentation for the exact file format or template required). For instance, you might need to supply a parameter like `-InventoryFile ` when running the collection script, pointing to the Azure Migrate output. + + +## Running the Toolkit Step-by-Step +Once your environment is ready and you have determined the input method, follow these steps to run the Region Selection Toolkit. It’s important to run the stages in order, as each stage uses data from the previous one. The steps below assume you’re using PowerShell: + +### 1. Authenticate and Set Context +Open a PowerShell prompt in the toolkit’s directory. If you’re in Azure Cloud Shell, you can navigate to the folder where you cloned the toolkit. +- **Log in to Azure:** Run `Connect-AzAccount` if you haven’t already authenticated. This will open a browser prompt (or use device code flow in Cloud Shell) for Azure login. After logging in, your session is connected to Azure. +- **Select the target subscription:** If you have multiple subscriptions, ensure the correct one is active. Use `Select-AzSubscription` `-SubscriptionId ` to switch the context to the subscription that contains the resources you want to analyse. This ensures all subsequent operations run against the intended subscription. (If you have only one subscription or have already set the context, this step is done automatically by Connect-AzAccount.) + +### 2. Run 1-Collect (Inventory Collection) +Next, gather the inventory of resources that will be evaluated: +- **If using Azure Resource Graph (existing Azure resources):** Run the collection script with your target scope. For example: +```powershell +# Run inventory collection for a subscription +PS> cd 1-Collect +PS> .\1-Collect.ps1 -SubscriptionId "" -OutputFile "inventory.json" +``` +Replace `` with your Azure Subscription ID (or you might use `-SubscriptionName "Name"` if supported by the script). This will query Azure Resource Graph and collect details of resources in that subscription. The script will likely output the collected inventory to a file (e.g., `inventory.json` or a similar format) or to an in-memory object that subsequent scripts will use. If your scope is a resource group or management group, use the appropriate parameters (check script help by running `Get-Help .\1-Collect.ps1 -Full` for available options). + +- **If using an Azure Migrate export:** Ensure the Azure Migrate data file is accessible (for example, copied into the toolkit directory or a known path). Run the collection script with a parameter to import that file. For example: +```powershell +PS> cd 1-Collect +PS> .\1-Collect.ps1 -ImportFile "MyMigrateExport.csv" -OutputFile "inventory.json" +``` +The script will parse the Azure Migrate assessment data and produce the inventory in the format needed for the next steps. +The output of `1-Collect` is a consolidated list of your workload’s resources and their attributes, which will be used to evaluate region compatibility. Once this step completes, you should have an inventory object or file ready. The console may show a summary (e.g., “100 resources collected from subscription XYZ”). If there are errors (like permission issues or file parse issues), address them before proceeding. + +### 2. Run 2-AvailabilityCheck (Service Availability and Compliance Analysis) +With the inventory in hand, run the second stage to analyse Azure regions against service availability and related factors: +```powershell +PS> cd ..\2-AvailabilityCheck +PS> .\2-AvailabilityCheck.ps1 -Inventory "inventory.json" -Regions "all" +``` +This script takes the inventory (from the previous step) and checks which Azure regions can support those resources. It will likely compare each resource’s Azure service against a global services-by-region list. By default, the toolkit may evaluate **all public Azure regions** or a broad set of regions. You might have the option to limit the regions in consideration (for example, using a `-Regions` parameter as shown, where you could specify a list like `"westeurope,eastus2"` if you only want specific regions evaluated). If unspecified, it defaults to all relevant regions. + +During this step, the toolkit will: +- Identify any regions where a required service is **not available**. Such regions would be marked as incompatible for your workload (or flagged for missing services). +- Check for compliance or sovereignty flags (for example, if your inventory includes resources subject to data residency rules, the script might note which regions are in the same geography or meet specific compliance requirements). It may use predefined mappings (e.g., marking government cloud regions, EU regions, etc.). + +The output of this stage is typically an interim analysis which shows, for each region (or each resource vs region matrix), the availability result. This might be saved to an output file like `availability.json` or similar. The details might include lists of any **unsupported services per region** or compliance notes (e.g., “Region X is in Gov cloud, skipped unless specifically needed” as a note). + +### 3. Run 3-CostInformation (Cost Analysis) +Next, run the cost comparison stage to evaluate cost differences across regions: +```powershell +PS> cd ..\3-CostInformation +PS> .\3-CostInformation.ps1 -Inventory "inventory.json" -RegionAnalysis "availability.json" -OutputFile "costs.json" +``` + + From f9e0ac163ac887fba78c7567e99f3fa72b73b325 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 10:54:36 +0200 Subject: [PATCH 03/38] Revise How-to-Use Toolkit documentation Updated the toolkit documentation to clarify prerequisites and running steps. Enhanced instructions for inventory collection and service availability checks. --- docs/wiki/How-to-Use-Toolkit.md | 142 ++++++++++++++++++++++++++------ 1 file changed, 116 insertions(+), 26 deletions(-) diff --git a/docs/wiki/How-to-Use-Toolkit.md b/docs/wiki/How-to-Use-Toolkit.md index bb83a15..453de02 100644 --- a/docs/wiki/How-to-Use-Toolkit.md +++ b/docs/wiki/How-to-Use-Toolkit.md @@ -4,6 +4,7 @@ _This page is a practical guide for running the Region Selection Toolkit, helpin ## Prerequisites Before using the toolkit, ensure the following prerequisites are met: - **Azure Subscription Access:** You should have access to the Azure subscription(s) containing the workload you want to analyse. At minimum, read permissions (e.g. **Azure Reader role**) on the relevant resources are required to gather inventory. If analysing a planned deployment (with no existing Azure resources yet), you can skip resource access but will need an Azure Migrate assessment export (see Input Data below). +- To run `3-CostInformation`, ensure that you have **Cost Management Reader access** to all subscriptions in scope. - **Environment:** Prepare a PowerShell environment to run the toolkit. The toolkit is implemented in PowerShell scripts, so you can run it on Windows, Linux, or in the Azure Cloud Shell (which comes pre-configured with Azure PowerShell). Ensure you have **PowerShell 7.x (Core)** installed (PowerShell 7.5.1 or later is recommended). @@ -36,48 +37,137 @@ The first step in using the toolkit is to provide an inventory of the workload ## Running the Toolkit Step-by-Step Once your environment is ready and you have determined the input method, follow these steps to run the Region Selection Toolkit. It’s important to run the stages in order, as each stage uses data from the previous one. The steps below assume you’re using PowerShell: -### 1. Authenticate and Set Context +### Authenticate and Set Context Open a PowerShell prompt in the toolkit’s directory. If you’re in Azure Cloud Shell, you can navigate to the folder where you cloned the toolkit. - **Log in to Azure:** Run `Connect-AzAccount` if you haven’t already authenticated. This will open a browser prompt (or use device code flow in Cloud Shell) for Azure login. After logging in, your session is connected to Azure. - **Select the target subscription:** If you have multiple subscriptions, ensure the correct one is active. Use `Select-AzSubscription` `-SubscriptionId ` to switch the context to the subscription that contains the resources you want to analyse. This ensures all subsequent operations run against the intended subscription. (If you have only one subscription or have already set the context, this step is done automatically by Connect-AzAccount.) -### 2. Run 1-Collect (Inventory Collection) -Next, gather the inventory of resources that will be evaluated: -- **If using Azure Resource Graph (existing Azure resources):** Run the collection script with your target scope. For example: +### 1. Run 1-Collect (Inventory Collection) +Next, gather the inventory of resources that will be evaluated. Run the script `Get-AzureServices.ps1` to collect the Azure resource inventory and properties, for yor relevant scope (resource group, subscription or multiple subscriptions). The script will generate a `resources.json` and a `summary.json` file in the same directory. The `resources.json` file contains the full inventory of resources and their properties, while the `summary.json` file contains a summary of the resources collected. + +**If using Azure Resource Graph:** Run the collection script with your target scope. For example: + +- To collect the inventory for a single resource group, run the script as follows: + ```powershell -# Run inventory collection for a subscription -PS> cd 1-Collect -PS> .\1-Collect.ps1 -SubscriptionId "" -OutputFile "inventory.json" +Get-AzureServices.ps1 -scopeType resourceGroup -resourceGroupName -subscriptionId ``` -Replace `` with your Azure Subscription ID (or you might use `-SubscriptionName "Name"` if supported by the script). This will query Azure Resource Graph and collect details of resources in that subscription. The script will likely output the collected inventory to a file (e.g., `inventory.json` or a similar format) or to an in-memory object that subsequent scripts will use. If your scope is a resource group or management group, use the appropriate parameters (check script help by running `Get-Help .\1-Collect.ps1 -Full` for available options). -- **If using an Azure Migrate export:** Ensure the Azure Migrate data file is accessible (for example, copied into the toolkit directory or a known path). Run the collection script with a parameter to import that file. For example: +- To collect the inventory for a single subscription, run the script as follows: + ```powershell -PS> cd 1-Collect -PS> .\1-Collect.ps1 -ImportFile "MyMigrateExport.csv" -OutputFile "inventory.json" +Get-AzureServices.ps1 -scopeType subscription -subscriptionId ``` -The script will parse the Azure Migrate assessment data and produce the inventory in the format needed for the next steps. -The output of `1-Collect` is a consolidated list of your workload’s resources and their attributes, which will be used to evaluate region compatibility. Once this step completes, you should have an inventory object or file ready. The console may show a summary (e.g., “100 resources collected from subscription XYZ”). If there are errors (like permission issues or file parse issues), address them before proceeding. -### 2. Run 2-AvailabilityCheck (Service Availability and Compliance Analysis) -With the inventory in hand, run the second stage to analyse Azure regions against service availability and related factors: +- To collect the inventory for multiple subscriptions, you will need to create a json file containing the subscription ids in scope. See [here](./subscriptions.json) for a sample json file. Once the file is created, run the script as follows: + ```powershell -PS> cd ..\2-AvailabilityCheck -PS> .\2-AvailabilityCheck.ps1 -Inventory "inventory.json" -Regions "all" +Get-AzureServices.ps1 -multiSubscription -workloadFile ``` -This script takes the inventory (from the previous step) and checks which Azure regions can support those resources. It will likely compare each resource’s Azure service against a global services-by-region list. By default, the toolkit may evaluate **all public Azure regions** or a broad set of regions. You might have the option to limit the regions in consideration (for example, using a `-Regions` parameter as shown, where you could specify a list like `"westeurope,eastus2"` if you only want specific regions evaluated). If unspecified, it defaults to all relevant regions. -During this step, the toolkit will: -- Identify any regions where a required service is **not available**. Such regions would be marked as incompatible for your workload (or flagged for missing services). -- Check for compliance or sovereignty flags (for example, if your inventory includes resources subject to data residency rules, the script might note which regions are in the same geography or meet specific compliance requirements). It may use predefined mappings (e.g., marking government cloud regions, EU regions, etc.). +**If using an Azure Migrate export:** Ensure the Azure Migrate data file is accessible. Run `Get-RessourcesFromAM.ps1` against an Azure Migrate `Assessment.xlsx` file to convert the VM & Disk SKUs into the same output as `Get-AzureServices.ps1` For example: -The output of this stage is typically an interim analysis which shows, for each region (or each resource vs region matrix), the availability result. This might be saved to an output file like `availability.json` or similar. The details might include lists of any **unsupported services per region** or compliance notes (e.g., “Region X is in Gov cloud, skipped unless specifically needed” as a note). +```powershell +Get-RessourcesFromAM.ps1 -filePath "C:\path\to\Assessment.xlsx" -outputFile "C:\path\to\summary.json" +``` +> [!NOTE] +> Before proceeding, get sure that the output files are successful generated in the `1-Collect` folder with the name `resources.json` as well as `summary.json`. + +### 2. Run 2-AvailabilityCheck (Service Availability) +After collecting inventory, continue with `2-AvailabilityCheck/Get-AvailabilityInformation.ps1`. This script evaluates the availability of Azure services, resources, and SKUs across different regions. When combined with the output from the `1-Collect` script, it provides a comprehensive overview of potential migration destinations, identifying feasible regions and the reasons for their suitability or limitations, such as availability constraints per region. + +It will generate a `services.json` file in the same directory, which contains the availability information for the services in the target region. Note that this functionality is not yet complete and is a work in progress. + +Currently, this script associates every resource with its regional availability. Additionally, it maps the following SKUs to the regions where they are supported: +* microsoft.compute/disks +* microsoft.compute/virtualmachines +* microsoft.sql/managedinstances +* microsoft.sql/servers/databases +* microsoft.storage/storageaccounts + +1. Navigate to the `2-AvailabilityCheck` folder and run the script using `.\Get-AvailabilityInformation.ps1`. The script will generate report files in the `2-AvailabilityCheck` folder. + +#### Per region filter script +To check the availability of services in a specific region, it is necessary to first run the `Get-AvailabilityInformation.ps1` script which will collect service availability in all regions. The resulting json files is then used with the `Get-Region.ps1` script to determine specific service availability for one or more regions to be used for reporting eventually. Note that the `Get-AvailabilityInformation.ps1` script only needs to be run once to collect the availability information for all regions, which takes a little while. + +After that, you can use the`Get-Region.ps1` script to check the availability of services in specific regions. Availability information is available in the `Availability_Mapping_.json` file, which is generated in the same directory as the script. -### 3. Run 3-CostInformation (Cost Analysis) -Next, run the cost comparison stage to evaluate cost differences across regions: ```powershell -PS> cd ..\3-CostInformation -PS> .\3-CostInformation.ps1 -Inventory "inventory.json" -RegionAnalysis "availability.json" -OutputFile "costs.json" +Get-AvailabilityInformation.ps1 +# Wait for the script to complete, this may take a while. +Get-Region.ps1 -region +# Example1: Get-Region.ps1 -region "east us" +# Example2: Get-Region.ps1 -region "west us" +# Example3: Get-Region.ps1 -region "sweden central" ``` + +### 3. Run 3-CostInformation (Cost Analysis) +This script uses public API to compare cost between the exsiting resource region and the one or more target regions. For this we use the Microsoft.CostManagement provider of each subscription. It will query the cost information for the resources collected in the previous step and compare cost diffrences of the regions in scope and generate a `cost.json` file in the same directory. Note that this is just standard pricing, which means customer discounts are **not** included. + +The input file is `resources.json` produced by the `1-Collect` script. + +1. Requires Az.CostManagement module version 0.4.2. +`PS1> Install-Module -Name Az.CostManagement` + +2. Navigate to the `3-CostInformation` folder and run the script using `.\Get-CostInformation.ps1`. The script will generate a CSV file in the current folder. + +#### Perform-RegionComparison.ps1 + +This script builds on the collection step by comparing pricing across Azure regions for the meter ID's retrieved earlier. +The Azure public pricing API is used, meaning that: +- No login is needed for this step +- Prices are *not* customer-specific, but are only used to calculate the relative cost difference between regions for each meter + +As customer discounts tend to be linear (for example, ACD is a flat rate discount across all PAYG Azure spend), the relative price difference between regions can still be used to make an intelligent estimate of the cost impact of a workload move. + +Instructions for use: + +1. Prepare a list of target regions for comparison. This can be provided at the command line or stored in a variable before calling the script. +2. Ensure the `resources.json` file is present (from the running of the collector script). +2. Run the script using `.\Perform-RegionComparison.ps1`. The script will generate output files in the current folder. + +For example: +``` text +$regions = @("eastus", "brazilsouth", "australiaeast") +.\Perform-RegionComparison.ps1 -regions $regions -outputType json +``` + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + From f6dab3f02860ee0c386a06e1c604a47d7b5645d5 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 10:59:10 +0200 Subject: [PATCH 04/38] Improve documentation for 3-CostInformation script Clarified the explanation of the cost analysis scripts and their functionality. --- docs/wiki/How-to-Use-Toolkit.md | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/docs/wiki/How-to-Use-Toolkit.md b/docs/wiki/How-to-Use-Toolkit.md index 453de02..2f74f51 100644 --- a/docs/wiki/How-to-Use-Toolkit.md +++ b/docs/wiki/How-to-Use-Toolkit.md @@ -102,7 +102,9 @@ Get-Region.ps1 -region ``` ### 3. Run 3-CostInformation (Cost Analysis) -This script uses public API to compare cost between the exsiting resource region and the one or more target regions. For this we use the Microsoft.CostManagement provider of each subscription. It will query the cost information for the resources collected in the previous step and compare cost diffrences of the regions in scope and generate a `cost.json` file in the same directory. Note that this is just standard pricing, which means customer discounts are **not** included. +This step contains two scripts. One that retrives cost information about resources inscope and second script uses public API to compare cost between the exsiting resource region and the one or more target regions. + +For this we use the Microsoft.CostManagement provider of each subscription. It will query the cost information for the resources collected in the previous step and compare cost diffrences of the regions in scope and generate a `cost.json` file in the same directory. Note that this is just standard pricing, which means customer discounts are **not** included. The input file is `resources.json` produced by the `1-Collect` script. From 2d7f6d1af3c30e2c4c02ec7a5c6b06aaad2bb234 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 11:23:04 +0200 Subject: [PATCH 05/38] Create Setup-and-Prerequisites.md --- docs/wiki/Setup-and-Prerequisites.md | 1 + 1 file changed, 1 insertion(+) create mode 100644 docs/wiki/Setup-and-Prerequisites.md diff --git a/docs/wiki/Setup-and-Prerequisites.md b/docs/wiki/Setup-and-Prerequisites.md new file mode 100644 index 0000000..8b13789 --- /dev/null +++ b/docs/wiki/Setup-and-Prerequisites.md @@ -0,0 +1 @@ + From 8b3277d1dd81df51bfb28543a0287585331b48db Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 11:34:39 +0200 Subject: [PATCH 06/38] Document setup and prerequisites for toolkit Added setup and prerequisites information for the Region Selection Toolkit. --- docs/wiki/Setup-and-Prerequisites.md | 30 ++++++++++++++++++++++++++++ 1 file changed, 30 insertions(+) diff --git a/docs/wiki/Setup-and-Prerequisites.md b/docs/wiki/Setup-and-Prerequisites.md index 8b13789..0b55577 100644 --- a/docs/wiki/Setup-and-Prerequisites.md +++ b/docs/wiki/Setup-and-Prerequisites.md @@ -1 +1,31 @@ +# Getting Started + +_This page is a guide for setup and prerequisites needed bafore running the Region Selection Toolkit._ + +## Prerequisites +Before using the toolkit, ensure the following prerequisites are met: +- **Azure Subscription Access:** You should have access to the Azure subscription(s) containing the workload you want to analyse. At minimum, read permissions (e.g. **Azure Reader role**) on the relevant resources are required to gather inventory. If analysing a planned deployment (with no existing Azure resources yet), you can skip resource access but will need an Azure Migrate assessment export (see Input Data below). +- To run `3-CostInformation`, ensure that you have **Cost Management Reader access** to all subscriptions in scope. + +- **Environment:** Prepare a PowerShell environment to run the toolkit. The toolkit is implemented in PowerShell scripts, so you can run it on Windows, Linux, or in the Azure Cloud Shell. Ensure you have **PowerShell Core 7.5.1** or later installed. + +- **Azure PowerShell Modules:** Install the necessary Azure PowerShell modules (if using Azure Cloud Shell, these are already available). + + - Azure Powershell module `Az.ResourceGraph 1.2.0` or later + - Azure Powershell module `Az.Accounts 4.1.0` or later + + If using Azure migrate as input file: + - Azure Powershell module `Az.Monitor 5.2.2` or later + - Azure Powershell `ImportExcel` module for Azure Migrate script + +- **Azure Login:** You must be able to authenticate to Azure. If running locally, use `Connect-AzAccount` to sign in with your Azure credentials. + +## Installation (Getting the Toolkit) +To obtain the Region Selection Toolkit on your machine or environment: +1. **Download or Clone** the toolkit’s repository (e.g., via Git): The toolkit is provided as a set of scripts in a GitHub repository (e.g. `Azure/AzRegionSelection`). You can clone it using `git clone https://github.com/Azure/AzRegionSelection.git`, or download the repository ZIP and extract it. +2. **Directory Structure:** After retrieval, you should have a directory containing the toolkit scripts. Key sub-folders include `1-Collect`, `2-AvailabilityCheck`, `3-CostInformation`, and `7-Report` (these correspond to different stages of the analysis). It’s important to keep this structure intact. You do **not** need to compile anything – the toolkit is ready to run via PowerShell scripts. + +> [!NOTE] +> Ensure your environment (local machine or Cloud Shell) has network access to Azure endpoints. The toolkit may call Azure APIs (for resource data and pricing information), so an internet connection is required when running it. + From f617048a92662228ff5811358bf64ba951008d90 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 11:39:47 +0200 Subject: [PATCH 07/38] Document input data methods for workload inventory Added instructions for providing a workload inventory using automatic collection and import methods. --- docs/wiki/Setup-and-Prerequisites.md | 5 +++++ 1 file changed, 5 insertions(+) diff --git a/docs/wiki/Setup-and-Prerequisites.md b/docs/wiki/Setup-and-Prerequisites.md index 0b55577..7ffce8b 100644 --- a/docs/wiki/Setup-and-Prerequisites.md +++ b/docs/wiki/Setup-and-Prerequisites.md @@ -28,4 +28,9 @@ To obtain the Region Selection Toolkit on your machine or environment: > [!NOTE] > Ensure your environment (local machine or Cloud Shell) has network access to Azure endpoints. The toolkit may call Azure APIs (for resource data and pricing information), so an internet connection is required when running it. +## Input Data: Providing a Workload Inventory +The first step in using the toolkit is to provide an inventory of the workload’s Azure resources. The Region Selection Toolkit supports two main input methods for this inventory: +**A.** **Automatic Inventory via Azure Resource Graph:** If the workload is already deployed in Azure, the toolkit can automatically collect the resource list. In this case, you’ll run the `1-Collect` script which uses Azure Resource Graph to retrieve all resources in the specified subscription or resource group. This requires the prerequisites above (Azure login and appropriate permissions). You will specify which subscription (or other scope) to query. + +**B.** **Import from Azure Migrate Assessment:** If you are planning a migration (for example, moving on-premises or other cloud workloads to Azure) and have used Azure Migrate to assess your environment, you can use that data as input. First, export the Azure Migrate assessment results (Azure Migrate allows exporting discovered VM and resource metadata to files such as Excel/CSV). Then, the toolkit’s `1-Collect` stage can ingest this file to create an inventory of resources. Ensure the exported data is in a format the toolkit expects (check the toolkit documentation for the exact file format or template required). From d6f7aa9c68a9b2bd60ac6bc733b189545fb1a9d0 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 13:38:53 +0200 Subject: [PATCH 08/38] Improve formatting of setup prerequisites section Formatted the prerequisites section for clarity and added indentation for better readability. --- docs/wiki/Setup-and-Prerequisites.md | 6 ++++-- 1 file changed, 4 insertions(+), 2 deletions(-) diff --git a/docs/wiki/Setup-and-Prerequisites.md b/docs/wiki/Setup-and-Prerequisites.md index 7ffce8b..fbc999b 100644 --- a/docs/wiki/Setup-and-Prerequisites.md +++ b/docs/wiki/Setup-and-Prerequisites.md @@ -4,8 +4,9 @@ _This page is a guide for setup and prerequisites needed bafore running the Regi ## Prerequisites Before using the toolkit, ensure the following prerequisites are met: -- **Azure Subscription Access:** You should have access to the Azure subscription(s) containing the workload you want to analyse. At minimum, read permissions (e.g. **Azure Reader role**) on the relevant resources are required to gather inventory. If analysing a planned deployment (with no existing Azure resources yet), you can skip resource access but will need an Azure Migrate assessment export (see Input Data below). -- To run `3-CostInformation`, ensure that you have **Cost Management Reader access** to all subscriptions in scope. +- **Azure Subscription Access:** + - You should have access to the Azure subscription(s) containing the workload you want to analyse. At minimum, read permissions (e.g. **Azure Reader role**) on the relevant resources are required to gather inventory. If analysing a planned deployment (with no existing Azure resources yet), you can skip resource access but will need an Azure Migrate assessment export (see Input Data below). + - To run `3-CostInformation`, ensure that you have **Cost Management Reader access** to all subscriptions in scope. - **Environment:** Prepare a PowerShell environment to run the toolkit. The toolkit is implemented in PowerShell scripts, so you can run it on Windows, Linux, or in the Azure Cloud Shell. Ensure you have **PowerShell Core 7.5.1** or later installed. @@ -23,6 +24,7 @@ Before using the toolkit, ensure the following prerequisites are met: ## Installation (Getting the Toolkit) To obtain the Region Selection Toolkit on your machine or environment: 1. **Download or Clone** the toolkit’s repository (e.g., via Git): The toolkit is provided as a set of scripts in a GitHub repository (e.g. `Azure/AzRegionSelection`). You can clone it using `git clone https://github.com/Azure/AzRegionSelection.git`, or download the repository ZIP and extract it. + 2. **Directory Structure:** After retrieval, you should have a directory containing the toolkit scripts. Key sub-folders include `1-Collect`, `2-AvailabilityCheck`, `3-CostInformation`, and `7-Report` (these correspond to different stages of the analysis). It’s important to keep this structure intact. You do **not** need to compile anything – the toolkit is ready to run via PowerShell scripts. > [!NOTE] From cf37fd041c0d0e7929d77c186411b0e44bd7c1e5 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 13:39:30 +0200 Subject: [PATCH 09/38] Update Setup-and-Prerequisites.md to remove note Removed note about network access to Azure endpoints. --- docs/wiki/Setup-and-Prerequisites.md | 3 --- 1 file changed, 3 deletions(-) diff --git a/docs/wiki/Setup-and-Prerequisites.md b/docs/wiki/Setup-and-Prerequisites.md index fbc999b..02b9e96 100644 --- a/docs/wiki/Setup-and-Prerequisites.md +++ b/docs/wiki/Setup-and-Prerequisites.md @@ -27,9 +27,6 @@ To obtain the Region Selection Toolkit on your machine or environment: 2. **Directory Structure:** After retrieval, you should have a directory containing the toolkit scripts. Key sub-folders include `1-Collect`, `2-AvailabilityCheck`, `3-CostInformation`, and `7-Report` (these correspond to different stages of the analysis). It’s important to keep this structure intact. You do **not** need to compile anything – the toolkit is ready to run via PowerShell scripts. -> [!NOTE] -> Ensure your environment (local machine or Cloud Shell) has network access to Azure endpoints. The toolkit may call Azure APIs (for resource data and pricing information), so an internet connection is required when running it. - ## Input Data: Providing a Workload Inventory The first step in using the toolkit is to provide an inventory of the workload’s Azure resources. The Region Selection Toolkit supports two main input methods for this inventory: From 4621cb9ab0237c841cdfd0e5be9e0c6f1647ef6b Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 13:44:32 +0200 Subject: [PATCH 10/38] Enhance setup documentation with Azure Migrate import Added instructions for importing Azure Migrate assessment data. --- docs/wiki/Setup-and-Prerequisites.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/docs/wiki/Setup-and-Prerequisites.md b/docs/wiki/Setup-and-Prerequisites.md index 02b9e96..d61a47b 100644 --- a/docs/wiki/Setup-and-Prerequisites.md +++ b/docs/wiki/Setup-and-Prerequisites.md @@ -33,3 +33,5 @@ The first step in using the toolkit is to provide an inventory of the workload **A.** **Automatic Inventory via Azure Resource Graph:** If the workload is already deployed in Azure, the toolkit can automatically collect the resource list. In this case, you’ll run the `1-Collect` script which uses Azure Resource Graph to retrieve all resources in the specified subscription or resource group. This requires the prerequisites above (Azure login and appropriate permissions). You will specify which subscription (or other scope) to query. **B.** **Import from Azure Migrate Assessment:** If you are planning a migration (for example, moving on-premises or other cloud workloads to Azure) and have used Azure Migrate to assess your environment, you can use that data as input. First, export the Azure Migrate assessment results (Azure Migrate allows exporting discovered VM and resource metadata to files such as Excel/CSV). Then, the toolkit’s `1-Collect` stage can ingest this file to create an inventory of resources. Ensure the exported data is in a format the toolkit expects (check the toolkit documentation for the exact file format or template required). + +## Next Up: [How to use Region Selection Toolkit](Step-by-Step-Guide.md) From 5781642e076c1d45affc88de6b139c37c869240e Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 13:59:33 +0200 Subject: [PATCH 11/38] Revise Step-by-Step Guide for Region Selection Toolkit Updated the Step-by-Step Guide for the Region Selection Toolkit, including prerequisites, installation instructions, and running the toolkit. Enhanced clarity on input data methods and script execution. --- ...o-Use-Toolkit.md => Step-by-Step-Guide.md} | 58 +++++-------------- 1 file changed, 14 insertions(+), 44 deletions(-) rename docs/wiki/{How-to-Use-Toolkit.md => Step-by-Step-Guide.md} (50%) diff --git a/docs/wiki/How-to-Use-Toolkit.md b/docs/wiki/Step-by-Step-Guide.md similarity index 50% rename from docs/wiki/How-to-Use-Toolkit.md rename to docs/wiki/Step-by-Step-Guide.md index 2f74f51..225c940 100644 --- a/docs/wiki/How-to-Use-Toolkit.md +++ b/docs/wiki/Step-by-Step-Guide.md @@ -1,51 +1,21 @@ # How to Use the Region Selection Toolkit -_This page is a practical guide for running the Region Selection Toolkit, helping you evaluate and choose the optimal Azure region for your workloads._ - -## Prerequisites -Before using the toolkit, ensure the following prerequisites are met: -- **Azure Subscription Access:** You should have access to the Azure subscription(s) containing the workload you want to analyse. At minimum, read permissions (e.g. **Azure Reader role**) on the relevant resources are required to gather inventory. If analysing a planned deployment (with no existing Azure resources yet), you can skip resource access but will need an Azure Migrate assessment export (see Input Data below). -- To run `3-CostInformation`, ensure that you have **Cost Management Reader access** to all subscriptions in scope. - -- **Environment:** Prepare a PowerShell environment to run the toolkit. The toolkit is implemented in PowerShell scripts, so you can run it on Windows, Linux, or in the Azure Cloud Shell (which comes pre-configured with Azure PowerShell). Ensure you have **PowerShell 7.x (Core)** installed (PowerShell 7.5.1 or later is recommended). - -- **Azure PowerShell Modules:** Install the necessary Azure PowerShell modules (if using Azure Cloud Shell, these are already available). The key modules needed include: - - - **Az.Accounts** (for logging in and selecting subscriptions) – version 4.1.0 or later - - - **Az.ResourceGraph** (for inventory queries) – version 1.2.0 or later - - - (Optional) **Az.Monitor** (for any performance metrics, if used) – version 5.2.2 or later -- **Azure Login:** You must be able to authenticate to Azure. If running locally, use Connect-AzAccount to sign in with your Azure credentials (or ensure your Azure CLI/PowerShell context is already logged in). - -## Installation (Getting the Toolkit) -To obtain the Region Selection Toolkit on your machine or environment: -1. **Download or Clone** the toolkit’s repository (e.g., via Git): The toolkit is provided as a set of scripts in a GitHub repository (e.g. `Azure/AzRegionSelection`). You can clone it using `git clone https://github.com/Azure/AzRegionSelection.git`, or download the repository ZIP and extract it. -2. **Directory Structure:** After retrieval, you should have a directory containing the toolkit scripts. Key sub-folders include `1-Collect`, `2-AvailabilityCheck`, `3-CostInformation`, and `7-Report` (these correspond to different stages of the analysis). It’s important to keep this structure intact. You do **not** need to compile anything – the toolkit is ready to run via PowerShell scripts. - -> [!NOTE] -> Ensure your environment (local machine or Cloud Shell) has network access to Azure endpoints. The toolkit may call Azure APIs (for resource data and pricing information), so an internet connection is required when running it. - -## Input Data: Providing a Workload Inventory -The first step in using the toolkit is to provide an inventory of the workload’s Azure resources. The Region Selection Toolkit supports two main input methods for this inventory: - -- **A.** **Automatic Inventory via Azure Resource Graph:** If the workload is already deployed in Azure (or you have an existing Azure environment you want to evaluate), the toolkit can automatically collect the resource list. In this case, you’ll run the `1-Collect` script which uses Azure Resource Graph to retrieve all resources in the specified subscription or resource group. This requires the prerequisites above (Azure login and appropriate permissions). You will specify which subscription (or other scope) to query. - -- **B.** **Import from Azure Migrate Assessment:** If you are planning a migration (for example, moving on-premises or other cloud workloads to Azure) and have used Azure Migrate to assess your environment, you can use that data as input. First, export the Azure Migrate assessment results (Azure Migrate allows exporting discovered VM and resource metadata to files such as Excel/CSV). Then, the toolkit’s `1-Collect` stage can ingest this file to create an inventory of resources. Ensure the exported data is in a format the toolkit expects (check the toolkit documentation for the exact file format or template required). For instance, you might need to supply a parameter like `-InventoryFile ` when running the collection script, pointing to the Azure Migrate output. +_This page is a practical guide for running the Region Selection Toolkit, helping you evaluate and choose the optimal Azure region for your workloads._ +Before proceeding with this Step by Step Guide, make sure you’ve completed the prerequisites and initial setup in [Getting Started](Setup-and-Prerequisites.md) ## Running the Toolkit Step-by-Step Once your environment is ready and you have determined the input method, follow these steps to run the Region Selection Toolkit. It’s important to run the stages in order, as each stage uses data from the previous one. The steps below assume you’re using PowerShell: ### Authenticate and Set Context Open a PowerShell prompt in the toolkit’s directory. If you’re in Azure Cloud Shell, you can navigate to the folder where you cloned the toolkit. -- **Log in to Azure:** Run `Connect-AzAccount` if you haven’t already authenticated. This will open a browser prompt (or use device code flow in Cloud Shell) for Azure login. After logging in, your session is connected to Azure. -- **Select the target subscription:** If you have multiple subscriptions, ensure the correct one is active. Use `Select-AzSubscription` `-SubscriptionId ` to switch the context to the subscription that contains the resources you want to analyse. This ensures all subsequent operations run against the intended subscription. (If you have only one subscription or have already set the context, this step is done automatically by Connect-AzAccount.) +- **Log in to Azure:** Run `Connect-AzAccount` to authenticate to Azure. +- **Select the target subscription/s:** If you have multiple subscriptions, ensure the correct one is active. Use `Select-AzSubscription` `-SubscriptionId ` to switch the context to the subscription that contains the resources you want to analyse. This ensures all subsequent operations run against the intended subscription. ### 1. Run 1-Collect (Inventory Collection) Next, gather the inventory of resources that will be evaluated. Run the script `Get-AzureServices.ps1` to collect the Azure resource inventory and properties, for yor relevant scope (resource group, subscription or multiple subscriptions). The script will generate a `resources.json` and a `summary.json` file in the same directory. The `resources.json` file contains the full inventory of resources and their properties, while the `summary.json` file contains a summary of the resources collected. -**If using Azure Resource Graph:** Run the collection script with your target scope. For example: +**If using Azure Resource Graph:** Run the `Get-AzureServices.ps1` script with your target scope. For example: - To collect the inventory for a single resource group, run the script as follows: @@ -65,25 +35,25 @@ Get-AzureServices.ps1 -scopeType subscription -subscriptionId Get-AzureServices.ps1 -multiSubscription -workloadFile ``` -**If using an Azure Migrate export:** Ensure the Azure Migrate data file is accessible. Run `Get-RessourcesFromAM.ps1` against an Azure Migrate `Assessment.xlsx` file to convert the VM & Disk SKUs into the same output as `Get-AzureServices.ps1` For example: +**If using an Azure Migrate export:** Run `Get-RessourcesFromAM.ps1` against an Azure Migrate `Assessment.xlsx` file to convert the VM & Disk SKUs into the same output as `Get-AzureServices.ps1` For example: ```powershell Get-RessourcesFromAM.ps1 -filePath "C:\path\to\Assessment.xlsx" -outputFile "C:\path\to\summary.json" ``` > [!NOTE] -> Before proceeding, get sure that the output files are successful generated in the `1-Collect` folder with the name `resources.json` as well as `summary.json`. +> Before proceeding, make sure that the output files are successful generated in the `1-Collect` folder with the name `resources.json` as well as `summary.json`. ### 2. Run 2-AvailabilityCheck (Service Availability) -After collecting inventory, continue with `2-AvailabilityCheck/Get-AvailabilityInformation.ps1`. This script evaluates the availability of Azure services, resources, and SKUs across different regions. When combined with the output from the `1-Collect` script, it provides a comprehensive overview of potential migration destinations, identifying feasible regions and the reasons for their suitability or limitations, such as availability constraints per region. +After collecting inventory, continue with `2-AvailabilityCheck/Get-AvailabilityInformation.ps1`. This script evaluates the availability of Azure services, resources, and SKUs across different regions. When combined with the output `resources.json`, it provides a comprehensive overview of potential migration destinations, identifying feasible regions based on Service Availability. Note that this functionality is not yet complete and is a work in progress. -It will generate a `services.json` file in the same directory, which contains the availability information for the services in the target region. Note that this functionality is not yet complete and is a work in progress. +It will generate a `services.json` file in the same directory, which contains the availability information for the services in the target region. Currently, this script associates every resource with its regional availability. Additionally, it maps the following SKUs to the regions where they are supported: -* microsoft.compute/disks -* microsoft.compute/virtualmachines -* microsoft.sql/managedinstances -* microsoft.sql/servers/databases -* microsoft.storage/storageaccounts +- microsoft.compute/disks +- microsoft.compute/virtualmachines +- microsoft.sql/managedinstances +- microsoft.sql/servers/databases +- microsoft.storage/storageaccounts 1. Navigate to the `2-AvailabilityCheck` folder and run the script using `.\Get-AvailabilityInformation.ps1`. The script will generate report files in the `2-AvailabilityCheck` folder. From cc7763aafb228ad87f13d8814f45a3d20bc78145 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 21:41:20 +0200 Subject: [PATCH 12/38] Refine Step-by-Step Guide for Azure scripts Updated the Step-by-Step Guide with clearer instructions and corrected typos. Enhanced explanations for the scripts and their outputs. --- docs/wiki/Step-by-Step-Guide.md | 58 +++++++++++++++------------------ 1 file changed, 26 insertions(+), 32 deletions(-) diff --git a/docs/wiki/Step-by-Step-Guide.md b/docs/wiki/Step-by-Step-Guide.md index 225c940..ebc6e40 100644 --- a/docs/wiki/Step-by-Step-Guide.md +++ b/docs/wiki/Step-by-Step-Guide.md @@ -17,6 +17,12 @@ Next, gather the inventory of resources that will be evaluated. Run the script ` **If using Azure Resource Graph:** Run the `Get-AzureServices.ps1` script with your target scope. For example: +- To include Cost Information add parameter `-includeCost $true`. If you include this parameter, it will also generate a CSV file in the same directory. This CSV file can be used later in `3-CostInformation`. Note: This might take some time depending on how long it takes to download the cost information. + +```powershell +Get-AzureServices.ps1 -includeCost $true +``` + - To collect the inventory for a single resource group, run the script as follows: ```powershell @@ -44,24 +50,30 @@ Get-RessourcesFromAM.ps1 -filePath "C:\path\to\Assessment.xlsx" -outputFile "C:\ > Before proceeding, make sure that the output files are successful generated in the `1-Collect` folder with the name `resources.json` as well as `summary.json`. ### 2. Run 2-AvailabilityCheck (Service Availability) -After collecting inventory, continue with `2-AvailabilityCheck/Get-AvailabilityInformation.ps1`. This script evaluates the availability of Azure services, resources, and SKUs across different regions. When combined with the output `resources.json`, it provides a comprehensive overview of potential migration destinations, identifying feasible regions based on Service Availability. Note that this functionality is not yet complete and is a work in progress. - -It will generate a `services.json` file in the same directory, which contains the availability information for the services in the target region. +After collecting inventory, continue with `Get-AvailabilityInformation.ps1`. This script evaluates the availability of Azure services, resources, and SKUs across all regions. When combined with the output from the 1-Collect script, it provides a comprehensive overview of potential migration destinations, identifying feasible regions and the reasons for their suitability or limitations, such as availability constraints per region. -Currently, this script associates every resource with its regional availability. Additionally, it maps the following SKUs to the regions where they are supported: +Note that this functionality is not yet complete and is a work in progress. Currently, this script associates every resource with its regional availability. Additionally, it maps the following SKUs to the regions where they are supported: - microsoft.compute/disks - microsoft.compute/virtualmachines - microsoft.sql/managedinstances - microsoft.sql/servers/databases - microsoft.storage/storageaccounts -1. Navigate to the `2-AvailabilityCheck` folder and run the script using `.\Get-AvailabilityInformation.ps1`. The script will generate report files in the `2-AvailabilityCheck` folder. +The `Get-AvailabilityInformation.ps1` script only needs to be run once to collect the availability information for all regions, which takes a little while. Run the following script: -#### Per region filter script -To check the availability of services in a specific region, it is necessary to first run the `Get-AvailabilityInformation.ps1` script which will collect service availability in all regions. The resulting json files is then used with the `Get-Region.ps1` script to determine specific service availability for one or more regions to be used for reporting eventually. Note that the `Get-AvailabilityInformation.ps1` script only needs to be run once to collect the availability information for all regions, which takes a little while. +```powershell +Get-AvailabilityInformation.ps1 +``` +It will generate a number of json files in the same directory the important one is the `Availability_Mapping.json` -After that, you can use the`Get-Region.ps1` script to check the availability of services in specific regions. Availability information is available in the `Availability_Mapping_.json` file, which is generated in the same directory as the script. +To check the availability of the resources in scope in a specific region run following script: + +```powershell +Get-Region.ps1 -Region +``` +This will generate `Availability_Mapping_.json` in the same directory. +Example: ```powershell Get-AvailabilityInformation.ps1 # Wait for the script to complete, this may take a while. @@ -72,38 +84,20 @@ Get-Region.ps1 -region ``` ### 3. Run 3-CostInformation (Cost Analysis) -This step contains two scripts. One that retrives cost information about resources inscope and second script uses public API to compare cost between the exsiting resource region and the one or more target regions. - -For this we use the Microsoft.CostManagement provider of each subscription. It will query the cost information for the resources collected in the previous step and compare cost diffrences of the regions in scope and generate a `cost.json` file in the same directory. Note that this is just standard pricing, which means customer discounts are **not** included. - -The input file is `resources.json` produced by the `1-Collect` script. - -1. Requires Az.CostManagement module version 0.4.2. -`PS1> Install-Module -Name Az.CostManagement` - -2. Navigate to the `3-CostInformation` folder and run the script using `.\Get-CostInformation.ps1`. The script will generate a CSV file in the current folder. -#### Perform-RegionComparison.ps1 +The Azure public pricing API is used, meaning that, prices are **not** customer-specific, but are only used to calculate the relative cost difference between regions for each meter ID. -This script builds on the collection step by comparing pricing across Azure regions for the meter ID's retrieved earlier. -The Azure public pricing API is used, meaning that: -- No login is needed for this step -- Prices are *not* customer-specific, but are only used to calculate the relative cost difference between regions for each meter - -As customer discounts tend to be linear (for example, ACD is a flat rate discount across all PAYG Azure spend), the relative price difference between regions can still be used to make an intelligent estimate of the cost impact of a workload move. - -Instructions for use: - -1. Prepare a list of target regions for comparison. This can be provided at the command line or stored in a variable before calling the script. -2. Ensure the `resources.json` file is present (from the running of the collector script). -2. Run the script using `.\Perform-RegionComparison.ps1`. The script will generate output files in the current folder. +Navigate to the `3-CostInformation` folder and run the script using the `Perform-RegionComparison.ps1` script to do cost comparison with target Region(s). For example: ``` text $regions = @("eastus", "brazilsouth", "australiaeast") -.\Perform-RegionComparison.ps1 -regions $regions -outputType json +.\Perform-RegionComparison.ps1 -regions $regions -outputFormat json -reso ``` +This will generate `region_comparison_RegionComparison.json` file + + From e3aea9a75d31cfb44953e7ee5d81833e112269a8 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 21:46:16 +0200 Subject: [PATCH 13/38] Revise setup instructions for PowerShell and Azure modules Updated the prerequisites for running the toolkit, including PowerShell version and Azure PowerShell modules. --- docs/wiki/Setup-and-Prerequisites.md | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/docs/wiki/Setup-and-Prerequisites.md b/docs/wiki/Setup-and-Prerequisites.md index d61a47b..0813054 100644 --- a/docs/wiki/Setup-and-Prerequisites.md +++ b/docs/wiki/Setup-and-Prerequisites.md @@ -10,11 +10,12 @@ Before using the toolkit, ensure the following prerequisites are met: - **Environment:** Prepare a PowerShell environment to run the toolkit. The toolkit is implemented in PowerShell scripts, so you can run it on Windows, Linux, or in the Azure Cloud Shell. Ensure you have **PowerShell Core 7.5.1** or later installed. -- **Azure PowerShell Modules:** Install the necessary Azure PowerShell modules (if using Azure Cloud Shell, these are already available). +- **Azure PowerShell Modules:** Install the necessary Azure PowerShell modules. - Azure Powershell module `Az.ResourceGraph 1.2.0` or later - Azure Powershell module `Az.Accounts 4.1.0` or later - + - Azure Powershell module `Az.CostManagement 0.4.2` or later + If using Azure migrate as input file: - Azure Powershell module `Az.Monitor 5.2.2` or later - Azure Powershell `ImportExcel` module for Azure Migrate script From 90529e3ef044546d169d0e4f0d26cf62df556245 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 22:13:29 +0200 Subject: [PATCH 14/38] Refine headings and clarify report generation steps Updated section headings and improved clarity in instructions regarding output files and report generation. --- docs/wiki/Step-by-Step-Guide.md | 39 +++++++++++++++++++++++++++++---- 1 file changed, 35 insertions(+), 4 deletions(-) diff --git a/docs/wiki/Step-by-Step-Guide.md b/docs/wiki/Step-by-Step-Guide.md index ebc6e40..2c4b10b 100644 --- a/docs/wiki/Step-by-Step-Guide.md +++ b/docs/wiki/Step-by-Step-Guide.md @@ -12,7 +12,7 @@ Open a PowerShell prompt in the toolkit’s directory. If you’re in Azure Clou - **Log in to Azure:** Run `Connect-AzAccount` to authenticate to Azure. - **Select the target subscription/s:** If you have multiple subscriptions, ensure the correct one is active. Use `Select-AzSubscription` `-SubscriptionId ` to switch the context to the subscription that contains the resources you want to analyse. This ensures all subsequent operations run against the intended subscription. -### 1. Run 1-Collect (Inventory Collection) +## 1. Run 1-Collect (Inventory Collection) Next, gather the inventory of resources that will be evaluated. Run the script `Get-AzureServices.ps1` to collect the Azure resource inventory and properties, for yor relevant scope (resource group, subscription or multiple subscriptions). The script will generate a `resources.json` and a `summary.json` file in the same directory. The `resources.json` file contains the full inventory of resources and their properties, while the `summary.json` file contains a summary of the resources collected. **If using Azure Resource Graph:** Run the `Get-AzureServices.ps1` script with your target scope. For example: @@ -47,9 +47,9 @@ Get-AzureServices.ps1 -multiSubscription -workloadFile Get-RessourcesFromAM.ps1 -filePath "C:\path\to\Assessment.xlsx" -outputFile "C:\path\to\summary.json" ``` > [!NOTE] -> Before proceeding, make sure that the output files are successful generated in the `1-Collect` folder with the name `resources.json` as well as `summary.json`. +> Before proceeding, make sure that the output files are successful generated in the `1-Collect` folder with the name `resources.json`, `summary.json` and a `CSV file` if cost was included. -### 2. Run 2-AvailabilityCheck (Service Availability) +## 2. Run 2-AvailabilityCheck (Service Availability) After collecting inventory, continue with `Get-AvailabilityInformation.ps1`. This script evaluates the availability of Azure services, resources, and SKUs across all regions. When combined with the output from the 1-Collect script, it provides a comprehensive overview of potential migration destinations, identifying feasible regions and the reasons for their suitability or limitations, such as availability constraints per region. Note that this functionality is not yet complete and is a work in progress. Currently, this script associates every resource with its regional availability. Additionally, it maps the following SKUs to the regions where they are supported: @@ -83,7 +83,7 @@ Get-Region.ps1 -region # Example3: Get-Region.ps1 -region "sweden central" ``` -### 3. Run 3-CostInformation (Cost Analysis) +## 3. Run 3-CostInformation (Cost Analysis) The Azure public pricing API is used, meaning that, prices are **not** customer-specific, but are only used to calculate the relative cost difference between regions for each meter ID. @@ -97,7 +97,38 @@ $regions = @("eastus", "brazilsouth", "australiaeast") This will generate `region_comparison_RegionComparison.json` file +## 4. 7-Report +This script generates formatted Excel (`.xlsx`)reports based on the output from the previous check script. + +Navigate to the `7-Report` folder and run the `Get-Report.ps1`, also specify the path to the availability information and the cost comparision path. For example: + +```powershell +.\Get-Report.ps1 -availabilityInfoPath ..\2-AvailabilityCheck\Availability_Mapping_.json -costComparisonPath ..\3-CostInformation\region_comparison_RegionComparison.json +``` +The script generates an `.xlsx` file in the `7-report` folder, named `Availability_Report_CURRENTTIMESTAMP`. + +Open the generated Excel file. The reports provide detailed information for each service, including: + +### Service Availability Report + +- **Resource type** +- **Resource count** +- **Implemented (origin) regions** +- **Implemented SKUs** +- **Availability in the Selected (target) regions** + +## Cost Comparison Report + +- **Meter ID** +- **Service Name** +- **Meter Name** +- **Product Name** +- **SKU Name** +- **Retail Price per region** +- **Price Difference to origin region per region** + +These reports help you analyze service compatibility and cost differences across different regions. From 02d9a1048b6f12607846277ee75a17104f7fa7b5 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 22:22:12 +0200 Subject: [PATCH 15/38] Create documentation for inventory collection Added documentation for inventory collection process using Get-AzureServices.ps1 and Get-RessourcesFromAM.ps1 scripts. --- docs/wiki/1-Collect.md | 37 +++++++++++++++++++++++++++++++++++++ 1 file changed, 37 insertions(+) create mode 100644 docs/wiki/1-Collect.md diff --git a/docs/wiki/1-Collect.md b/docs/wiki/1-Collect.md new file mode 100644 index 0000000..7d9560d --- /dev/null +++ b/docs/wiki/1-Collect.md @@ -0,0 +1,37 @@ +# 1-Collect (Inventory Collection) +Gathers the inventory of resources that will be evaluated. Run the script `Get-AzureServices.ps1` to collect the Azure resource inventory and properties, for yor relevant scope (resource group, subscription or multiple subscriptions). The script will generate a `resources.json` and a `summary.json` file in the same directory. The `resources.json` file contains the full inventory of resources and their properties, while the `summary.json` file contains a summary of the resources collected. + +## Examples +### If using Azure Resource Graph: +Run the `Get-AzureServices.ps1` script with your target scope. For example: + +- To include Cost Information add parameter `-includeCost $true`. If you include this parameter, it will also generate a CSV file in the same directory. This CSV file can be used later in `3-CostInformation`. Note: This might take some time depending on how long it takes to download the cost information. + +```powershell +Get-AzureServices.ps1 -includeCost $true +``` + +- To collect the inventory for a single resource group, run the script as follows: + +```powershell +Get-AzureServices.ps1 -scopeType resourceGroup -resourceGroupName -subscriptionId +``` + +- To collect the inventory for a single subscription, run the script as follows: + +```powershell +Get-AzureServices.ps1 -scopeType subscription -subscriptionId +``` + +- To collect the inventory for multiple subscriptions, you will need to create a json file containing the subscription ids in scope. See [here](./subscriptions.json) for a sample json file. Once the file is created, run the script as follows: + +```powershell +Get-AzureServices.ps1 -multiSubscription -workloadFile +``` + +### If using an Azure Migrate export: +Run `Get-RessourcesFromAM.ps1` against an Azure Migrate `Assessment.xlsx` file to convert the VM & Disk SKUs into the same output as `Get-AzureServices.ps1` For example: + +```powershell +Get-RessourcesFromAM.ps1 -filePath "C:\path\to\Assessment.xlsx" -outputFile "C:\path\to\summary.json" +``` From ecb514ced1523ced8014566b52c579dc8663cb97 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 22:34:09 +0200 Subject: [PATCH 16/38] Revise Step-by-Step Guide for Azure Inventory Collection Clarify instructions for collecting Azure resource inventory and improve formatting. --- docs/wiki/Step-by-Step-Guide.md | 26 +++----------------------- 1 file changed, 3 insertions(+), 23 deletions(-) diff --git a/docs/wiki/Step-by-Step-Guide.md b/docs/wiki/Step-by-Step-Guide.md index 2c4b10b..7e8beca 100644 --- a/docs/wiki/Step-by-Step-Guide.md +++ b/docs/wiki/Step-by-Step-Guide.md @@ -10,44 +10,24 @@ Once your environment is ready and you have determined the input method, follow ### Authenticate and Set Context Open a PowerShell prompt in the toolkit’s directory. If you’re in Azure Cloud Shell, you can navigate to the folder where you cloned the toolkit. - **Log in to Azure:** Run `Connect-AzAccount` to authenticate to Azure. -- **Select the target subscription/s:** If you have multiple subscriptions, ensure the correct one is active. Use `Select-AzSubscription` `-SubscriptionId ` to switch the context to the subscription that contains the resources you want to analyse. This ensures all subsequent operations run against the intended subscription. ## 1. Run 1-Collect (Inventory Collection) -Next, gather the inventory of resources that will be evaluated. Run the script `Get-AzureServices.ps1` to collect the Azure resource inventory and properties, for yor relevant scope (resource group, subscription or multiple subscriptions). The script will generate a `resources.json` and a `summary.json` file in the same directory. The `resources.json` file contains the full inventory of resources and their properties, while the `summary.json` file contains a summary of the resources collected. -**If using Azure Resource Graph:** Run the `Get-AzureServices.ps1` script with your target scope. For example: +Run the script `Get-AzureServices.ps1` to collect the Azure resource inventory and properties, for yor relevant scope (resource group, subscription or multiple subscriptions). The script will generate a `resources.json` and a `summary.json` file in the same directory. The `resources.json` file contains the full inventory of resources and their properties, while the `summary.json` file contains a summary of the resources collected. For examples on how to run the script for different scopes please see [1-Collect scope examples](1-Collect.md). -- To include Cost Information add parameter `-includeCost $true`. If you include this parameter, it will also generate a CSV file in the same directory. This CSV file can be used later in `3-CostInformation`. Note: This might take some time depending on how long it takes to download the cost information. +**If using Azure Resource Graph:** ```powershell Get-AzureServices.ps1 -includeCost $true ``` -- To collect the inventory for a single resource group, run the script as follows: - -```powershell -Get-AzureServices.ps1 -scopeType resourceGroup -resourceGroupName -subscriptionId -``` - -- To collect the inventory for a single subscription, run the script as follows: - -```powershell -Get-AzureServices.ps1 -scopeType subscription -subscriptionId -``` - -- To collect the inventory for multiple subscriptions, you will need to create a json file containing the subscription ids in scope. See [here](./subscriptions.json) for a sample json file. Once the file is created, run the script as follows: - -```powershell -Get-AzureServices.ps1 -multiSubscription -workloadFile -``` - **If using an Azure Migrate export:** Run `Get-RessourcesFromAM.ps1` against an Azure Migrate `Assessment.xlsx` file to convert the VM & Disk SKUs into the same output as `Get-AzureServices.ps1` For example: ```powershell Get-RessourcesFromAM.ps1 -filePath "C:\path\to\Assessment.xlsx" -outputFile "C:\path\to\summary.json" ``` > [!NOTE] -> Before proceeding, make sure that the output files are successful generated in the `1-Collect` folder with the name `resources.json`, `summary.json` and a `CSV file` if cost was included. +> Before proceeding, make sure that the output files (`resources.json`, `summary.json` and a `CSV file`) are generated in the `1-Collect` folder. ## 2. Run 2-AvailabilityCheck (Service Availability) After collecting inventory, continue with `Get-AvailabilityInformation.ps1`. This script evaluates the availability of Azure services, resources, and SKUs across all regions. When combined with the output from the 1-Collect script, it provides a comprehensive overview of potential migration destinations, identifying feasible regions and the reasons for their suitability or limitations, such as availability constraints per region. From 5f5a4b97703bcfff5a25137a49d1460f9f674f3b Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 22:41:07 +0200 Subject: [PATCH 17/38] Add availability check documentation This document outlines the availability check script for Azure services, detailing its functionality, usage, and examples. --- docs/wiki/2-AvailabilityCheck.md | 37 ++++++++++++++++++++++++++++++++ 1 file changed, 37 insertions(+) create mode 100644 docs/wiki/2-AvailabilityCheck.md diff --git a/docs/wiki/2-AvailabilityCheck.md b/docs/wiki/2-AvailabilityCheck.md new file mode 100644 index 0000000..3463d1c --- /dev/null +++ b/docs/wiki/2-AvailabilityCheck.md @@ -0,0 +1,37 @@ +# 2-AvailabilityCheck + +## Availability check script +This script evaluates the availability of Azure services, resources, and SKUs across all regions. When combined with the output from the 1-Collect script, it provides a comprehensive overview of potential migration destinations, identifying feasible regions and the reasons for their suitability or limitations, such as availability constraints per region. + +Note that this functionality is not yet complete and is a work in progress. Currently, this script associates every resource with its regional availability. Additionally, it maps the following SKUs to the regions where they are supported: +- microsoft.compute/disks +- microsoft.compute/virtualmachines +- microsoft.sql/managedinstances +- microsoft.sql/servers/databases +- microsoft.storage/storageaccounts + +The `Get-AvailabilityInformation.ps1` script only needs to be run once to collect the availability information for all regions, which takes a little while. Run the following script: + +```powershell +Get-AvailabilityInformation.ps1 +``` +It will generate a number of json files in the same directory the important one is the `Availability_Mapping.json` + +## Filter by Region script + +To check the availability of the resources in scope in a specific region run following script: + +```powershell +Get-Region.ps1 -Region +``` +This will generate `Availability_Mapping_.json` in the same directory. + +## Example: +```powershell +Get-AvailabilityInformation.ps1 +# Wait for the script to complete, this may take a while. +Get-Region.ps1 -region +# Example1: Get-Region.ps1 -region "east us" +# Example2: Get-Region.ps1 -region "west us" +# Example3: Get-Region.ps1 -region "sweden central" +``` From d97d16977dc42843447b46003ed4d7b731b7a465 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 22:50:59 +0200 Subject: [PATCH 18/38] Refine headings and correct typos in guide Updated section headings and fixed minor typos in the Step-by-Step Guide. --- docs/wiki/Step-by-Step-Guide.md | 34 ++++++++------------------------- 1 file changed, 8 insertions(+), 26 deletions(-) diff --git a/docs/wiki/Step-by-Step-Guide.md b/docs/wiki/Step-by-Step-Guide.md index 7e8beca..3598ee5 100644 --- a/docs/wiki/Step-by-Step-Guide.md +++ b/docs/wiki/Step-by-Step-Guide.md @@ -11,9 +11,9 @@ Once your environment is ready and you have determined the input method, follow Open a PowerShell prompt in the toolkit’s directory. If you’re in Azure Cloud Shell, you can navigate to the folder where you cloned the toolkit. - **Log in to Azure:** Run `Connect-AzAccount` to authenticate to Azure. -## 1. Run 1-Collect (Inventory Collection) +## Run 1-Collect (Inventory Collection) -Run the script `Get-AzureServices.ps1` to collect the Azure resource inventory and properties, for yor relevant scope (resource group, subscription or multiple subscriptions). The script will generate a `resources.json` and a `summary.json` file in the same directory. The `resources.json` file contains the full inventory of resources and their properties, while the `summary.json` file contains a summary of the resources collected. For examples on how to run the script for different scopes please see [1-Collect scope examples](1-Collect.md). +Run the script `Get-AzureServices.ps1` to collect the Azure resource inventory and properties, for yor relevant scope (resource group, subscription or multiple subscriptions). The script will generate a `resources.json` and a `summary.json` file in the same directory. The `resources.json` file contains the full inventory of resources and their properties, while the `summary.json` file contains a summary of the resources collected. For examples on how to run the script for different scopes please see [1-Collect Examples](1-Collect.md). **If using Azure Resource Graph:** @@ -29,41 +29,23 @@ Get-RessourcesFromAM.ps1 -filePath "C:\path\to\Assessment.xlsx" -outputFile "C:\ > [!NOTE] > Before proceeding, make sure that the output files (`resources.json`, `summary.json` and a `CSV file`) are generated in the `1-Collect` folder. -## 2. Run 2-AvailabilityCheck (Service Availability) -After collecting inventory, continue with `Get-AvailabilityInformation.ps1`. This script evaluates the availability of Azure services, resources, and SKUs across all regions. When combined with the output from the 1-Collect script, it provides a comprehensive overview of potential migration destinations, identifying feasible regions and the reasons for their suitability or limitations, such as availability constraints per region. +## Run 2-AvailabilityCheck (Service Availability) -Note that this functionality is not yet complete and is a work in progress. Currently, this script associates every resource with its regional availability. Additionally, it maps the following SKUs to the regions where they are supported: -- microsoft.compute/disks -- microsoft.compute/virtualmachines -- microsoft.sql/managedinstances -- microsoft.sql/servers/databases -- microsoft.storage/storageaccounts +This script will check the availability of the services in the target region based on the inventory collected in the previous step. Note that this functionality is not yet complete and is a work in progress. For examples on how to run the script please see [2-AvailabilityCheck Examples](2-AvailabilityCheck.md) -The `Get-AvailabilityInformation.ps1` script only needs to be run once to collect the availability information for all regions, which takes a little while. Run the following script: +After collecting inventory, continue with `Get-AvailabilityInformation.ps1`. It will generate a number of json files in the same directory the important one is the `Availability_Mapping.json` Run the following script: ```powershell Get-AvailabilityInformation.ps1 ``` -It will generate a number of json files in the same directory the important one is the `Availability_Mapping.json` -To check the availability of the resources in scope in a specific region run following script: +Check the availability of the resources in scope for a specific region. This will generate a file named `Availability_Mapping_.json` in the same directory. Run the following script: ```powershell Get-Region.ps1 -Region ``` -This will generate `Availability_Mapping_.json` in the same directory. -Example: -```powershell -Get-AvailabilityInformation.ps1 -# Wait for the script to complete, this may take a while. -Get-Region.ps1 -region -# Example1: Get-Region.ps1 -region "east us" -# Example2: Get-Region.ps1 -region "west us" -# Example3: Get-Region.ps1 -region "sweden central" -``` - -## 3. Run 3-CostInformation (Cost Analysis) +## Run 3-CostInformation (Cost Analysis) The Azure public pricing API is used, meaning that, prices are **not** customer-specific, but are only used to calculate the relative cost difference between regions for each meter ID. @@ -77,7 +59,7 @@ $regions = @("eastus", "brazilsouth", "australiaeast") This will generate `region_comparison_RegionComparison.json` file -## 4. 7-Report +## Run 7-Report This script generates formatted Excel (`.xlsx`)reports based on the output from the previous check script. From 79593a73b9cc3e412e6bc5751562fe03c5104898 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 22:52:06 +0200 Subject: [PATCH 19/38] Change code block to powershell syntax Updated code block to use powershell syntax highlighting. --- docs/wiki/Step-by-Step-Guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/wiki/Step-by-Step-Guide.md b/docs/wiki/Step-by-Step-Guide.md index 3598ee5..dd52b32 100644 --- a/docs/wiki/Step-by-Step-Guide.md +++ b/docs/wiki/Step-by-Step-Guide.md @@ -52,7 +52,7 @@ The Azure public pricing API is used, meaning that, prices are **not** customer- Navigate to the `3-CostInformation` folder and run the script using the `Perform-RegionComparison.ps1` script to do cost comparison with target Region(s). For example: -``` text +```powershell $regions = @("eastus", "brazilsouth", "australiaeast") .\Perform-RegionComparison.ps1 -regions $regions -outputFormat json -reso ``` From 10f71d623c7f1acb8ec1adc6943789a9b913e269 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 22:55:01 +0200 Subject: [PATCH 20/38] Revise title and code block formatting in CostInformation Updated the title and formatting in the CostInformation document. --- docs/wiki/3-CostInformation.md | 103 +++++++++++++++++++++++++++++++++ 1 file changed, 103 insertions(+) create mode 100644 docs/wiki/3-CostInformation.md diff --git a/docs/wiki/3-CostInformation.md b/docs/wiki/3-CostInformation.md new file mode 100644 index 0000000..e31bc44 --- /dev/null +++ b/docs/wiki/3-CostInformation.md @@ -0,0 +1,103 @@ +# 3-CostInformation +Cost data retrieval and region comparison + +### Get-CostInformation.ps1 + +This script is intended to take a collection of given resource IDs and return the cost incurred during previous months, grouped as needed. For this we use the Microsoft.CostManagement provider of each subscription. This means one call of the Cost Management PowerShell module per subscription. + +The input file is produced by the Get-AzureServices.ps1 script. + +Requires Az.CostManagement module version 0.4.2. + +`PS1> Install-Module -Name Az.CostManagement` + +Instructions for use: + +1. Log on to Azure using `Connect-AzAccount`. Ensure that you have Cost Management Reader access to each subscription listed in the resources file (default `resources.json`) +2. Navigate to the 3-CostInformation folder and run the script using `.\Get-CostInformation.ps1`. The script will generate a CSV file in the current folder. + +#### Documentation links - cost retrieval +Documentation regarding the Az.CostManagement module is not always straightforward. Helpful links are: + +| Documentation | Link | +| -------- | ------- | +| Cost Management Query (API) | [Link](https://learn.microsoft.com/en-us/rest/api/cost-management/query/usage) | +| Az.CostManagement Query (PowerShell) | [Link](https://learn.microsoft.com/en-us/powershell/module/az.costmanagement/invoke-azcostmanagementquery) | + +Valid dimensions for grouping are: + +``` text +AccountName +BenefitId +BenefitName +BillingAccountId +BillingMonth +BillingPeriod +ChargeType +ConsumedService +CostAllocationRuleName +DepartmentName +EnrollmentAccountName +Frequency +InvoiceNumber +MarkupRuleName +Meter +MeterCategory +MeterId +MeterSubcategory +PartNumber +PricingModel +PublisherType +ReservationId +ReservationName +ResourceGroup +ResourceGroupName +ResourceGuid +ResourceId +ResourceLocation +ResourceType +ServiceName +ServiceTier +SubscriptionId +SubscriptionName +``` + +### Perform-RegionComparison.ps1 + +This script builds on the collection step by comparing pricing across Azure regions for the meter ID's retrieved earlier. +The Azure public pricing API is used, meaning that: +* No login is needed for this step +* Prices are *not* customer-specific, but are only used to calculate the relative cost difference between regions for each meter + +As customer discounts tend to be linear (for example, ACD is a flat rate discount across all PAYG Azure spend), the relative price difference between regions can still be used to make an intelligent estimate of the cost impact of a workload move. + +Instructions for use: + +1. Prepare a list of target regions for comparison. This can be provided at the command line or stored in a variable before calling the script. +2. Ensure the `resources.json` file is present (from the running of the collector script). +2. Run the script using `.\Perform-RegionComparison.ps1`. The script will generate output files in the current folder. + +#### Example + +```powershell +$regions = @("eastus", "brazilsouth", "australiaeast") +.\Perform-RegionComparison.ps1 -regions $regions -outputType json +``` + +#### Outputs + +Depending on the chosen output format, the script outputs four sets of data: + +| Dataset | Contents | +| -------- | ------- | +| `inputs` | The input data used for calling the pricing API (for reference only) | +| `pricemap` | An overview of which regions are cheaper / similarly-priced / more expensive for each meter ID | +| `prices` | Prices for each source/target region mapping by meter ID | +| `uomerrors` | A list of any eventual mismatches of Unit Of Measure between regions | + + +#### Documentation links - region comparison + +| Documentation | Link | +| -------- | ------- | +| Azure pricing API | [Link](https://learn.microsoft.com/en-us/rest/api/cost-management/retail-prices/azure-retail-prices) | From 227581185a3722455f881c63393138c2218e938a Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 22:57:57 +0200 Subject: [PATCH 21/38] Clarify Cost Analysis section with additional details Added a reference to the 3-CostInformation details in the Cost Analysis section. --- docs/wiki/Step-by-Step-Guide.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/wiki/Step-by-Step-Guide.md b/docs/wiki/Step-by-Step-Guide.md index dd52b32..c1e52fe 100644 --- a/docs/wiki/Step-by-Step-Guide.md +++ b/docs/wiki/Step-by-Step-Guide.md @@ -47,7 +47,7 @@ Get-Region.ps1 -Region ## Run 3-CostInformation (Cost Analysis) -The Azure public pricing API is used, meaning that, prices are **not** customer-specific, but are only used to calculate the relative cost difference between regions for each meter ID. +The Azure public pricing API is used, meaning that, prices are **not** customer-specific, but are only used to calculate the relative cost difference between regions for each meter ID. Please see [3-CostInformation](3-CostInformation.md) for more details. Navigate to the `3-CostInformation` folder and run the script using the `Perform-RegionComparison.ps1` script to do cost comparison with target Region(s). From ef76a9d9d94ac9f6399626627189782ad7523b05 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 23:03:55 +0200 Subject: [PATCH 22/38] Add documentation for report generation script This document outlines the functionality of the report generation script, including the types of reports produced, dependencies, and usage examples. --- docs/wiki/7-Report.md | 38 ++++++++++++++++++++++++++++++++++++++ 1 file changed, 38 insertions(+) create mode 100644 docs/wiki/7-Report.md diff --git a/docs/wiki/7-Report.md b/docs/wiki/7-Report.md new file mode 100644 index 0000000..0d9e4eb --- /dev/null +++ b/docs/wiki/7-Report.md @@ -0,0 +1,38 @@ +# 7-Report + +This script generates formatted Excel (`.xlsx`)reports based on the output from the previous check script. The reports provide detailed information for each service, including: + +### Service Availability Report + +- **Resource type** +- **Resource count** +- **Implemented (origin) regions** +- **Implemented SKUs** +- **Availability in the Selected (target) regions** + +## Cost Comparison Report + +- **Meter ID** +- **Service Name** +- **Meter Name** +- **Product Name** +- **SKU Name** +- **Retail Price per region** +- **Price Difference to origin region per region** + +These reports help you analyze service compatibility and cost differences across different regions. + +## Dependencies + +- This script requires the `ImportExcel` PowerShell module. +- The script requires you to have run either the `2-AvailabilityCheck/Get-Region.ps1` or `3-CostInformation/Perform-RegionComparison.ps1` or both scripts to generate the necessary JSON input files for availability and cost data. + +## Example + +If you have created one or more availability JSON files using the `2-AvailabilityCheck/Get-Region.ps1` script, run the following commands, replacing the path with your actual file path(s): + +```powershell +.\Get-Report.ps1 -availabilityInfoPath `@("..\2-AvailabilityCheck\Availability_Mapping_Asia_Pacific.json", "..\2-AvailabilityCheck\Availability_Mapping_Europe.json")` -costComparisonPath "..\3-CostInformation\region_comparison_prices.json" + +``` +The script generates an `.xlsx` and `.csv` files in the `7-report` folder, named `Availability_Report_CURRENTTIMESTAMP`. From 0968cad63ab0b43eac9acbe8b8a68b6735611488 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 23:07:56 +0200 Subject: [PATCH 23/38] Add navigation steps for script execution Updated instructions to include navigation to relevant folders before running scripts. --- docs/wiki/Step-by-Step-Guide.md | 29 +++++------------------------ 1 file changed, 5 insertions(+), 24 deletions(-) diff --git a/docs/wiki/Step-by-Step-Guide.md b/docs/wiki/Step-by-Step-Guide.md index c1e52fe..afb1b72 100644 --- a/docs/wiki/Step-by-Step-Guide.md +++ b/docs/wiki/Step-by-Step-Guide.md @@ -13,7 +13,7 @@ Open a PowerShell prompt in the toolkit’s directory. If you’re in Azure Clou ## Run 1-Collect (Inventory Collection) -Run the script `Get-AzureServices.ps1` to collect the Azure resource inventory and properties, for yor relevant scope (resource group, subscription or multiple subscriptions). The script will generate a `resources.json` and a `summary.json` file in the same directory. The `resources.json` file contains the full inventory of resources and their properties, while the `summary.json` file contains a summary of the resources collected. For examples on how to run the script for different scopes please see [1-Collect Examples](1-Collect.md). +Navigate to the `1-Collect` folder and run the script `Get-AzureServices.ps1` to collect the Azure resource inventory and properties, for yor relevant scope (resource group, subscription or multiple subscriptions). The script will generate a `resources.json` and a `summary.json` file in the same directory. The `resources.json` file contains the full inventory of resources and their properties, while the `summary.json` file contains a summary of the resources collected. For examples on how to run the script for different scopes please see [1-Collect Examples](1-Collect.md). **If using Azure Resource Graph:** @@ -33,7 +33,7 @@ Get-RessourcesFromAM.ps1 -filePath "C:\path\to\Assessment.xlsx" -outputFile "C:\ This script will check the availability of the services in the target region based on the inventory collected in the previous step. Note that this functionality is not yet complete and is a work in progress. For examples on how to run the script please see [2-AvailabilityCheck Examples](2-AvailabilityCheck.md) -After collecting inventory, continue with `Get-AvailabilityInformation.ps1`. It will generate a number of json files in the same directory the important one is the `Availability_Mapping.json` Run the following script: +Navigate to the `2-AvailabilityCheck` folder and run `Get-AvailabilityInformation.ps1`. It will generate a number of json files in the same directory the important one is the `Availability_Mapping.json` Run the following script: ```powershell Get-AvailabilityInformation.ps1 @@ -61,7 +61,8 @@ This will generate `region_comparison_RegionComparison.json` file ## Run 7-Report -This script generates formatted Excel (`.xlsx`)reports based on the output from the previous check script. +This script generates formatted Excel (`.xlsx`) reports based on the output from the previous check script. Please see [7-Report](7-Report.md) for more details. + Navigate to the `7-Report` folder and run the `Get-Report.ps1`, also specify the path to the availability information and the cost comparision path. For example: @@ -70,27 +71,7 @@ Navigate to the `7-Report` folder and run the `Get-Report.ps1`, also specify the ``` The script generates an `.xlsx` file in the `7-report` folder, named `Availability_Report_CURRENTTIMESTAMP`. -Open the generated Excel file. The reports provide detailed information for each service, including: - -### Service Availability Report - -- **Resource type** -- **Resource count** -- **Implemented (origin) regions** -- **Implemented SKUs** -- **Availability in the Selected (target) regions** - -## Cost Comparison Report - -- **Meter ID** -- **Service Name** -- **Meter Name** -- **Product Name** -- **SKU Name** -- **Retail Price per region** -- **Price Difference to origin region per region** - -These reports help you analyze service compatibility and cost differences across different regions. +Open the generated Excel file. The reports provide detailed information for each service. These reports help you analyze service compatibility and cost differences across different regions. From 84e22cd92f4e63f5877c31eb2b4f0016a7e2ba99 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 23:09:47 +0200 Subject: [PATCH 24/38] Create readme.md --- docs/wiki/archive/readme.md | 1 + 1 file changed, 1 insertion(+) create mode 100644 docs/wiki/archive/readme.md diff --git a/docs/wiki/archive/readme.md b/docs/wiki/archive/readme.md new file mode 100644 index 0000000..8b13789 --- /dev/null +++ b/docs/wiki/archive/readme.md @@ -0,0 +1 @@ + From 1ab8600d785d30155d24eb151b2246c19d6cf261 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 23:10:14 +0200 Subject: [PATCH 25/38] Delete docs/wiki/archive/readme.md --- docs/wiki/archive/readme.md | 1 - 1 file changed, 1 deletion(-) delete mode 100644 docs/wiki/archive/readme.md diff --git a/docs/wiki/archive/readme.md b/docs/wiki/archive/readme.md deleted file mode 100644 index 8b13789..0000000 --- a/docs/wiki/archive/readme.md +++ /dev/null @@ -1 +0,0 @@ - From 6acb1027782e475cb4822fd4efb805343cc13703 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 23:11:01 +0200 Subject: [PATCH 26/38] Create readme.md --- docs/wiki/archive/readme.md | 1 + 1 file changed, 1 insertion(+) create mode 100644 docs/wiki/archive/readme.md diff --git a/docs/wiki/archive/readme.md b/docs/wiki/archive/readme.md new file mode 100644 index 0000000..8b13789 --- /dev/null +++ b/docs/wiki/archive/readme.md @@ -0,0 +1 @@ + From 86ab02bc3d0485ce02cc6797effbb5bcc2faa679 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 23:16:11 +0200 Subject: [PATCH 27/38] Rename docs/wiki/ToolkitHowTo.md to docs/wiki/archive/ToolkitHowTo.md --- docs/wiki/{ => archive}/ToolkitHowTo.md | 0 1 file changed, 0 insertions(+), 0 deletions(-) rename docs/wiki/{ => archive}/ToolkitHowTo.md (100%) diff --git a/docs/wiki/ToolkitHowTo.md b/docs/wiki/archive/ToolkitHowTo.md similarity index 100% rename from docs/wiki/ToolkitHowTo.md rename to docs/wiki/archive/ToolkitHowTo.md From 68fb6ffbc8f280180491fdf6760747f0326bdb9f Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 23:16:28 +0200 Subject: [PATCH 28/38] Delete docs/wiki/archive/readme.md --- docs/wiki/archive/readme.md | 1 - 1 file changed, 1 deletion(-) delete mode 100644 docs/wiki/archive/readme.md diff --git a/docs/wiki/archive/readme.md b/docs/wiki/archive/readme.md deleted file mode 100644 index 8b13789..0000000 --- a/docs/wiki/archive/readme.md +++ /dev/null @@ -1 +0,0 @@ - From 564280d38b46bef74dc094c77e5c211d2c5972c6 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 23:17:11 +0200 Subject: [PATCH 29/38] Rename docs/wiki/scenarios.md to docs/wiki/archive/scenarios.md --- docs/wiki/archive/scenarios.md | 1 + docs/wiki/scenarios.md | 0 2 files changed, 1 insertion(+) create mode 100644 docs/wiki/archive/scenarios.md delete mode 100644 docs/wiki/scenarios.md diff --git a/docs/wiki/archive/scenarios.md b/docs/wiki/archive/scenarios.md new file mode 100644 index 0000000..d3f5a12 --- /dev/null +++ b/docs/wiki/archive/scenarios.md @@ -0,0 +1 @@ + diff --git a/docs/wiki/scenarios.md b/docs/wiki/scenarios.md deleted file mode 100644 index e69de29..0000000 From ba72e213452758daeed5bd1880db1eea59dcfa0e Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 23:18:44 +0200 Subject: [PATCH 30/38] Rename 1-Collect/readme.md to docs/wiki/archive/1-Collect readme.md --- 1-Collect/readme.md => docs/wiki/archive/1-Collect readme.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) rename 1-Collect/readme.md => docs/wiki/archive/1-Collect readme.md (97%) diff --git a/1-Collect/readme.md b/docs/wiki/archive/1-Collect readme.md similarity index 97% rename from 1-Collect/readme.md rename to docs/wiki/archive/1-Collect readme.md index 096bb4c..103b363 100644 --- a/1-Collect/readme.md +++ b/docs/wiki/archive/1-Collect readme.md @@ -4,4 +4,4 @@ This script is intended to assess Azure services currently implemented in a give To use the script do the following from a powershell command line: 1. Log on to Azure using `Connect-AzAccount` and select the appropriate subscription using `Select-AzSubscription`. -2. Navigate to the 1-Collect folder and run the script using `.\Get-AzureServices.ps1`. The script will generate a report in the `1-Collect` folder with the name `resources.json` as well as `summary.json`. \ No newline at end of file +2. Navigate to the 1-Collect folder and run the script using `.\Get-AzureServices.ps1`. The script will generate a report in the `1-Collect` folder with the name `resources.json` as well as `summary.json`. From 5ae45dbbbc8c31bc1ec77590cd9b46519ce3f441 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 23:19:29 +0200 Subject: [PATCH 31/38] Rename 2-AvailabilityCheck/readme.md to docs/wiki/archive/2-AvailabilityCheck readme.md --- .../wiki/archive/2-AvailabilityCheck readme.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) rename 2-AvailabilityCheck/readme.md => docs/wiki/archive/2-AvailabilityCheck readme.md (99%) diff --git a/2-AvailabilityCheck/readme.md b/docs/wiki/archive/2-AvailabilityCheck readme.md similarity index 99% rename from 2-AvailabilityCheck/readme.md rename to docs/wiki/archive/2-AvailabilityCheck readme.md index 2c1a47a..74339ba 100644 --- a/2-AvailabilityCheck/readme.md +++ b/docs/wiki/archive/2-AvailabilityCheck readme.md @@ -23,4 +23,4 @@ This script processes the output from the previous script to extract data for a To use the script do the following from a powershell command line: 1. Execute the `.\Get-AvailabilityInformation.ps1` script first, as previously outlined. -2. Run `.\Get-Region.ps1` from `2-AvailabilityCheck` folder. The script will generate report files in the `2-AvailabilityCheck` folder. \ No newline at end of file +2. Run `.\Get-Region.ps1` from `2-AvailabilityCheck` folder. The script will generate report files in the `2-AvailabilityCheck` folder. From 1948521515e728d258d77f9c95e22a4c5eec0599 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 23:20:19 +0200 Subject: [PATCH 32/38] Rename 3-CostInformation/README.md to docs/wiki/archive/3-CostInformation README.md --- .../README.md => docs/wiki/archive/3-CostInformation README.md | 0 1 file changed, 0 insertions(+), 0 deletions(-) rename 3-CostInformation/README.md => docs/wiki/archive/3-CostInformation README.md (100%) diff --git a/3-CostInformation/README.md b/docs/wiki/archive/3-CostInformation README.md similarity index 100% rename from 3-CostInformation/README.md rename to docs/wiki/archive/3-CostInformation README.md From b4c209a5805817d6777d3434ac74c6d06ffb82ad Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 23:21:15 +0200 Subject: [PATCH 33/38] Rename 7-Report/readme.md to docs/wiki/archive/7-Report readme.md --- 7-Report/readme.md => docs/wiki/archive/7-Report readme.md | 0 1 file changed, 0 insertions(+), 0 deletions(-) rename 7-Report/readme.md => docs/wiki/archive/7-Report readme.md (100%) diff --git a/7-Report/readme.md b/docs/wiki/archive/7-Report readme.md similarity index 100% rename from 7-Report/readme.md rename to docs/wiki/archive/7-Report readme.md From da844e7067b33e4bb18fa66d5c064449aa4fb306 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 23:25:12 +0200 Subject: [PATCH 34/38] Rename docs/wiki/Home.md to docs/wiki/archive/Home.md --- docs/wiki/{ => archive}/Home.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) rename docs/wiki/{ => archive}/Home.md (98%) diff --git a/docs/wiki/Home.md b/docs/wiki/archive/Home.md similarity index 98% rename from docs/wiki/Home.md rename to docs/wiki/archive/Home.md index b2a473f..1167a9f 100644 --- a/docs/wiki/Home.md +++ b/docs/wiki/archive/Home.md @@ -40,4 +40,4 @@ for each potential region. **Recommendation Report** - After analysis, the toolkit produces a clear report or summary of findings. You’ll get a list of recommended region(s) ranked or filtered based on the defined criteria, along with the reasoning (e.g. “Region A is recommended due to full service availability and lowest cost, with moderate sustainability score”). This report can be used to present options to stakeholders or as a blueprint for the actual migration/deployment. \ No newline at end of file + After analysis, the toolkit produces a clear report or summary of findings. You’ll get a list of recommended region(s) ranked or filtered based on the defined criteria, along with the reasoning (e.g. “Region A is recommended due to full service availability and lowest cost, with moderate sustainability score”). This report can be used to present options to stakeholders or as a blueprint for the actual migration/deployment. From 2da3e8603bfc31b45544f33d8b7fc99afd403d8f Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Tue, 21 Oct 2025 23:27:27 +0200 Subject: [PATCH 35/38] Update README with new wiki navigation links --- README.md | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index f845746..7feaf34 100644 --- a/README.md +++ b/README.md @@ -4,8 +4,9 @@ This project is next phase of Azure to Azure Migration toolkit driven by [jfaurs ## Wiki navigation -- [What is Regions selection toolkit](./docs/wiki/Home.md) -- [How to use Region Selection Toolkit](./docs/wiki/ToolkitHowTo.md) +- [What is Regions selection toolkit](./docs/wiki/Introduction-to-Azure-Region-Selection-Toolkit.md) +- [Getting Started](./docs/wiki/Setup-and-Prerequisites.md) +- [How to use Region Selection Toolkit](./docs/wiki/Step-by-Step-Guide.md) - [Frequently Asked Questions](./docs/wiki/FAQ.md) - [Contributing](./docs/wiki/Contribution.md) - [Known Issues](./docs/wiki/KnownIssues.md) From ff464609102c015f60f143ddafbc51c1240b4444 Mon Sep 17 00:00:00 2001 From: Linda Petersson <38290892+lvlindv@users.noreply.github.com> Date: Wed, 22 Oct 2025 11:50:42 +0200 Subject: [PATCH 36/38] Delete docs/wiki/archive directory --- docs/wiki/archive/1-Collect readme.md | 7 -- .../archive/2-AvailabilityCheck readme.md | 26 ----- docs/wiki/archive/3-CostInformation README.md | 104 ------------------ docs/wiki/archive/7-Report readme.md | 40 ------- docs/wiki/archive/Home.md | 43 -------- docs/wiki/archive/ToolkitHowTo.md | 60 ---------- docs/wiki/archive/scenarios.md | 1 - 7 files changed, 281 deletions(-) delete mode 100644 docs/wiki/archive/1-Collect readme.md delete mode 100644 docs/wiki/archive/2-AvailabilityCheck readme.md delete mode 100644 docs/wiki/archive/3-CostInformation README.md delete mode 100644 docs/wiki/archive/7-Report readme.md delete mode 100644 docs/wiki/archive/Home.md delete mode 100644 docs/wiki/archive/ToolkitHowTo.md delete mode 100644 docs/wiki/archive/scenarios.md diff --git a/docs/wiki/archive/1-Collect readme.md b/docs/wiki/archive/1-Collect readme.md deleted file mode 100644 index 103b363..0000000 --- a/docs/wiki/archive/1-Collect readme.md +++ /dev/null @@ -1,7 +0,0 @@ -## Assessment script - -This script is intended to assess Azure services currently implemented in a given scope. The script will produce a report containing information about all services in scope as well as summary report detailing the number of individual services, as well as SKUs in use if relevant. - -To use the script do the following from a powershell command line: -1. Log on to Azure using `Connect-AzAccount` and select the appropriate subscription using `Select-AzSubscription`. -2. Navigate to the 1-Collect folder and run the script using `.\Get-AzureServices.ps1`. The script will generate a report in the `1-Collect` folder with the name `resources.json` as well as `summary.json`. diff --git a/docs/wiki/archive/2-AvailabilityCheck readme.md b/docs/wiki/archive/2-AvailabilityCheck readme.md deleted file mode 100644 index 74339ba..0000000 --- a/docs/wiki/archive/2-AvailabilityCheck readme.md +++ /dev/null @@ -1,26 +0,0 @@ -# Current implementation to Azure availabilities mapping scripts - -## Availability check script - -This script evaluates the availability of Azure services, resources, and SKUs across different regions. When combined with the output from the 1-Collect script, it provides a comprehensive overview of potential migration destinations, identifying feasible regions and the reasons for their suitability or limitations, such as availability constraints per region. - -Currently, this script associates every resource with its regional availability. Additionally, it maps the following SKUs to the regions where they are supported: -* microsoft.compute/disks -* microsoft.compute/virtualmachines -* microsoft.sql/managedinstances -* microsoft.sql/servers/databases -* microsoft.storage/storageaccounts - -To use the script do the following from a powershell command line: -1. Log on to Azure using `Connect-AzAccount` and select the appropriate subscription using `Select-AzSubscription`. -2. Run `.\Get-AzureServices.ps1` from `1-Collect` folder. -3. Get sure that the output files are successful generated in the `1-Collect` folder with the name `resources.json` as well as `summary.json`. -4. Navigate to the `2-AvailabilityCheck` folder and run the script using `.\Get-AvailabilityInformation.ps1`. The script will generate report files in the `2-AvailabilityCheck` folder. - -## Per region filter script - -This script processes the output from the previous script to extract data for a single, specified region. - -To use the script do the following from a powershell command line: -1. Execute the `.\Get-AvailabilityInformation.ps1` script first, as previously outlined. -2. Run `.\Get-Region.ps1` from `2-AvailabilityCheck` folder. The script will generate report files in the `2-AvailabilityCheck` folder. diff --git a/docs/wiki/archive/3-CostInformation README.md b/docs/wiki/archive/3-CostInformation README.md deleted file mode 100644 index d674b57..0000000 --- a/docs/wiki/archive/3-CostInformation README.md +++ /dev/null @@ -1,104 +0,0 @@ -# Cost data retrieval and region comparison - -## About the scripts - -### Get-CostInformation.ps1 - -This script is intended to take a collection of given resource IDs and return the cost incurred during previous months, grouped as needed. For this we use the Microsoft.CostManagement provider of each subscription. This means one call of the Cost Management PowerShell module per subscription. - -The input file is produced by the Get-AzureServices.ps1 script. - -Requires Az.CostManagement module version 0.4.2. - -`PS1> Install-Module -Name Az.CostManagement` - -Instructions for use: - -1. Log on to Azure using `Connect-AzAccount`. Ensure that you have Cost Management Reader access to each subscription listed in the resources file (default `resources.json`) -2. Navigate to the 3-CostInformation folder and run the script using `.\Get-CostInformation.ps1`. The script will generate a CSV file in the current folder. - -#### Documentation links - cost retrieval -Documentation regarding the Az.CostManagement module is not always straightforward. Helpful links are: - -| Documentation | Link | -| -------- | ------- | -| Cost Management Query (API) | [Link](https://learn.microsoft.com/en-us/rest/api/cost-management/query/usage) | -| Az.CostManagement Query (PowerShell) | [Link](https://learn.microsoft.com/en-us/powershell/module/az.costmanagement/invoke-azcostmanagementquery) | - -Valid dimensions for grouping are: - -``` text -AccountName -BenefitId -BenefitName -BillingAccountId -BillingMonth -BillingPeriod -ChargeType -ConsumedService -CostAllocationRuleName -DepartmentName -EnrollmentAccountName -Frequency -InvoiceNumber -MarkupRuleName -Meter -MeterCategory -MeterId -MeterSubcategory -PartNumber -PricingModel -PublisherType -ReservationId -ReservationName -ResourceGroup -ResourceGroupName -ResourceGuid -ResourceId -ResourceLocation -ResourceType -ServiceName -ServiceTier -SubscriptionId -SubscriptionName -``` - -### Perform-RegionComparison.ps1 - -This script builds on the collection step by comparing pricing across Azure regions for the meter ID's retrieved earlier. -The Azure public pricing API is used, meaning that: -* No login is needed for this step -* Prices are *not* customer-specific, but are only used to calculate the relative cost difference between regions for each meter - -As customer discounts tend to be linear (for example, ACD is a flat rate discount across all PAYG Azure spend), the relative price difference between regions can still be used to make an intelligent estimate of the cost impact of a workload move. - -Instructions for use: - -1. Prepare a list of target regions for comparison. This can be provided at the command line or stored in a variable before calling the script. -2. Ensure the `resources.json` file is present (from the running of the collector script). -2. Run the script using `.\Perform-RegionComparison.ps1`. The script will generate output files in the current folder. - -#### Example - -``` text -$regions = @("eastus", "brazilsouth", "australiaeast") -.\Perform-RegionComparison.ps1 -regions $regions -outputType json -``` - -#### Outputs - -Depending on the chosen output format, the script outputs four sets of data: - -| Dataset | Contents | -| -------- | ------- | -| `inputs` | The input data used for calling the pricing API (for reference only) | -| `pricemap` | An overview of which regions are cheaper / similarly-priced / more expensive for each meter ID | -| `prices` | Prices for each source/target region mapping by meter ID | -| `uomerrors` | A list of any eventual mismatches of Unit Of Measure between regions | - - -#### Documentation links - region comparison - -| Documentation | Link | -| -------- | ------- | -| Azure pricing API | [Link](https://learn.microsoft.com/en-us/rest/api/cost-management/retail-prices/azure-retail-prices) | diff --git a/docs/wiki/archive/7-Report readme.md b/docs/wiki/archive/7-Report readme.md deleted file mode 100644 index fe764c2..0000000 --- a/docs/wiki/archive/7-Report readme.md +++ /dev/null @@ -1,40 +0,0 @@ -# Export Script - -This script generates formatted Excel (`.xlsx`)reports based on the output from the previous check script. The reports provide detailed information for each service, including: - -## Service Availability Report - -- **Resource type** -- **Resource count** -- **Implemented (origin) regions** -- **Implemented SKUs** -- **Selected (target) regions** -- **Availability in the selected regions** - -## Cost Comparison Report - -- **Azure Cost Meter ID** -- **Service Name** -- **Meter Name** -- **Product Name** -- **SKU Name** -- **Retail Price per region** -- **Price Difference to origin region per region** - -These reports help you analyze service compatibility and cost differences across different regions. - -## Dependencies - -- This script requires the `ImportExcel` PowerShell module. -- The script requires you to have run either the `2-AvailabilityCheck/Get-Region.ps1` or `3-CostInformation/Perform-RegionComparison.ps1` or both scripts to generate the necessary JSON input files for availability and cost data. - -## Usage Instructions - -1. Open a PowerShell command line. -2. Navigate to the `7-Report` folder. -3. If you have created one or more availability JSON files using the `2-AvailabilityCheck/Get-Region.ps1` script, run the following commands, replacing the path with your actual file path(s): - - ```powershell - .\Get-Report.ps1 -availabilityInfoPath `@("..\2-AvailabilityCheck\Availability_Mapping_Asia_Pacific.json", "..\2-AvailabilityCheck\Availability_Mapping_Europe.json")` -costComparisonPath "..\3-CostInformation\region_comparison_prices.json" - ``` -The script generates an `.xlsx` and `.csv` files in the `7-report` folder, named `Availability_Report_CURRENTTIMESTAMP`. diff --git a/docs/wiki/archive/Home.md b/docs/wiki/archive/Home.md deleted file mode 100644 index 1167a9f..0000000 --- a/docs/wiki/archive/Home.md +++ /dev/null @@ -1,43 +0,0 @@ -# Welcome to the Region Selection toolkit wiki - -This wiki documents the current situation during the development of the Region Selection toolkit. - -# What is Regions selection toolkit - -## Project Description - -**Region Selection Toolkit** is a comprehensive solution for guiding Cloud Solution Architects, Solution Engineers and IT teams in selecting the optimal Microsoft Azure region for their workloads. This toolkit automates the complex analysis required when deciding “Which Azure region should we deploy to?”. It evaluates multiple factors – from service availability and compliance to sustainability and performance – to recommend the best region(s) for a given set of cloud resources. The goal is to streamline regional planning for scenarios such as migrating to a new Azure region, expanding an application into additional regions, or choosing a region for a new deployment. - -This holistic approach ensures you consider all angles (technical, business, and environmental) when comparing cloud regions. - -*Note: The Region Selection Toolkit is designed with extensibility in mind. Its modular architecture means additional factors (e.g. capacity planning data or more detailed latency testing) can be incorporated over time. New Azure regions and services are continually updated to keep recommendations current.* - -## Toolkit Features - -**Inventory collection** -The toolkit can collect an inventory of your existing Azure resources (e.g. via Azure Resource Graph) or accept input from an Azure Migrate assessment. This inventory forms the basis of the region compatibility analysis. - -**Multi-Factor Region Analysis** -Analyses Azure regions against a wide range of criteria crucial for decision-making. -It checks: - -* Service Availability & Roadmap - Verifies that all Azure services and features used by your workload are available (or have planned availability) in the target region. The toolkit cross-references your workload’s resource types against Azure’s regional services list, helping avoid deployments in regions where required services are not yet supported. - -* cost differences - Compares estimated costs of running the workload in different regions. Azure service pricing can vary by region; the toolkit retrieves pricing information for your workload’s resource mix in each candidate region, allowing a side-by-side cost comparison. This helps in budgeting and choosing a cost-effective location without manual price research. - -* Compliance and geopolitical factors [V1] - Takes into account data residency requirements and geopolitical considerations. It will flag, for instance, if a region belongs to a specific sovereignty (such as EU, US Gov, or China regions) or if there are legal/regulatory implications in choosing that location. This ensures your region choice aligns with compliance mandates and organisational policies (e.g. GDPR, data sovereignty, or other regional regulations). - -* performance impacts [V2] - Provides insights on performance-related aspects such as network latency and infrastructure resiliency. For example, it notes whether a region offers Availability Zones and identifies the region’s paired region (for disaster recovery purposes). This helps evaluate reliability and potential latency impact on end-users when moving or expanding to that region. - -* and sustainability metrics [V1] - Highlights sustainability considerations of each region. The toolkit surfaces data like regional carbon intensity or renewable energy availability (where available) to help organisations optimise for lower carbon footprint. Choosing a greener Azure region can support corporate sustainability goals – the toolkit makes this information readily accessible during planning. - -for each potential region. - -**Recommendation Report** - - After analysis, the toolkit produces a clear report or summary of findings. You’ll get a list of recommended region(s) ranked or filtered based on the defined criteria, along with the reasoning (e.g. “Region A is recommended due to full service availability and lowest cost, with moderate sustainability score”). This report can be used to present options to stakeholders or as a blueprint for the actual migration/deployment. diff --git a/docs/wiki/archive/ToolkitHowTo.md b/docs/wiki/archive/ToolkitHowTo.md deleted file mode 100644 index 136c0cd..0000000 --- a/docs/wiki/archive/ToolkitHowTo.md +++ /dev/null @@ -1,60 +0,0 @@ -# Background - -This guide describes how to leverage the Region selection toolkit toolkit when checking can your current Azure workload be deployed in another region. - -> Note that this is a preview solution intended to encourage feedback for further development which should be tested in a safe environment before using in production to protect against possible failures/unnecessary cost. -> Also note that this repo is public and as such you should never upload or otherwise divulge sensitive information to this repo. If there is any concern, please contact your Microsoft counterparts for detailed advice. - -The repo at present contains code and details for the following: - -- Script and supporting files to collect Azure resource inventory and properties from either an Azure resource group, an Azure subscription (default behavior) or multiple Azure subscriptions. This functionality is contained in the 1-Collect directory. -- Script to convert the output Excel file from a Azure Migrate Assessment to the same format. -- Script to determine service availability in the target region based on the inventory collected in the previous step. This functionality is contained in the 2-AvailabilityCheck directory. Note that this functionality is not yet complete and is a work in progress. - -## Prerequisites - -1. Microsoft Entra ID Tenant. -1. Azure RBAC Reader access to minimum one resource group for when collecting inventory. Note that depending on the scope of the inventory collection, you may need to have Reader access to either a single resource group, a subscription or multiple subscriptions. -1. You will need to have the following installed on the platform you are running the scripts from: - - PowerShell Core 7.5.1 or later - - Azure Powershell module Az.Monitor 5.2.2 or later - - Azure Powershell module Az.ResourceGraph 1.2.0 or later - - Azure Powershell module Az.Accounts 4.1.0 or later - - Azure Powershell ImportExcel module for Azure Migrate script - -## High Level Steps - -- Fork this repo to your own GitHub organization, you should not create a direct clone of the repo. Pull requests based off direct clones of the repo will not be allowed. -- Clone the repo from your own GitHub organization to whatever platform you are using to access Azure. -- Open the PowerShell console and navigate to the directory where you cloned the repo. -- Navigate to the `1-Collect` directory. -- Logon to Azure with an account that has the required permissions to collect the inventory using `Connect-AzAccount`. -- Run the script `Get-AzureServices.ps1` to collect the Azure resource inventory and properties, for yor relevant scope (resource group, subscription or multiple subscriptions). The script will generate a resources.json and a summary.json file in the same directory. The resources.json file contains the full inventory of resources and their properties, while the summary.json file contains a summary of the resources collected. For examples on how to run the script for different scopes please see 1-Collect scope examples - [1-Collect Scope Examples](#1-collect-scope-examples) below. -- Alternatively you can run `Get-RessourcesFromAM.ps1` against an Azure Migrate `Assessment.xlsx` file to convert the VM & Disk SKUs into the same output as `Get-AzureServices.ps1` to be used further with the `2-AvailabilityCheck/Get-AvailabilityInformation.ps1` script. -- After collecting the inventory, the intent is that you can use the `2-AvailabilityCheck/Get-AvailabilityInformation.ps1` script to check the availability of the services in the target region. This script will generate a services.json file in the same directory, which contains the availability information for the services in the target region. Note that this functionality is not yet complete and is a work in progress. - -## 1-Collect Scope Examples - -- To collect the inventory for a single resource group, run the script as follows: - -```powershell -Get-AzureServices.ps1 -scopeType resourceGroup -resourceGroupName -subscriptionId -``` - -- To collect the inventory for a single subscription, run the script as follows: - -```powershell -Get-AzureServices.ps1 -scopeType subscription -subscriptionId -``` - -- To collect the inventory for multiple subscriptions, you will need to create a json file containing the subscription ids in scope. See [here](./subscriptions.json) for a sample json file. Once the file is created, run the script as follows: - -```powershell -Get-AzureServices.ps1 -multiSubscription -workloadFile -``` - -### 1.1-Azure Migrate Script Examples - -```powershell -Get-RessourcesFromAM.ps1 -filePath "C:\path\to\Assessment.xlsx" -outputFile "C:\path\to\summary.json" -``` diff --git a/docs/wiki/archive/scenarios.md b/docs/wiki/archive/scenarios.md deleted file mode 100644 index d3f5a12..0000000 --- a/docs/wiki/archive/scenarios.md +++ /dev/null @@ -1 +0,0 @@ - From 463c3e90a7ba18a107911d0f016c298739c39a90 Mon Sep 17 00:00:00 2001 From: Jan Faurskov <22591930+jfaurskov@users.noreply.github.com> Date: Wed, 22 Oct 2025 12:18:34 +0200 Subject: [PATCH 37/38] Update docs/wiki/1-Collect.md --- docs/wiki/1-Collect.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/wiki/1-Collect.md b/docs/wiki/1-Collect.md index 7d9560d..72ea5c8 100644 --- a/docs/wiki/1-Collect.md +++ b/docs/wiki/1-Collect.md @@ -17,7 +17,7 @@ Get-AzureServices.ps1 -includeCost $true Get-AzureServices.ps1 -scopeType resourceGroup -resourceGroupName -subscriptionId ``` -- To collect the inventory for a single subscription, run the script as follows: +- To collect the inventory for a single subscription, cost not included, run the script as follows: ```powershell Get-AzureServices.ps1 -scopeType subscription -subscriptionId From 30df463716abbb86fb97e2e80d811baab94a80f8 Mon Sep 17 00:00:00 2001 From: Jan Faurskov <22591930+jfaurskov@users.noreply.github.com> Date: Wed, 22 Oct 2025 12:19:20 +0200 Subject: [PATCH 38/38] Update docs/wiki/1-Collect.md --- docs/wiki/1-Collect.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/wiki/1-Collect.md b/docs/wiki/1-Collect.md index 72ea5c8..9a04946 100644 --- a/docs/wiki/1-Collect.md +++ b/docs/wiki/1-Collect.md @@ -11,7 +11,7 @@ Run the `Get-AzureServices.ps1` script with your target scope. For example: Get-AzureServices.ps1 -includeCost $true ``` -- To collect the inventory for a single resource group, run the script as follows: +- To collect the inventory for a single resource group, cost not included, run the script as follows: ```powershell Get-AzureServices.ps1 -scopeType resourceGroup -resourceGroupName -subscriptionId