Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat!: ama #968

Merged
merged 65 commits into from
Jun 17, 2024
Merged

feat!: ama #968

merged 65 commits into from
Jun 17, 2024

Conversation

matt-FFFFFF
Copy link
Member

@matt-FFFFFF matt-FFFFFF commented Jun 4, 2024

Overview/Summary

This will be in the next major release, following the update of Azure Landing Zones with it's major policy refresh and move to Azure Monitoring Agent from Microsoft Monitoring Agent.

Incorporates the following changes from upstream

  1. Policy refresh H2 FY24
  2. AMA Updates

Changes from our awesome community

  1. Connectivity: Add OpenAI and Databricks Private Link DNS #918 (thanks @chrsundermann!)
  2. Network gateway default parameters #925 (thanks @nyanhp!)
  3. Avoid continuous destroy and create of azapi_resource diag_settings #952 (thanks @Keetika-Yogendra!)

‼️ Breaking Changes

  1. Minimum AzureRM provider version now 3.107.0
  2. Minimum AzAPI provider version now 1.13.1
  3. Minimum Terraform version now 1.7.0
  4. var.configure_management_resources schema change, removing legacy components and adding support for AMA resources

Acknowledgements

Thanks to:

Thanks to:

Testing Evidence

Please provide any testing evidence to show that your Pull Request works/fixes as described and planned (include screenshots, if appropriate).

As part of this Pull Request I have

  • Checked for duplicate Pull Requests
  • Associated it with relevant issues, for tracking and closure.
  • Ensured my code/branch is up-to-date with the latest changes in the main branch
  • Performed testing and provided evidence.
  • Updated relevant and associated documentation.

@JamesDLD
Copy link

JamesDLD commented Jun 6, 2024

Hi @matt-FFFFFF thank you for this initiative!
If it can help I am sharing here a code I used that can help to create the 3 Data Collection Rule:

We could have something link this in the file resources.management.tf

resource "azurerm_monitor_data_collection_rule" "management" {
  for_each                    = local.azurerm_monitor_data_collection_rule_management
  name                        = each.value.user_given_dcr_name
  location                    = each.value.workspace_location
  resource_group_name         = each.value.resource_group_name
  description                 = each.value.description

  dynamic "data_sources" {
    for_each = lookup(each.value, "data_sources", [])
    content {
      dynamic "data_import" {
        for_each = lookup(data_sources.value, "data_import", [])
        content {
          dynamic "event_hub_data_source" {
            for_each = lookup(data_import.value, "event_hub_data_source", [])
            content {
              name           = event_hub_data_source.value.name
              stream         = event_hub_data_source.value.stream
              consumer_group = lookup(event_hub_data_source.value, "consumer_group", null)
            }
          }
        }
      }
      dynamic "extension" {
        for_each = lookup(data_sources.value, "extension", [])
        content {
          extension_name     = extension.value.extension_name
          name               = extension.value.name
          streams            = extension.value.streams                             #(Required) Specifies a list of streams that this data source will be sent to. A stream indicates what schema will be used for this data and usually what table in Log Analytics the data will be sent to. Possible values include but not limited to Microsoft-Event, Microsoft-InsightsMetrics, Microsoft-Perf, Microsoft-Syslog, Microsoft-WindowsEvent.
          extension_json     = lookup(extension.value, "extension_json", null)     #(Optional) A JSON String which specifies the extension setting.
          input_data_sources = lookup(extension.value, "input_data_sources", null) #(Optional) Specifies a list of data sources this extension needs data from. An item should be a name of a supported data source which produces only one stream. Supported data sources type: performance_counter, windows_event_log,and syslog.
        }
      }
      dynamic "iis_log" {
        for_each = lookup(data_sources.value, "iis_log", [])
        content {
          name            = iis_log.value.name
          streams         = iis_log.value.streams
          log_directories = lookup(iis_log.value, "log_directories", null)
        }
      }
      dynamic "log_file" {
        for_each = lookup(data_sources.value, "log_file", [])
        content {
          name          = log_file.value.name
          streams       = log_file.value.streams
          file_patterns = log_file.value.file_patterns
          format        = log_file.value.format
          dynamic "settings" {
            for_each = lookup(log_file.value, "settings", [])
            content {
              dynamic "text" {
                for_each = lookup(settings.value, "text", [])
                content {
                  record_start_timestamp_format = text.value.record_start_timestamp_format #(Required) The timestamp format of the text log files. Possible values are ISO 8601, YYYY-MM-DD HH:MM:SS, M/D/YYYY HH:MM:SS AM/PM, Mon DD, YYYY HH:MM:SS, yyMMdd HH:mm:ss, ddMMyy HH:mm:ss, MMM d hh:mm:ss, dd/MMM/yyyy:HH:mm:ss zzz,and yyyy-MM-ddTHH:mm:ssK.
                }
              }
            }
          }
        }
      }
      dynamic "performance_counter" {
        for_each = lookup(data_sources.value, "performance_counter", [])
        content {
          counter_specifiers            = performance_counter.value.counter_specifiers
          name                          = performance_counter.value.name
          sampling_frequency_in_seconds = performance_counter.value.sampling_frequency_in_seconds
          streams                       = performance_counter.value.streams
        }
      }
      dynamic "platform_telemetry" {
        for_each = lookup(data_sources.value, "platform_telemetry", [])
        content {
          name    = platform_telemetry.value.name
          streams = platform_telemetry.value.streams
        }
      }
      dynamic "prometheus_forwarder" {
        for_each = lookup(data_sources.value, "prometheus_forwarder", [])
        content {
          name    = prometheus_forwarder.value.name
          streams = prometheus_forwarder.value.streams
          dynamic "label_include_filter" {
            for_each = lookup(prometheus_forwarder.value, "label_include_filter", [])
            content {
              label = label_include_filter.value.label
              value = label_include_filter.value.value
            }
          }
        }
      }
      dynamic "syslog" {
        for_each = lookup(data_sources.value, "syslog", [])
        content {
          facility_names = syslog.value.facility_names
          log_levels     = syslog.value.log_levels
          name           = syslog.value.name
          streams        = lookup(syslog.value, "streams", null)
        }
      }
      dynamic "windows_event_log" {
        for_each = lookup(data_sources.value, "windows_event_log", [])
        content {
          name           = windows_event_log.value.name
          streams        = windows_event_log.value.streams
          x_path_queries = windows_event_log.value.x_path_queries
        }
      }
      dynamic "windows_firewall_log" {
        for_each = lookup(data_sources.value, "windows_firewall_log", [])
        content {
          name    = windows_firewall_log.value.name
          streams = windows_firewall_log.value.streams
        }
      }
    }
  }

  dynamic "destinations" {
    for_each = lookup(each.value, "destinations", [])
    content {
      dynamic "azure_monitor_metrics" {
        for_each = lookup(destinations.value, "azure_monitor_metrics", [])
        content {
          name = azure_monitor_metrics.value.name
        }
      }
      dynamic "event_hub" {
        for_each = lookup(destinations.value, "event_hub", [])
        content {
          event_hub_id = event_hub.value.event_hub_id #(Required) The resource ID of the Event Hub.
          name         = event_hub.value.name         #(Required) The name which should be used for this destination. This name should be unique across all destinations regardless of type within the Data Collection Rule.
        }
      }
      dynamic "event_hub_direct" {
        for_each = lookup(destinations.value, "event_hub_direct", [])
        content {
          event_hub_id = event_hub_direct.value.event_hub_id
          name         = event_hub_direct.value.name
        }
      }
      dynamic "log_analytics" {
        for_each = lookup(destinations.value, "log_analytics", [])
        content {
          workspace_resource_id = log_analytics.value.workspace_resource_id
          name                  = log_analytics.value.name
        }
      }
      dynamic "monitor_account" {
        for_each = lookup(destinations.value, "monitor_account", [])
        content {
          monitor_account_id = monitor_account.value.monitor_account_id
          name               = monitor_account.value.name
        }
      }
      dynamic "storage_blob" {
        for_each = lookup(destinations.value, "storage_blob", [])
        content {
          container_name     = storage_blob.value.container_name
          name               = storage_blob.value.name
          storage_account_id = storage_blob.value.storage_account_id
        }
      }
      dynamic "storage_blob_direct" {
        for_each = lookup(destinations.value, "storage_blob_direct", [])
        content {
          container_name     = storage_blob_direct.value.container_name
          name               = storage_blob_direct.value.name
          storage_account_id = storage_blob_direct.value.storage_account_id
        }
      }
      dynamic "storage_table_direct" {
        for_each = lookup(destinations.value, "storage_table_direct", [])
        content {
          table_name         = storage_table_direct.value.table_name
          name               = storage_table_direct.value.name
          storage_account_id = storage_table_direct.value.storage_account_id
        }
      }
    }
  }

  dynamic "data_flow" {
    for_each = lookup(each.value, "data_flow", [])
    content {
      streams            = data_flow.value.streams                             #(Required) Specifies a list of streams. Possible values include but not limited to Microsoft-Event, Microsoft-InsightsMetrics, Microsoft-Perf, Microsoft-Syslog,and Microsoft-WindowsEvent.
      destinations       = data_flow.value.destinations                        #(Required) Specifies a list of destination names. A azure_monitor_metrics data source only allows for stream of kind Microsoft-InsightsMetrics.
      built_in_transform = lookup(data_flow.value, "built_in_transform", null) #(Optional) The built-in transform to transform stream data.
      output_stream      = lookup(data_flow.value, "output_stream", null)      #(Optional) The output stream of the transform. Only required if the data flow changes data to a different stream.
      transform_kql      = lookup(data_flow.value, "transform_kql", null)      #(Optional) The KQL query to transform stream data.
    }
  }

  dynamic "identity" {
    for_each = lookup(each.value, "identity", [])
    content {
      type         = lookup(identity.value, "type", null)
      identity_ids = lookup(identity.value, "identity_id_key", null) == null ? null : [azurerm_user_assigned_identity.id[identity.value.identity_id_key].id]
    }
  }

  dynamic "stream_declaration" {
    for_each = lookup(each.value, "stream_declaration", [])
    content {
      stream_name = stream_declaration.value.stream_name
      dynamic "column" {
        for_each = lookup(stream_declaration.value, "column", [])
        content {
          name = column.value.name
          type = column.value.type
        }
      }
    }
  }

  kind = lookup(each.value, "kind", null) #(Optional) The kind of the Data Collection Rule. Possible values are Linux, Windows,and AgentDirectToStore. A rule of kind Linux does not allow for windows_event_log data sources. And a rule of kind Windows does not allow for syslog data sources. If kind is not specified, all kinds of data sources are allowed.
  tags = lookup(each.value, "tags", null)
}

And something link this in the file locals.management.tf

# The following locals are used to build the map of Data
# Collection Rules to deploy.
locals {
  management_workspace_resource_id = "/subscriptions/${var.subscription_id_management}/resourceGroups/${local.archetypes.configure_management_resources.advanced.custom_settings_by_resource_type.azurerm_resource_group.management.name}/providers/Microsoft.OperationalInsights/workspaces/${local.archetypes.configure_management_resources.advanced.custom_settings_by_resource_type.azurerm_log_analytics_workspace.management.name}"

  azurerm_monitor_data_collection_rule_management = {
    VmInsights = {
      user_given_dcr_name = "dcr-vminsights-001"
      workspace_location  = "francecentral"
      resource_group_name = module.enterprise_scale[0].azurerm_log_analytics_workspace.management[local.management_workspace_resource_id].resource_group_name
      description         = "Data collection rule for VM Insights."
      tags                = local.archetypes.default_tags

      data_sources = [
        {
          performance_counter = [
            {
              streams                       = ["Microsoft-InsightsMetrics"]
              sampling_frequency_in_seconds = 60
              counter_specifiers = [
                "\\VmInsights\\DetailedMetrics"
              ]
              name = "VMInsightsPerfCounters"
            }
          ]
          extension = [
            {
              streams        = ["Microsoft-ServiceMap"]
              extension_name = "DependencyAgent"
              name           = "DependencyAgentDataSource"
            }
          ]
        }
      ]

      destinations = [
        {
          log_analytics = [
            {
              workspace_resource_id = module.enterprise_scale[0].azurerm_log_analytics_workspace.management[local.management_workspace_resource_id].id
              name                  = "VMInsightsPerf-Logs-Dest"
            }
          ]
        }
      ]

      data_flow = [
        {
          streams      = ["Microsoft-InsightsMetrics"]
          destinations = ["VMInsightsPerf-Logs-Dest"]
        },
        {
          streams      = ["Microsoft-ServiceMap"]
          destinations = ["VMInsightsPerf-Logs-Dest"]
        }
      ]
    }

    ChangeTracking = {
      user_given_dcr_name = "dcr-changetracking-001"
      workspace_location  = "francecentral"
      resource_group_name = module.enterprise_scale[0].azurerm_log_analytics_workspace.management[local.management_workspace_resource_id].resource_group_name
      description         = "Data collection rule for CT."
      tags                = local.archetypes.default_tags

      data_sources = [
        {
          extension = [
            {
              streams = [
                "Microsoft-ConfigurationChange",
                "Microsoft-ConfigurationChangeV2",
                "Microsoft-ConfigurationData"
              ]
              extension_name = "ChangeTracking-Windows"
              extension_json = jsonencode({
                enableFiles     = true,
                enableSoftware  = true,
                enableRegistry  = false,
                enableServices  = true,
                enableInventory = true,
                fileSettings = {
                  fileCollectionFrequency = 900,
                  fileInfo = [
                    {
                      name                  = "ChangeTrackingLinuxPath_default",
                      enabled               = true,
                      destinationPath       = "/etc/.*.conf",
                      useSudo               = true,
                      recurse               = true,
                      maxContentsReturnable = 5000000,
                      pathType              = "File",
                      type                  = "File",
                      links                 = "Follow",
                      maxOutputSize         = 500000,
                      groupTag              = "Recommended"
                    }
                  ]
                },
                softwareSettings = {
                  softwareCollectionFrequency = 300
                },
                inventorySettings = {
                  inventoryCollectionFrequency = 36000
                },
                servicesSettings = {
                  serviceCollectionFrequency = 300
                }
              })
              name = "CTDataSource-Windows"
            },
            {
              streams = [
                "Microsoft-ConfigurationChange",
                "Microsoft-ConfigurationChangeV2",
                "Microsoft-ConfigurationData"
              ]
              extension_name = "ChangeTracking-Linux"
              extension_json = jsonencode({
                enableFiles     = true,
                enableSoftware  = true,
                enableRegistry  = false,
                enableServices  = true,
                enableInventory = true,
                fileSettings = {
                  fileCollectionFrequency = 900,
                  fileInfo = [
                    {
                      name                  = "ChangeTrackingLinuxPath_default",
                      enabled               = true,
                      destinationPath       = "/etc/.*.conf",
                      useSudo               = true,
                      recurse               = true,
                      maxContentsReturnable = 5000000,
                      pathType              = "File",
                      type                  = "File",
                      links                 = "Follow",
                      maxOutputSize         = 500000,
                      groupTag              = "Recommended"
                    }
                  ]
                },
                softwareSettings = {
                  softwareCollectionFrequency = 300
                },
                inventorySettings = {
                  inventoryCollectionFrequency = 36000
                },
                servicesSettings = {
                  serviceCollectionFrequency = 300
                }
              })
              name = "CTDataSource-Linux"
            }
          ]
        }
      ]

      destinations = [
        {
          log_analytics = [
            {
              workspace_resource_id = module.enterprise_scale[0].azurerm_log_analytics_workspace.management[local.management_workspace_resource_id].id
              name                  = "Microsoft-CT-Dest"
            }
          ]
        }
      ]

      data_flow = [
        {
          streams = [
            "Microsoft-ConfigurationChange",
            "Microsoft-ConfigurationChangeV2",
            "Microsoft-ConfigurationData"
          ]
          destinations = ["Microsoft-CT-Dest"]
        }
      ]
    }

    DefenderSQL = {
      user_given_dcr_name = "dcr-defendersql-001"
      workspace_location  = "francecentral"
      resource_group_name = module.enterprise_scale[0].azurerm_log_analytics_workspace.management[local.management_workspace_resource_id].resource_group_name
      description         = "Data collection rule for Defender for SQL."
      tags                = local.archetypes.default_tags

      data_sources = [
        {
          extension = [
            {
              streams = [
                "Microsoft-DefenderForSqlAlerts",
                "Microsoft-DefenderForSqlLogins",
                "Microsoft-DefenderForSqlTelemetry",
                "Microsoft-DefenderForSqlScanEvents",
                "Microsoft-DefenderForSqlScanResults"
              ]
              extension_name = "MicrosoftDefenderForSQL"
              extension_json = jsonencode({
                enableCollectionOfSqlQueriesForSecurityResearch = false
              })
              name = "MicrosoftDefenderForSQL"
            }
          ]
        }
      ]

      destinations = [
        {
          log_analytics = [
            {
              workspace_resource_id = module.enterprise_scale[0].azurerm_log_analytics_workspace.management[local.management_workspace_resource_id].id
              name                  = "LogAnalyticsDest"
            }
          ]
        }
      ]

      data_flow = [
        {
          streams = [
            "Microsoft-DefenderForSqlAlerts",
            "Microsoft-DefenderForSqlLogins",
            "Microsoft-DefenderForSqlTelemetry",
            "Microsoft-DefenderForSqlScanEvents",
            "Microsoft-DefenderForSqlScanResults"
          ]
          destinations = ["LogAnalyticsDest"]
        }
      ]
    }
  }
}

@matt-FFFFFF
Copy link
Member Author

Hi @JamesDLD

Thank you for your thorough work here!

For the initial release we are implementing the DCRs that are identical to the one published by the product teams.

You will be able to override the policy parameters and supply your own DCRs if you wish, however the DCRs should be created outside the module.

We may incorporate your changes in a future release to add more flexibility, however I am mindful of the complexity of the input schema required for the variables.

@matt-FFFFFF
Copy link
Member Author

Apologies @JamesDLD I misread this - I will incorporate what you have done into this PR

Thanks again!

@JamesDLD
Copy link

JamesDLD commented Jun 6, 2024

Apologies @JamesDLD I misread this - I will incorporate what you have done into this PR

Thanks again!

Thanks!!

@matt-FFFFFF
Copy link
Member Author

/azp run unit

Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@matt-FFFFFF
Copy link
Member Author

@jaredfholgate this is ready to go now I think

jaredfholgate
jaredfholgate previously approved these changes Jun 14, 2024
Copy link
Member

@jaredfholgate jaredfholgate left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nothing that would stop a release here, all just minor comments. LGTM :)

_README_header.md Outdated Show resolved Hide resolved
_README_header.md Outdated Show resolved Hide resolved
docs/wiki/[User-Guide]-Upgrade-from-v5.2.1-to-v6.0.0.md Outdated Show resolved Hide resolved
docs/wiki/[User-Guide]-Upgrade-from-v5.2.1-to-v6.0.0.md Outdated Show resolved Hide resolved
docs/wiki/[User-Guide]-Upgrade-from-v5.2.1-to-v6.0.0.md Outdated Show resolved Hide resolved
modules/management/locals.tf Show resolved Hide resolved
modules/management/locals.tf Show resolved Hide resolved
resources.management.tf Show resolved Hide resolved
resources.management.tf Show resolved Hide resolved
terraform.tf Show resolved Hide resolved
@matt-FFFFFF
Copy link
Member Author

/azp run unit

Copy link

Azure Pipelines successfully started running 1 pipeline(s).

@matt-FFFFFF
Copy link
Member Author

/azp run unit

1 similar comment
@matt-FFFFFF
Copy link
Member Author

/azp run unit

Copy link

Azure Pipelines successfully started running 1 pipeline(s).

1 similar comment
Copy link

Azure Pipelines successfully started running 1 pipeline(s).

jaredfholgate
jaredfholgate previously approved these changes Jun 16, 2024
Copy link
Member

@jaredfholgate jaredfholgate left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, just a couple of comments.

docs/wiki/[User-Guide]-Getting-Started.md Outdated Show resolved Hide resolved
resources.management.tf Show resolved Hide resolved
tests/scripts/azp-strategy.ps1 Outdated Show resolved Hide resolved
@matt-FFFFFF
Copy link
Member Author

/azp run unit

Copy link

Azure Pipelines successfully started running 1 pipeline(s).

Copy link
Member

@jaredfholgate jaredfholgate left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@matt-FFFFFF matt-FFFFFF merged commit 4d983f7 into main Jun 17, 2024
15 checks passed
@matt-FFFFFF matt-FFFFFF deleted the feat/ama branch June 17, 2024 12:01
anmolnagpal pushed a commit to clouddrove/terraform-azure-landingzone that referenced this pull request Jul 8, 2024
* feat(connectivity): Add option to set allow_non_virtual_wan_traffic in express route gateway. (Azure#914)

Co-authored-by: Miltos Tsatsakis <m.tsatsakis@kaizengaming.com>

* updates to resolve issue Azure#794 (Azure#919)

Co-authored-by: github-actions <action@github.com>

* docs: update docs for threat_intelligence_allowlist (Azure#928)

* Update wiki-sync.yml

* chore(deps): bump github/super-linter from 5 to 6 (Azure#931)

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* add link to Upgrade guide from v4.2.0 to v5.0.0 (Azure#934)

Co-authored-by: Matt White <16320656+matt-FFFFFF@users.noreply.github.com>

* Added hub_routing_preference to connectivity advanced configuration (Azure#930)

* Policy sync updates (Azure#959)

* Update Library Templates (automated) (Azure#966)

Co-authored-by: github-actions <action@github.com>

* Add remote branch option (Azure#970)

* Update Library Templates (automated) (Azure#973)

Co-authored-by: github-actions <action@github.com>

* Update Library Templates (automated) (Azure#976)

Co-authored-by: github-actions <action@github.com>

* Remove redundant assignment file (Azure#977)

* updating threat intelligence allowlist dynamic block (Azure#953)

Co-authored-by: Matt White <16320656+matt-FFFFFF@users.noreply.github.com>

* chore(deps): bump azure/powershell from 1 to 2 (Azure#917)

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>

* Add OpenSFF Scorecard (Azure#987)

* chore(deps): bump github/codeql-action from 3.24.9 to 3.25.8 (Azure#990)

* chore(deps): bump github.com/hashicorp/go-getter from 1.7.3 to 1.7.4 in /tests/terratest (Azure#986)

* feat!: ama (Azure#968)

* Fix example uami issue (Azure#1000)

* Update Library Templates (automated) (Azure#1001)

Co-authored-by: github-actions <action@github.com>

* docs: additional v6 upgrade detail (Azure#1002)

* Update Library Templates (automated) (Azure#1006)

Co-authored-by: github-actions <action@github.com>

* docs: update docs with FAQ on roadmap and banner for upcoming breaking changes (Azure#1008)

* naming fixed in module

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: Mtsa <miltsatsakis@gmail.com>
Co-authored-by: Miltos Tsatsakis <m.tsatsakis@kaizengaming.com>
Co-authored-by: Adam Tuckwell <106317528+ATuckwell@users.noreply.github.com>
Co-authored-by: github-actions <action@github.com>
Co-authored-by: Jared Holgate <jaredholgate@microsoft.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Tobias <tobias-m99@gmx.de>
Co-authored-by: Matt White <16320656+matt-FFFFFF@users.noreply.github.com>
Co-authored-by: QBY-MarkusMaring <106068259+QBY-MarkusMaring@users.noreply.github.com>
Co-authored-by: cae-pr-creator[bot] <126156663+cae-pr-creator[bot]@users.noreply.github.com>
Co-authored-by: Daan Toes <112694691+cndaan@users.noreply.github.com>
@@ -1484,7 +1494,8 @@ locals {
azure_synapse_analytics_dev = ["privatelink.dev.azuresynapse.net"]
azure_synapse_analytics_sql = ["privatelink.sql.azuresynapse.net"]
azure_synapse_studio = ["privatelink.azuresynapse.net"]
azure_web_apps_sites = ["privatelink.azurewebsites.net"]
azure_virtual_desktop = ["privatelink.wvd.microsoft.com"]
azure_web_apps_sites = ["privatelink.azurewebsites.net", "scm.privatelink.azurewebsites.net"]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi, does this introduces a Breaking Change? Checking our current deployment, all scm records are automatically added to the "privatelink.azurewebsites.net" private DNS Zone. Introducing the scm subdomain as its own zone, would probably brake the current recordsets? I did not dare testing is :D

Checking the Docs and several github issues, there is no clear approach on whether to deploy or not the deploy "scm" as a separate zone.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes it does break it, SCM stopped resolving any IP since that zone is empty. I have tested it, please remove it or allow us to override it somehow.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
9 participants