[Cosmos] Migrate Java Cosmos weekly pipelines to TME#48877
Draft
tvaron3 wants to merge 10 commits intoAzure:mainfrom
Draft
[Cosmos] Migrate Java Cosmos weekly pipelines to TME#48877tvaron3 wants to merge 10 commits intoAzure:mainfrom
tvaron3 wants to merge 10 commits intoAzure:mainfrom
Conversation
Version bumps and CHANGELOG updates for: - azure-cosmos-spark_3-3_2-12 4.47.0 - azure-cosmos-spark_3-4_2-12 4.47.0 - azure-cosmos-spark_3-5_2-12 4.47.0 - azure-cosmos-spark_3-5_2-13 4.47.0 - azure-cosmos-spark_4-0_2-13 4.47.0 Features Added: - Added support for change feed with startFrom point-in-time on merged partitions (PR Azure#48752) Bugs Fixed: - Fixed readContainerThroughput unnecessary permission requirement (PR Azure#48800) Also updated azure-cosmos CHANGELOG to reclassify the startFrom fix as a feature. Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
- Added JVM <clinit> deadlock fix (PR Azure#48689) to all 5 spark connector CHANGELOGs - Added Known Issues section to Spark 4.0 README for Structured Streaming incompatibility with Databricks Runtime 17.3 Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Updated with accurate details: MetadataVersionUtil$ class removal, DBR 17.3 includes Spark 4.1 changes while reporting 4.0.0, and recommendation to stay on previous LTS until DBR 18 LTS. Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Point tests.yml, spark.yml, and kafka.yml at the Azure SDK Test Resources - TME tenant/subscription and the new azure-sdk-tests-cosmos-tme service connection. Prefix the long-lived Spark resource group with SSS3PT_ so that local-auth Cosmos keys do not trip S360 alerts (see eng/common/TestResources/New-TestResources.ps1 lines 130/314). Per-run resource groups created by New-TestResources.ps1 are prefixed automatically because the TME tenant id is in $wellKnownTMETenants. Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
…grate-live-tests-to-tme # Conflicts: # eng/versioning/version_client.txt # sdk/cosmos/azure-cosmos-spark-account-data-resolver-sample/pom.xml # sdk/cosmos/azure-cosmos-spark_3-3_2-12/CHANGELOG.md # sdk/cosmos/azure-cosmos-spark_3-3_2-12/pom.xml # sdk/cosmos/azure-cosmos-spark_3-4_2-12/CHANGELOG.md # sdk/cosmos/azure-cosmos-spark_3-4_2-12/pom.xml # sdk/cosmos/azure-cosmos-spark_3-5_2-12/CHANGELOG.md # sdk/cosmos/azure-cosmos-spark_3-5_2-12/pom.xml # sdk/cosmos/azure-cosmos-spark_3-5_2-13/CHANGELOG.md # sdk/cosmos/azure-cosmos-spark_3-5_2-13/pom.xml # sdk/cosmos/azure-cosmos-spark_4-0_2-13/CHANGELOG.md # sdk/cosmos/azure-cosmos-spark_4-0_2-13/pom.xml # sdk/cosmos/fabric-cosmos-spark-auth_3/pom.xml
Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
Override CloudConfig.Public.ServiceConnection to azure-sdk-tests-cosmos-tme for the IT_Cosmos and Spring_Data_Cosmos_Integration stages in sdk/spring/tests.yml. Thread a CloudConfig passthrough parameter through tests-supported-spring-versions-template.yml and tests-supported-spring-versions-filter-template.yml so the override reaches archetype-sdk-tests-isolated.yml. Defaults are unchanged so non-cosmos Spring stages (AppConfig, ServiceBus, EventHubs_Storage, KeyVault, AppConfig_IT) continue to use their current service connections. Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Description
Migrate the Java Cosmos weekly test pipelines (
tests.yml,spark.yml,kafka.yml) from the Microsoft corp tenant to the Azure SDK Test Resources – TME tenant / subscription.70a036f6-8e4d-4615-bad6-149c02e7720d4d042dc6-fe17-4698-a23f-ec6a8d1e98f4azure-sdk-tests-cosmos-tmeThe TME tenant id is already in
$wellKnownTMETenantsineng/common/TestResources/New-TestResources.ps1, so per-run resource groups are automatically prefixed withSSS3PT_to keep local-auth Cosmos keys from tripping S360. For the one long-lived RG we hard-code (oltp-spark-ci), this PR adds the prefix manually.Changes
sdk/cosmos/tests.ymlServiceConnection: azure-sdk-tests-cosmos→azure-sdk-tests-cosmos-tmeacross all 7 stagessdk/cosmos/spark.ymlSubscriptionId/TenantIdwith TME values; renameResourceGroupName: oltp-spark-ci→SSS3PT_oltp-spark-ciacross all 6 stagessdk/cosmos/kafka.ymlACCOUNT_TENANT_IDto the TME tenant and switch service connection toazure-sdk-tests-cosmos-tmeDraft PR — Manual prerequisites (not in this PR)
This PR only covers the in-repo YAML. The following must be completed out-of-band by eng-sys / Cosmos test owners before the pipelines can run green. I'm marking the PR as Draft until they're all done.
1. Azure DevOps service connection
Contributoron theAzure SDK Test Resources – TMEsubscription.azure-sdk-tests-cosmos-tmepointing at that subscription and grant the cosmos weekly pipelines permission to use it.kafkaTestApplicationOidfollow-up (see §4).2. TME variable group
Create a pipeline variable group attached to the new service connection. Keep identical variable names to the existing corp variable group so no additional YAML changes are needed. Variables consumed by these pipelines:
tests.yml(ThinClient stages):thinclient-test-endpoint,thinclient-test-key,thin-client-canary-multi-region-session-endpoint/key,thin-client-canary-multi-writer-session-endpoint/keyspark.yml:spark-databricks-cosmos-endpoint,spark-databricks-cosmos-endpoint-msi,spark-databricks-cosmos-key,spark-databricks-endpoint-with-msi,spark-databricks-token-with-msi,spark-databricks-cosmos-spn-clientId,spark-databricks-cosmos-spn-clientSecret,spark-databricks-cosmos-spn-clientIdCert,spark-databricks-cosmos-spn-clientCertBase64kafka.yml:cosmos-client-telemetry-endpoint,cosmos-client-telemetry-cosmos-account,kafka-mcr-name(also reusesspark-databricks-cosmos-spn-clientId/Secret)3. Long-lived resources to pre-create in TME
All must live in
SSS3PT_-prefixed resource groups (e.g.SSS3PT_rg-cosmos-java-weekly,SSS3PT_oltp-spark-ci) to suppress S360 alerts for local-auth:thinclient-test-endpoint/keytests.ymlCosmos_Live_Test_ThinClientthin-client-canary-multi-region-session-endpoint/keytests.ymlThinClient_MultiRegionthin-client-canary-multi-writer-session-endpoint/keytests.ymlThinClient_MultiMasterspark-databricks-cosmos-endpoint/keyand-msispark.yml(all 6 stages)spark-databricks-endpoint-with-msi,spark-databricks-token-with-msispark.yml— the existingoltp-spark-ciworkspace can't be moved cross-tenant, a brand-new one is required in TMEoltpsparkcijarstore0326) + cert/SASspark-databricks-cosmos-spn-clientIdCert,spark-databricks-cosmos-spn-clientCertBase64spark.ymlspark-databricks-cosmos-spn-clientId/Secret/clientIdCert/clientCertBase64; its object id also feedskafkaTestApplicationOid(§4)spark.yml,kafka.ymlcosmos-client-telemetry-endpoint,cosmos-client-telemetry-cosmos-accountkafka.ymlkafka-mcr-name) reachable from TME pipeline agentskafka-mcr-namekafka.yml4. Follow-up YAML change (requires SPN object id from §1)
kafkaTestApplicationOidinsdk/cosmos/test-resources/kafka-testcontainer/test-resources.json(line 36) from the current corp-tenant SPN (3b254cc1-3ecc-4d33-9d61-e867badcef16) to the object id of the TME SPN created in §1. Without this the Cosmos data-plane RBAC role assignment insideNew-TestResources.ps1will fail.5. Validation (before marking ready for review)
tests.ymlagainst this branch; confirm the deploy step uses the TME subscription, RG is created withSSS3PT_, and all 7 stages complete.spark.ymlagainst this branch; confirm the Databricks notebook job succeeds on the new TME workspace + Cosmos account.kafka.ymlagainst this branch.Testing
Validated the diff is self-consistent (no leftover corp tenant/subscription ids,
SSS3PT_prefix applied to all 6spark.ymlstages, service connection renamed in all 7tests.ymlstages +kafka.yml).References