Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] CosmosDBAccountResource.RegenerateKeyAsync throws exception on 202 status code #44079

Open
baryoloraul opened this issue May 16, 2024 · 8 comments
Assignees
Labels
Cosmos customer-reported Issues that are reported by GitHub users external to the Azure organization. Mgmt This issue is related to a management-plane library. needs-author-feedback More information is needed from author to address the issue. question The issue doesn't require a change to the product in order to be resolved. Most issues start as that Service Attention This issue is responsible by Azure service team.

Comments

@baryoloraul
Copy link
Member

Library name and version

Azure.ResourceManager.CosmosDB 1.3.2

Describe the bug

Calling CosmosDBAccountResource.RegenerateKeyAsync generates a response with a http status code of 202, which results in a RequestFailedException being thrown. From the Azure portal, it appears that this operation succeeds, but the SDK just incorrectly throws an exception.

2024-05-13 18:21:40.199493: Cosmos DB Account RegenerateKeyAsync exception (Account: REDACTED, Key Type: secondaryReadonly). ExceptionType: RequestFailedException, ExceptionMessage: Service request failed.
2024-05-13 18:21:40.199825: Status: 202 (Accepted)
2024-05-13 18:21:40.200032:
2024-05-13 18:21:40.200114: Content:
2024-05-13 18:21:40.200404: {"status":"Dequeued"}
2024-05-13 18:21:40.200604:
2024-05-13 18:21:40.200678: Headers:
2024-05-13 18:21:40.200852: Cache-Control: no-store, no-cache
2024-05-13 18:21:40.201027: Pragma: no-cache
2024-05-13 18:21:40.201258: Location: REDACTED
2024-05-13 18:21:40.201346: x-ms-request-id: a7d25542-9579-44be-882b-4fc4a265c5be
2024-05-13 18:21:40.201571: Strict-Transport-Security: REDACTED
2024-05-13 18:21:40.201653: x-ms-gatewayversion: REDACTED
2024-05-13 18:21:40.201835: x-ms-ratelimit-remaining-subscription-reads: REDACTED
2024-05-13 18:21:40.201910: x-ms-correlation-request-id: REDACTED
2024-05-13 18:21:40.202407: x-ms-routing-request-id: REDACTED
2024-05-13 18:21:40.202423: X-Content-Type-Options: REDACTED
2024-05-13 18:21:40.202428: X-Cache: REDACTED
2024-05-13 18:21:40.202432: X-MSEdge-Ref: REDACTED
2024-05-13 18:21:40.202435: Date: Mon, 13 May 2024 18:21:39 GMT
2024-05-13 18:21:40.202439: Content-Length: 21
2024-05-13 18:21:40.202442: Content-Type: application/json

Expected behavior

This should be treated as a long running operation and not result in an RequestFailedException.

Actual behavior

This API can result in either 200-OK or 202-Accepted as per the documentation, for cases where it results in 202-Accepted the SDK operation results in a RequestFailedException

Reproduction Steps

Just use the SDK to re generate account keys, if the operation is processed asynchronously, this error shows up.

Environment

No response

@github-actions github-actions bot added the needs-triage This is a new issue that needs to be triaged to the appropriate team. label May 16, 2024
@jsquire jsquire added Cosmos Mgmt This issue is related to a management-plane library. needs-team-attention This issue needs attention from Azure service team or SDK team customer-reported Issues that are reported by GitHub users external to the Azure organization. question The issue doesn't require a change to the product in order to be resolved. Most issues start as that and removed needs-triage This is a new issue that needs to be triaged to the appropriate team. labels May 16, 2024
@jsquire
Copy link
Member

jsquire commented May 16, 2024

Thank you for your feedback. Tagging and routing to the team member best able to assist.

@archerzz
Copy link
Member

@baryoloraul Could you share the codes to invoke RegenerateKeyAsync() and the detailed error messages?

I cannot reproduce this error under Azure.ResourceManager.CosdmosDB version 1.3.2 and Azure.Identity version 1.12.0-beta.2.
Here are my codes (REDACTED):

using AzureEventSourceListener consoleListener = AzureEventSourceListener.CreateConsoleLogger(EventLevel.Informational);
var client = new ArmClient(new InteractiveBrowserCredential(), "{your subscription id}", new ArmClientOptions()
{
    Diagnostics = { IsLoggingContentEnabled = true }
});

ResourceGroupResource rg = client.GetDefaultSubscription().GetResourceGroup("{your resource group}");
CosmosDBAccountResource account = rg.GetCosmosDBAccount("{your cosmos account}");
await account.RegenerateKeyAsync(WaitUntil.Completed, new CosmosDBAccountRegenerateKeyContent(CosmosDBAccountKeyKind.Secondary));

Here is some key traffic content:

[Informational] Azure-Core: Request [xxxxxxxx] POST https://management.azure.com/subscriptions/xxx/resourceGroups/xxx/providers/Microsoft.DocumentDB/databaseAccounts/xxxxxx/regenerateKey?api-version=2022-11-15
[Informational] Azure-Core: Response [xxxxxxx] 202 Accepted (01.6s)
......
[Informational] Azure-Core: Request [yyyyy] GET https://management.azure.com/subscriptions/xxx/providers/Microsoft.DocumentDB/locations/eastus2/operationsStatus/xxxxxxxxxxx?api-version=2022-11-15
[Informational] Azure-Core: Response [yyyyy] 200 Ok (00.6s)
......
[Informational] Azure-Core: Request [zzzzz] GET https://management.azure.com/subscriptions/xxx/resourceGroups/xxxx/providers/Microsoft.DocumentDB/databaseAccounts/xxxxxxxx/regenerateKey/operationResults/xxxxxxxxx?api-version=2022-11-15
[Informational] Azure-Core: Response [zzzzz] 200 Ok (00.6s)

Apparently, the polling is successful.

@archerzz archerzz removed the needs-team-attention This issue needs attention from Azure service team or SDK team label May 22, 2024
@github-actions github-actions bot added the needs-team-attention This issue needs attention from Azure service team or SDK team label May 22, 2024
@archerzz archerzz removed the needs-team-attention This issue needs attention from Azure service team or SDK team label May 22, 2024
@github-actions github-actions bot added the needs-team-attention This issue needs attention from Azure service team or SDK team label May 22, 2024
@archerzz archerzz added needs-author-feedback More information is needed from author to address the issue. and removed needs-team-attention This issue needs attention from Azure service team or SDK team labels May 22, 2024
Copy link

Hi @baryoloraul. Thank you for opening this issue and giving us the opportunity to assist. To help our team better understand your issue and the details of your scenario please provide a response to the question asked above or the information requested above. This will help us more accurately address your issue.

Copy link

Hi @baryoloraul, we're sending this friendly reminder because we haven't heard back from you in 7 days. We need more information about this issue to help address it. Please be sure to give us your input. If we don't hear back from you within 14 days of this comment the issue will be automatically closed. Thank you!

@github-actions github-actions bot added the no-recent-activity There has been no recent activity on this issue. label May 29, 2024
@baryoloraul
Copy link
Member Author

baryoloraul commented May 30, 2024

Hi @archerzz, the error details in the bug description is all I have, next is the code I am using to rotate the keys:

var cosmosDbAccountId = CosmosDBAccountResource.CreateResourceIdentifier(subscriptionId, resourceGroupName, cosmosDbAccountName);
var cosmosDbAccount = armClient.GetCosmosDBAccountResource(cosmosDbAccountId);
var operationResult = await cosmosDbAccount.RegenerateKeyAsync(WaitUntil.Completed, new CosmosDBAccountRegenerateKeyContent(CosmosDBAccountKeyKind.SecondaryReadonly), cancellationToken).ConfigureAwait(false);

Worth mentioning that this error does not show up every time I try to rotate the keys, I haven't found a pattern though. Retrying the same failed operation succeeds some times.

I am running this in .Net 6 in CBL-Mariner 2.0.

@github-actions github-actions bot added needs-team-attention This issue needs attention from Azure service team or SDK team and removed needs-author-feedback More information is needed from author to address the issue. no-recent-activity There has been no recent activity on this issue. labels May 30, 2024
@archerzz archerzz added the Service Attention This issue is responsible by Azure service team. label May 31, 2024
@archerzz
Copy link
Member

I compared our test recording and found there could be a status problem on backend.

Hi cosmos team,
Can you check x-ms-request-id: a7d25542-9579-44be-882b-4fc4a265c5be to see if there is something wrong on backend? Thanks.

@archerzz
Copy link
Member

archerzz commented Jun 3, 2024

@baryoloraul Is it possible you enable diagnostics on your codes? like below:

// or other listener to collect logs
using AzureEventSourceListener consoleListener = AzureEventSourceListener.CreateConsoleLogger(EventLevel.Informational);

var client = new ArmClient(new AzureDefaultCredential(), "xxxx", new ArmClientOptions()
{
    Diagnostics = { IsLoggingContentEnabled = true }
});

We cannot reproduce this error. And after analyzing existing error messages, there is no clue on what was wrong. Some detailed logs are greatly appreciated. Thanks.

@archerzz archerzz added the needs-author-feedback More information is needed from author to address the issue. label Jun 3, 2024
@github-actions github-actions bot removed the needs-team-attention This issue needs attention from Azure service team or SDK team label Jun 3, 2024
Copy link

github-actions bot commented Jun 3, 2024

Hi @baryoloraul. Thank you for opening this issue and giving us the opportunity to assist. To help our team better understand your issue and the details of your scenario please provide a response to the question asked above or the information requested above. This will help us more accurately address your issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Cosmos customer-reported Issues that are reported by GitHub users external to the Azure organization. Mgmt This issue is related to a management-plane library. needs-author-feedback More information is needed from author to address the issue. question The issue doesn't require a change to the product in order to be resolved. Most issues start as that Service Attention This issue is responsible by Azure service team.
Projects
None yet
Development

No branches or pull requests

4 participants