-
Notifications
You must be signed in to change notification settings - Fork 4.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adds errorCode and explanation; Replaces pagination with continuationToken #28061
Conversation
Next Steps to Merge✅ All automated merging requirements have been met! Refer to step 4 in the PR workflow diagram (even if your PR is for data plane, not ARM). |
Swagger Validation Report
|
compared tags (via openapi-validator v2.2.0) | new version | base version |
---|---|---|
package-2024-01-31-preview | package-2024-01-31-preview(837e8a1) | package-2024-01-31-preview(release-healthdataaiservices-deidentification-2024-01-31-preview) |
The following errors/warnings exist before current PR submission:
Rule | Message |
---|---|
A PUT operation request body schema should be the same as its 200 response schema, to allow reusing the same entity between GET and PUT. If the schema of the PUT request body is a superset of the GET response body, make sure you have a PATCH operation to make the resource updatable. Operation: 'Jobs_Create' Request Model: 'parameters[3].schema' Response Model: 'responses[200].schema' Location: HealthDataAIServices.Deidentification/preview/2024-01-31-preview/HealthDataAIServices.Deidentification.json#L213 |
|
OperationId for put method should contain both 'Create' and 'Update' Location: HealthDataAIServices.Deidentification/preview/2024-01-31-preview/HealthDataAIServices.Deidentification.json#L214 |
️️✔️
Avocado succeeded [Detail] [Expand]
Validation passes for Avocado.
️️✔️
SwaggerAPIView succeeded [Detail] [Expand]
️❌
TypeSpecAPIView: 0 Errors, 1 Warnings failed [Detail]
Rule | Message |
---|---|
"How to fix":"Check the detailed log and verify if the TypeSpec emitter is able to create API review file for the changes in PR." |
️️✔️
ModelValidation succeeded [Detail] [Expand]
Validation passes for ModelValidation.
️️✔️
SemanticValidation succeeded [Detail] [Expand]
Validation passes for SemanticValidation.
️️✔️
SpellCheck succeeded [Detail] [Expand]
Validation passes for SpellCheck.
️️✔️
PR Summary succeeded [Detail] [Expand]
Validation passes for Summary.
️️✔️
Automated merging requirements met succeeded [Detail] [Expand]
Swagger Generation Artifacts
|
Generated ApiView
|
@visibility("read") | ||
errorCode?: string; | ||
|
||
@doc("Error message for complete job failures.") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it only available when job fails?
There is a recommended error type as described here https://github.com/microsoft/api-guidelines/blob/vNext/azure/Guidelines.md#handling-errors, do you want to follow the same data structure?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ahhhh great point, I figured that could only be used for error responses. I've updated the PR to use this for jobStatuses as well.
Also Yes, this field will only be available when the job has Failed.
e64ac2b
into
Azure:release-healthdataaiservices-deidentification-2024-01-31-preview
…Token (Azure#28061) * Adds errorCode and explanation; Replaces pagination with continuationToken * Updates to use standard Azure Foundation Error --------- Co-authored-by: Graham Thomas <gthomas@microsoft.com>
…Token (Azure#28061) * Adds errorCode and explanation; Replaces pagination with continuationToken * Updates to use standard Azure Foundation Error --------- Co-authored-by: Graham Thomas <gthomas@microsoft.com>
Data Plane API - Pull Request
Previous spec was the proposed spec, during implementation, the team and I recognized a couple changes that are needed to the spec. No customers are currently using the existing spec so backwards compatibility is not an issue.
ErrorCode and Explanation We've added an errorCode and explanation to jobs so customers can understand why they failed. Added it first the file level (incase a single file within a job fails) but also added it to the root job (incase there is a prerequisite issue, ex. cannot access customer blob storage). This pattern exist already for real-time, but job reports aren't error responses.
Pagination with continuationToken Given we have a backend of cosmosDB, we believe it's much simpler for us and the user to utilize a nextLink containing the continuationToken as opposed to using the cumbersome top/skip pattern.
API Info: The Basics
Most of the information about your service should be captured in the issue that serves as your API Spec engagement record.
Is this review for (select one):
Change Scope
This section will help us focus on the specific parts of your API that are new or have been modified.
Please share a link to the design document for the new APIs, a link to the previous API Spec document (if applicable), and the root paths that have been updated.
Viewing API changes
For convenient view of the API changes made by this PR, refer to the URLs provided in the table
in the
Generated ApiView
comment added to this PR. You can use ApiView to show API versions diff.Suppressing failures
If one or multiple validation error/warning suppression(s) is detected in your PR, please follow the
Swagger-Suppression-Process
to get approval.
❔Got questions? Need additional info?? We are here to help!
Contact us!
The Azure API Review Board is dedicated to helping you create amazing APIs. You can read about our mission and learn more about our process on our wiki.
Click here for links to tools, specs, guidelines & other good stuff
Tooling
Guidelines & Specifications
Helpful Links
Checks stuck in `queued` state?
If the PR CI checks appear to be stuck in `queued` state, please add a comment with contents `/azp run`. This should result in a new comment denoting a `PR validation pipeline` has started and the checks should be updated after few minutes.