[FEATURE REQ] Expose API responses object model's custom deserialization #43657
Labels
Client
This issue points to a problem in the data-plane of the library.
customer-reported
Issues that are reported by GitHub users external to the Azure organization.
needs-team-attention
Workflow: This issue needs attention from Azure service team or SDK team
OpenAI
question
The issue doesn't require a change to the product in order to be resolved. Most issues start as that
Service Attention
Workflow: This issue is responsible by Azure service team.
Library name
Azure.AI.OpenAI
Please describe the feature.
Azure.AI.OpenAI has custom code for deserializing the REST API's responses (for instance completions, chat completions).
For instance: internal static Completions DeserializeCompletions(JsonElement element, ModelReaderWriterOptions options = null)
The problem is that if I get a REST API response from another mean (let's say through a proxy LLM API), it's not possible to leverage this library's object model for completions.
Solution would be to expose this custom deserialization logic to allow converting a string response to Azure.AI.OpenAI's object model.
Note: another benefit of this is let's say you have a CSV storing thousands of OpenAI API responses, you could load it up in a C# program and rehydrate those using the object model.
The text was updated successfully, but these errors were encountered: