-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Description
What did you find confusing? Please describe.
AutoML.create_model() supports an inference_response_keys parameter which controls what kind of results the created model produces: e.g. Label, top label confidence and/or all confidences for multiclass classification.
AutoML.deploy() doc does not mention the inference_response_keys parameter or any default behaviour of additional kwargs...
...However, it is possible to pass inference_response_keys in and the created endpoint will respond as specified.
Describe how documentation can be improved
Best would be to explicitly support this argument, because it's important for how the results of the model will be consumed as it significantly changes the API contract.
Documentation could also be improved to indicate what parent/downstream methods unlisted kwargs will be sent to.
Additional context
N/A