Skip to content

Document support of inference_response_keys on AutoML.deploy() #1797

@athewsey

Description

@athewsey

What did you find confusing? Please describe.

AutoML.create_model() supports an inference_response_keys parameter which controls what kind of results the created model produces: e.g. Label, top label confidence and/or all confidences for multiclass classification.

AutoML.deploy() doc does not mention the inference_response_keys parameter or any default behaviour of additional kwargs...

...However, it is possible to pass inference_response_keys in and the created endpoint will respond as specified.

Describe how documentation can be improved

Best would be to explicitly support this argument, because it's important for how the results of the model will be consumed as it significantly changes the API contract.

Documentation could also be improved to indicate what parent/downstream methods unlisted kwargs will be sent to.

Additional context

N/A

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions