Skip to content

Support batch prediction requests #523

@deliahu

Description

@deliahu

Input: list of chunks (a chunk is an arbitrary object)

Should the batch API run in the operator or a separate pod?

Should we handle the case where the input request is really large? Could receive a link to the input request file

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions