Skip to content

[ML] Inference endpoint was created even if there is request validation error #137127

@wwang500

Description

@wwang500

Version:
9.2.0

Step to reproduce:

In dev tools, create an inference endpoint with num_allocations = 1 and adaptive_allocations enabled.

PUT _inference/rerank/mytest
{
    "service": "elasticsearch",
      "service_settings": {
        "num_threads": 1,
        "num_allocations": 1,
        "model_id": ".rerank-v1",
        "adaptive_allocations": {
          "enabled": true,
          "min_number_of_allocations": 0,
          "max_number_of_allocations": 32
        }
      },
      "task_settings": {
        "return_documents": true
      }
}

You should see an error:

{
  "error": {
    "root_cause": [
      {
        "type": "action_request_validation_exception",
        "reason": "Validation Failed: 1: [number_of_allocations] cannot be set if adaptive allocations is enabled;"
      }
    ],
    "type": "action_request_validation_exception",
    "reason": "Validation Failed: 1: [number_of_allocations] cannot be set if adaptive allocations is enabled;"
  },
  "status": 400
}

However, when run GET _inference/rerank/mytest, the inference endpoint has been created successfully. (trained model isn't created)

Metadata

Metadata

Assignees

No one assigned

    Labels

    :mlMachine learning>bugTeam:MLMeta label for the ML team

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions