Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ML] Adding manage_inference to the kibana_system role #108262

Conversation

jonathan-buttner
Copy link
Contributor

This PR adds the manage_inference to the kibana_system role so the default user can interact with the inference APIs. This came from a discussion around the security assistant not being able to interact with the inference API using the internal elasticsearch user within kibana.

@jonathan-buttner jonathan-buttner added >non-issue :ml Machine learning Team:ML Meta label for the ML team v8.15.0 labels May 3, 2024
@jonathan-buttner jonathan-buttner marked this pull request as ready for review May 3, 2024 17:33
@jonathan-buttner jonathan-buttner requested a review from a team as a code owner May 3, 2024 17:33
@elasticsearchmachine
Copy link
Collaborator

Pinging @elastic/ml-core (Team:ML)

@kc13greiner kc13greiner self-requested a review May 6, 2024 14:26
@kc13greiner
Copy link
Contributor

Heya @jonathan-buttner !

I have a few clarifying questions:

  1. Why do these API need to be called as the kibana_system user?
  2. What does the manage_inference privilege allow? I dont see it documented anywhere.

@spong
Copy link
Member

spong commented May 6, 2024

I can answer 1. -- over in the Security Solution Assistant, we're trying to leverage the new _inference API to automatically set up and deploy ELSER so that we can enable the Knowledge Base functionality by default (so long as the appropriate ML resources exist). To do this we would be calling the below with an asInternalUser esClient, however the internal user does not currently have this privilege:

      // Temporarily use esClient for current user until `kibana_system` user has `inference_admin` role
      // See https://github.com/elastic/elasticsearch/pull/108262
      // const esClient = (await context.core).elasticsearch.client.asInternalUser;
      const esClient = (await context.core).elasticsearch.client.asCurrentUser;
      const elserResponse = await esClient.inference.putModel({
        inference_id: 'elser_model_2',
        task_type: 'sparse_embedding',
        model_config: {
          service: 'elser',
          service_settings: {
            model_id: elserId,
            num_allocations: 1,
            num_threads: 1,
          },
          task_settings: {},
        },
      });

We could fall back to using the TrainedModelsAPI as the internal user already has manage_ml privileges which covers this API, however we were hoping to start trialing the _inference API so we could begin to provide feedback and use cases to the platform team.

@jonathan-buttner
Copy link
Contributor Author

jonathan-buttner commented May 6, 2024

Hey @kc13greiner 👋

  1. What does the manage_inference privilege allow? I dont see it documented anywhere.

The manage_inference gives access to these apis below. Here are some docs: https://www.elastic.co/guide/en/elasticsearch/reference/master/inference-apis.html

Generally it allows setting up and deleting inference endpoints to interact with 3rd party services like cohere and openai. It also allows interacting with the trained models apis: https://www.elastic.co/guide/en/elasticsearch/reference/master/ml-df-trained-models-apis.html

    private static final Set<String> MANAGE_INFERENCE_PATTERN = Set.of(
        "cluster:admin/xpack/inference/*",
        "cluster:monitor/xpack/inference*", // no trailing slash to match the POST InferenceAction name
        "cluster:admin/xpack/ml/trained_models/deployment/start",
        "cluster:admin/xpack/ml/trained_models/deployment/stop",
        "cluster:monitor/xpack/ml/trained_models/deployment/infer"
    );

Copy link
Member

@davidkyle davidkyle left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@kc13greiner
Copy link
Contributor

@jonathan-buttner @spong Thanks for the info! Reviewing and discussing with the team 🚀

@jonathan-buttner jonathan-buttner merged commit 4574f2a into elastic:main May 7, 2024
15 checks passed
@jonathan-buttner jonathan-buttner deleted the ml-add-infer-to-kibana-system branch May 7, 2024 13:46
@kc13greiner
Copy link
Contributor

@jonathan-buttner Sorry, that wasn't an approval yet. I just wanted to provide an update that I was discussing with the team. I apologize for the confusing wording.

@jonathan-buttner
Copy link
Contributor Author

jonathan-buttner commented May 7, 2024

Accidentally merged this without security's approval. They asked us to revert for now and we'll continue discussing on a new PR.

jonathan-buttner added a commit that referenced this pull request May 7, 2024
elasticsearchmachine pushed a commit that referenced this pull request May 7, 2024
Reverts #108262

Accidentally merged the above PR without security's approval. They asked
us to revert for now and we'll continue discussing on a new PR.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
:ml Machine learning >non-issue Team:ML Meta label for the ML team v8.15.0
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants