Skip to content

support model_armor with disabling vertexai #6276

@ehsankf

Description

@ehsankf

I would like to pass model_armor_config to the Generative Language API call, and it appears that the code below is currently the only way to invoke the API while including model_armor_config. Is there a way to make this work with the vertexai=False flag, to ensure that the request is sent to the Gemini API platform rather than Vertex AI?

client_options = {"api_endpoint": "us-east4-aiplatform.googleapis.com"}
  client = aiplatform_v1beta1.PredictionServiceClient(client_options=client_options)

  request = aiplatform_v1beta1.GenerateContentRequest(
      model="gemini-2.0-flash-001",
      contents=[content.Content(role="user", parts=[content.Part(text="Hello world")])],
      model_armor_config={
          "prompt_template_name": f"projects/{PROJECT_ID}/locations/{LOCATION}/templates/{TEMPLATE_ID}"
      }
  )

Metadata

Metadata

Assignees

No one assigned

    Labels

    api: vertex-aiIssues related to the googleapis/python-aiplatform API.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions