Skip to content

OLS-2378: Generic Provider Config for LCore supported providers.#1257

Open
sriroopar wants to merge 1 commit intoopenshift:mainfrom
sriroopar:generic_provider_config
Open

OLS-2378: Generic Provider Config for LCore supported providers.#1257
sriroopar wants to merge 1 commit intoopenshift:mainfrom
sriroopar:generic_provider_config

Conversation

@sriroopar
Copy link
Contributor

@sriroopar sriroopar commented Feb 5, 2026

Description

This PR introduces Generic Provider Configuration support for LCore (Llama Stack) backend, enabling flexible LLM provider configuration beyond the predefined types.

Type of change

  • Refactor
  • New feature
  • Bug fix
  • CVE fix
  • Optimization
  • Documentation Update
  • Configuration Update
  • Bump-up dependent library

Related Tickets & Documents

Checklist before requesting a review

  • I have performed a self-review of my code.
  • PR has passed all pre-merge test jobs.
  • If it is a core feature, I have added thorough tests.

Testing

  • Please provide detailed steps to perform tests related to this code change.

    • Configured OLS to use LCore as backend, as changes will not work with appserver backend.
    • Applied a valid config using remote:: openai as provider type.
    • Verified llama stack configmap generation and validation rules manually.
    • Checked liveliness of endpoint with openai provider and passed a query and verified that the response is valid.
    • Also verified with an invalid provider type.
  • How were the fix/results from this change verified? Please provide relevant screenshots or results.

Screenshot From 2026-02-05 15-50-07

@openshift-ci-robot openshift-ci-robot added the jira/valid-reference Indicates that this PR references a valid Jira ticket of any type. label Feb 5, 2026
@openshift-ci-robot
Copy link

openshift-ci-robot commented Feb 5, 2026

@sriroopar: This pull request references OLS-2378 which is a valid jira issue.

Details

In response to this:

Description

This PR introduces Generic Provider Configuration support for LCore (Llama Stack) backend, enabling flexible LLM provider configuration beyond the predefined types.

Type of change

  • Refactor
  • New feature
  • Bug fix
  • CVE fix
  • Optimization
  • Documentation Update
  • Configuration Update
  • Bump-up dependent library

Related Tickets & Documents

Checklist before requesting a review

  • I have performed a self-review of my code.
  • PR has passed all pre-merge test jobs.
  • If it is a core feature, I have added thorough tests.

Testing

  • Please provide detailed steps to perform tests related to this code change.

    • Configured OLS to use LCore as backend, as changes will not work with appserver backend.
    • Applied a valid config using remote:: openai as provider type.
    • Verified llama stack configmap generation and validation rules manually.
  • How were the fix/results from this change verified? Please provide relevant screenshots or results.

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the openshift-eng/jira-lifecycle-plugin repository.

1 similar comment
@openshift-ci-robot
Copy link

openshift-ci-robot commented Feb 5, 2026

@sriroopar: This pull request references OLS-2378 which is a valid jira issue.

Details

In response to this:

Description

This PR introduces Generic Provider Configuration support for LCore (Llama Stack) backend, enabling flexible LLM provider configuration beyond the predefined types.

Type of change

  • Refactor
  • New feature
  • Bug fix
  • CVE fix
  • Optimization
  • Documentation Update
  • Configuration Update
  • Bump-up dependent library

Related Tickets & Documents

Checklist before requesting a review

  • I have performed a self-review of my code.
  • PR has passed all pre-merge test jobs.
  • If it is a core feature, I have added thorough tests.

Testing

  • Please provide detailed steps to perform tests related to this code change.

    • Configured OLS to use LCore as backend, as changes will not work with appserver backend.
    • Applied a valid config using remote:: openai as provider type.
    • Verified llama stack configmap generation and validation rules manually.
  • How were the fix/results from this change verified? Please provide relevant screenshots or results.

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the openshift-eng/jira-lifecycle-plugin repository.

@openshift-ci openshift-ci bot requested review from bparees and xrajesh February 5, 2026 18:19
@openshift-ci
Copy link

openshift-ci bot commented Feb 5, 2026

[APPROVALNOTIFIER] This PR is NOT APPROVED

This pull-request has been approved by:
Once this PR has been reviewed and has the lgtm label, please assign raptorsun for approval. For more information see the Code Review Process.

The full list of commands accepted by this bot can be found here.

Details Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@openshift-ci-robot
Copy link

openshift-ci-robot commented Feb 5, 2026

@sriroopar: This pull request references OLS-2378 which is a valid jira issue.

Details

In response to this:

Description

This PR introduces Generic Provider Configuration support for LCore (Llama Stack) backend, enabling flexible LLM provider configuration beyond the predefined types.

Type of change

  • Refactor
  • New feature
  • Bug fix
  • CVE fix
  • Optimization
  • Documentation Update
  • Configuration Update
  • Bump-up dependent library

Related Tickets & Documents

Checklist before requesting a review

  • I have performed a self-review of my code.
  • PR has passed all pre-merge test jobs.
  • If it is a core feature, I have added thorough tests.

Testing

  • Please provide detailed steps to perform tests related to this code change.

    • Configured OLS to use LCore as backend, as changes will not work with appserver backend.
    • Applied a valid config using remote:: openai as provider type.
    • Verified llama stack configmap generation and validation rules manually.
    • Checked liveliness of endpoint with openai provider and passed a query and verified that the response is valid.
    • Also verified with an invalid provider type.
  • How were the fix/results from this change verified? Please provide relevant screenshots or results.

Screenshot From 2026-02-05 15-50-07

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the openshift-eng/jira-lifecycle-plugin repository.

@sriroopar sriroopar force-pushed the generic_provider_config branch 2 times, most recently from 55a718f to 3453cd5 Compare February 9, 2026 13:49
@sriroopar sriroopar force-pushed the generic_provider_config branch 2 times, most recently from 803c5c8 to f2d3abf Compare February 16, 2026 23:12
@raptorsun
Copy link
Contributor

The logics looks good :)
I have a doubt about having providers[].providerType beside providers[].type. This could be confusing for the users.
If the providers[].type is generic, we can put everything including the llamastack's providerType. the strucutre looks like

providers:
  - type: generic
    config: 
      provider_id: openai
      provider_type: remote::openai
      config:
        api_key: ${env.OPENAI_API_KEY}

moreover, the generic type is more llamaStackGeneric

@sriroopar sriroopar force-pushed the generic_provider_config branch from f2d3abf to 69ed208 Compare February 18, 2026 13:27
@openshift-ci
Copy link

openshift-ci bot commented Feb 18, 2026

@sriroopar: all tests passed!

Full PR test history. Your PR dashboard.

Details

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository. I understand the commands that are listed here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

jira/valid-reference Indicates that this PR references a valid Jira ticket of any type.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants