New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: Install the Pipelines Operator message in GUI when in fact Red Hat Openshift Pipelines installed #1379
Comments
The "go install the operator link" at https://github.com/opendatahub-io/odh-dashboard/blob/main/frontend/src/pages/dependencies/PipelinesDependencyMissing.tsx#L39 should be changed, has a wrong namespace startup-pipeline-state instead of openshift-operators.
i.e. currently
Also, in mirrored environments (on-prem, airgapped) there are often custom catalog sources So for going to the details page of a certain operator, to install it or view the status, it is most likely always openshift-operators for the namespace and a custom catalogsource name is possible. One could get the catalogsources via openshift api ( not sure if it's worth it, though, since the name can be anything. i.e. in your case, the catalogsource name was
and in our case, our cluster-admins named the catalogsource / mirrored index
You could do a search-link (works):
Maybe including the link is not even necessary or viable, as not all admins of data science projects also have the right to view, even less so to install or modify, operators in openshift-operators. @andrewballantyne Instead of the "Install Operator" button, a text might suffice, e.g. The OpenShift Cluster must have OpenShift Pipelines 1.8 or higher installed. We recommend channel pipelines-1.8 on OCP 4.10 and pipelines-1.9 or pipelines-1.10 for OCP 4.11, 4.12 and 4.13. Instructions at https://docs.openshift.com/container-platform/4.12/cicd/pipelines/installing-pipelines.html#op-installing-pipelines-operator-in-web-console_installing-pipelines. Taken from https://github.com/opendatahub-io/data-science-pipelines-operator/tree/main#pre-requisites |
Also, regarding how the serviceaccount checks whether the operator is installed https://github.com/opendatahub-io/odh-dashboard/blob/main/frontend/src/types.ts#L68 you'd also have to ensure somehow that the check works for the operator openshift-pipelines-operator-rh, regardless of which catalog source it was installed from. Something like
and checking the subscription object at status.installedCSV
|
Thanks @shalberd for the detailed breakdown. Okay, I think the solution for this ticket probably should be:
I think that'll get us to a nicer working effort so the operator can be used. Any objections to this solution @shalberd? We have #1385 to chase down if we can detect the operator name better for the cluster-admin / CSV creator. |
Points 1 to 3 sound very good indeed. You are absolutely right, the operatorhub search GUI is totally namespace-agnostic. It simply does not matter which namespace is used. I just suggest to use openshift-operators namespace because that is the standard place where global operator subscriptions live. More for optics. The keyword search removes the dependency on a certain catalog source / index name.
I added thoughts on ways to detect the operator dependency in a comment in #1385. Basically, looking for the subscription's |
Summary Solution
Added to help the assignee know where to go with the solution.
See the summary comment below -- #1379 (comment)
Original issue reported by Sven.
Is there an existing issue for this?
Current Behavior
There is a message when clicking on the Data Science Pipelines section:
To use pipelines, first install the Red Hat OpenShift Pipelines Operator.
https://github.com/opendatahub-io/odh-dashboard/blob/main/frontend/src/pages/dependencies/PipelinesDependencyMissing.tsx#L39
https://github.com/search?q=org%3Aopendatahub-io%20%22startup-pipeline-state%22&type=code
when clicking on the button, it leads to a namespace "startup-pipeline-state", without the Openshift pipelines operator.
Neither Project startup-pipeline-space nor namespace startup-pipeline-state exist on the Cluster.
Is the dependency check too much downstream / RHOSDS-specific?
However, the Openshift Pipelines operator is installed in openshift-operators as a global operator.
Is the Red Hat Openshift Pipelines operator not supposed to be installed that way? I looked at https://github.com/opendatahub-io/data-science-pipelines-operator/tree/main#pre-requisites
Openshift Pipelines operator 1.8.x was installed while dashboard pods running.
Expected Behavior
No error message or more specific one
Steps To Reproduce
master branch build of odh-dashboard
install odh with data-science-pipelines in kfdef
install Red Hat Openshift Pipelines operator in openshift-operators in mode "all namespaces".
Workaround (if any)
No response
What browsers are you seeing the problem on?
No response
Open Data Hub Version
1.6.0
1.6.0 with master branch build of ODH Dashboard
Anything else
data science project exists
The text was updated successfully, but these errors were encountered: