Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Install the Pipelines Operator message in GUI when in fact Red Hat Openshift Pipelines installed #1379

Closed
1 task done
shalberd opened this issue Jun 14, 2023 · 5 comments · Fixed by #1412
Closed
1 task done
Assignees
Labels
feature/ds-pipelines Data Science Pipelines feature (aka DSP) kind/bug Something isn't working priority/high Important issue that needs to be resolved asap. Releases should not have too many of these.
Milestone

Comments

@shalberd
Copy link
Contributor

shalberd commented Jun 14, 2023

Summary Solution

Added to help the assignee know where to go with the solution.

See the summary comment below -- #1379 (comment)


Original issue reported by Sven.

Is there an existing issue for this?

  • I have searched the existing issues

Current Behavior

There is a message when clicking on the Data Science Pipelines section:

To use pipelines, first install the Red Hat OpenShift Pipelines Operator.

Bildschirmfoto 2023-06-14 um 13 01 26

https://github.com/opendatahub-io/odh-dashboard/blob/main/frontend/src/pages/dependencies/PipelinesDependencyMissing.tsx#L39

https://github.com/search?q=org%3Aopendatahub-io%20%22startup-pipeline-state%22&type=code

when clicking on the button, it leads to a namespace "startup-pipeline-state", without the Openshift pipelines operator.

Neither Project startup-pipeline-space nor namespace startup-pipeline-state exist on the Cluster.

Is the dependency check too much downstream / RHOSDS-specific?

Bildschirmfoto 2023-06-14 um 13 03 15

However, the Openshift Pipelines operator is installed in openshift-operators as a global operator.

Bildschirmfoto 2023-06-14 um 13 04 23

Is the Red Hat Openshift Pipelines operator not supposed to be installed that way? I looked at https://github.com/opendatahub-io/data-science-pipelines-operator/tree/main#pre-requisites

Openshift Pipelines operator 1.8.x was installed while dashboard pods running.

Expected Behavior

No error message or more specific one

Steps To Reproduce

master branch build of odh-dashboard
install odh with data-science-pipelines in kfdef
install Red Hat Openshift Pipelines operator in openshift-operators in mode "all namespaces".

Workaround (if any)

No response

What browsers are you seeing the problem on?

No response

Open Data Hub Version

1.6.0
1.6.0 with master branch build of ODH Dashboard

Anything else

data science project exists

kind: Project
apiVersion: project.openshift.io/v1
metadata:
  name: mydsproject
  labels:
    kubernetes.io/metadata.name: mydsproject
    modelmesh-enabled: 'true'
    opendatahub.io/dashboard: 'true'
...
@shalberd shalberd added kind/bug Something isn't working untriaged Indicates the newly create issue has not been triaged yet labels Jun 14, 2023
@shalberd
Copy link
Contributor Author

Bildschirmfoto 2023-06-14 um 15 07 35

@DaoDaoNoCode DaoDaoNoCode added priority/high Important issue that needs to be resolved asap. Releases should not have too many of these. feature/ds-pipelines Data Science Pipelines feature (aka DSP) and removed untriaged Indicates the newly create issue has not been triaged yet labels Jun 14, 2023
@DaoDaoNoCode DaoDaoNoCode added this to the Current Release milestone Jun 14, 2023
@shalberd
Copy link
Contributor Author

shalberd commented Jun 14, 2023

The "go install the operator link" at https://github.com/opendatahub-io/odh-dashboard/blob/main/frontend/src/pages/dependencies/PipelinesDependencyMissing.tsx#L39 should be changed, has a wrong namespace startup-pipeline-state instead of openshift-operators.

${url}/operatorhub/ns/startup-pipeline-state?details-item=openshift-pipelines-operator-rh-redhat-operators-openshift-marketplace

i.e. currently

${url}/operatorhub/ns/<operatornamespace>?details-item=<operatorname>-<catalogsourcename>

Also, in mirrored environments (on-prem, airgapped) there are often custom catalog sources

https://docs.openshift.com/container-platform/4.10/operators/admin/olm-restricted-networks.html#olm-creating-catalog-from-index_olm-restricted-networks

So for going to the details page of a certain operator, to install it or view the status, it is most likely always openshift-operators for the namespace and a custom catalogsource name is possible. One could get the catalogsources via openshift api (oc get catalogsources -n openshift-marketplace) and regex-match or starts-with match the correct one

Bildschirmfoto 2023-06-14 um 17 11 08

not sure if it's worth it, though, since the name can be anything.

i.e. in your case, the catalogsource name was

rh-redhat-operators-openshift-marketplace

and in our case, our cluster-admins named the catalogsource / mirrored index

redhat-operator-index

You could do a search-link (works):

/operatorhub/ns/openshift-operators?keyword=red+hat+openshift+pipelines

Bildschirmfoto 2023-06-14 um 17 33 04

Maybe including the link is not even necessary or viable, as not all admins of data science projects also have the right to view, even less so to install or modify, operators in openshift-operators. @andrewballantyne Instead of the "Install Operator" button, a text might suffice, e.g.

The OpenShift Cluster must have OpenShift Pipelines 1.8 or higher installed. We recommend channel pipelines-1.8 on OCP 4.10 and pipelines-1.9 or pipelines-1.10 for OCP 4.11, 4.12 and 4.13. Instructions at https://docs.openshift.com/container-platform/4.12/cicd/pipelines/installing-pipelines.html#op-installing-pipelines-operator-in-web-console_installing-pipelines.
Ask your cluster-admin for install.

Taken from

https://github.com/opendatahub-io/data-science-pipelines-operator/tree/main#pre-requisites

@shalberd
Copy link
Contributor Author

shalberd commented Jun 14, 2023

Also, regarding how the serviceaccount checks whether the operator is installed

https://github.com/opendatahub-io/odh-dashboard/blob/main/frontend/src/types.ts#L68

you'd also have to ensure somehow that the check works for the operator openshift-pipelines-operator-rh, regardless of which catalog source it was installed from. Something like

oc get subscriptions -n openshift-operators | grep openshift-pipelines-operator-rh

and checking the subscription object at

status.installedCSV

spec:
  channel: pipelines-1.8
  installPlanApproval: Automatic
  name: openshift-pipelines-operator-rh
  source: redhat-operator-index
  sourceNamespace: openshift-marketplace
status:
  installplan:
    apiVersion: operators.coreos.com/v1alpha1
    kind: InstallPlan
    name: install-zkf29
    uuid: 6ee7324a-c5c4-40d6-9d11-bc84b5e3cb57
  lastUpdated: '2023-06-13T13:25:18Z'
  installedCSV: openshift-pipelines-operator-rh.v1.8.2
  currentCSV: openshift-pipelines-operator-rh.v1.8.2

@andrewballantyne
Copy link
Member

Thanks @shalberd for the detailed breakdown.

Okay, I think the solution for this ticket probably should be:

  1. [Backend] Merge the blankDashboardConfig right after it is pulled from the API -- all code should use our defaults baked in anyways (remove it from the get config call as it will be always merged now)
  2. [Frontend] Update the URL used for redirect
    1. Try to use the "all projects" url; OperatorHub doesn't really work on namespaces per se, so no real reason to do that
    2. Change the search from trying to select the tile to doing a search (eg. ?keyword=red+hat+openshift+pipelines)
  3. [Frontend] Change the permission understanding for the install button on the dependency page
    1. Using check access, determine if they can create ClusterServiceVersions (CSVs) -- if they can't, they can't install the operator, hide the button away (this is done by isAdmin today, remove that logic)
    2. If we hide the button, we should provide text to lead them to contacting their admin (wording should be vetted by UX)

I think that'll get us to a nicer working effort so the operator can be used. Any objections to this solution @shalberd?

We have #1385 to chase down if we can detect the operator name better for the cluster-admin / CSV creator.

@shalberd
Copy link
Contributor Author

shalberd commented Jun 14, 2023

Points 1 to 3 sound very good indeed.

You are absolutely right, the operatorhub search GUI is totally namespace-agnostic. It simply does not matter which namespace is used. I just suggest to use openshift-operators namespace because that is the standard place where global operator subscriptions live. More for optics. The keyword search removes the dependency on a certain catalog source / index name.

https://myopenshift.apps.bla.com/operatorhub/ns/openshift-operators?keyword=red+hat+openshift+pipelines

I added thoughts on ways to detect the operator dependency in a comment in #1385. Basically, looking for the subscription's status.installedCSV field. That field is only really present if the operator was installed successfully and is running without any errors.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature/ds-pipelines Data Science Pipelines feature (aka DSP) kind/bug Something isn't working priority/high Important issue that needs to be resolved asap. Releases should not have too many of these.
Projects
Archived in project
Development

Successfully merging a pull request may close this issue.

4 participants