Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 7 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1 +1,7 @@
# docs
# README

This project contains the documentation for hybrid-cloud-patterns.io

The directory structure is pretty straightforward. The images dir are where images used in the documentation are loaded and referenced. Each validated pattern receives its own directory that includes an index.md file and any other files needed fo explain the pattern.


Binary file modified images/manufacturing-logical.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added images/manufacturing-schema-gitops.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
32 changes: 19 additions & 13 deletions industrial-edge/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,15 +35,15 @@ industrial setting, but it could easily be applicable to other verticals.

### Red Hat Technologies

- OpenShift
- OpenShift GitOps (ArgoCD)
- OpenShift Pipelines (Tekton)
- OpenShift Container Storage
- Advanced Cluster Management
- Red Hat OpenShift Container Platform (Kubernetes++)
- Red Hat Advanced Cluster Management (Open Clutser Management)
- Red Hat OpenShift GitOps (ArgoCD)
- Red Hat OpenShift Pipelines (Tekton)
- Red Hat Quay (Container image registry)
- Red Hat AMQ (Apache ActiveMQ)
- Red Hat AMQ Streams (Apache Kafka Event Broker)
- Red Hat Integration (Apache Camel-K)
- Open Data Hub
- AMQ (MQTT Message broker)
- AMQ Streams (Kafka Event Broker)
- Camel-K Integrations

## Architecture

Expand Down Expand Up @@ -71,12 +71,16 @@ The diagram below shows the components are are deployed in the datacenter and th

![Industrial Edge Physical Dataflow Architecture](/images/manufacturing-schema-df.png)

While the above diagrams show the components involved on the DevOps side dealing with the application and the AI/ML models, there are other components to conside when dealing with operational side using GitOps.

[Industrial Edge Physical GitOps Architecture](/images/manufacturing-schema-gitops.png)

## Recorded Demo

## Prerequisties

1. An OpenShift cluster ( Go to https://console.redhat.com/openshift/create )
1. (Optional) A second OpenShift cluster
1. (Optional) A second OpenShift cluster for the factory (edge).
1. A github account
1. A quay account

Expand All @@ -95,18 +99,20 @@ containing the complete configuration.
1. Check to see that all Operators have been deployed

```
UI -> Installed Operators
OpenShift UI -> Installed Operators
```
The entire deployment involves several OpenShift GitOps applications. It takes time to deploy everything. You may have to go back and forth between this step and the next step to make sure
that all the operators are deployed.

1. Check all ArgoCD applications are synchronised
1. Check all OpenShift GitOps applications are synchronised

a. Obtain the ArgoCD urls and passwords
a. Obtain the ArgoCD console urls and passwords

```
for name in openshift datacenter factory; do oc -n $name-gitops get route $name-gitops-server -o jsonpath='{.spec.host}'; echo ; oc -n $name-gitops extract secrets/$name-gitops-cluster --to=-; done
```

a. Log in, and check for green applications
a. Log in using the userid `admin` and the provided generated passowrd. There will be a number and check for green applications

## What Next

Expand Down
24 changes: 24 additions & 0 deletions industrial-edge/troubleshooting.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,30 @@ nav_order: 2

## Installation-phase Failures

The framework for deploying the applications and their operators has been made easy for the user
by using OpenShift GitOps for continuous deployment (Argo CD). It takes time to deploy everything.
You may have to go back and forth between the OpenShift cluster console and the OpenShift GitOps console to check on applications and operators being up and in a ready state.

The applications deployment for the main data center are as follows. First OpenShift GitOps operator will deploy. See the OpenShift Console to see that it is running. Then OpenShift GitOps takes over the rest of the deployment. It deploys the following applications

- Advanced Cluster Management operator in the application `acm`. this will manage the edge clusters
- Open Data Hub in the application `odh` for the data science components.
- OpenShift Pipelines is deployed in the application pipelines
- AMQ Streams is deployed to manage data coming from factories and stored in a data lake.
- The data lake uses S3 based storage and is deployed in the `central-s3` application
- Testing at the data center is managed by the `manuela-test` application

Make sure that all these applications are `Healthy` 💚 and `Synced` ✅ in the OpenShift GitOps console. If in a state other than `Healthy` (`Progressing, Degraded, Missing, Unknown'`) then it's time to dive deeper into that application and see what has happened.


The applications deployed on the factory (edge) cluster are as follows. After a successful importing [1] a factory cluster to the main ACM hub, you should check in the factory cluster's OpenShift UI to see if the projects `open-cluster-manager-agent` and `open-cluster-manager-agent-addons` are running. When these are deployed then OpenShift GitOps operator will be deployed on the cluster. From there OpenShift GitOps deploys the following applications:

- `datalake` application sets streams to the data center.
- `stormshift` sets up application and AMQ integration components
- `odh` sets up the AI/ML models that have been developed by the data scientists.

[1] ACM has different ways of describing this process based on which tool you are using. Attach, Join, Import are terms associated with bringing a cluster under the management of a hub cluster.

### Install loop does not complete

#### Symptom: `make install` does not complete in a timely fashion (~10 minutes from start). Status messages keep scrolling.
Expand Down
29 changes: 0 additions & 29 deletions medical-diagnosis.md

This file was deleted.