A container image automated pipline based on OpenShift and Jenkins to build, deploy, test, promote, certify and publish
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
config
docs/images
hack
image-scanner
jenkins
LICENSE
README.md
custom-build.yaml
jenkins-jobs.yaml
ose-build-template.yaml

README.md

OpenShift Image Build Pipeline

A container image automated build pipline based on OpenShift V3 and Jenkins to build, deploy, test and promote.

CI workflow

OpenShift

OpenShift is a hosted service. You may want to host an instance of OpenShift yourself either because you want a development environment or you do not have access to a hosted environment. Refer to the Installation methods.

Workflow Requirements

  • access to OpenShift

  • configured OpenShift registry

  • created an OpenShift project

  • access to oc client. For remote OpenShift client use the CLI binary can be downloaded or run from a container.

      $ [sudo] docker run -it --name origin --entrypoint bash openshift/origin
    

Setup

  1. Add the edit role to the default service account in the <PROJECT_NAME> project. This is so Jenkins can access the OpenShift environment using a service account token.

     $ oc policy add-role-to-user edit system:serviceaccount:<PROJECT_NAME>:default
    
  2. Upload the Jenkins master template.

     $ oc create -f https://raw.githubusercontent.com/aweiteka/jenkins-ci/http-insecure/openshift/jenkins-master-ephemeral.json
    
  3. Start the Jenkins master. This will build and deploy the server so it will take several minutes. Replace <YOUR_PASSWORD>.

     $ oc new-app jenkins-master -p JENKINS_PASSWORD=<YOUR_PASSWORD>
    
  4. Create the application. This creates a whole pile of resources (build config, image streams, test deployment, service)

     oc new-app https://github.com/example/app.git --context-dir=path/to/dockerfile --name=<YOUR_APPLICATION_NAME>
    
  5. Create a DNS route for your application

     oc expose service <YOUR_APPLICATION_NAME>
    

Jenkins setup

In the OpenShift web UI Overview click on the Jenkins service link and login with username "admin" and the password you selected when deployed. (The default is "password" if you did not select your own.) Note there may be a few jobs already created. For this workflow these will not be used.

Now we're ready to create the jobs in the Jenkins master. We'll use Jenkins Job builder to define the jobs then render them using a CLI tool.

  1. Copy the Jenkins Job Builder template and config directory from this repository to your source repository. The directory should look something like this.

     ├── config
     │   └── jenkins-jobs.ini
     ├── Dockerfile
     ├── ...
     └── jenkins-jobs.yaml
    
  2. Edit the jenkins-jobs config file config/jenkins-jobs.ini changing the jenkins master route address. Do NOT add jenkins-jobs.ini to source control. It has your credentials. The password must be the admin user token. In the jenkins web UI in upper-right corner navigate Jenkins Admin pulldown > Configure > Show API Token.

  3. Run the Jenkins Job Builder tool to upload jobs to the Jenkins master. Run the container from the same directory of the jenkins-jobs.yaml file.

     [sudo] atomic run aweiteka/jenkins-job-builder
    
  4. Each time you want to make a change to a job, run this tool again to update the changes in the Jenkins master.

Customizing the automation

TBD

Migrating to another OpenShift Instance

If you were working on a local development environment you can migrate your work to a hosted environment.

  1. Export your template. We're exporting all resources as template. You may pass in the label option to select certain resources such as -l app=mongodb.

    oc export all --all -o json --as-template myproject > myproject.json

  2. Logout of the local environment.

     oc logout
    
  3. Try to login to the hosted environment.

     oc login https://<openshift_console_url>
    
  4. You'll get a 404 login error, instructing you to get an API token first. Visit the URL and copy the login command with the token.

     oc login --token=<token> --server=https://<openshift_api_url>
    
  5. Import on the other Openshift server

     oc new-app -f myproject.json
    
  6. Update your Jenkins endpoint so you can upload the jenkins jobs to the new jenkins server. Get the Jenkins master URL:

     oc get route jenkins
    
  7. Update config/jenkins-jobs.ini file with the URL from step 1.

  8. Upload the jobs.

     sudo atomic run aweiteka/jenkins-job-builder
    

Notes

  • Delete resources in bulk

      oc delete all -l <FOO=BAR>
    
  • Trigger OpenShift web hook remotely

      curl -X POST <openshift_webhook_url> [--insecure]
    
  • Image scanning. Assumes image contents in /tmp/image-content

      export OSCAP_PROBE_ROOT=/tmp/image-content
      sudo oscap oval eval --report /tmp/oscap.html --results /tmp/oscap.xml http://www.redhat.com/security/data/oval/Red_Hat_Enterprise_Linux_7.xml
    
  • Inspect image

      oc get istag <imagestream>:<tag> -o yaml
    
  • Get image labels

      oc get istag centos:centos7 -o template -t {{.image.dockerImageMetadata.ContainerConfig.Labels}}
    
  • Dockerfile lint remote Dockerfile

      docker run -it --rm projectatomic/dockerfile-lint bash -c 'git clone https://github.com/projectatomic/atomicapp.git && dockerfile_lint -f atomicapp/Dockerfile'
    

Troubleshooting

  1. "My image won't run on OpenShift."

    Is it running as root? OpenShift will not allow running as root. You may need to update your image. See "Support arbitrary user ids".

  2. Monitoring and debugging tips

     oc get events -w         # tail openshift events
     oc get builds            # list builds
     oc build-logs <build>    # view a build log
     oc get pods              # list pods
     oc logs <pod>            # view a pod log
     oc exec -it <pod> bash   # enter a pod interactively to debug
     oc get dc                # list deployment configurations
     oc edit dc <deploy_conf> # edit a deployment configuration