Skip to content

Commit

Permalink
Merge pull request #124 from rattermeyer/updated-documentation
Browse files Browse the repository at this point in the history
update antora documentation
  • Loading branch information
rattermeyer committed Sep 10, 2019
2 parents 896250c + 8c14475 commit 6beb90b
Show file tree
Hide file tree
Showing 10 changed files with 99 additions and 281 deletions.
258 changes: 1 addition & 257 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,259 +1,3 @@
# Jenkins Shared Library

This library allows to have a minimal Jenkinsfile in each repository by providing all language-agnostic build aspects. The goal is to duplicate as little as possible between repositories and have an easy way to ship updates to all projects.


## Usage

Load the shared library in your `Jenkinsfile` like this:
```groovy
def final projectId = "hugo"
def final componentId = "be-node-express"
def final credentialsId = "${projectId}-cd-cd-user-with-password"
def sharedLibraryRepository
def dockerRegistry
node {
sharedLibraryRepository = env.SHARED_LIBRARY_REPOSITORY
dockerRegistry = env.DOCKER_REGISTRY
}
library identifier: 'ods-library@production', retriever: modernSCM(
[$class: 'GitSCMSource',
remote: sharedLibraryRepository,
credentialsId: credentialsId])
odsPipeline(
image: "${dockerRegistry}/cd/jenkins-slave-maven",
projectId: projectId,
componentId: componentId,
branchToEnvironmentMapping: [
'master': 'test',
'*': 'dev'
]
) { context ->
stage('Build') {
// custom stage
}
stageScanForSonarqube(context) // using a provided stage
}
```

## Provided Stages
Following stages are provided (see folder vars for more details):
* stageScanForSonarqube(context)
* stageOWASPDependencyCheck(context)
* stageScanForSnyk(context, snykAuthenticationCode, buildFile, projectId)
* stageUploadToNexus(context)
* stageStartOpenshiftBuild(context)
* stageDeployToOpenshift(context)

## Workflow

The shared library does not impose which Git workflow you use. Whether you use git-flow, GitHub flow or a custom workflow, it is possible to configure the shared library according to your needs. There are just two settings to control everything: `branchToEnvironmentMapping` and `autoCloneEnvironmentsFromSourceMapping`.

### branchToEnvironmentMapping

Example:
```
branchToEnvironmentMapping: [
"master": "prod",
"develop": "dev",
"hotfix/": "hotfix",
"*": "review"
]
```

Maps a branch to an environment. There are three ways to reference branches:

* Fixed name (e.g. `master`)
* Prefix (ending with a slash, e.g. `hotfix/`)
* Any branch (`*`)

Matches are made top-to-bottom. For prefixes / any branch, a more specific environment might be selected if:

* the branch contains a ticket ID and a corresponding env exists in OCP. E.g. for mapping `"feature/": "dev"` and branch `feature/foo-123-bar`, the env `dev-123` is selected instead of `dev` if it exists.
* the branch name corresponds to an existing env in OCP. E.g. for mapping `"release/": "rel"` and branch `release/1.0.0`, the env `rel-1.0.0` is selected instead of `rel` if it exists.

### autoCloneEnvironmentsFromSourceMapping

Caution! Cloning environments on-the-fly is an advanced feature and should only be used if you understand OCP well, as there are many moving parts and things can go wrong in multiple places.

Example:
```
autoCloneEnvironmentsFromSourceMapping: [
"hotfix": "prod",
"review": "dev"
]
```

Instead of deploying multiple branches to the same environment, individual environments can be created on-the-fly. For example, the mapping `"*": "review"` deploys all branches to the `review` environment. To have one environment per branch / ticket ID, you can add the `review` environment to `autoCloneEnvironmentsFromSourceMapping`, e.g. like this: `"review": "dev"`. This will create individual environments (named e.g. `review-123` or `review-foobar`), each cloned from the `dev` environment.

### Examples

If you use [git-flow](https://jeffkreeftmeijer.com/git-flow/), the following config fits well:
```
branchToEnvironmentMapping: [
'master': 'prod',
'develop': 'dev',
'release/': 'rel',
'hotfix/': 'hotfix',
'*': 'preview'
]
// Optionally, configure environments on-the-fly:
autoCloneEnvironmentsFromSourceMapping: [
'rel': 'dev',
'hotfix': 'prod',
'preview': 'dev'
]
```

If you use [GitHub Flow](https://guides.github.com/introduction/flow/), the following config fits well:
```
branchToEnvironmentMapping: [
'master': 'prod',
'*': 'preview'
]
// Optionally, configure environments on-the-fly:
autoCloneEnvironmentsFromSourceMapping: [
'preview': 'prod'
]
```

If you use a custom workflow, the config could look like this:
```
branchToEnvironmentMapping: [
'production': 'prod',
'master': 'dev',
'staging': 'uat'
]
// Optionally, configure environments on-the-fly:
autoCloneEnvironmentsFromSourceMapping: [
'uat': 'prod'
]
```


## Writing stages

Inside the closure passed to `odsPipeline`, you have full control. Write stages just like you would do in a normal `Jenkinsfile`. You have access to the `context`, which is assembled for you on the master node. The `context` can be influenced by changing the config map passed to `odsPipeline`. Please see `vars/odsPipeline.groovy` for possible options.

When you write stages, you have access to both global variables (defined without `def` in the `Jenkinsfile`) and the
`context` object. It contains the following properties:

| Property | Description |
| ------------- |-------------|
| jobName | Value of JOB_NAME. It is the name of the project of the build. |
| buildNumber | Value of BUILD_NUMBER. The current build number, such as "153". |
| buildUrl | Value of BUILD_URL. The URL where the results of the build can be found (e.g. http://buildserver/jenkins/job/MyJobName/123/) |
| buildTime | Time of the build, collected when the odsPipeline starts. |
| image | Container image to use for the Jenkins agent container. This value is not used when "podContainers" is set. |
| podLabel | Pod label, set by default to a random label to avoid caching issues. Set to a stable label if you want to reuse pods across builds. |
| podContainers | Custom pod containers to use. By default, only one container is used, and it is configure automatically. If you need to run multiple containers (e.g. app and database), then you can configure the containers via this property. |
| podVolumes | Volumes to make available to the pod. |
| podAlwaysPullImage | Determine whether to always pull the container image before each build run. |
| podServiceAccount | Serviceaccount to use when running the pod. |
| credentialsId | Credentials identifier (Credentials are created and named automatically by the OpenShift Jenkins plugin). |
| tagversion | The tagversion is made up of the build number and the first 8 chars of the commit SHA. |
| notifyNotGreen | Whether to send notifications if the build is not successful. |
| nexusHost | Nexus host (with scheme). |
| nexusUsername | Nexus username. |
| nexusPassword | Nexus password. |
| nexusHostWithBasicAuth | Nexus host (with scheme), including username and password as BasicAuth. |
| branchToEnvironmentMapping | Define which branches are deployed to which environments. |
| autoCloneEnvironmentsFromSourceMapping | Define which environments are cloned from which source environments. |
| cloneSourceEnv | The environment which was chosen as the clone source. |
| environment | The environment which was chosen as the deployment target, e.g. "dev". |
| targetProject | Target project, based on the environment. E.g. "foo-dev". |
| groupId | Group ID, defaults to: org.opendevstack.<projectID>. |
| projectId | Project ID, e.g. "foo". |
| componentId | Component ID, e.g. "be-auth-service". |
| gitUrl | Git URL of repository |
| gitBranch | Git branch for which the build runs. |
| gitCommit |Git commit SHA to build. |
| gitCommitAuthor | Git commit author. |
| gitCommitMessage | Git commit message. |
| gitCommitTime | Git commit time in RFC 3399. |
| sonarQubeBranch | Branch on which to run SonarQube analysis. |
| failOnSnykScanVulnerabilities | Boolean flag (default true) that disables build failure in case Snyk Scan finds vulnerabilities |
| dependencyCheckBranch | Branch on which to run dependency checks. |
| environmentLimit | Number of environments to allow. |
| openshiftHost | OpenShift host - value taken from OPENSHIFT_API_URL. |
| odsSharedLibVersion | ODS Jenkins shared library version, taken from reference in Jenkinsfile. |
| bitbucketHost | BitBucket host - value taken from BITBUCKET_HOST. |
| environmentCreated | Whether an environment has been created during the build. |
| openshiftBuildTimeout | Timeout for the OpenShift build of the container image. |
| ciSkip | Whether the build should be skipped, based on the Git commit message. |


## Slave customization

The slave used to build your code can be customized by specifying the image to
use. Further, `podAlwaysPullImage` (defaulting to `true`) can be used to
determine whether this image should be refreshed on each build. The setting
`podVolumes` allows to mount persistent volume claims to the pod (the value is
passed to the `podTemplate` call as `volumes`). To control the container pods
completely, set `podContainers` (which is passed to the `podTemplate` call
as `containers`). See the
[kubernetes-plugin](https://github.com/jenkinsci/kubernetes-plugin)
documentation for possible configuration.


## Versioning

Each `Jenkinsfile` references a Git revsison of this library, e.g.
`library identifier: 'ods-library@production'`. The Git revsison can be a
branch (e.g. `production` or `0.1.x`), a tag (e.g.`0.1.1`) or a specific commit.

By default, each `Jenkinsfile` in `ods-project-quickstarters` on the `master`
branch references the `production` branch of this library. Quickstarters on a
branch point to the corresponding branch of the shared library - for example
a `Jenkinsfile` on branch `0.1.x` points to `0.1.x` of the shared library.

## Git LFS (Git Large File Storage extension)

If you are working with large files (e.g.: binary files, media files, files bigger than 5MB...),
you can follow the following steps:
- Check this HOWTO about [Git LFS](https://www.atlassian.com/git/tutorials/git-lfs)
- Track your large files in your local clone, as explained in previous step
- Enable Git LFS in your repository (if BitBucket: under repository's settings main page you can enable it)

**NOTE**: if already having a repository with large files and you want to migrate it to using git LFS:

```bash
git lfs migrate
```

### How to add Snyk scanning to your ODS project

1. Setup organisation in snyk.io
1. If you don´t have an snyk account just create one at snyk.io
1. Once you logged into snyk.io, in your snyk group create an organisation for your project with exactly same name as project name.
1. Create a service account in settings for the created organisation and keep the displayed token. You will need it later.
1. Add environment variable to jenkins in your cd project
1. Add the environment variable `SNYK_AUTHENTICATION_CODE` in jenkins in your openshift cd project with service account token as value.
1. Edit your project Jenkinsfile
1. Read auth code from environment by adding:
```
node {
...
snykAuthenticationCode = env.SNYK_AUTHENTICATION_CODE
}
```
1. Add stageScanForSnyk:
```
) { context ->
...
stageScanForSnyk(context, snykAuthenticationCode, 'build.gradle', context.projectId)
...
}
```

## Development
* Try to write tests.
* See if you can split things up into classes.
* Keep in mind that you need to access e.g. `sh` via `script.sh`.


## Background
The implementation is largely based on https://www.relaxdiego.com/2018/02/jenkins-on-jenkins-shared-libraries.html. The scripted pipeline syntax was chosen because it is a better fit for a shared library. The declarative pipeline syntax is targeted for newcomers and/or simple pipelines (see https://jenkins.io/doc/book/pipeline/syntax/#scripted-pipeline). If you try to use it e.g. within a Groovy class you'll end up with lots of `script` blocks.
Documentation can be found on the https://www.opendevstack.org/ods-documentation/ods-jenkins-shared-library/latest/index.html[net] and in the antora folder at https://github.com/opendevstack/ods-jenkins-shared-library/tree/master/docs/modules/ROOT/pages
61 changes: 61 additions & 0 deletions docs/modules/ROOT/pages/index.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,17 @@ odsPipeline(
}
----

== Provided Stages

Following stages are provided (see folder vars for more details):

* <<_stage_scan_for_sonarqube,stageScanForSonarqube(context)>>
* stageOWASPDependencyCheck(context)
* <<_stage_scan_for_snyk,stageScanForSnyk(context, snykAuthenticationCode, buildFile, projectId)>>
* stageUploadToNexus(context)
* <<_stage_start_openshift_build,stageStartOpenshiftBuild(context)>>
* stageDeployToOpenshift(context)

== Workflow

The shared library does not impose which Git workflow you use. Whether you use git-flow, GitHub flow or a custom workflow, it is possible to configure the shared library according to your needs. There are just two settings to control everything: `branchToEnvironmentMapping` and `autoCloneEnvironmentsFromSourceMapping`.
Expand Down Expand Up @@ -235,6 +246,9 @@ When you write stages, you have access to both global variables (defined without
| sonarQubeBranch
| Branch on which to run SonarQube analysis.

| failOnSnykScanVulnerabilities
| Boolean flag (default true) that disables build failure in case Snyk Scan finds vulnerabilities

| dependencyCheckBranch
| Branch on which to run dependency checks.

Expand Down Expand Up @@ -299,6 +313,53 @@ you can follow the following steps:
git lfs migrate
----

=== How to add Snyk scanning to your ODS project

. Setup organisation in snyk.io
.. If you don´t have an snyk account just create one at snyk.io
.. Once you logged into snyk.io, in your snyk group create an organisation for your project with exactly same name as project name.
.. Create a service account in settings for the created organisation and keep the displayed token. You will need it later.
. Add environment variable to jenkins in your cd project
.. Add the environment variable `SNYK_AUTHENTICATION_CODE` in jenkins in your openshift cd project with service account token as value.
. Edit your project Jenkinsfile
.. Read auth code from environment by adding:
+
----
node {
...
snykAuthenticationCode = env.SNYK_AUTHENTICATION_CODE
}
----

.. Add stageScanForSnyk:
+
----
) { context ->
...
stageScanForSnyk(context, snykAuthenticationCode, 'build.gradle', context.projectId)
...
}
----

== Stages

=== Ods Pipeline

include::./odsPipeline.adoc[]

=== Stage Scan for Snyk

include::./stageScanForSnyk.adoc[]

=== Stage Start OpenShift Build

include::./stageStartOpenshiftBuild.adoc[]

=== Stage Scan For SonarQube

include::./stageScanForSonarQube.adoc[]

== Development

* Try to write tests.
Expand Down
3 changes: 3 additions & 0 deletions docs/modules/ROOT/pages/odsPipeline.adoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
:page-partial:

The odsPipeline method offers a complete pipeline for any component created with a quickstarter. It takes care of building the code, uploading artifacts to Nexus, analysing the code and starting builds and triggering deployments in Openshift.
7 changes: 7 additions & 0 deletions docs/modules/ROOT/pages/stageScanForSnyk.adoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
:page-partial:

The "Snyk Security Scan" stage does 2 tasks:

. uploads the list of project 3rd party dependencies including its licenses for monitoring. Snyk monitoring feature notifies developers about new vulnerabilities per email once this vulnerabilities are reported to the Snyk Vulnerability Database
. analyses your project 3rd party dependencies including its licenses and break the build if vulnerable versions are found in the project. Build fail can be disable with the property `failOnSnykScanVulnerabilities`
Note: that if this stage only runs if the SNYK_AUTHENTICATION_CODE is found as environment variable. This variable needs to be defined as environment variable in the deployment configuration of your project jenkins.
10 changes: 10 additions & 0 deletions docs/modules/ROOT/pages/stageScanForSonarQube.adoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
:page-partial:

The "SonarQube Analysis" stage scans your source code and reports findings to
SonarQube. The configuration of the scan happens via the
"sonar-project.properties" file in the repository being built.

In debug mode, the sonar-scanner binary is started with the "-X" flag.

If no "sonar.projectVersion" is specified in "sonar-project.properties", it is
set to the shortened Git SHA.
13 changes: 13 additions & 0 deletions docs/modules/ROOT/pages/stageStartOpenshiftBuild.adoc
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
:page-partial:

stageStartOpenshiftBuild triggers the BuildConfig related to the repository
being built.

stageStartOpenshiftBuild takes two optional params
(a) the first one, named "buildArgs", which is a map allowing to customise the image build step in OpenShift.
For example:
stageStartOpenshiftBuild(context, ["myArg":"val"])

(b) the second one, named "imageLabels", which is a map allowing to customise the image label generation.
For example:
stageStartOpenshiftBuild(context, [ : ], ["myImageLabel":"valLabel"]). This will end up as label prefixed with 'ext.'
2 changes: 1 addition & 1 deletion vars/odsPipeline.txt
Original file line number Diff line number Diff line change
@@ -1 +1 @@
The odsPipeline method offers a complete pipeline for any component created with a quickstarter. It takes care of building the code, uploading artifacts to Nexus, analysing the code and starting builds and triggering deployments in Openshift.
see the documentation in the antora module / documentation at https://github.com/opendevstack/ods-jenkins-shared-library/tree/master/docs/modules/ROOT/pages
5 changes: 1 addition & 4 deletions vars/stageScanForSnyk.txt
Original file line number Diff line number Diff line change
@@ -1,4 +1 @@
The "Snyk Security Scan" stage does 2 tasks:
1. uploads the list of project 3rd party dependencies including its licenses for monitoring. Snyk monitoring feature notifies developers about new vulnerabilities per email once this vulnerabilities are reported to the Snyk Vulnerability Database
2. analyses your project 3rd party dependencies including its licenses and break the build if vulnerable versions are found in the project. Build fail can be disable with the property ```failOnSnykScanVulnerabilities```
Note: that if this stage only runs if the SNYK_AUTHENTICATION_CODE is found as environment variable. This variable needs to be defined as environment variable in the deployment configuration of your project jenkins.
see the documentation in the antora module / documentation at https://github.com/opendevstack/ods-jenkins-shared-library/tree/master/docs/modules/ROOT/pages
9 changes: 1 addition & 8 deletions vars/stageScanForSonarQube.txt
Original file line number Diff line number Diff line change
@@ -1,8 +1 @@
The "SonarQube Analysis" stage scans your source code and reports findings to
SonarQube. The configuration of the scan happens via the
"sonar-project.properties" file in the repository being built.

In debug mode, the sonar-scanner binary is started with the "-X" flag.

If no "sonar.projectVersion" is specified in "sonar-project.properties", it is
set to the shortened Git SHA.
see the documentation in the antora module / documentation at https://github.com/opendevstack/ods-jenkins-shared-library/tree/master/docs/modules/ROOT/pages
Loading

0 comments on commit 6beb90b

Please sign in to comment.