Permalink
Find file Copy path
498 lines (394 sloc) 16.3 KB

openshift-ansible Best Practices Guide

The purpose of this guide is to describe the preferred patterns and best practices used in this repository (both in Ansible and Python).

It is important to note that this repository may not currently comply with all best practices, but the intention is that it will.

All new pull requests created against this repository MUST comply with this guide.

This guide complies with RFC2119.

Python

Method Signatures


When adding a new parameter to an existing method, a default value SHOULD be used

The purpose of this rule is to make it so that method signatures are backwards compatible.

If this rule isn’t followed, it will be necessary for the person who changed the method to search out all callers and make sure that they’re able to use the new method signature.

Before:
def add_person(first_name, last_name):
After:
def add_person(first_name, last_name, age=None):

PyLint

PyLint is used in an attempt to keep the Python code as clean and as manageable as possible. The build bot runs each pull request through PyLint and any warnings or errors cause the build bot to fail the pull request.


PyLint rules MUST NOT be disabled on a whole file.

PyLint rules MUST NOT be disabled unless they meet one of the following exceptions
Exceptions:
  1. When PyLint fails because of a dependency that can’t be installed on the build bot

  2. When PyLint fails because of including a module that is outside of control (like Ansible)

  3. When PyLint fails, but the code makes more sense the way it is formatted (stylistic exception). For this exception, the description of the PyLint disable MUST state why the code is more clear, AND the person reviewing the PR will decide if they agree or not. The reviewer may reject the PR if they disagree with the reason for the disable.


All PyLint rule disables MUST be documented in the code.

The purpose of this rule is to inform future developers about the disable.

Specifically, the following MUST accompany every PyLint disable:
  1. Why is the check being disabled?

  2. Is disabling this check meant to be permanent or temporary?

Example:
# Reason: disable pylint maybe-no-member because overloaded use of
#     the module name causes pylint to not detect that 'results'
#     is an array or hash
# Status: permanently disabled unless a way is found to fix this.
# pylint: disable=maybe-no-member
metadata[line] = results.pop()

Ansible

Yaml Files (Playbooks, Roles, Vars, etc)


Ansible files SHOULD NOT use JSON (use pure YAML instead).

YAML is a superset of JSON, which means that Ansible allows JSON syntax to be interspersed. Even though YAML (and by extension Ansible) allows for this, JSON SHOULD NOT be used.

Reasons:
  • Ansible is able to give clearer error messages when the files are pure YAML

  • YAML reads nicer (preference held by several team members)

  • YAML makes for nicer diffs as YAML tends to be multi-line, whereas JSON tends to be more concise

Exceptions:
  • Ansible static inventory files are INI files. To pass in variables for specific hosts, Ansible allows for these variables to be put inside of the static inventory files. These variables can be in JSON format, but can’t be in YAML format. This is an acceptable use of JSON, as YAML is not allowed in this case.

Every effort should be made to keep our Ansible YAML files in pure YAML.

Modules


Custom Ansible modules SHOULD be embedded in a role.

The purpose of this rule is to make it easy to include custom modules in our playbooks and share them on Ansible Galaxy.

Custom module openshift_facts.py is embedded in the openshift_facts role.
> ll openshift-ansible/roles/openshift_facts/library/
-rwxrwxr-x. 1 user group 33616 Jul 22 09:36 openshift_facts.py
Custom module openshift_facts can be used after openshift_facts role has been referenced.
- hosts: openshift_hosts
  gather_facts: no
  roles:
  - role: openshift_facts
  post_tasks:
  - openshift_facts
      role: common
      hostname: host
      public_hostname: host.example.com

Parameters to Ansible modules SHOULD use the Yaml dictionary format when 3 or more parameters are being passed

When a module has several parameters that are being passed in, it’s hard to see exactly what value each parameter is getting. It is preferred to use the Ansible Yaml syntax to pass in parameters so that it’s more clear what values are being passed for each parameter.

Bad:
- file: src=/file/to/link/to dest=/path/to/symlink owner=foo group=foo state=link
Good:
- file:
    src: /file/to/link/to
    dest: /path/to/symlink
    owner: foo
    group: foo
    state: link

Parameters to Ansible modules SHOULD use the Yaml dictionary format when the line length exceeds 120 characters

Lines that are long quickly become a wall of text that isn’t easily parsable. It is preferred to use the Ansible Yaml syntax to pass in parameters so that it’s more clear what values are being passed for each parameter.

Bad:
- get_url: url=http://example.com/path/file.conf dest=/etc/foo.conf sha256sum=b5bb9d8014a0f9b1d61e21e796d78dccdf1352f23cd32812f4850b878ae4944c
Good:
- get_url:
    url: http://example.com/path/file.conf
    dest: /etc/foo.conf
    sha256sum: b5bb9d8014a0f9b1d61e21e796d78dccdf1352f23cd32812f4850b878ae4944c

The Ansible command module SHOULD be used instead of the Ansible shell module.

The Ansible shell module can run most commands that can be run from a bash CLI. This makes it extremely powerful, but it also opens our playbooks up to being exploited by attackers.

Bad:
- shell: "/bin/echo {{ cli_var }}"
Better:
- command: "/bin/echo {{ cli_var }}"

The Ansible quote filter MUST be used with any variable passed into the shell module.

It is recommended not to use the shell module. However, if it absolutely must be used, all variables passed into the shell module MUST use the quote filter to ensure they are shell safe.

Bad:
- shell: "/bin/echo {{ cli_var }}"
Good:
- shell: "/bin/echo {{ cli_var | quote }}"

Defensive Programming


Ansible playbooks MUST begin with checks for any variables that they require.

If an Ansible playbook requires certain variables to be set, it’s best to check for these up front before any other actions have been performed. In this way, the user knows exactly what needs to be passed into the playbook.

Example:
---
- hosts: localhost
  gather_facts: no
  tasks:
  - fail: msg="This playbook requires g_environment to be set and non empty"
    when: g_environment is not defined or g_environment == ''

Ansible roles tasks/main.yml file MUST begin with checks for any variables that they require.

If an Ansible role requires certain variables to be set, it’s best to check for these up front before any other actions have been performed. In this way, the user knows exactly what needs to be passed into the role.

Example:
---
# tasks/main.yml
- fail: msg="This role requires arl_environment to be set and non empty"
  when: arl_environment is not defined or arl_environment == ''

Tasks


Ansible tasks SHOULD NOT be used in Ansible playbooks. Instead, use pre_tasks and post_tasks.

An Ansible play is defined as a Yaml dictionary. Because of that, Ansible doesn’t know if the play’s tasks list or roles list was specified first. Therefore Ansible always runs tasks after roles.

This can be quite confusing if the tasks list is defined in the playbook before the roles list because people assume in order execution in Ansible.

Therefore, we SHOULD use pre_tasks and post_tasks to make it more clear when the tasks will be run.

Bad:
---
# playbook.yml
- hosts: localhost
  gather_facts: no
  tasks:
  - name: This will execute AFTER the example_role, so it's confusing
    debug: msg="in tasks list"
  roles:
  - role: example_role

# roles/example_role/tasks/main.yml
- debug: msg="in example_role"
Good:
---
# playbook.yml
- hosts: localhost
  gather_facts: no
  pre_tasks:
  - name: This will execute BEFORE the example_role, so it makes sense
    debug: msg="in pre_tasks list"
  roles:
  - role: example_role

# roles/example_role/tasks/main.yml
- debug: msg="in example_role"

Roles


All tasks in a role SHOULD be tagged with the role name.

Ansible tasks can be tagged, and then these tags can be used to either run or skip the tagged tasks using the --tags and --skip-tags ansible-playbook options respectively.

This is very useful when developing and debugging new tasks. It can also significantly speed up playbook runs if the user specifies only the roles that changed.

Example:
---
# roles/example_role/tasks/main.yml
- debug: msg="in example_role"
  tags:
  - example_role

The Ansible roles directory MUST maintain a flat structure.
The purpose of this rule is to:
  • Comply with the upstream best practices

  • Make it familiar for new contributors

  • Make it compatible with Ansible Galaxy


Ansible Roles SHOULD be named like technology_component[_subcomponent].

For consistency, role names SHOULD follow the above naming pattern. It is important to note that this is a recommendation for role naming, and follows the pattern used by upstream.

Many times the technology portion of the pattern will line up with a package name. It is advised that whenever possible, the package name should be used.

Examples:
  • The role to configure a master is called openshift_control_plane

  • The role to configure OpenShift specific yum repositories is called openshift_repos

Filters


The default filter SHOULD replace empty strings, lists, etc.

When using the jinja2 default filter, unless the variable is a boolean, specify true as the second parameter. This will cause the default filter to replace empty strings, lists, etc with the provided default.

This is because it is preferable to either have a sane default set than to have an empty string, list, etc. For example, it is preferable to have a config value set to a sane default than to have it simply set as an empty string.

From the Jinja2 Docs:
If you want to use default with variables that evaluate to false you have to set the second parameter to true
Example:
---
- hosts: localhost
  gather_facts: no
  vars:
    somevar: ''
  tasks:
  - debug: var=somevar

  - name: "Will output 'somevar: []'"
    debug: "msg='somevar: [{{ somevar | default('the string was empty') }}]'"

  - name: "Will output 'somevar: [the string was empty]'"
    debug: "msg='somevar: [{{ somevar | default('the string was empty', true) }}]'"

In other words, normally the default filter will only replace the value if it’s undefined. By setting the second parameter to true, it will also replace the value if it defaults to a false value in Python, so None, empty list, empty string, etc.

This is almost always more desirable than an empty list, string, etc.

Yum and DNF


Package installation MUST use Ansible package module to abstract away dnf/yum.

The Ansible package module calls the associated package manager for the underlying OS.

Bad:
---
# tasks.yml
- name: Install etcd (for etcdctl)
  yum: name=etcd state=latest
  when: ansible_pkg_mgr == yum
  register: install_result

- name: Install etcd (for etcdctl)
  dnf: name=etcd state=latest
  when: ansible_pkg_mgr == dnf
  register: install_result
Good:
---
# tasks.yml
- name: Install etcd (for etcdctl)
  package:
    name: etcd
    state: latest
  register: install_result