This repo contains a collection of reusable Bash functions for handling common tasks such as logging, assertions,
string manipulation, and more. It is our attempt to bring a little more sanity, predictability, and coding reuse to our
Bash scripts. All the code has thorough automated tests and is packaged into functions, so you can safely import it
into your bash scripts using
Once you have
bash-commons installed (see the install instructions), you use
source to import the
modules and start calling the functions within them. Before you import any modules, make sure you
bootstrap.sh file which sets some important defaults to encourage good code:
source /opt/gruntwork/bash-commons/bootstrap.sh source /opt/gruntwork/bash-commons/log.sh source /opt/gruntwork/bash-commons/assert.sh source /opt/gruntwork/bash-commons/os.sh log_info "Hello, World!" assert_not_empty "--foo" "$foo" "You must provide a value for the --foo parameter." if os_is_ubuntu "16.04"; then log_info "This script is running on Ubuntu 16.04!" elif os_is_centos; then log_info "This script is running on CentOS!" fi
The first step is to download the code onto your computer.
gruntwork-install \ --repo https://github.com/gruntwork-io/bash-commons \ --module-name bash-commons \ --tag <VERSION>
The default install location is
/opt/gruntwork/bash-commons, but you can override that using the
dir param, and
override the owner of the install dir using the
gruntwork-install \ --repo https://github.com/gruntwork-io/bash-commons \ --module-name bash-commons \ --tag <VERSION> \ --module-param dir=/foo/bar \ --module-param owner=my-os-username \ --module-param group=my-os-group
If you don't want to use the Gruntwork Installer, you can use
git clone to get the code onto your computer and then
copy it to it's final destination manually:
git clone --branch <VERSION> https://github.com/gruntwork-io/bash-commons.git sudo mkdir -p /opt/gruntwork cp -r bash-commons/modules/bash-commons/src /opt/gruntwork/bash-commons sudo chown -R "my-os-username:my-os-group" /opt/gruntwork/bash-commons
You can use the
source command to "import" the modules you need and use them in your code:
This will make all the functions within that module available in your code:
log_info "Hello, World!"
Here's an overview of the modules available in
array.sh: Helpers for working with Bash arrays, such as checking if an array contains an element, or joining an array into a string with a delimiter between elements.
assert.sh: Assertions that check a condition and exit if the condition is not met, such as asserting a variable is not empty or that an expected app is installed. Useful for defensive programming.
aws.sh: A collection of thin wrappers for direct calls to the AWS CLI and EC2 Instance Metadata. These thin wrappers give you a shorthand way to fetch certain information (e.g., information about an EC2 Instance, such as its private IP, public IP, Instance ID, and region). Moreover, you can swap out
aws.shwith a version that returns mock data to make it easy to run your code locally (e.g., in Docker) and to run unit tests.
aws-wrapper.sh: A collection of "high level" wrappers for the AWS CLI and EC2 Instance Metadata to simplify common tasks such as looking up tags or IPs for EC2 Instances. Note that these wrappers handle all the data processing and logic, whereas all the direct calls to the AWS CLI and EC2 metadata endpoints are delegated to
aws.shto make unit testing easier.
file.sh: A collection of helpers for working with files, such as checking if a file exists or contains certain text.
log.sh: A collection of logging helpers that write logs to
stderrwith log levels (INFO, WARN, ERROR) and timestamps.
os.sh: A collection of Operating System helpers, such as checking which flavor of Linux (e.g., Ubuntu, CentOS) is running and validating checksums.
string.sh: A collection of string manipulation functions, such as checking if a string contains specific text, stripping prefixes, and stripping suffixes.
The code in
bash-commons follows the following principles:
The code in this repo aims to be compatible with:
- Bash 3
- Most major Linux distributions (e.g., Ubuntu, CentOS)
All the code should mainly follow the Google Shell Style Guide. In particular:
- The first line of every script should be
- All code should be defined in functions.
- Functions should exit or return 0 on success and non-zero on error.
- Functions should return output by writing it to
- Functions should log to
- All variables should be
local. No global variables are allowed at all.
- Make as many variables
- If a variable is both local and readonly, use
- If calling to a subshell and storing the output in a variable (foo=
$( ... )), do NOT use
local -rin the same statement or the exit code will be lost. Instead, declare the variable as
localon one line and then call the subshell on the next line.
- Quote all strings.
[[ ... ]]instead of
[ ... ].
- Use snake_case for function and variable names. Use UPPER_SNAKE_CASE for constants.
Everything in a function
It's essential that ALL code is defined in a function. That allows you to use
source to "import" that code without
anything actually being executed.
Bash does not support namespacing, so we fake it using a convention on the function names: if you create a file
<foo.sh>, all functions in it should start with
foo_. For example, all the functions in
log.sh start with
log_error) and all the functions in
string.sh start with
string_strip_prefix). That makes it easier to tell which functions came from which modules.
For readability, that means you should typically give files a name that is a singular noun. For example,
string.sh instead of
Every function should be tested:
Automated tests are in the test folder.
We run all tests in the gruntwork/bash-commons-circleci-tests Docker image so that (a) it's consistent with how the CI server runs them, (b) the tests always run on Linux, (c) any changes the tests make, such as writing files or creating OS users, won't affect the host OS, (d) we can replace some of the modules, such as
aws.sh, with mocks at test time. There is a
docker-compose.ymlfile in the
testfolder to make it easy to run the tests.
To run all the tests:
To run one test file:
docker-compose run tests bats test/array.bats.
To leave the Docker container running so you can debug, explore, and interactively run bats:
docker-compose run tests bash.
If you ever need to build a new Docker image, the
Dockerfileis in the .circleci folder:
cd .circleci docker build -t gruntwork/bash-commons-circleci-tests . docker push gruntwork/bash-commons-circleci-tests
- Add automated tests for
aws-wrapper.sh. We have not tested these as they require either running an EC2 Instance or run something like LocalStack.
Copyright © 2018 Gruntwork, Inc.