Skip to content
Cache directories using checksums between builds
Branch: master
Clone or download
danthorpe Initial Release
* Adds plugin.yml

* Fixes linter plugin id

* Adding echoing of variables

* Removes plugin required elements

* Writing some bash unit tests

* Fixes the linter again

* Fixes typo

* Fixing unit test

* Test no cache key

* Trying to fix the linter again

* Fixing shell

* Still trying to lint the plugin

* Removes unneeded unset

* Fixes README

* More README

* I fucking hate YAML

* Adding upload structure

* Proper setup & teardown

* Fixes script error

* Adding tests for actual aws s3 sync command

* Properly stubs aws

* Removes dryrun option

* Fixes the test

* Removes trailing slash

* Tweaking scripts to show outputs

* Re-enabled S3

* Implements the pre-command

* Fixes up tests

* Fixes tests again

* Fixing tests again

* Saves file

* Supports a basic checksum based cache key

* Script linting

* shellcheck disable=SC2001

* Updates README

* Adds build badge

* Build badge scoped to master
Latest commit 4d152a9 Mar 10, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
.buildkite
hooks
tests
LICENSE
README.md
docker-compose.yml
plugin.yml

README.md

Build Status

Cache Buildkite Plugin

A Buildkite plugin to restore and save directories by cache keys. For example, use the checksum of a .resolved or .lock file to restore/save built dependencies between independent builds, not just jobs.

Restore & Save Caches

steps:
  - plugins:
    - danthorpe/cache#v1.0.0:
        cache_key: "v1-cache-{{ checksum 'Podfile.lock' }}"
        paths: [ "Pods/", "Rome/" ]

Cache Key Templates

The cache key is a string, which support a crude template system. Currently checksum is the only command supported for now. It can be used as in the example above. In this case the cache key will be determined by executing a checksum (actually shasum) on the Podfile.lock file, prepended with v1-cache-.

S3 Storage

This plugin uses AWS S3 sync to cache the paths into a bucket as defined by environment variables defined in your agent.

export BUILDKITE_PLUGIN_CACHE_S3_BUCKET_NAME="my-unique-s3-bucket-name"
export BUILDKITE_PLUGIN_CACHE_S3_PROFILE="my-s3-profile"

The paths are synced using Amazon S3 into your bucket using a structure of organization-slug/pipeline-slug/cache_key, as determined by the Buildkite environment variables.

You can’t perform that action at this time.