Skip to content


Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?

Latest commit


Git stats


Failed to load latest commit information.
Latest commit message
Commit time
  _________       .__  _____  __                      .__                
 /   _____/_  _  _|__|/ ____\/  |______    ____  __ __|  | _____ _______ 
 \_____  \\ \/ \/ /  \   __\\   __\__  \ _/ ___\|  |  \  | \__  \\_  __ \
 /        \\     /|  ||  |   |  |  / __ \\  \___|  |  /  |__/ __ \|  | \/
/_______  / \/\_/ |__||__|   |__| (____  /\___  >____/|____(____  /__|   
        \/                             \/     \/                \/       

OpenStack Swift and Ansible

This repository will create a virtualized OpenStack Swift cluster using Vagrant, VirtualBox, Ansible.

Table of Contents

  1. Too long; didn't read
  2. Features
  3. Requirements
  4. Networking setup
  5. Starting over
  6. Development environment
  7. Modules
  8. Future work
  9. Issues
  10. Notes


Note this will start seven virtual machines on your computer.

$ git clone
$ cd swiftacular
# Checkout some modules to help with managing openstack 
$ git clone library/openstack
$ vagrant up
$ cp group_vars/all.example group_vars/all # and edit if desired
$ ansible-playbook site.yml

Supported Operating Systems and OpenStack Releases

  • CentOS 6.5 with OpenStack Havana packages
  • Ubuntu 12.04 with OpenStack Havana packages
  • Ubuntu 14.04 with OpenStack Icehouse packages

Ubuntu 14.04 is probably the most tested version right now, then Ubuntu 12.04, followed up by Redhat/CentOS 6.5+.

The Vagrantfile has the above boxes in place with Ubuntu 12.04 being the default uncommented box. To use one of the other operating systems as the basis for Swiftacular, simply uncomment the OS you would like to use in the Vagrant file, and make sure the other boxes are commented out.


  • Run OpenStack Swift in vms on your local computer, but with multiple servers
  • Replication network is used, which means this could be a basis for a geo-replication system
  • SSL - Keystone is configured to use SSL and the Swift Proxy is proxied by an SSL server
  • Sparse files to back Swift disks
  • Tests for uploading files into Swift
  • Use of gauntlt attacks to verify installation
  • Supports Ubuntu Precise 12.04, Trusty 14.04 and CentOS 6.5


  • Vagrant and Virtualbox
  • For Ubuntu I am using the official Vagrant Precise64 images
  • For CentOS 6 I am using the Vagrant box provided by Puppet Labs
  • Enough resources on your computer to run seven vms

Virtual machines created

Seven Vagrant-based virtual machines are used for this playbook:

  • package_cache - One apt-cacher-ng server so that you don't have to download packages from the Internet over and over again, only once
  • authentication - One Keystone server for authentication
  • lbssl - One SSL termination server that will be used to proxy connections to the Swift Proxy server
  • swift-proxy - One Swift proxy server
  • swift-storage - Three Swift storage nodes

Networking setup

Each vm will have four networks (technically five including the Vagrant network). In a real production system every server would not need to be attached to every network, and in fact you would want to avoid that. In this case, they are all attached to every network.

  • eth0 - Used by Vagrant
  • eth1 - - The "public" network that users would connect to
  • eth2 - - This is the network between the SSL terminator and the Swift Proxy
  • eth3 - - The local Swift internal network
  • eth4 - - The replication network which is a feature of OpenStack Swift starting with the Havana release

Self-signed certificates

Because this playbook configures self-signed SSL certificates and by default the swift client will complain about that fact, either the --insecure option needs to be used or alternatively the SWIFTCLIENT_INSECURE environment variable can be set to true.

Using the swift command line client

You can install the swift client anywhere that you have access to the SSL termination point and Keystone. So you could put it on your local laptop as well, probably with:

$ pip install python-swiftclient

However, I usually login to the package_cache server and use swift from there.

$ vagrant ssh swift-package-cache-01
vagrant@swift-package-cache-01:~$ . testrc 
vagrant@swift-package-cache-01:~$ swift list
vagrant@swift-package-cache-01:~$ echo "swift is cool" > swift.txt
vagrant@swift-package-cache-01:~$ swift upload swifty swift.txt 
vagrant@swift-package-cache-01:~$ swift list
vagrant@swift-package-cache-01:~$ swift list swifty

Starting over

If you want to redo the installation there are a few ways.

To restart completely:

$ vagrant destroy -f
$ vagrant up
# wait...
$ ansible-playbook site.yml

There is a script to destroy and rebuild everything but the package cache:

$ ./bin/redo
$ ansible -m ping all # just to check if networking is up
$ ansible-playbook site.yml

To remove and redo only the rings and fake/sparse disks without destroying any virtual machines:

$ ansible-playbook playbooks/remove_rings.yml
$ ansible-playbook site.yml

To remove the keystone database and redo the endpoints, users, regions, etc:

$ ansible-playbook ./playbook/remove_keystone.yml
$ ansible-playbook site.yml

Development environment

This playbook was developed in the following environment:

  • OSX 10.8.2
  • Ansible 1.4
  • Virtualbox 4.2.6
  • Vagrant 1.3.5


There is an swift-ansible-modules directory in the library directory that contains a couple of modules taken from the official Ansible modules as well as the openstack-ansible-modules and for now both have been modified to allow the "insecure" option, which means self-signed certificates. I hope to get those changes into their respective repositories soon.

Future work

See the issues in the tracking system on Github for Swiftacular with the enhancement label.


See the issues in the tracking tracking system on Github for Swiftacular.


  • I know that Vagrant can automatically start Ansible playbooks on the creation of a vm, but I prefer to run the playbook manually
  • LXC is likely a better fit than Virtualbox given all the vms are the same OS and we don't need to boot any vms within vms inception style
  • Starting the vms is a bit slow I believe because of the extra networks


Deploy OpenStack Swift with Ansible and Vagrant







No packages published