Skip to content
S3 data store for the Dragonfly ruby gem
Branch: master
Clone or download
Latest commit 15ba3f3 Jun 2, 2017
Type Name Latest commit message Commit time
Failed to load latest commit information.
lib/dragonfly version bump May 6, 2017
spec Fixed spec that was relying on deprecated Dragonfly::Serializer.marsh… May 6, 2017
.gitignore ignore .s3_spec.yml file Sep 13, 2013 version bump May 6, 2017
LICENSE.txt bundle gem init Sep 13, 2013
Rakefile bundle gem init Sep 13, 2013
dragonfly-s3_data_store.gemspec Remove git post install message Jan 25, 2016


Amazon AWS S3 data store for use with the Dragonfly gem.


gem 'dragonfly-s3_data_store'


Configuration (remember the require)

require 'dragonfly/s3_data_store' do
  # ...

  datastore :s3,
    bucket_name: 'my-bucket',
    access_key_id: 'blahblahblah',
    secret_access_key: 'blublublublu'

  # ...

Available configuration options

:region               # default 'us-east-1', see for options
:storage_headers      # defaults to {'x-amz-acl' => 'public-read'}, can be overridden per-write - see below
:url_scheme           # defaults to "http"
:url_host             # defaults to "<bucket-name>", or "<bucket-name>" if not a valid subdomain
:use_iam_profile      # boolean - if true, no need for access_key_id or secret_access_key
:root_path            # store all content under a subdirectory - uids will be relative to this - defaults to nil
:fog_storage_options  # hash for passing any extra options to, e.g. {path_style: true}

Per-storage options, {'some' => 'metadata'}, path: 'some/path.txt', headers: {'x-amz-acl' => 'public-read-write'})


class MyModel
  dragonfly_accessor :photo do
    storage_options do |attachment|
        path: "some/path/#{some_instance_method}/#{rand(100)}",
        headers: {"x-amz-acl" => "public-read-write"}

BEWARE!!!! you must make sure the path (which will become the uid for the content) is unique and changes each time the content is changed, otherwise you could have caching problems, as the generated urls will be the same for the same uid.

Serving directly from S3

You can get the S3 url using'some/uid')



or with an expiring url:

my_model.attachment.remote_url(expires: 3.days.from_now)

or with an https url:

my_model.attachment.remote_url(scheme: 'https')   # also configurable for all urls with 'url_scheme'

or with a custom host:

my_model.attachment.remote_url(host: 'custom.domain')   # also configurable for all urls with 'url_host'

or with other query parameters (needs an expiry):

my_model.attachment.remote_url(expires: 3.days.from_now, query: {'response-content-disposition' => 'attachment'})  # URL that downloads the file
You can’t perform that action at this time.