Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fog Adapter #88

Closed
wants to merge 7 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
42 changes: 39 additions & 3 deletions README.md
Expand Up @@ -263,9 +263,45 @@ end

### Upload Sitemaps to a Remote Host

> SitemapGenerator::S3Adapter is a simple S3 adapter which was added in v3.2 which
> uses Fog and doesn't require CarrierWave. You can find a bit more information
> about it [on the wiki page][remote_hosts].
**Upload to Remote Host using Fog**

You can use the **SitemapGenerator::FogAdapter** class to upload the sitemaps to a remote host (such as Amazon S3 or Rackspace Cloud Files)

Just include the following code in your **config/sitemap.rb** file.

```ruby
# config/sitemap.rb
SitemapGenerator::Sitemap.adapter = SitemapGenerator::FogAdapter.configure do |config|

config.credentials = {
# Amazon S3
#
# :provider => "AWS",
# :aws_access_key_id => YOUR_AWS_ACCESS_KEY_ID,
# :aws_secret_access_key => YOUR_AWS_SECRET_ACCESS_KEY

# Rackspace Cloud Files
#
# :provider => "Rackspace",
# :rackspace_username => RACKSPACE_USERNAME,
# :rackspace_api_key => RACKSPACE_API_KEY
#
# Rackspace European Cloud:
# :rackspace_auth_url => "lon.auth.api.rackspacecloud.com"

# Google Storage for Developers
#
# :provider => "Google",
# :google_storage_access_key_id => YOUR_SECRET_ACCESS_KEY_ID,
# :google_storage_secret_access_key => YOUR_SECRET_ACCESS_KEY
}

# Name of your Bucket / Container
config.fog_directory = "mysite_sitemaps"

end
```
**Upload to Remote Host using CarrierWave**

Sometimes it is desirable to host your sitemap files on a remote server and point robots
and search engines to the remote files. For example if you are using a host like Heroku
Expand Down
2 changes: 2 additions & 0 deletions lib/sitemap_generator.rb
Expand Up @@ -10,6 +10,8 @@
module SitemapGenerator
autoload(:Interpreter, 'sitemap_generator/interpreter')
autoload(:FileAdapter, 'sitemap_generator/adapters/file_adapter')
autoload(:Config, 'sitemap_generator/adapters/config.rb')
autoload(:FogAdapter, 'sitemap_generator/adapters/fog_adapter')
autoload(:S3Adapter, 'sitemap_generator/adapters/s3_adapter')
autoload(:WaveAdapter, 'sitemap_generator/adapters/wave_adapter')
autoload(:BigDecimal, 'sitemap_generator/core_ext/big_decimal')
Expand Down
44 changes: 44 additions & 0 deletions lib/sitemap_generator/adapters/config.rb
@@ -0,0 +1,44 @@
module SitemapGenerator
class Config
def initialize(data={})
@data = {}
update!(data)
end

def update!(data)
data.each do |key, value|
self[key] = value
end
end

def [](key)
@data[key.to_sym]
end

def []=(key, value)
if value.class == Hash
@data[key.to_sym] = Config.new(value)
else
@data[key.to_sym] = value
end
end

def method_missing(sym, *args)
if sym.to_s =~ /(.+)=$/
self[$1] = args.first
else
self[sym]
end
end

def to_hash
data = {}
@data.each do |k,v|
data[k] = v.kind_of?(Config) ? v.to_hash : v
end

return data
end

end
end
35 changes: 35 additions & 0 deletions lib/sitemap_generator/adapters/fog_adapter.rb
@@ -0,0 +1,35 @@
require 'fog'

module SitemapGenerator
class FogAdapter
include Singleton

@configuration = Config.new
@storage = nil

def self.configure(&block)
block.call @configuration
return self
end

# Call with a SitemapLocation and string data
def self.write(location, raw_data)
SitemapGenerator::FileAdapter.new.write(location, raw_data)

self.connect if !@storage
directory = @storage.directories.get(@configuration.fog_directory)

directory.files.create(
:key => location.path_in_public,
:body => File.open(location.path),
:public => true
)
end

private

def self.connect
@storage = Fog::Storage.new(@configuration.credentials.to_hash)
end
end
end