Skip to content

Commit

Permalink
Merge pull request #3 from menkel/master
Browse files Browse the repository at this point in the history
[feature] add generator for install roboto
  • Loading branch information
dpickett committed Jun 4, 2012
2 parents 3a26b70 + 4eea66a commit dfec298
Show file tree
Hide file tree
Showing 8 changed files with 101 additions and 22 deletions.
3 changes: 3 additions & 0 deletions .gitignore
Expand Up @@ -22,3 +22,6 @@ spec/dummy/db/*.sqlite3
spec/dummy/log/*.log
spec/dummy/tmp/


.DS_Store
spec/dummy_generator/
29 changes: 7 additions & 22 deletions README.md
Expand Up @@ -7,36 +7,21 @@ Roboto is a Rails Engine that gives you the ability to specify enviornment speci
Don't let crawlers access your staging environment. This is [bad for SEO](http://www.seomoz.org/learn-seo/duplicate-content).

## Installing

First, remove the default, generate robots.txt in your Rails App
You can add it to your Gemfile with:

```
#> rm public/robots.txt
gem 'roboto'
```

Next, add roboto to your gemfile:

After you need to run the generator:
```
gem 'roboto'
#> rails generate roboto:install
```

Then, add robot to your routes (config/routes.rb):

```
Rails.application.routes.draw do
mount_roboto
end
```
If you already have robots.txt, it will be kept for your production environment in config/robots/production.txt

You can now specify environment specific robots.txt files in config/robots.

It's recommended for staging that you do disallow crawlers from accessing your site. Once you've created a separate Rails environment for staging, define a config/robots/staging.txt file like so:

```
#place this in config/robots/staging.txt
User-Agent: *
Disallow: /
```
You can now specify environment specific robots.txt files in config/robots/.
By default crawlers are disallow from accessing your site has been made for all your environments.

## Contributing

Expand Down
32 changes: 32 additions & 0 deletions lib/generators/roboto/install_generator.rb
@@ -0,0 +1,32 @@
module Roboto
module Generators
class InstallGenerator < Rails::Generators::Base
source_root File.expand_path("../../templates", __FILE__)

desc "Creates a Roboto locale files to your application."

def copy_locale
empty_directory "config/robots"
env_list = Dir.glob("#{destination_root}/config/environments/*")
env_list.each do |env_file|
env_name = File.basename(env_file, ".rb")
unless (env_name == "production" && FileTest.exists?("public/robots.txt"))
copy_file "robots.txt", "config/robots/#{env_name}.txt"
end
end
if FileTest.exists?("public/robots.txt")
copy_file File.join(destination_root + "/public/robots.txt"), "config/robots/production.txt"
remove_file "public/robots.txt"
end
end

def add_roboto_route
route "mount_roboto"
end

def show_readme
readme "README" if behavior == :invoke
end
end
end
end
9 changes: 9 additions & 0 deletions lib/generators/templates/README
@@ -0,0 +1,9 @@
===============================================================================
Disallow crawlers from accessing your site has been made for all
your environments.

If you had a robots.txt in public,
it was moved to config/robots/production.txt

You can now specify environment specific robots.txt files in config/robots.
===============================================================================
2 changes: 2 additions & 0 deletions lib/generators/templates/robots.txt
@@ -0,0 +1,2 @@
User-Agent: *
Disallow: /
2 changes: 2 additions & 0 deletions roboto.gemspec
Expand Up @@ -19,11 +19,13 @@ Gem::Specification.new do |gem|
gem.add_dependency 'actionpack', '>= 3.0.0'

gem.add_development_dependency 'rspec-rails', '~> 2.9.0'
gem.add_development_dependency 'ammeter'
gem.add_development_dependency 'capybara'
gem.add_development_dependency 'fakefs'
gem.add_development_dependency 'yard'
gem.add_development_dependency 'redcarpet'


#we need this for the dummy app
gem.add_development_dependency 'sqlite3'
end
Expand Down
45 changes: 45 additions & 0 deletions spec/generators/roboto/install_generator_spec.rb
@@ -0,0 +1,45 @@
require 'spec_helper'
require 'generators/roboto/install_generator'

describe Roboto::Generators::InstallGenerator do
destination File.expand_path("../../../dummy_generator", __FILE__)

before {prepare_destination}

describe 'presence of roboto configuration file' do
before do
@env_availabe = ["roboto_env", "staging", "production"]
create_fake_env
create_routes_rb
run_generator
end

["roboto_env", "staging", "production"].each do |env|
describe 'config/robots/#{env}.txt' do
subject { file("config/robots/#{env}.txt") }
it { should exist }
it { should contain "User-Agent: *" }
it { should contain "Disallow: /" }
end
end

describe 'config/routes.rb' do
subject { file('config/routes.rb') }
it { should exist }
it { should contain "mount_roboto" }
end
end

def create_routes_rb
routes = File.expand_path("../../../dummy/config/routes.rb", __FILE__)
destination = File.join(destination_root, "config")
FileUtils.mkdir_p(destination)
FileUtils.cp routes, destination
end

def create_fake_env
destination = File.join(destination_root, "config/environments")
FileUtils.mkdir_p(destination)
@env_availabe.each {|env| FileUtils.touch(destination_root + "/config/environments/#{env}.rb")}
end
end
1 change: 1 addition & 0 deletions spec/spec_helper.rb
Expand Up @@ -6,6 +6,7 @@

require 'capybara/rspec'
require 'rspec/rails'
require 'ammeter/init'

Rails.backtrace_cleaner.remove_silencers!

Expand Down

0 comments on commit dfec298

Please sign in to comment.