Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add benchmark generator #37948

Merged
merged 1 commit into from Jan 27, 2020
Merged
Changes from all commits
Commits
File filter...
Filter file types
Jump to…
Jump to file or symbol
Failed to load files and symbols.

Always

Just for now

@@ -1,14 +1,28 @@
* Add benchmark generator

Introduce benchmark generator to benchmark Rails applications.

`rails generate benchmark opt_compare`

This creates a benchmark file that uses [`benchmark-ips`](https://github.com/evanphx/benchmark-ips).
By default, two code blocks can be benchmarked using the `before` and `after` reports.

You can run the generated benchmark file using:
`ruby script/benchmarks/opt_compare.rb`

*Kevin Jalbert*, *Gannon McGibbon*

* Cache compiled view templates when running tests by default

When generating a new app without `--skip-spring`, caching classes is
disabled in `environments/test.rb`. This implicitly disables caching
view templates too. This change will enable view template caching by
adding this to the generated `environments/test.rb`:
When generating a new app without `--skip-spring`, caching classes is
disabled in `environments/test.rb`. This implicitly disables caching
view templates too. This change will enable view template caching by
adding this to the generated `environments/test.rb`:

```ruby
config.action_view.cache_template_loading = true
```

*Jorge Manrubia*

* Introduce middleware move operations
@@ -3,7 +3,7 @@ git_source(:github) { |repo| "https://github.com/#{repo}.git" }

ruby <%= "'#{RUBY_VERSION}'" -%>

<% unless gemfile_entries.first.comment -%>
<% unless gemfile_entries.first&.comment -%>
This conversation was marked as resolved by kevinjalbert

This comment has been minimized.

Copy link
@kaspth

kaspth Jan 5, 2020

Member

Why was it we needed this again?

This comment has been minimized.

Copy link
@gmcgibbon

gmcgibbon Jan 6, 2020

Member

If you specify you want an empty gemfile, this assumes there's always one entry and crashes.


<% end -%>
<% gemfile_entries.each do |gem| -%>
@@ -0,0 +1,19 @@
Description:
Generate benchmarks to compare performance optimizations.
This conversation was marked as resolved by kevinjalbert

This comment has been minimized.

Copy link
@kaspth

kaspth Jan 5, 2020

Member

Needs an ips description and why we prefer ips benchmarks.

This comment has been minimized.

Copy link
@nateberkopec

nateberkopec Jan 10, 2020

Contributor

For me, the big thing is the simple significance test, automatic warmup, and the fact that you don't need to choose a # of iterations.


Makes use of the `benchmark-ips` gem as it provides a number of benefits like:
- Simple significance test
- Automatic warmup
- No need to specify the number of iterations

Example:
`rails generate benchmark opt_compare`

This will create:
script/benchmarks/opt_compare.rb

You can run the generated benchmark file using:
`ruby script/benchmarks/opt_compare.rb`

You can specify different reports:
`rails generate benchmark opt_compare patch1 patch2 patch3`
@@ -0,0 +1,29 @@
# frozen_string_literal: true

require "rails/generators/named_base"

module Rails
module Generators
class BenchmarkGenerator < NamedBase
IPS_GEM_NAME = "benchmark-ips"

argument :reports, type: :array, default: ["before", "after"]

def generate_layout
add_ips_to_gemfile unless ips_installed?
template("benchmark.rb.tt", "script/benchmarks/#{file_name}.rb")
end

private
def add_ips_to_gemfile
gem(IPS_GEM_NAME, group: [:development, :test])
end

def ips_installed?
in_root do
return File.read("Gemfile").match?(/gem.*\b#{IPS_GEM_NAME}\b.*/)

This comment has been minimized.

Copy link
@rafaelfranca

rafaelfranca Jan 24, 2020

Member

Should we use bundle show?

This comment has been minimized.

Copy link
@gmcgibbon

gmcgibbon Jan 24, 2020

Member

We could, I wonder which approach is faster? Probably the current one, but shelling out to bundle show gem_name is likely much less error prone.

Perhaps we could keep this approach now and refactor this in both the benchmark and db system generators later?

end
end
end
end
end
@@ -0,0 +1,15 @@
# frozen_string_literal: true

require_relative "../config/environment"

This comment has been minimized.

Copy link
@kaspth

kaspth Dec 22, 2019

Member

In my experience, benchmarks tend to do some setup here that they act on in the reports. Wonder if we should at least note that with a comment. Or maybe there's a way to set up some sample data that's not annoying ala DATA = 100.times.map { "example" }. Or:

# Override the method under test and optimize it.
class Post
  def optimized_title
    title
  end
end

# Find a representative sample.
post = Post.first

Benchmark.ips do |x|
  x.report("before") { post.title }
  x.report("after")   { post.optimized_title }
end

Hell, we could consider adding a require "active_record/test_case/benchmark_helper then if people ran this in the test environment, we'd automatically wrap the script in a transaction that would be rolled back after the benchmark. Just some more ideas for what we could do since we're automating things here.

This comment has been minimized.

Copy link
@kevinjalbert

kevinjalbert Jan 3, 2020

Author Contributor

🤔 I'm good with leaving a comment in the template and leave the decision with the end user.

Unless there is a general setup that works best for setting things up?

The benchmark_helper is an interesting idea, although is it maybe too prescriptive?


I'm 100% open to suggestions on how to move forward with this comment.

This comment has been minimized.

Copy link
@kaspth

kaspth Jan 5, 2020

Member

Unless there is a general setup that works best for setting things up?

That's benchmark dependent, I guess. But it's stuff like this that would be good to think about such that we can make calls on it. What would people be testing here? Are these one-off scripts and what's under test? Active Record interactions? Full view tests? (Probably not). Seems like this is at the model level, so I think it's fine to make model related decisions, while still leaving it possible to remove that stuff for other testing.

This comment has been minimized.

Copy link
@nateberkopec

nateberkopec Jan 10, 2020

Contributor

I like this idea.

Common benchmark mistake:

x.report("thing") { MyService.perform(User.first) }

Reifying a "setup method" or "setup area" where data is prepared sounds good to me.

This comment has been minimized.

Copy link
@gmcgibbon

gmcgibbon Jan 10, 2020

Member

I assume we would do this with a comment and some whitespace above the benchmark?

# Any benchmarking setup goes here...



Benchmark.ips do |x|
<%- reports.each do |report| -%>
x.report("<%= report %>") { }
<%- end -%>
This conversation was marked as resolved by kevinjalbert

This comment has been minimized.

Copy link
@nateberkopec

nateberkopec Jan 10, 2020

Contributor

If we're gonna ship with ips default, we should x.compare! here


x.compare!
end
@@ -0,0 +1,86 @@
# frozen_string_literal: true

require "generators/generators_test_helper"
require "rails/generators/rails/benchmark/benchmark_generator"

module Rails
module Generators
class BenchmarkGeneratorTest < Rails::Generators::TestCase
include GeneratorsTestHelper

setup do
copy_gemfile
end

test "generate benchmark" do
run_generator ["my_benchmark"]

assert_file("Gemfile") do |content|
assert_match "gem 'benchmark-ips'", content
end

assert_file("script/benchmarks/my_benchmark.rb") do |content|
assert_equal <<~RUBY, content
# frozen_string_literal: true
require_relative "../config/environment"
# Any benchmarking setup goes here...
Benchmark.ips do |x|
x.report("before") { }
x.report("after") { }
x.compare!
end
RUBY
end
end

test "generate benchmark with no name" do
output = capture(:stderr) do
run_generator []
end

assert_equal <<~MSG, output
No value provided for required arguments 'name'
MSG
end

test "generate benchmark with reports" do
run_generator ["my_benchmark", "with_patch", "without_patch"]

assert_file("script/benchmarks/my_benchmark.rb") do |content|
assert_equal <<~RUBY, content
# frozen_string_literal: true
require_relative "../config/environment"
# Any benchmarking setup goes here...
Benchmark.ips do |x|
x.report("with_patch") { }
x.report("without_patch") { }
x.compare!
end
RUBY
end
end

test "generate benchmark twice only adds ips gem once" do
run_generator ["my_benchmark"]
run_generator ["my_benchmark"]

assert_file("Gemfile") do |content|
occurrences = content.scan("gem 'benchmark-ips'").count
assert_equal 1, occurrences, "Should only have benchmark-ips present once"
end
end
end
end
end
ProTip! Use n and p to navigate between commits in a pull request.
You can’t perform that action at this time.