--profile fails when a test fails #730

Closed
TigerWolf opened this Issue Nov 16, 2012 · 12 comments

Comments

Projects
None yet
8 participants

Rspec --profile output fails to show up when a test fails.

bash-4.2$ rails -v
Rails 3.2.8
bash-4.2$ ruby -v
ruby 1.9.3p194 (2012-04-20 revision 35410) [i686-linux]
bash-4.2$   rails new rspec_testing      
...
bash-4.2$ cd rspec_testing  
bash-4.2$ echo "gem 'rspec-rails'" >> Gemfile
bash-4.2$ bundle
...
bash-4.2$ rails generate rspec:install
...
bash-4.2$ echo "--profile" >> .rspec
bash-4.2$ bash-4.2$ cat spec/my_spec.rb 

describe "profiling" do
  it "is fast" do
  end

  it "is slow" do
    sleep 3
  end

  it "is pending" do
    pending "is pending"
  end

  it "is failing" do
    throw "this is failing"
  end

end


bash-4.2$  rspec  
..*F

Pending:
  profiling is pending
    # is pending
    # ./spec/my_spec.rb:9

Failures:

  1) profiling is failing
     Failure/Error: throw "this is failing"
     ArgumentError:
       uncaught throw "this is failing"
     # ./spec/my_spec.rb:14:in `throw'
     # ./spec/my_spec.rb:14:in `block (2 levels) in <top (required)>'

Finished in 3 seconds
4 examples, 1 failure, 1 pending

Failed examples:

rspec ./spec/my_spec.rb:13 # profiling is failing

bash-4.2$ cat spec/my_spec.rb 
describe "profiling" do
  it "is fast" do
  end

  it "is slow" do
    sleep 3
  end

  it "is pending" do
    pending "is pending"
  end

  it "is failing" do
#    throw "this is failing"
  end

end
bash-4.2$ rspec spec/my_spec.rb 
..*.

Pending:
  profiling is pending
    # is pending
    # ./spec/my_spec.rb:9

Top 4 slowest examples (3 seconds, 100.0% of total time):
  profiling is slow
    3 seconds ./spec/my_spec.rb:5
  profiling is pending
    0.00022 seconds ./spec/my_spec.rb:9
  profiling is failing
    0.00013 seconds ./spec/my_spec.rb:13
  profiling is fast
    0 seconds ./spec/my_spec.rb:2

Finished in 3 seconds
4 examples, 0 failures, 1 pending
Contributor

Peeja commented Dec 2, 2012

Reproduced.

https://gist.github.com/4191562

Run rake for output.

Contributor

vanstee commented Dec 6, 2012

Is it useful to profile tests that might have failed? Just trying to think of a usecase for this.

If you have 1000 examples and want to improve the performance of your tests but one of your tests is currently failing, then you will either need to filter out those tests or fix it. You could be in a situation where speeding up your tests could help you fix those broken tests more quickly.

This came up for me when upgrading a large app from Rails 3.0 to 3.2.

Contributor

greggroth commented Dec 12, 2012

It would be an easy thing to change (here's where it happens), but I also think it's not meaningful to see a profile if there are failing specs.

@TigerWolf That feels like an edge case to me. You use --profile to get a report of the slowest tests, but it sounds like you already have an idea of which tests are slowing you down. If your unit tests are passing and want to speed those up, then you can profile that set of specs: rspec spec/models --profile. There again, maybe I don't fully understand your situation.

wallace commented Dec 17, 2012

@greggroth, if you're trying to fix a failing test and every time you run your test suite for a full regression check, you have to wait for some super slow tests to complete only to find that the test you are focused on is still failing, maybe you decide to switch gears and focus on the slow tests? @TigerWolf, does that sound like what happened to you?

Since --profile currently only runs when your test suite is green, I don't seen it being too confusing if it were to only display slow passing tests but it seems like a documentation update would be helpful too. Thoughts?

Contributor

greggroth commented Dec 17, 2012

It seems like the energy would still be better spent getting the suite passing before worrying about performance. Running focused tests removes the slow tests from the equation.

In any case, maybe it would still make sense to be able to override the default behavior through a config option.

wallace commented Dec 17, 2012

@greggroth, good call on focused tests.

How opinionated is rspec? Does it encourage users to develop with a certain workflow or is it more inclined to provide users with a plethora of options to work how they please?

Owner

myronmarston commented Dec 24, 2012

FWIW, not including profile output when there are failures was an intentional decision, not a bug:

https://github.com/rspec/rspec-core/blob/v2.12.2/lib/rspec/core/formatters/base_text_formatter.rb#L46-L47

I traced that back through the git history to the initial import of micronaut into rspec-core as the basis of the spec running for rspec-2...so this behavior has been around a long time. I've never discussed it with @spicycode or @dchelimsky, though, to get their take on why this choice was made.

If I had to guess...the reason the profile info isn't dumped when their are failures is that it clutters up the output. The failures are the more important thing you really should be concerned about, and it would be problematic if the presence of the extra profile info caused someone to not notice that there were spec failures. Improving spec perf is best done when your suite is green.

That said, I can remember getting annoyed at this behavior on at least one occasion and got an unexpected failure (due to the --order random option exposing a new unintentional ordering dependency I hadn't noticed yet), and was annoyed that the profile output was not included in the output (as it was what I was primarily interested in at the time).

I'm open to the idea of changing the profiler so the info is always dumped when the --profile option is used, even if there are spec failures (after all, there shouldn't be any harm in giving the user more info), as long as the profile output doesn't reduce visibility into the failures--i.e. the failures should come later in the output than the profile info, I think.

@dchelimsky -- are there additional factors I'm not thinking of to take into account here?

Owner

dchelimsky commented Dec 24, 2012

@myronmarston you already know more about this than I do :)

re: motivation, my guess would be that failures are likely to cause the profile information to be incorrect. If the most expensive example errors out on the first line it might not even be in the list.

@myronmarston myronmarston added a commit that referenced this issue Feb 27, 2013

@myronmarston myronmarston Add missing changelog entry for #730.
[ci skip]
74ee7a7
Contributor

pjg commented Feb 27, 2013

I'm really against this change. It makes running your specs using guard-rspec with --fail-fast and --profile pretty pointless as they clutter the output and you have to scroll your console up to read the error 😢 (i.e. using --profile is a hindrance not help).

I mean it's about priorities - what is more important? Failing spec or an info about the slowness of the specs?

How about enabling --profile only when --fail-fast is not used?

Owner

myronmarston commented Feb 27, 2013

How about enabling --profile only when --fail-fast is not used?

That makes a certain amount of sense to me. Want to take a stab at that?

Contributor

pjg commented Feb 28, 2013

Sure! I'll work on it next week and see if I can make it work :)

@pjg pjg added a commit to pjg/rspec-core that referenced this issue Mar 9, 2013

@pjg pjg Don't --profile with --fail-fast and failures
i.e. don't display slow examples in the output if `--fail-fast` option
is used and there are any failures as it clutters the output.

References #730.
187c008

@pjg pjg added a commit to pjg/rspec-core that referenced this issue Mar 13, 2013

@pjg pjg Don't --profile with --fail-fast and failures
i.e. don't display slow examples in the output if `--fail-fast` option
is used and there are any failures as it clutters the output.

References #730.
2058871

@pjg pjg added a commit to pjg/rspec-core that referenced this issue Mar 24, 2013

@pjg pjg Don't --profile with --fail-fast and failures
i.e. don't display slow examples in the output if `--fail-fast` option
is used and there are any failures as it clutters the output.

References #730.
eb8546a
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment