Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OfflineReporter with multiple benchmarks fails to #159

Open
lordgordon opened this issue Feb 12, 2016 · 7 comments
Open

OfflineReporter with multiple benchmarks fails to #159

lordgordon opened this issue Feb 12, 2016 · 7 comments

Comments

@lordgordon
Copy link

Hi there,

I'm trying to setup a benchmarking suite with scalameter 0.7, but the HTML report always show only one benchmark.

In fact data.js contains only one result, instead of the expected two.

From the output of sbt bench:test I can see all the tests are properly executed.

Current result:
image

Content of target/benchmark:
image

Environment:

  • JDK: OpenJDK 1.8.0_72;
  • Scala: 2.11.7;
  • sbt: 0.13.9.

src/bench/scala/Benchmark.scala:

import org.scalameter.api._

class Benchmark extends Bench.Group {
  include(new IncrementalTSMemory {})
  include(new IncrementalTSTime {})
}

src/bench/scala/IncrementalTSBenchmark.scala:

import org.scalameter.api._

object IncrementalTSConfig {
  val seqTS = Seq[MyClass] // here I initialize the object I want to test

  // benchmarking settings
  val sizes = Gen.range("states")(0, 3, 1)
}

trait IncrementalTSTime extends Bench.OfflineReport {
  // the performance test
  performance of "IncrementalTS[time]" in {
    measure method "linearize" in {
      using(IncrementalTSConfig.sizes) curve "Test1" in {
        i => IncrementalTSConfig.seqTS(i).linearize()
      }
    }
  }
}

trait IncrementalTSMemory extends Bench.OfflineReport {
  // benchmarking settings
  override def measurer = new Executor.Measurer.MemoryFootprint

  // the performance test
  performance of "IncrementalTS[memory]" in {
    measure method "linearize" config(
      exec.minWarmupRuns -> 2,
      exec.maxWarmupRuns -> 5,
      exec.benchRuns -> 5,
      exec.independentSamples -> 1
      ) in {
      using(IncrementalTSConfig.sizes) curve "Test1" in {
        i => IncrementalTSConfig.seqTS(i).linearize()
      }
    }
  }
}

Thanks a lot for your help.

Best Regards,

Nicholas

Edit: minor changes in the code

@axel22
Copy link
Member

axel22 commented Feb 16, 2016

This might be a bug in HTMLReporter with naming the curves. Could you try to rename one of the curves from Test1 to something else.

@lordgordon
Copy link
Author

Actually, I already tried without naming the curves. I also tried with different Bench classes.

@lordgordon
Copy link
Author

It is working to you?

@enlait
Copy link

enlait commented Apr 11, 2017

Have the same issue here with scalameter 0.8.2 My guess it that reporters from different bench traits do not aggregate together.

My tests are fairly by-the-book, except that I use LocalExecutor

@mduerig
Copy link

mduerig commented Apr 17, 2017

This doesn't even work for the include-statements example, which specifically states it would place reports of individual tests into different directories. In my case I do not get different directories for these tests but just the default target/benchmarks/report one containing a single index.html file containing the results of the test that happened to run last.

@mduerig
Copy link

mduerig commented Apr 17, 2017

The problem here seems to be that HtmlReporter retrieves the result directory from the global context instead of the context that was used when running the test. The latter would be available from the results passed to HtmlReporter.report(). I wonder whether the include-statements example ever worked like this...

@axel22
Copy link
Member

axel22 commented Apr 17, 2017

Good analysis, the HTMLReporter does not properly read the result directory. The correct behavior would probably be to group the benchmark scopes according to their resultDir property, and generate separate reports for different groups.

I unfortunately don't at the moment have time to fix this, but patches are more than welcome.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants