Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Suggestion] add API for detecting benchmark run failures. #989

Closed
ig-sinicyn opened this issue Dec 15, 2018 · 4 comments · Fixed by #2148
Closed

[Suggestion] add API for detecting benchmark run failures. #989

ig-sinicyn opened this issue Dec 15, 2018 · 4 comments · Fixed by #2148

Comments

@ig-sinicyn
Copy link
Contributor

There's no way to detect run failures such as incompatible target platform.

Use case:

		[Fact]
		public void InProcessCannotRun()
		{
			var otherPlatform = IntPtr.Size == 8
				? Platform.X86
				: Platform.X64;

			var otherPlatformConfig = new ManualConfig()
				.With(Job.Dry.With(InProcessToolchain.Instance).WithInvocationCount(UnrollFactor).WithUnrollFactor(UnrollFactor).With(otherPlatform))
				.With(new OutputLogger(Output))
				.With(DefaultColumnProviders.Instance);

			var runInfo = BenchmarkConverter.TypeToBenchmarks(typeof(BenchmarkAllCases), otherPlatformConfig);
			var summary = BenchmarkRunner.Run(runInfo);

			// summary is empty; no way to detect from code what vent wrong;
			// output is:
			/*
				// Benchmark BenchmarkAllCases.InvokeOnceVoid: Dry(Platform=X86, Toolchain=InProcessToolchain, InvocationCount=16, IterationCount=1, LaunchCount=1, RunStrategy=ColdStart, UnrollFactor=16, WarmupCount=1)
				// cannot be run in-process. Validation errors:
				//    * Job Dry, EnvironmentMode.Platform was run as X64 (X86 expected). Fix your test runner options.
			 */

			// Workaround (not reliable)
			Assert.Empty(summary.BenchmarksCases);

			return;
		}

Suggestions:

  • call the IToolchain.IsSupported method from validator and add the result into summary.ValidationErrors
  • add a property that indicates that benchmark completed without errors, summary.CompletedWithoutErrors or something like this.
@ig-sinicyn ig-sinicyn changed the title [Suggestion] add API for detecting cases where toolchain is not supported. [Suggestion] add API for detecting benchmark run failures. Dec 15, 2018
@AndreyAkinshin
Copy link
Member

call the IToolchain.IsSupported method from validator and add the result into summary.ValidationErrors

It's a good idea!

add a property that indicates that benchmark completed without errors, summary.CompletedWithoutErrors or something like this.

We already have a benchmark report in the summary. Currently, MarkdownExporter uses the following code to detect broken benchmarks:

var benchmarksWithTroubles = summary.Reports.Where(r => !r.GetResultRuns().Any()).Select(r => r.BenchmarkCase).ToList();

Currently, BenchmarkReport contains GenerateResult, BuildResult, and the list of result runs. I think we can introduce Status property which contains the final conclusion about benchmark status (including error text messages in case of any problems). In this case, we can implement an extension method for CompletedWithoutErrors based on the status values.

@ig-sinicyn
Copy link
Contributor Author

@AndreyAkinshin
Just checked, the summary.Reports is empty too. To repro: copy the code into the InProcessTest.

		[Fact]
		public void InProcessCannotRun()
		{
			var otherPlatform = IntPtr.Size == 8
				? Platform.X86
				: Platform.X64;

			var otherPlatformConfig = new ManualConfig()
				.With(Job.Dry.With(InProcessToolchain.Instance).WithInvocationCount(UnrollFactor).WithUnrollFactor(UnrollFactor).With(otherPlatform))
				.With(new OutputLogger(Output))
				.With(DefaultColumnProviders.Instance);

			var runInfo = BenchmarkConverter.TypeToBenchmarks(typeof(BenchmarkAllCases), otherPlatformConfig);
			var summary = BenchmarkRunner.Run(runInfo);

			// proof:
			Assert.Empty(summary.BenchmarksCases);
			Assert.Empty(summary.Reports);
		}

@adamsitnik
Copy link
Member

I've marked this issue as up-for-grabs. The contributor who is going to work on this should make sure that the errors are present in summary.ValidationErrors

@emanuel-v-r
Copy link
Contributor

Hi,

Can I grab this one?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants