-
Notifications
You must be signed in to change notification settings - Fork 293
Track perf over time #6326
Copy link
Copy link
Closed as not planned
Labels
Area: InfrastructureArea: MSTestIssues with MSTest that are not specific to more refined area (e.g. analyzers or assertions)Issues with MSTest that are not specific to more refined area (e.g. analyzers or assertions)Area: MTPBelongs to the Microsoft.Testing.Platform core libraryBelongs to the Microsoft.Testing.Platform core libraryArea: Performance
Metadata
Metadata
Assignees
Labels
Area: InfrastructureArea: MSTestIssues with MSTest that are not specific to more refined area (e.g. analyzers or assertions)Issues with MSTest that are not specific to more refined area (e.g. analyzers or assertions)Area: MTPBelongs to the Microsoft.Testing.Platform core libraryBelongs to the Microsoft.Testing.Platform core libraryArea: Performance
Summary
This issue is intended to find the best reasonable way to automatically or semi-automatically track MSTest performance over time for different scenarios.
Background and Motivation
It's good information to know how the perf characteristics of MSTest and/or MTP is changing overtime.
Proposed Feature
My general idea is that we will have a separate pipeline, specifically for measuring performance of different scenarios. We can start initially with 6 jobs:
Detailed implementation
We could either add this pipeline in testfx repo directly, or we could create a separate repo (testfx-perf-validation) that MSTest/MTP is inserted into on every build.
I'm not sure which would be best.