Attempt to reduce test build cost#74471
Conversation
|
Tagging subscribers to this area: @hoyosjs Issue Detailsnull
|
|
@jkoritzinsky I think this is still running the same set of tests, but I don't actually know how to validate that. Do you? |
|
I don't think I have a script for verifying that today. @trylek might have one. I know that Helix is a little unreliable for recording successful tests in Kusto, but otherwise that might be a good place to look. |
|
My best guess is the AzDO API for the test run, since that's where Helix reports to and then there's back ingestion into Kusto |
| <!-- Fast filter to avoid files with Pri1 projects, this isn't completely accurate, but its close. Any Pri1 tests not caught by this will be skipped, | ||
| and any tests intended as Pri0 will end up running as Pri1, which isn't terrible. This trick allows us to avoid loading each project into msbuild | ||
| and evaluating it, which is a substantial improvement to the performance of the "Copy native test components to test output folder" task in our test runs --> | ||
| <Pri1Projects Include="@(AllProjects)" Condition="$([System.Text.RegularExpressions.Regex]::IsMatch($([System.IO.File]::ReadAllText('%(AllProjects.FullPath)')), '<CLRTestPriority>1</CLRTestPriority>'))" /> |
trylek
left a comment
There was a problem hiding this comment.
Looks great to me, thanks David! I suspected there may be some low hanging fruits. One suggestion down on my quality week list :-).
|
I don't have any targeted script for evaluating the set of tests run. AzDO UI does report the total number of tests so comparing it to neighboring runs without this change should give us a pretty decent indication whether this is basically fine or whether there's some big problem lurking somewhere. |
|
When merging the JIT/Methodical tests, I used to compare the XUnit xml outputs from the non-merged and the merged runs until I hit a match but I no longer remember whether I had any automation for this. |
|
I've validated that AzDO thinks we run the same number of tests. |
An evaluation of the cost of "Copy native test components to test output folder" indicates that there are 2 major performance bottlenecks in that routine.
There are the findings I had
This change addresses both of these issues.
Results:
Mac OSX test run jobs take ~8 minutes to run this portion of the build, down from ~27 minutes. Performance improvements are visible on all architectures/OS pairings, but they aren't as dramatic.