Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

new code coverage #5169

Merged
merged 11 commits into from
May 30, 2020
Merged

Conversation

frank-dong-ms-zz
Copy link
Contributor

@frank-dong-ms-zz frank-dong-ms-zz commented May 27, 2020

Address code coverage pipeline issue:
Error message like: Unable to read beyond the end of the stream.
https://dev.azure.com/dnceng/public/_build/results?buildId=662753&view=logs&j=12b34f85-96db-5b04-05cf-faf2be278867&t=4efaf9f8-7d26-5ca9-4ea8-c01555af0e7d

This is known issue from msbuild and coverlet. Coverlet collect and write hits data on process exist, if for some reason process is too slow to close will be killed and we cannot collect coverage result.
https://github.com/coverlet-coverage/coverlet/blob/master/Documentation/KnownIssues.md#1-vstest-stops-process-execution-earlydotnet-test

Due to recommendation from coverlet, change from coverlet.msbuild to coverlet.collector:
https://github.com/coverlet-coverage/coverlet/blob/master/Documentation/VSTestIntegration.md

Upgrade Vs test sdk and modify corresponding configuration.

@codecov
Copy link

codecov bot commented May 27, 2020

Codecov Report

Merging #5169 into master will decrease coverage by 2.58%.
The diff coverage is n/a.

@@            Coverage Diff             @@
##           master    #5169      +/-   ##
==========================================
- Coverage   75.79%   73.20%   -2.59%     
==========================================
  Files         993     1007      +14     
  Lines      181067   187713    +6646     
  Branches    19494    20237     +743     
==========================================
+ Hits       137240   137418     +178     
- Misses      38532    44780    +6248     
- Partials     5295     5515     +220     
Flag Coverage Δ
#Debug 73.20% <ø> (-2.59%) ⬇️
#production 68.89% <ø> (-2.81%) ⬇️
#test 87.48% <ø> (-1.44%) ⬇️
Impacted Files Coverage Δ
...ML.Tests/Scenarios/IrisPlantClassificationTests.cs 0.00% <0.00%> (-100.00%) ⬇️
test/Microsoft.ML.AutoML.Tests/Util.cs 50.00% <0.00%> (-50.00%) ⬇️
...st/Microsoft.ML.Functional.Tests/Datasets/Adult.cs 58.82% <0.00%> (-41.18%) ⬇️
...rc/Microsoft.ML.AutoML/API/RunDetails/RunDetail.cs 77.27% <0.00%> (-18.19%) ⬇️
...enerator/CodeGenerator/CSharp/PipelineExtension.cs 67.34% <0.00%> (-16.33%) ⬇️
...osoft.ML.Functional.Tests/SchemaDefinitionTests.cs 83.07% <0.00%> (-15.39%) ⬇️
...est/Microsoft.ML.Tests/Scenarios/GetColumnTests.cs 85.13% <0.00%> (-14.87%) ⬇️
...odeGenerator/CodeGenerator/CSharp/CSharpProject.cs 0.00% <0.00%> (-14.29%) ⬇️
...c/Microsoft.ML.LightGbm/WrappedLightGbmTraining.cs 43.42% <0.00%> (-11.85%) ⬇️
...rosoft.ML.Data/Scorers/QuantileRegressionScorer.cs 88.88% <0.00%> (-11.12%) ⬇️
... and 191 more

@frank-dong-ms-zz frank-dong-ms-zz changed the title test new code coverage new code coverage May 29, 2020
@frank-dong-ms-zz frank-dong-ms-zz marked this pull request as ready for review May 29, 2020 03:24
@frank-dong-ms-zz frank-dong-ms-zz requested a review from a team as a code owner May 29, 2020 03:24
@@ -1181,6 +1181,9 @@ public void OneHotHashEncodingOnnxConversionTest()
var onnxModelPath = GetOutputPath(onnxFileName);
SaveOnnxModel(onnxModel, onnxModelPath, null);

//Free up memory for x86 test
GC.Collect();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this necessary? And how does this affect code coverage?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I come up with out of memory issue for x86 in local env after upgrade package "Microsoft.NET.Test.Sdk".
So I'm assuming x86 test is close to use up all 2 GB memory. We should optimize memory usage like free up memory as soon as possible but that seems like out of this PR's scope.


In reply to: 432260986 [](ancestors = 432260986)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are we running code coverage on x86 builds? If not, we should remove this from this PR and address it separately.


In reply to: 432322296 [](ancestors = 432322296,432260986)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, we only run code coverage on x64, netcore app 2.1, but we run x86 CI builds, this happens not from code coverage build but normal CI build only after upgrade Microsoft.NET.Test.Sdk.


In reply to: 432630210 [](ancestors = 432630210,432322296,432260986)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

breast-cancer.txt is a 15K file. OneHotHashEncoding is just doing a bunch of computation and not allocating memory. It is very suspicious that it would run out of memory during this test. Are you able to reproduce this consistently?


In reply to: 432724322 [](ancestors = 432724322,432630210,432322296,432260986)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I have several repro at local, haven't try on CI. Memory issue may not be at this test, what I observe is that when we start to see out of memory issue at local, we see this issue from several different tests but we always starts to see out of memory from this specific test. This is can be caused from previous code/test not releasing memory in time.


In reply to: 432725851 [](ancestors = 432725851,432724322,432630210,432322296,432260986)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Then it is not appropriate to add a GC.Collect here. Ideally we shouldn't be adding a GC.Collect anywhere and instead rely on predictable disposal of allocated memory. But if we have to add it, we should add it where the allocation is happening instead of somewhere else.


In reply to: 432738561 [](ancestors = 432738561,432725851,432724322,432630210,432322296,432260986)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let me try on CI without this GC.Collect, if CI can pass then we can optimize memory issue in separate PR, other wise I would like to optimize memory usage first or disable some test from running on x86 first


In reply to: 432739891 [](ancestors = 432739891,432738561,432725851,432724322,432630210,432322296,432260986)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can try adding memory statistics to BaseTestClass.Cleanup so that you can get a log of the memory usage at the end of each test to try and find the offending test.


In reply to: 432761446 [](ancestors = 432761446,432739891,432738561,432725851,432724322,432630210,432322296,432260986)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That is great idea, I will do that in a separate PR since x86 test runs fine on CI due to my test


In reply to: 432787073 [](ancestors = 432787073,432761446,432739891,432738561,432725851,432724322,432630210,432322296,432260986)

<!-- Delete testhost.dll and testhost.exe from output folder,
start test from dotnet.exe to keep consistent behavior with older version of Microsoft.NET.Test.Sdk -->
<Target Name="DeleteTestHost">
<Message Importance="High" Text="Delete testhost.dll and testhost.exe from output folder..." />
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

question: Why is this necessary?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This also related to upgrade package "Microsoft.NET.Test.Sdk". Before upgrade the package, testhost.dll and testhost.exe are not copied to output path and tests are all starting from dotnet.exe. After upgrade package, tests will starts from testhost which will cause 2 issues:

  1. RemoteExecutor will not able to start new process as below line will get testhost instead of dotnet: https://github.com/dotnet/machinelearning/blob/master/test/Microsoft.ML.TestFramework/RemoteExecutor.cs#L29
  2. For x86 tests, we should use testhost.x86 instead of testhost directly, this should be achieved by set /p:Configuration instead of buildArch(which is ML.NET currently using), but setting Configuration will modify the build behavior a lot and corrupt the ML.NET build system.

So, based on these 2 reason, I'm add this target to remove testhost after build to keep consistent behavior as before upgrade test sdk package.


In reply to: 432262763 [](ancestors = 432262763)

Copy link
Contributor

@harishsk harishsk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

:shipit:

@frank-dong-ms-zz frank-dong-ms-zz merged commit c8cace2 into dotnet:master May 30, 2020
antoniovs1029 added a commit to antoniovs1029/machinelearning that referenced this pull request Jun 3, 2020
antoniovs1029 added a commit to antoniovs1029/machinelearning that referenced this pull request Jun 3, 2020
antoniovs1029 added a commit to antoniovs1029/machinelearning that referenced this pull request Jun 3, 2020
antoniovs1029 added a commit to antoniovs1029/machinelearning that referenced this pull request Jun 3, 2020
@ghost ghost locked as resolved and limited conversation to collaborators Mar 18, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants