Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Microsoft CodeAnalysis OOM Exception #24055

Closed
mavasani opened this issue Jan 4, 2018 · 19 comments
Closed

Microsoft CodeAnalysis OOM Exception #24055

mavasani opened this issue Jan 4, 2018 · 19 comments
Assignees
Labels
Area-IDE Bug Resolution-Fixed The bug has been fixed and/or the requested behavior has been implemented Tenet-Performance Regression in measured performance of the product from goals.
Milestone

Comments

@mavasani
Copy link
Member

mavasani commented Jan 4, 2018

Ported from dotnet/roslyn-analyzers#1503


@chillryan

I've reported this issue through the visual studio feedback tool see link but figured I'd try my chances posting here.

I'm using the Code Analysis 2017 extension for VS 2017 15.5. In looking at the stack trace, I can only assume that since the extension is running inside the devenv process, when its analysis a loaded solution, it hits the OOM.

Since I've hit this situation a very frequent basis, I've disabled solution analysis for C# projects hoping that will help. I'll report back here if there are any changes or comment on the visual studio feedback link posted prior.

@mavasani mavasani added Area-IDE Bug Tenet-Performance Regression in measured performance of the product from goals. labels Jan 4, 2018
@mavasani
Copy link
Member Author

mavasani commented Jan 4, 2018

@sharwell @heejaechang - Do we have a tracking issue for our memory usage drastically increasing on switching git branches? The feedback link mentions switching git branches as the major culprit. I have seen it very often while working on Roslyn as well.

@heejaechang
Copy link
Contributor

I think it probably the same tagger issue since switching git branch will cause a lot of document to be changed causing a lot of files to be re-analyzed, raising a lot of diagnostic changed events, all pushed into UI queue and OOM.

@jinujoseph jinujoseph added this to the 15.6 milestone Jan 8, 2018
@sharwell
Copy link
Member

sharwell commented Jan 8, 2018

@heejaechang I reviewed the heap dump and can confirm that this is the PendingWork bug.

@sharwell sharwell added the 4 - In Review A fix for the issue is submitted for review. label Jan 8, 2018
@sharwell
Copy link
Member

sharwell commented Jan 8, 2018

@chillryan I just reviewed your feedback history. Several items have comments linking back to an internal bug we refer to as the "PendingWork" bug. When this is fixed, you should see many of your feedback items close afterwards and the improvement should be substantial.

@chillryan
Copy link

chillryan commented Jan 9, 2018

@sharwell thanks for update. Is there a feedback link for PendingWork bug?

@sharwell
Copy link
Member

sharwell commented Jan 9, 2018

@chillryan Not specifically. If so, most of them were filed by you. 😆 There were multiple proposed solutions which are in the final evaluation stage. I'll link whichever one we settle on back to this issue.

@udlose
Copy link

udlose commented Jan 10, 2018

@sharwell - Can you confirm if the attached file is related to the same issue? I'd like to make sure before creating a bug report for VS. I can provide a full memory dump (uploaded privately) if needed but it sounds like your team is aware of the cause. TIA

devenv.exe__PID__62124__Date__01_10_2018__Time_10_51_38AM__41__Manual Dump_MultipleRules.zip

@sharwell
Copy link
Member

@udlose I would need a full heap dump to confirm. Better to just submit a new feedback ticket. Feel free to mention that it should be directed to me for initial investigation. 😄

@udlose
Copy link

udlose commented Jan 10, 2018

@sharwell - new defect is at VS Crash with OutOfMemoryException. There are 11,299,087 instances of Microsoft.CodeAnalysis.Editor.Implementation.ForegroundNotification.ForegroundNotificationService.PendingWork taking 452MB of heap memory - but I also see 2.5GB of native memory taken so possibly a native leak caused by these PendingWork instances? Just a guess. I didn't go as far as looking for roots, etc.

0:000> !address -summary
--- Usage Summary ---------------- RgnCount ----------- Total Size -------- %ofBusy %ofTotal
8924 9ec9e000 ( 2.481 GB) 67.63% 62.03%
Image 6349 2cecd000 ( 718.801 MB) 19.13% 17.55%
Heap 1425 16d2c000 ( 365.172 MB) 9.72% 8.92%
Free 1655 15309000 ( 339.035 MB) 8.28%
Stack 429 8220000 ( 130.125 MB) 3.46% 3.18%
TEB 141 141000 ( 1.254 MB) 0.03% 0.03%
Other 25 ec000 ( 944.000 kB) 0.02% 0.02%
PEB 1 3000 ( 12.000 kB) 0.00% 0.00%

--- Type Summary (for busy) ------ RgnCount ----------- Total Size -------- %ofBusy %ofTotal
MEM_PRIVATE 6073 a0281000 ( 2.502 GB) 68.21% 62.56%
MEM_IMAGE 10999 3fbf1000 (1019.941 MB) 27.15% 24.90%
MEM_MAPPED 222 ae75000 ( 174.457 MB) 4.64% 4.26%

--- State Summary ---------------- RgnCount ----------- Total Size -------- %ofBusy %ofTotal
MEM_COMMIT 13675 d370d000 ( 3.304 GB) 90.05% 82.60%
MEM_RESERVE 3619 175da000 ( 373.852 MB) 9.95% 9.13%
MEM_FREE 1655 15309000 ( 339.035 MB) 8.28%

--- Protect Summary (for commit) - RgnCount ----------- Total Size -------- %ofBusy %ofTotal
PAGE_READWRITE 5658 8ae70000 ( 2.170 GB) 59.16% 54.26%
PAGE_READONLY 3576 2270e000 ( 551.055 MB) 14.67% 13.45%
PAGE_EXECUTE_READ 505 1f57f000 ( 501.496 MB) 13.35% 12.24%
PAGE_EXECUTE_READWRITE 1170 2f2d000 ( 47.176 MB) 1.26% 1.15%
PAGE_WRITECOPY 1745 2e32000 ( 46.195 MB) 1.23% 1.13%
PAGE_EXECUTE_WRITECOPY 480 79f000 ( 7.621 MB) 0.20% 0.19%
PAGE_READWRITE|PAGE_GUARD 282 60f000 ( 6.059 MB) 0.16% 0.15%
PAGE_NOACCESS 257 101000 ( 1.004 MB) 0.03% 0.02%
PAGE_EXECUTE 2 2000 ( 8.000 kB) 0.00% 0.00%

--- Largest Region by Usage ----------- Base Address -------- Region Size ----------
3ea0000 1ffc000 ( 31.984 MB)
Image 5e12a000 1983000 ( 25.512 MB)
Heap f4b41000 fae000 ( 15.680 MB)
Free fcf40000 2e20000 ( 46.125 MB)
Stack 3da0000 f6000 ( 984.000 kB)
TEB c14000 3000 ( 12.000 kB)
Other f00000 86000 ( 536.000 kB)
PEB d25000 3000 ( 12.000 kB)

@sharwell
Copy link
Member

There are 11,299,087 instances of Microsoft.CodeAnalysis.Editor.Implementation.ForegroundNotification.ForegroundNotificationService.PendingWork

That's definitely the same bug. This situation results a large amount of unnecessary computation that has ramifications vastly exceeding the size of the list itself.

@udlose
Copy link

udlose commented Jan 10, 2018

is there native code used by those PendingWork classes? It's curious to see the native heap at 2.5GB...

@jinujoseph
Copy link
Contributor

fixed via #23448

@udlose
Copy link

udlose commented Jan 16, 2018

@jinujoseph what release number of VS will this be included in?

@sharwell sharwell added Resolution-Fixed The bug has been fixed and/or the requested behavior has been implemented and removed 4 - In Review A fix for the issue is submitted for review. labels Jan 16, 2018
@jinujoseph
Copy link
Contributor

jinujoseph commented Jan 16, 2018

15.7 onward

@Panoone
Copy link

Panoone commented Jan 30, 2018

I only started experiencing this problem today - with all projects - despite having no problem with VS Pro 2017 for months. I tried upgrading to 15.5.6 with no success.

Application: devenv.exe
Framework Version: v4.0.30319
Description: The application requested process termination through System.Environment.FailFast(string message).
Message: System.TypeLoadException: Could not load type 'System.Security.Cryptography.IncrementalHash' from assembly 'System.Security.Cryptography.Algorithms, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a'.
at Microsoft.CodeAnalysis.Checksum.Create(Stream stream)
at Microsoft.CodeAnalysis.Execution.AbstractReferenceSerializationService.CreateChecksum(AnalyzerReference reference, CancellationToken cancellationToken)
at Microsoft.CodeAnalysis.Serialization.Serializer.CreateChecksum(Object value, CancellationToken cancellationToken)
at Microsoft.VisualStudio.LanguageServices.Remote.RemoteHostClientServiceFactory.RemoteHostClientService.AddGlobalAssets(CancellationToken cancellationToken)
at Microsoft.VisualStudio.LanguageServices.Remote.RemoteHostClientServiceFactory.RemoteHostClientService.d__18.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.CodeAnalysis.Remote.RemoteHostClientExtensions.d__20.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at Microsoft.CodeAnalysis.SolutionCrawler.SolutionCrawlerRegistrationService.WorkCoordinator.IncrementalAnalyzerProcessor.NormalPriorityProcessor.d__37.MoveNext()
Stack:
at System.Environment.FailFast(System.String, System.Exception)
at Microsoft.CodeAnalysis.FailFast.OnFatalException(System.Exception)
at Microsoft.CodeAnalysis.ErrorReporting.FatalError.Report(System.Exception, System.Action1<System.Exception>) at Microsoft.CodeAnalysis.ErrorReporting.FatalError.ReportUnlessCanceled(System.Exception) at Microsoft.CodeAnalysis.SolutionCrawler.SolutionCrawlerRegistrationService+WorkCoordinator+IncrementalAnalyzerProcessor+NormalPriorityProcessor+<ResetStatesAsync>d__37.MoveNext() at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(System.Threading.Tasks.Task) at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(System.Threading.Tasks.Task) at Microsoft.CodeAnalysis.SolutionCrawler.SolutionCrawlerRegistrationService+WorkCoordinator+IncrementalAnalyzerProcessor+NormalPriorityProcessor+<ResetStatesAsync>d__37.MoveNext() at System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[Microsoft.CodeAnalysis.SolutionCrawler.SolutionCrawlerRegistrationService+WorkCoordinator+IncrementalAnalyzerProcessor+NormalPriorityProcessor+<ResetStatesAsync>d__37, Microsoft.CodeAnalysis.Features, Version=2.6.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35]](<ResetStatesAsync>d__37 ByRef) at Microsoft.CodeAnalysis.SolutionCrawler.SolutionCrawlerRegistrationService+WorkCoordinator+IncrementalAnalyzerProcessor+NormalPriorityProcessor.ResetStatesAsync() at Microsoft.CodeAnalysis.SolutionCrawler.SolutionCrawlerRegistrationService+WorkCoordinator+IncrementalAnalyzerProcessor+NormalPriorityProcessor+<ExecuteAsync>d__17.MoveNext() at System.Runtime.CompilerServices.AsyncTaskMethodBuilder.Start[[Microsoft.CodeAnalysis.SolutionCrawler.SolutionCrawlerRegistrationService+WorkCoordinator+IncrementalAnalyzerProcessor+NormalPriorityProcessor+<ExecuteAsync>d__17, Microsoft.CodeAnalysis.Features, Version=2.6.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35]](<ExecuteAsync>d__17 ByRef) at Microsoft.CodeAnalysis.SolutionCrawler.SolutionCrawlerRegistrationService+WorkCoordinator+IncrementalAnalyzerProcessor+NormalPriorityProcessor.ExecuteAsync() at Microsoft.CodeAnalysis.SolutionCrawler.IdleProcessor+<ProcessAsync>d__12.MoveNext() at System.Runtime.CompilerServices.AsyncMethodBuilderCore+MoveNextRunner.InvokeMoveNext(System.Object) at System.Threading.ExecutionContext.RunInternal(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object, Boolean) at System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object, Boolean) at System.Runtime.CompilerServices.AsyncMethodBuilderCore+MoveNextRunner.Run() at System.Threading.Tasks.AwaitTaskContinuation.RunOrScheduleAction(System.Action, Boolean, System.Threading.Tasks.Task ByRef) at System.Threading.Tasks.Task.FinishContinuations() at System.Threading.Tasks.Task.FinishStageThree() at System.Threading.Tasks.Task1[[System.Threading.Tasks.VoidTaskResult, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089]].TrySetResult(System.Threading.Tasks.VoidTaskResult)
at System.Runtime.CompilerServices.AsyncTaskMethodBuilder1[[System.Threading.Tasks.VoidTaskResult, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089]].SetResult(System.Threading.Tasks.VoidTaskResult) at System.Runtime.CompilerServices.AsyncTaskMethodBuilder1[[System.Threading.Tasks.VoidTaskResult, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089]].SetResult(System.Threading.Tasks.Task1<System.Threading.Tasks.VoidTaskResult>) at Microsoft.CodeAnalysis.SolutionCrawler.IdleProcessor+<WaitForIdleAsync>d__11.MoveNext() at System.Runtime.CompilerServices.AsyncMethodBuilderCore+MoveNextRunner.InvokeMoveNext(System.Object) at System.Threading.ExecutionContext.RunInternal(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object, Boolean) at System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object, Boolean) at System.Runtime.CompilerServices.AsyncMethodBuilderCore+MoveNextRunner.Run() at System.Threading.Tasks.AwaitTaskContinuation.RunOrScheduleAction(System.Action, Boolean, System.Threading.Tasks.Task ByRef) at System.Threading.Tasks.Task.FinishContinuations() at System.Threading.Tasks.Task.FinishStageThree() at System.Threading.Tasks.Task1[[System.Threading.Tasks.VoidTaskResult, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089]].TrySetResult(System.Threading.Tasks.VoidTaskResult)
at System.Threading.Tasks.Task+DelayPromise.Complete()
at System.Threading.Tasks.Task+<>c.b__274_1(System.Object)
at System.Threading.TimerQueueTimer.CallCallbackInContext(System.Object)
at System.Threading.ExecutionContext.RunInternal(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object, Boolean)
at System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object, Boolean)
at System.Threading.TimerQueueTimer.CallCallback()
at System.Threading.TimerQueueTimer.Fire()
at System.Threading.TimerQueue.FireNextTimers()
at System.Threading.TimerQueue.AppDomainTimerCallback()

Followed by:

Faulting application name: devenv.exe, version: 15.0.27130.2010, time stamp: 0x5a31e4ea
Faulting module name: unknown, version: 0.0.0.0, time stamp: 0x00000000
Exception code: 0x80131623
Fault offset: 0x2d6b5fba
Faulting process ID: 0x3668
Faulting application start time: 0x01d399754ad12c19
Faulting application path: C:\Program Files (x86)\Microsoft Visual Studio\2017\Professional\Common7\IDE\devenv.exe
Faulting module path: unknown
Report ID: c9e71bbf-bb95-4e6d-b46a-c1e27109804e
Faulting package full name:
Faulting package-relative application ID:

@CyrusNajmabadi
Copy link
Member

@Panoone That appears to be a different issue. It's not an OOM, but rather a type-load failure.

@udlose
Copy link

udlose commented Jan 30, 2018

@Panoone i had a fellow dev hit that same problem. The fix was to install .net 4.7.1. For some reason it's looking for those assemblies. Once it was installed, the problem was solved.

@Panoone
Copy link

Panoone commented Jan 30, 2018

@udlose Thanks, Dave. I'll give it a go. There was a significant Windows 10 update last night and I'm assuming something in that was the culprit.

@Panoone
Copy link

Panoone commented Jan 30, 2018

@udlose Yep, upgrading .NET Framework resolved the issue. Thanks again.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Area-IDE Bug Resolution-Fixed The bug has been fixed and/or the requested behavior has been implemented Tenet-Performance Regression in measured performance of the product from goals.
Projects
None yet
Development

No branches or pull requests

9 participants