Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

System.Buffers.ArrayPool.Tests.ArrayPoolUnitTests.RentBufferFiresRentedDiagnosticEvent failure #42899

Closed
janvorli opened this issue Sep 30, 2020 · 21 comments · Fixed by #45469
Closed
Assignees
Labels
area-Diagnostics-coreclr disabled-test The test is disabled in source code against the issue

Comments

@janvorli
Copy link
Member

janvorli commented Sep 30, 2020

The test failed in CI on my PR (with change unrelated to the issue):

    System.Buffers.ArrayPool.Tests.ArrayPoolUnitTests.RentBufferFiresRentedDiagnosticEvent [FAIL]
      Half-way through waiting for remote process.
      Timed out at 9/30/2020 1:52:32 AM after 60000ms waiting for remote process.
      	Process ID: 23479
      	Handle: 904
      	Name: dotnet
      	MainModule: /home/helixbot/work/A64809A4/p/dotnet
      	StartTime: 9/30/2020 1:51:32 AM
      	TotalProcessorTime: 00:00:00.0100000
      
      Stack Trace:
        /_/src/Microsoft.DotNet.RemoteExecutor/src/RemoteInvokeHandle.cs(224,0): at Microsoft.DotNet.RemoteExecutor.RemoteInvokeHandle.Dispose(Boolean disposing)
        /_/src/Microsoft.DotNet.RemoteExecutor/src/RemoteInvokeHandle.cs(57,0): at Microsoft.DotNet.RemoteExecutor.RemoteInvokeHandle.Dispose()
        /_/src/libraries/System.Buffers/tests/ArrayPool/ArrayPoolTest.cs(44,0): at System.Buffers.ArrayPool.Tests.ArrayPoolTest.RemoteInvokeWithTrimming(Action action, Boolean trim)
        /_/src/libraries/System.Buffers/tests/ArrayPool/UnitTests.cs(404,0): at System.Buffers.ArrayPool.Tests.ArrayPoolUnitTests.RentBufferFiresRentedDiagnosticEvent()

https://dev.azure.com/dnceng/public/_build/results?buildId=836124&view=ms.vss-test-web.build-test-results-tab&runId=26640246&resultId=172262&paneView=history

Runfo Tracking Issue: system.buffers.arraypool.tests.arraypoolunittests.rentbufferfiresrenteddiagnosticevent

Build Definition Kind Run Name

Build Result Summary

Day Hit Count Week Hit Count Month Hit Count
0 0 0
@ghost
Copy link

ghost commented Sep 30, 2020

Tagging subscribers to this area: @tannergooding, @pgovind, @jeffhandley
See info in area-owners.md if you want to be subscribed.

@Dotnet-GitSync-Bot Dotnet-GitSync-Bot added the untriaged New issue has not been triaged by the area owner label Sep 30, 2020
@ghost
Copy link

ghost commented Sep 30, 2020

Tagging subscribers to this area: @tommcdon
See info in area-owners.md if you want to be subscribed.

@jkotas
Copy link
Member

jkotas commented Sep 30, 2020

Given that this is diagnostic event test and that the process spent just 10ms of processor time, this is most likely event pipe race condition that results into a hand during startup.

@tommcdon tommcdon added this to the 6.0.0 milestone Oct 1, 2020
@tommcdon tommcdon removed the untriaged New issue has not been triaged by the area owner label Oct 1, 2020
@tommcdon tommcdon added this to Needs Triage in .NET Core Diagnostics via automation Oct 1, 2020
@josalem
Copy link
Contributor

josalem commented Oct 6, 2020

I looked at the test:

public static void RentBufferFiresRentedDiagnosticEvent()
{
RemoteInvokeWithTrimming(() =>
{
ArrayPool<byte> pool = ArrayPool<byte>.Create(maxArrayLength: 16, maxArraysPerBucket: 1);
byte[] buffer = pool.Rent(16);
pool.Return(buffer);
Assert.Equal(1, RunWithListener(() => pool.Rent(16), EventLevel.Verbose, e =>
{
Assert.Equal(1, e.EventId);
Assert.Equal(buffer.GetHashCode(), e.Payload[0]);
Assert.Equal(buffer.Length, e.Payload[1]);
Assert.Equal(pool.GetHashCode(), e.Payload[2]);
}));
});
}

I don't think this would go through EventPipe. If I recall correctly, because this is a managed EventSource (System.Buffers.ArrayPoolEventSource), and it's being listened to by an EventListener, events should be written directly to the listener.

Looking at the history it looks like this is the first failure of this test in recent history as well. There isn't a dump of the remote executor process in AzDO/Helix, so we'll have to see if this happens again to get more details.

@jkotas
Copy link
Member

jkotas commented Oct 6, 2020

According to the log, the process spent 10ms of processor time. 10ms of processor time is not enough to start executing managed code. I think this got stuck in the unmanaged runtime startup, most likely in the unmanaged eventpipe or tracing initialization - since it happened while executing tests that stress these components.

@josalem
Copy link
Contributor

josalem commented Oct 6, 2020

EventPipe should be dormant for this test. Since the event producer is a managed EventSource and the consumer is an EventListener in process, EventPipe shouldn't be doing anything out of the ordinary especially before managed code runs. All the eventing in this test is happening in managed code. I agree that something must have happened before managed code executed, though, I'm just not sure it was EventPipe.

Have we seen any similar failures where a RemoteExecutor based test fails on timeout with the remote process only executing for a few milliseconds?

@v-haren
Copy link

v-haren commented Nov 3, 2020

failed again in job:runtime 20201102.59

failed test: System.Buffers.ArrayPool.Tests.ArrayPoolUnitTests.RentBufferFiresRentedDiagnosticEvent

net6.0-OSX-Release-x64-CoreCLR_checked-OSX.1013.Amd64.Open

Error message

Half-way through waiting for remote process.
Timed out at 11/2/2020 2:52:28 PM after 60000ms waiting for remote process.
Process ID: 25505
Handle: 1020
Name: dotnet
MainModule: /private/tmp/helix/working/B0640910/p/dotnet
StartTime: 11/2/2020 2:51:28 PM
TotalProcessorTime: 00:00:00.1841435


Stack trace
   at Microsoft.DotNet.RemoteExecutor.RemoteInvokeHandle.Dispose(Boolean disposing) in /_/src/Microsoft.DotNet.RemoteExecutor/src/RemoteInvokeHandle.cs:line 224
   at Microsoft.DotNet.RemoteExecutor.RemoteInvokeHandle.Dispose() in /_/src/Microsoft.DotNet.RemoteExecutor/src/RemoteInvokeHandle.cs:line 57
   at System.Buffers.ArrayPool.Tests.ArrayPoolTest.RemoteInvokeWithTrimming(Action action, Boolean trim) in /_/src/libraries/System.Buffers/tests/ArrayPool/ArrayPoolTest.cs:line 44
   at System.Buffers.ArrayPool.Tests.ArrayPoolUnitTests.RentBufferFiresRentedDiagnosticEvent() in /_/src/libraries/System.Buffers/tests/ArrayPool/UnitTests.cs:line 404

@ViktorHofer
Copy link
Member

This has happened recently again (i.e. in https://dev.azure.com/dnceng/public/_build/results?buildId=871598&view=ms.vss-test-web.build-test-results-tab&runId=27884432&resultId=158379&paneView=debug). Updated the issue to be live tracking via runfo.

@stephentoub
Copy link
Member

We also saw #44037 failing very frequently recently; I disabled the test as it was taking out too many PRs, but it had a similar symptom.

@v-haren
Copy link

v-haren commented Nov 23, 2020

failed again in job: runtime 20201122.5

failed test: System.Buffers.ArrayPool.Tests.ArrayPoolUnitTests.RentBufferFiresRentedDiagnosticEvent

net6.0-OSX-Release-x64-CoreCLR_checked-OSX.1013.Amd64.Open

Error message

Half-way through waiting for remote process.
Timed out at 11/22/2020 8:03:08 AM after 60000ms waiting for remote process.
Process ID: 35065
Handle: 1048
Name: dotnet
MainModule: /private/tmp/helix/working/A1F70904/p/dotnet
StartTime: 11/22/2020 8:02:08 AM
TotalProcessorTime: 00:00:00.2301308



Stack trace
   at Microsoft.DotNet.RemoteExecutor.RemoteInvokeHandle.Dispose(Boolean disposing) in /_/src/Microsoft.DotNet.RemoteExecutor/src/RemoteInvokeHandle.cs:line 224
   at Microsoft.DotNet.RemoteExecutor.RemoteInvokeHandle.Dispose() in /_/src/Microsoft.DotNet.RemoteExecutor/src/RemoteInvokeHandle.cs:line 57
   at System.Buffers.ArrayPool.Tests.ArrayPoolTest.RemoteInvokeWithTrimming(Action action, Boolean trim) in /_/src/libraries/System.Buffers/tests/ArrayPool/ArrayPoolTest.cs:line 44
   at System.Buffers.ArrayPool.Tests.ArrayPoolUnitTests.RentBufferFiresRentedDiagnosticEvent() in /_/src/libraries/System.Buffers/tests/ArrayPool/UnitTests.cs:line 404

@safern safern added the blocking-clean-ci Blocking PR or rolling runs of 'runtime' or 'runtime-extra-platforms' label Dec 2, 2020
@safern
Copy link
Member

safern commented Dec 2, 2020

Happened again on: #45430
https://dev.azure.com/dnceng/public/_build/results?buildId=904700&view=ms.vss-test-web.build-test-results-tab&runId=28875012&resultId=145658&paneView=debug

System.Buffers.ArrayPool.Tests.ArrayPoolUnitTests.ReturnBufferFiresDiagnosticEvent [FAIL]
      Half-way through waiting for remote process.
      Memory load: 18
      
      Image Name                     PID Services                                    
      ========================= ======== ============================================
      svchost.exe                    696 BrokerInfrastructure, DcomLaunch, LSM,      
                                         PlugPlay, Power, SystemEventsBroker         
      svchost.exe                    752 RpcEptMapper, RpcSs                         
      svchost.exe                    896 TermService                                 
      svchost.exe                    964 NcbService, PcaSvc, TrkWks, UALSVC,         
                                         UmRdpService, vmickvpexchange,              
                                         vmicshutdown, WdiSystemHost                 
      svchost.exe                    988 Dhcp, EventLog, lmhosts, TimeBrokerSvc,     
                                         vmictimesync                                
      svchost.exe                    424 vmicheartbeat                               
      svchost.exe                    864 BFE, CoreMessagingRegistrar, DPS, MpsSvc,   
                                         pla                                         
      svchost.exe                   1040 Appinfo, CertPropSvc, DsmSvc, gpsvc,        
                                         iphlpsvc, lfsvc, ProfSvc, sacsvr, Schedule, 
                                         SENS, SessionEnv, ShellHWDetection, Themes, 
                                         UserManager, Winmgmt, WpnService, wuauserv  
      svchost.exe                   1048 CDPSvc, EventSystem, FontCache, netprofm,   
                                         nsi, W32Time, WinHttpAutoProxySvc           
      svchost.exe                   1252 CryptSvc, Dnscache, LanmanWorkstation,      
                                         NlaSvc, WinRM                               
      svchost.exe                   1260 Wcmsvc                                      
      svchost.exe                   1832 AppHostSvc                                  
      svchost.exe                   1844 DiagTrack                                   
      svchost.exe                   1936 LanmanServer                                
      svchost.exe                   1944 W3SVC, WAS                                  
      svchost.exe                   1952 StateRepository, tiledatamodelsvc           
      svchost.exe                   1872 CDPUserSvc_48585, OneSyncSvc_48585          
      Timed out at 12/1/2020 9:01:15 PM after 60000ms waiting for remote process.
      Wrote mini dump to: C:\h\w\B3270977\w\A28208FA\uploads\2476.gh1qv4jx.1tm.dmp
      	Process ID: 2476
      	Handle: 1092
      	Name: dotnet
      	MainModule: C:\h\w\B3270977\p\dotnet.exe
      	StartTime: 12/1/2020 9:00:13 PM
      	TotalProcessorTime: 00:00:01.0781250
      	Threads:
      		Thread #1 (OS 0x2C4)   [MTA]
      			[HelperMethodFrame_1OBJ]
      			System.Diagnostics.Tracing.EventSource.SendCommand(System.Diagnostics.Tracing.EventListener, System.Diagnostics.Tracing.EventProviderType, Int32, Int32, System.Diagnostics.Tracing.EventCommand, Boolean, System.Diagnostics.Tracing.EventLevel, System.Diagnostics.Tracing.EventKeywords, System.Collections.Generic.IDictionary`2<System.String,System.String>)
      			System.Diagnostics.Tracing.EventListener.EnableEvents(System.Diagnostics.Tracing.EventSource, System.Diagnostics.Tracing.EventLevel, System.Diagnostics.Tracing.EventKeywords, System.Collections.Generic.IDictionary`2<System.String,System.String>)
      			System.Diagnostics.Tracing.TestEventListener.EnableEventSource(System.Diagnostics.Tracing.EventSource, System.Diagnostics.Tracing.EventLevel, System.Diagnostics.Tracing.EventKeywords)
      			System.Diagnostics.Tracing.TestEventListener.AddSource(System.String, System.Nullable`1<System.Guid>, System.Diagnostics.Tracing.EventLevel, System.Diagnostics.Tracing.EventKeywords)
      			System.Diagnostics.Tracing.TestEventListener.AddSource(System.String, System.Diagnostics.Tracing.EventLevel, System.Diagnostics.Tracing.EventKeywords)
      			System.Diagnostics.Tracing.TestEventListener..ctor(System.String, System.Diagnostics.Tracing.EventLevel, System.Nullable`1<Double>)
      			System.Buffers.ArrayPool.Tests.ArrayPoolTest.RunWithListener(System.Action, System.Diagnostics.Tracing.EventLevel, System.Action`1<System.Diagnostics.Tracing.EventWrittenEventArgs>)
      			System.Buffers.ArrayPool.Tests.ArrayPoolUnitTests+<>c.<ReturnBufferFiresDiagnosticEvent>b__35_0()
      			[DebuggerU2MCatchHandlerFrame]
      			[HelperMethodFrame_PROTECTOBJ]
      			System.Reflection.RuntimeMethodInfo.Invoke(System.Object, System.Reflection.BindingFlags, System.Reflection.Binder, System.Object[], System.Globalization.CultureInfo)
      			System.Reflection.MethodBase.Invoke(System.Object, System.Object[])
      			Microsoft.DotNet.RemoteExecutor.Program.Main(System.String[])
      		Thread #2 (OS 0x1740) [Finalizer] [Background] [MTA]
      			[DebuggerU2MCatchHandlerFrame]
      		Thread #3 (OS 0xEF4) [Thread pool worker] [Background] [MTA]
      		Thread #4 (OS 0x1340) [Thread pool worker] [Background] [MTA]
      			[HelperMethodFrame_1OBJ]
      			System.Threading.Monitor.Enter(System.Object, Boolean ByRef)
      			System.Diagnostics.Tracing.TestEventListener.OnEventSourceCreated(System.Diagnostics.Tracing.EventSource)
      			System.Diagnostics.Tracing.EventSource.AddListener(System.Diagnostics.Tracing.EventListener)
      			System.Diagnostics.Tracing.EventListener.AddEventSource(System.Diagnostics.Tracing.EventSource)
      			System.Diagnostics.Tracing.EventSource.Initialize(System.Guid, System.String, System.String[])
      			System.Threading.PortableThreadPoolEventSource..ctor()
      			System.Threading.PortableThreadPoolEventSource..cctor()
      			[DynamicHelperFrame]
      			System.Threading.PortableThreadPool+WorkerThread.WorkerThreadStart()
      			System.Threading.ThreadHelper.ThreadStart_Context(System.Object)
      			System.Threading.ExecutionContext.RunInternal(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object)
      			System.Threading.ThreadHelper.ThreadStart()
      			[DebuggerU2MCatchHandlerFrame]
      		Thread #5 (OS 0xCA4) [Thread pool worker] [Background] [MTA]
      			[InlinedCallFrame]
      			Interop+Kernel32.GetSystemTimes(Int64 ByRef, Int64 ByRef, Int64 ByRef)
      			System.Threading.PortableThreadPool+CpuUtilizationReader.get_CurrentUtilization()
      			System.Threading.PortableThreadPool+GateThread.GateThreadStart()
      			System.Threading.ThreadHelper.ThreadStart_Context(System.Object)
      			System.Threading.ExecutionContext.RunInternal(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object)
      			System.Threading.ThreadHelper.ThreadStart()
      			[DebuggerU2MCatchHandlerFrame]
      
      Stack Trace:
        /_/src/Microsoft.DotNet.RemoteExecutor/src/RemoteInvokeHandle.cs(224,0): at Microsoft.DotNet.RemoteExecutor.RemoteInvokeHandle.Dispose(Boolean disposing)
        /_/src/Microsoft.DotNet.RemoteExecutor/src/RemoteInvokeHandle.cs(57,0): at Microsoft.DotNet.RemoteExecutor.RemoteInvokeHandle.Dispose()
        /_/src/libraries/System.Buffers/tests/ArrayPool/ArrayPoolTest.cs(44,0): at System.Buffers.ArrayPool.Tests.ArrayPoolTest.RemoteInvokeWithTrimming(Action action, Boolean trim)
        /_/src/libraries/System.Buffers/tests/ArrayPool/UnitTests.cs(424,0): at System.Buffers.ArrayPool.Tests.ArrayPoolUnitTests.ReturnBufferFiresDiagnosticEvent()

I'm going to disable the test.

@safern
Copy link
Member

safern commented Dec 2, 2020

@safern
Copy link
Member

safern commented Dec 2, 2020

@jkotas do you think this could be because of a checked coreclr being slow that remote executor is killing it before it finishes? If so, we can just add SkipOnCoreClr for checked runtime and close this issue?

@stephentoub
Copy link
Member

Just based on the remote processes' stacks above, I think it's more likely this is a deadlock.

@safern
Copy link
Member

safern commented Dec 2, 2020

Ok thanks for looking. I will then disable with active issue.

@sywhang
Copy link
Contributor

sywhang commented Dec 2, 2020

I can take a look at fixing this @safern. Please tag/assign me in the issue you open.

@safern
Copy link
Member

safern commented Dec 2, 2020

I can take a look at fixing this @safern. Please tag me in the issue you open.

Sounds good. I will just open a PR to disable the test while this issue is fixed, thanks for looking into it.

@jkotas
Copy link
Member

jkotas commented Dec 2, 2020

It is a classic A-B B-A deadlock:

Main thread is waiting for this lock:

0a 0117dafc 72c37cda coreclr!AwareLock::Enter+0x167 [F:\workspace\_work\1\s\src\coreclr\src\vm\syncblk.cpp @ 2404] 
0b (Inline) -------- coreclr!Object::EnterObjMonitor+0x8 [F:\workspace\_work\1\s\src\coreclr\src\vm\object.h @ 277] 
0c 0117dbd4 72c3d3fd coreclr!JIT_MonEnter_Helper+0xed [F:\workspace\_work\1\s\src\coreclr\src\vm\jithelpers.cpp @ 3665] 
0d 0117dd10 725d660a coreclr!JIT_MonReliableEnter_Portable+0x3fd [F:\workspace\_work\1\s\src\coreclr\src\vm\jithelpers.cpp @ 3713] 
0e 0117dd40 725db014 System_Private_CoreLib!System.Diagnostics.Tracing.EventSource.SendCommand(System.Diagnostics.Tracing.EventListener, System.Diagnostics.Tracing.EventProviderType, Int32, Int32, System.Diagnostics.Tracing.EventCommand, Boolean, System.Diagnostics.Tracing.EventLevel, System.Diagnostics.Tracing.EventKeywords, System.Collections.Generic.IDictionary`2<System.String,System.String>)+0xa6 [/_/src/libraries/System.Private.CoreLib/src/System/Diagnostics/Tracing/EventSource.cs @ 2385] 
0f 0117dd78 08a41d7b System_Private_CoreLib!System.Diagnostics.Tracing.EventListener.EnableEvents(System.Diagnostics.Tracing.EventSource, System.Diagnostics.Tracing.EventLevel, System.Diagnostics.Tracing.EventKeywords, System.Collections.Generic.IDictionary`2<System.String,System.String>)+0x30 [/_/src/libraries/System.Private.CoreLib/src/System/Diagnostics/Tracing/EventSource.cs @ 3876] 
10 0117ddc4 08a41785 System_Buffers_Tests!System.Diagnostics.Tracing.TestEventListener.EnableEventSource(System.Diagnostics.Tracing.EventSource, System.Diagnostics.Tracing.EventLevel, System.Diagnostics.Tracing.EventKeywords)+0xcb
11 0117dec4 08a414b4 System_Buffers_Tests!System.Diagnostics.Tracing.TestEventListener.AddSource(System.String, System.Nullable`1<System.Guid>, System.Diagnostics.Tracing.EventLevel, System.Diagnostics.Tracing.EventKeywords)+0x2ad
12 0117df08 08a406b8 System_Buffers_Tests!System.Diagnostics.Tracing.TestEventListener.AddSource(System.String, System.Diagnostics.Tracing.EventLevel, System.Diagnostics.Tracing.EventKeywords)+0x64
13 0117df34 08a404ec System_Buffers_Tests!System.Diagnostics.Tracing.TestEventListener..ctor(System.String, System.Diagnostics.Tracing.EventLevel, System.Nullable`1<Double>)+0xc8
14 0117df9c 08bfff4f System_Buffers_Tests!System.Buffers.ArrayPool.Tests.ArrayPoolTest.RunWithListener(System.Action, System.Diagnostics.Tracing.EventLevel, System.Action`1<System.Diagnostics.Tracing.EventWrittenEventArgs>)+0x9c
15 0117dfd8 72e54161 System_Buffers_Tests!System.Buffers.ArrayPool.Tests.ArrayPoolUnitTests+<>c.<ReturnBufferFiresDiagnosticEvent>b__35_0()+0xf7
16 0117dfe4 72b58c9d coreclr!CallDescrWorkerInternal+0x34

The lock is taken by this thread that is waiting on a lock taken by the main thread:

09 08e9db20 72af662c coreclr!AwareLock::EnterEpilog+0x41 [F:\workspace\_work\1\s\src\coreclr\src\vm\syncblk.cpp @ 2501] 
0a 08e9dba8 72c37cda coreclr!AwareLock::Enter+0x167 [F:\workspace\_work\1\s\src\coreclr\src\vm\syncblk.cpp @ 2404] 
0b (Inline) -------- coreclr!Object::EnterObjMonitor+0x8 [F:\workspace\_work\1\s\src\coreclr\src\vm\object.h @ 277] 
0c 08e9dc80 72c3d3fd coreclr!JIT_MonEnter_Helper+0xed [F:\workspace\_work\1\s\src\coreclr\src\vm\jithelpers.cpp @ 3665] 
0d 08e9ddbc 724cf577 coreclr!JIT_MonReliableEnter_Portable+0x3fd [F:\workspace\_work\1\s\src\coreclr\src\vm\jithelpers.cpp @ 3713] 
0e 08e9ddc8 08a40cb9 System_Private_CoreLib!System.Threading.Monitor.Enter(System.Object, Boolean ByRef)+0x13 [/_/src/coreclr/src/System.Private.CoreLib/src/System/Threading/Monitor.CoreCLR.cs @ 48] 
0f 08e9de28 725d89e8 System_Buffers_Tests!System.Diagnostics.Tracing.TestEventListener.OnEventSourceCreated(System.Diagnostics.Tracing.EventSource)+0x59
10 08e9de58 725db2d6 System_Private_CoreLib!System.Diagnostics.Tracing.EventSource.AddListener(System.Diagnostics.Tracing.EventListener)+0xa4 [/_/src/libraries/System.Private.CoreLib/src/System/Diagnostics/Tracing/EventSource.cs @ 3318] 
11 08e9dea0 725d4255 System_Private_CoreLib!System.Diagnostics.Tracing.EventListener.AddEventSource(System.Diagnostics.Tracing.EventSource)+0x196 [/_/src/libraries/System.Private.CoreLib/src/System/Diagnostics/Tracing/EventSource.cs @ 4010] 
12 08e9defc 724eb41a System_Private_CoreLib!System.Diagnostics.Tracing.EventSource.Initialize(System.Guid, System.String, System.String[])+0x169 [/_/src/libraries/System.Private.CoreLib/src/System/Diagnostics/Tracing/EventSource.cs @ 1502] 
13 08e9df40 724eb9a1 System_Private_CoreLib!System.Threading.PortableThreadPoolEventSource..ctor()+0xb2 [/_/src/libraries/System.Private.CoreLib/src/System/Threading/PortableThreadPoolEventSource.cs @ 84] 
14 08e9df54 72e54161 System_Private_CoreLib!System.Threading.PortableThreadPoolEventSource..cctor()+0x11 [/_/src/libraries/System.Private.CoreLib/src/System/Threading/PortableThreadPoolEventSource.cs @ 293] 
15 08e9df54 72b58c9d coreclr!CallDescrWorkerInternal+0x34

@benaadams
Copy link
Member

benaadams commented Dec 2, 2020

It's this issue #45059 ?

image

Some of the EventSource initalizations (e.g. manifest generation) will request an ArrayPool buffer which then calls ArrayPool.Log to check if its enabled, which then requires the same lock

@benaadams
Copy link
Member

I think PortableThreadPool is triggering more managed EventSource initialization than was happening previously so this didn't exhibit as much previously?

@safern safern added disabled-test The test is disabled in source code against the issue and removed blocking-clean-ci Blocking PR or rolling runs of 'runtime' or 'runtime-extra-platforms' labels Dec 2, 2020
@sywhang
Copy link
Contributor

sywhang commented Dec 2, 2020

@benaadams This is a different issue from #45059 (deadlock in the EventListener implementation used in the test) but yes, PortableThreadPool is triggering the deadlock more frequently.

@runfoapp runfoapp bot removed this from the 6.0.0 milestone Apr 6, 2021
@ghost ghost locked as resolved and limited conversation to collaborators May 6, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
area-Diagnostics-coreclr disabled-test The test is disabled in source code against the issue
Projects
None yet
Development

Successfully merging a pull request may close this issue.