Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Nunit 3.4.1 NUnit.Engine.Runners #31

Closed
CharliePoole opened this issue Aug 15, 2016 · 12 comments
Closed

Nunit 3.4.1 NUnit.Engine.Runners #31

CharliePoole opened this issue Aug 15, 2016 · 12 comments

Comments

@CharliePoole
Copy link
Collaborator

@elinato commented on Fri Jul 15 2016

I run into problems, Nunit 3.4.1 sometimes doesn't terminate nunit-agent-x86.exe, using TeamCity

I get this from TeamCity:

[Step 6/6] Unhandled Exception: System.InvalidOperationException: LocalDataStoreSlot storage has been freed. [10:38:55][Step 6/6] at System.LocalDataStore.SetData(LocalDataStoreSlot slot, Object data) [10:38:55][Step 6/6] at System.Threading.Thread.SetData(LocalDataStoreSlot slot, Object data) [10:38:55][Step 6/6] at System.Windows.Interop.ComponentDispatcher.get_CurrentThreadData() [10:38:55][Step 6/6] at System.Windows.Threading.Dispatcher.WndProcHook(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled) [10:38:55][Step 6/6] at MS.Win32.HwndWrapper.WndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled) [10:38:55][Step 6/6] at MS.Win32.HwndSubclass.DispatcherCallbackOperation(Object o) [10:38:55][Step 6/6] at System.Windows.Threading.ExceptionWrapper.InternalRealCall(Delegate callback, Object args, Int32 numArgs) [10:38:55][Step 6/6] at System.Windows.Threading.ExceptionWrapper.TryCatchWhen(Object source, Delegate callback, Object args, Int32 numArgs, Delegate catchHandler) [10:38:55][Step 6/6] at System.Windows.Threading.Dispatcher.LegacyInvokeImpl(DispatcherPriority priority, TimeSpan timeout, Delegate method, Object args, Int32 numArgs) [10:38:55][Step 6/6] at MS.Win32.HwndSubclass.SubclassWndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam) [10:38:58][Step 6/6] System.Net.Sockets.SocketException (0x80004005): An existing connection was forcibly closed by the remote host [10:38:58][Step 6/6] [10:38:58][Step 6/6] Server stack trace: [10:38:58][Step 6/6] at System.Net.Sockets.Socket.Receive(Byte[] buffer, Int32 offset, Int32 size, SocketFlags socketFlags) [10:38:58][Step 6/6] at System.Runtime.Remoting.Channels.SocketStream.Read(Byte[] buffer, Int32 offset, Int32 size) [10:38:58][Step 6/6] at System.Runtime.Remoting.Channels.SocketHandler.ReadFromSocket(Byte[] buffer, Int32 offset, Int32 count) [10:38:58][Step 6/6] at System.Runtime.Remoting.Channels.SocketHandler.Read(Byte[] buffer, Int32 offset, Int32 count) [10:38:58][Step 6/6] at System.Runtime.Remoting.Channels.SocketHandler.ReadAndMatchFourBytes(Byte[] buffer) [10:38:58][Step 6/6] at System.Runtime.Remoting.Channels.Tcp.TcpSocketHandler.ReadAndMatchPreamble() [10:38:58][Step 6/6] at System.Runtime.Remoting.Channels.Tcp.TcpSocketHandler.ReadVersionAndOperation(UInt16& operation) [10:38:58][Step 6/6] at System.Runtime.Remoting.Channels.Tcp.TcpClientSocketHandler.ReadHeaders() [10:38:58][Step 6/6] at System.Runtime.Remoting.Channels.Tcp.TcpClientTransportSink.ProcessMessage(IMessage msg, ITransportHeaders requestHeaders, Stream requestStream, ITransportHeaders& responseHeaders, Stream& responseStream) [10:38:58][Step 6/6] at System.Runtime.Remoting.Channels.BinaryClientFormatterSink.SyncProcessMessage(IMessage msg) [10:38:58][Step 6/6] [10:38:58][Step 6/6] Exception rethrown at [0]: [10:38:58][Step 6/6] at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg) [10:38:58][Step 6/6] at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type) [10:38:58][Step 6/6] at NUnit.Engine.ITestEngineRunner.Unload() [10:38:58][Step 6/6] at NUnit.Engine.Runners.ProcessRunner.UnloadPackage() [10:38:58][Step 6/6] at NUnit.Engine.Runners.AbstractTestRunner.Unload() [10:38:58][Step 6/6] at NUnit.Engine.Runners.AggregatingTestRunner.UnloadPackage() [10:38:58][Step 6/6] at NUnit.Engine.Runners.AbstractTestRunner.Unload() [10:38:58][Step 6/6] at NUnit.Engine.Runners.MasterTestRunner.UnloadPackage() [10:38:58][Step 6/6] at NUnit.Engine.Runners.AbstractTestRunner.Dispose(Boolean disposing) [10:38:58][Step 6/6] at NUnit.Engine.Runners.MasterTestRunner.Dispose(Boolean disposing) [10:38:58][Step 6/6] at NUnit.Engine.Runners.AbstractTestRunner.Dispose() [10:38:58][Step 6/6] at NUnit.ConsoleRunner.ConsoleRunner.RunTests(TestPackage package, TestFilter filter) [10:38:58][Step 6/6] at NUnit.ConsoleRunner.Program.Main(String[] args) [10:38:58][Step 6/6] Process exited with code -100 [10:38:58][Step 6/6] Step Run tests (NUnit) failed


@CharliePoole commented on Fri Jul 15 2016

Based on the stacktrace, it does not seem to be the case that the agent process fails to terminate. In fact there is a SocketException with the message "An existing connection was forcibly closed by the remote host" which would usually indicate that the agent had already terminated.

That's a normal situation after a run has terminated so I'm uncertain why it causes a problem with Teamcity. Are you absolutely sure that TC is using NUnit 3.4.1?


@elinato commented on Sun Jul 17 2016

In TeamCity I use this command line: packages\NUnit.ConsoleRunner.3.4.1\tools\nunit3-console.exe

image

@twoudsma
Copy link

I have the same issue in our testing framework with nunit-agent-x86.exe when using version 3.5.0 for both the framework and the console runner:

22773 06:06:50 System.Net.Sockets.SocketException (0x80004005): An existing connection was forcibly closed by the remote host 22774 06:06:50 22775 06:06:50 Server stack trace: 22776 06:06:50 at System.Net.Sockets.Socket.Receive(Byte[] buffer, Int32 offset, Int32 size, SocketFlags socketFlags) 22777 06:06:50 at System.Runtime.Remoting.Channels.SocketStream.Read(Byte[] buffer, Int32 offset, Int32 size) 22778 06:06:50 at System.Runtime.Remoting.Channels.SocketHandler.ReadFromSocket(Byte[] buffer, Int32 offset, Int32 count) 22779 06:06:50 at System.Runtime.Remoting.Channels.SocketHandler.Read(Byte[] buffer, Int32 offset, Int32 count) 22780 06:06:50 at System.Runtime.Remoting.Channels.Tcp.TcpFixedLengthReadingStream.Read(Byte[] buffer, Int32 offset, Int32 count) 22781 06:06:50 at System.IO.BinaryReader.ReadBytes(Int32 count) 22782 06:06:50 at System.Runtime.Serialization.Formatters.Binary.SerializationHeaderRecord.Read(__BinaryParser input) 22783 06:06:50 at System.Runtime.Serialization.Formatters.Binary.__BinaryParser.ReadSerializationHeaderRecord() 22784 06:06:50 at System.Runtime.Serialization.Formatters.Binary.__BinaryParser.Run() 22785 06:06:50 at System.Runtime.Serialization.Formatters.Binary.ObjectReader.Deserialize(HeaderHandler handler, __BinaryParser serParser, Boolean fCheck, Boolean isCrossAppDomain, IMethodCallMessage methodCallMessage) 22786 06:06:50 at System.Runtime.Remoting.Channels.CoreChannel.DeserializeBinaryResponseMessage(Stream inputStream, IMethodCallMessage reqMsg, Boolean bStrictBinding) 22787 06:06:50 at System.Runtime.Remoting.Channels.BinaryClientFormatterSink.SyncProcessMessage(IMessage msg) 22788 06:06:50 22789 06:06:50 Exception rethrown at [0]: 22790 06:06:50 at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg) 22791 06:06:50 at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type) 22792 06:06:50 at NUnit.Engine.ITestAgent.Stop() 22793 06:06:50 at NUnit.Engine.Runners.ProcessRunner.Dispose(Boolean disposing) 22794 06:06:50 at NUnit.Engine.Runners.AbstractTestRunner.Dispose() 22795 06:06:50 at NUnit.Engine.Runners.MasterTestRunner.Dispose(Boolean disposing) 22796 06:06:50 at NUnit.Engine.Runners.MasterTestRunner.Dispose() 22797 06:06:50 at NUnit.ConsoleRunner.ConsoleRunner.RunTests(TestPackage package, TestFilter filter) 22798 06:06:50 at NUnit.ConsoleRunner.Program.Main(String[] args)

We are currently migrating to 64-bit. I don't know if this issue also occurs with nunit-agent.exe?

@CharliePoole
Copy link
Collaborator Author

@l0bd0n The nunit-agent.exe and nunit-agent-x86.exe programs are identical, with the exception of the platform that each targets.

@twoudsma
Copy link

@CharliePoole So catching this exception and verifying that the runner has terminated is then fine?

@elinato
Copy link

elinato commented Dec 14, 2016

I used this step in TeamCity as first step, I'm not sure if this problem still exist.

echo off
tasklist /fi "imagename eq nunit-agent-x86.exe" |find ":" > nul
if errorlevel 1 taskkill /F /IM nunit-agent-x86.exe /T
exit

@eberlid
Copy link

eberlid commented Dec 14, 2016

We are currently migrating from MbUnit to NUnit. We use TeamCity 10.0.3 and we have now the same issue as described above.
System.Net.Sockets.SocketException (0x80004005): An existing connection was forcibly closed by the remote host

We also had similar issues using Gallio Testrunner while using MbUnit:
[09:14:51][Step 2/2] Forcibly killing the host process! [09:14:51][Step 2/2] Watchdog timer expired! [09:14:51][Step 2/2] Host process exited with code: -1 [09:14:51][Step 2/2] A fatal exception occurred while running tests. Possible causes include invalid test runner parameters and stack overflows. [09:14:51][Step 2/2] Gallio.Model.ModelException: An exception occurred while invoking a test driver. ---> Gallio.Runtime.Hosting.HostException: An exception occurred while connecting to the host service. ---> Gallio.Runtime.Hosting.HostException: Error attaching to the host process. ---> System.Runtime.Remoting.RemotingException: Failed to read from an IPC Port: The pipe has been ended. [09:14:51][Step 2/2] [09:14:51][Step 2/2] HResult: -2146233077 [09:14:51][Step 2/2] [09:14:51][Step 2/2] Server stack trace: [09:14:51][Step 2/2] at System.Runtime.Remoting.Channels.Ipc.IpcPort.Read(Byte[] data, Int32 offset, Int32 length) [09:14:51][Step 2/2] at System.Runtime.Remoting.Channels.SocketHandler.ReadFromSocket(Byte[] buffer, Int32 offset, Int32 count) [09:14:51][Step 2/2] at System.Runtime.Remoting.Channels.SocketHandler.Read(Byte[] buffer, Int32 offset, Int32 count) [09:14:51][Step 2/2] at System.Runtime.Remoting.Channels.SocketHandler.ReadAndMatchFourBytes(Byte[] buffer) [09:14:51][Step 2/2] at System.Runtime.Remoting.Channels.Tcp.TcpSocketHandler.ReadVersionAndOperation(UInt16& operation) [09:14:51][Step 2/2] at System.Runtime.Remoting.Channels.Ipc.IpcClientTransportSink.ProcessMessage(IMessage msg, ITransportHeaders requestHeaders, Stream requestStream, ITransportHeaders& responseHeaders, Stream& responseStream) [09:14:51][Step 2/2] at System.Runtime.Remoting.Channels.BinaryClientFormatterSink.SyncProcessMessage(IMessage msg) [09:14:51][Step 2/2] [09:14:51][Step 2/2] Exception rethrown at [0]: [09:14:51][Step 2/2] at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg) [09:14:51][Step 2/2] at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type) [09:14:51][Step 2/2] at Gallio.Runtime.Hosting.IHostService.Ping() [09:14:51][Step 2/2] at Gallio.Runtime.Hosting.IsolatedProcessHost.WaitUntilReady(IHostService hostService) in c:\Server\Projects\MbUnit v3\Work\src\Gallio\Gallio\Runtime\Hosting\IsolatedProcessHost.cs:line 383 [09:14:51][Step 2/2] at Gallio.Runtime.Hosting.IsolatedProcessHost.AcquireRemoteHostService() in c:\Server\Projects\MbUnit v3\Work\src\Gallio\Gallio\Runtime\Hosting\IsolatedProcessHost.cs:line 119 [09:14:51][Step 2/2] --- End of inner exception stack trace ---

The symptoms look quiet similar to me. Could an antivirus program cause this issue?

@CharliePoole
Copy link
Collaborator Author

@l0bd0n If you are calling the engine programatically, you could decide to ignore the exception. It's still a problem for us to fix, of course.

@eberlid There is a timeout while waiting for the agent to load. So if your antivirus spends a lot of time checking whether the assembly is OK to load, it would be possible to cause a timeout. In most dev environments, you don't want this kind of slowdown.

@rprouse
Copy link
Member

rprouse commented Jan 19, 2017

I believe this is fixed by #161. Please reopen if I am wrong.

@rprouse rprouse closed this as completed Jan 19, 2017
@rprouse rprouse added this to the Closed Without Action milestone Jan 19, 2017
@eberlid
Copy link

eberlid commented Jan 23, 2017

Unfortunately, the problem still exists. Am I right that the mentioned fix #161 does only affect Console Runner? We are currently running the tests with the Console Runner 3.6.0 but still get this Socket Exceptions (test assemblies are compiled against 3.5.0)

The machine normally runs two TeamCity agents; Sometimes only one is active. I cannot say for sure, but it seems that the Socket Exceptions are more frequently when the machine is under high load, i.e., both agents are running rather than only one. The machine is equipped with 8 GB RAM, SSD and a Xeon E5-1620 v2 CPU. We thought that timing issues may cause the problem, i.e., starting to wait for a process to finish but it has already finished...?

Shall I create a new issue for that?

@CharliePoole
Copy link
Collaborator Author

@rprouse #161 fixes #128, which is unrelated to this issue. The fixed issue has to do with the agent failing to terminate.

As my comments show, I had the impression that the OP was calling the engine from a program and was simply doing it wrong. But on rereading, I think I may have been wrong. We probably need a new description of what the problem is.

@heldersepu
Copy link

@CharliePoole I see you asked "We probably need a new description of what the problem is."
You can see that same error (InvalidOperationException) here:
https://ci.appveyor.com/project/icsharpcode/avalonedit/build/1.0.57

@CharliePoole
Copy link
Collaborator Author

@heldersepu The link you gave shows an exception being thrown in a run of nunit-console V2, which is quite different from any issues in this repository. Not sure if you need a new issue or if you meant to link to a different example.

@jnm2
Copy link
Collaborator

jnm2 commented May 29, 2017

We believe this was fixed by #223 which will be released as part of 3.7. If you'd like to confirm sooner that the issue is fixed, please try 3.7.0-dev-03641 from the https://ci.appveyor.com/nuget/nunit-console NuGet feed.

If it turns out that this does not fix your issue, please reopen and comment.

@jnm2 jnm2 modified the milestones: 3.7, Closed Without Action May 29, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants