Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issues with NEST 7. “The stream with Id and Tag is disposed." #4168

Closed
andrx opened this issue Oct 22, 2019 · 10 comments
Closed

Issues with NEST 7. “The stream with Id and Tag is disposed." #4168

andrx opened this issue Oct 22, 2019 · 10 comments
Labels

Comments

@andrx
Copy link
Contributor

andrx commented Oct 22, 2019

Initially asked question here but as mentioned would be better to track it on github.

NEST/Elasticsearch.Net version:
7.3.1 .net core 2.0 on centos7

Elasticsearch version:
7.3.2 basic on centos7

Description of the problem including expected versus actual behavior:

After update to Nest7 (basically nuget, no other changes) occasionally getting this error for a process that usually runs for 15 minutes.
Sometimes it happens during updating documents by id, sometimes bulk operations.
The problem disappeared when I turned off parallel processing ParallelOptions { MaxDegreeOfParallelism = 1 } so, basically lowered the load.
Have never had the issue with Nest6. IElasticClient is registered as singleton via DI.
The client talks directly to ES by IPs set in StaticConnectionPool

System.AggregateException: One or more errors occurred. (The read operation failed, see inner exception.) ---> Elasticsearch.Net.UnexpectedElasticsearchClientException: The read operation failed, see inner exception. ---> System.IO.IOException: The read operation failed, see inner exception. ---> System.ObjectDisposedException: Cannot access a disposed object.

Object name: 'The stream with Id f434e91e-8fcd-4051-83fd-56ca445e5e53 and Tag is disposed.'.
at Elasticsearch.Net.RecyclableMemoryStream.CheckDisposed()
at Elasticsearch.Net.RecyclableMemoryStream.SafeRead(Byte[] buffer, Int32 offset, Int32 count, Int32& streamPosition)
at Elasticsearch.Net.RecyclableMemoryStream.Read(Byte[] buffer, Int32 offset, Int32 count)
at System.IO.MemoryStream.ReadAsync(Byte[] buffer, Int32 offset, Int32 count, CancellationToken cancellationToken)

---> (Inner Exception #0) Elasticsearch.Net.UnexpectedElasticsearchClientException: The read operation failed, see inner exception. ---> System.IO.IOException: The read operation failed, see inner exception. ---> System.ObjectDisposedException: Cannot access a disposed object.

--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()
at System.Net.Http.CurlHandler.MultiAgent.TransferDataFromRequestStream(IntPtr buffer, Int32 length, EasyRequest easy)
at System.Net.Http.CurlHandler.MultiAgent.CurlSendCallback(IntPtr buffer, UInt64 size, UInt64 nitems, IntPtr context)
--- End of inner exception stack trace ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()
at System.Net.Http.CurlHandler.CurlResponseStream.Read(Byte[] buffer, Int32 offset, Int32 count)
at System.IO.Stream.CopyTo(Stream destination, Int32 bufferSize)
at Elasticsearch.Net.ResponseBuilder.SetBody[TResponse](ApiCallDetails details, RequestData requestData, Stream responseStream, String mimeType)
at Elasticsearch.Net.ResponseBuilder.ToResponse[TResponse](RequestData requestData, Exception ex, Nullable`1 statusCode, IEnumerable`1 warnings, Stream responseStream, String mimeType)
at Elasticsearch.Net.HttpConnection.Request[TResponse](RequestData requestData)
at Elasticsearch.Net.RequestPipeline.CallElasticsearch[TResponse](RequestData requestData)
at Elasticsearch.Net.Transport`1.Request[TResponse](HttpMethod method, String path, PostData data, IRequestParameters requestParameters)
--- End of inner exception stack trace ---
at Elasticsearch.Net.Transport`1.Request[TResponse](HttpMethod method, String path, PostData data, IRequestParameters requestParameters)
at Elasticsearch.Net.ElasticLowLevelClient.DoRequest[TResponse](HttpMethod method, String path, PostData data, IRequestParameters requestParameters)
at Nest.ElasticClient.DoRequest[TRequest,TResponse](TRequest p, IRequestParameters parameters, Action`1 forceConfiguration)
at Nest.ElasticClient.Search[TDocument](ISearchRequest request)
at Nest.ElasticClient.Search[TDocument](Func`2 selector)
at Loader.Writers.CustomerRepository.SearchCustomer(Func`2 query)
at Loader.Loaders.EmailSentLoader.GetUpdatedCollection(DealerDto dealer, EmailSent[] batch, ConcurrentDictionary`2 bulkCollection)
at Loader.Loaders.EmailSentLoader.<>c__DisplayClass9_1.<Load>b__1(Int32 page)
at System.Threading.Tasks.Parallel.<>c__DisplayClass44_0`2.<PartitionerForEachWorker>b__1(IEnumerator& partitionState, Int32 timeout, Boolean& replicationDelegateYieldedBeforeCompletion)

without DisableDirectStreaming() stacktrace looks a bit different:

--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()
at System.Net.Http.CurlHandler.MultiAgent.TransferDataFromRequestStream(IntPtr buffer, Int32 length, EasyRequest easy)
at System.Net.Http.CurlHandler.MultiAgent.CurlSendCallback(IntPtr buffer, UInt64 size, UInt64 nitems, IntPtr context)
--- End of inner exception stack trace ---
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()
at System.Net.Http.CurlHandler.CurlResponseStream.Read(Byte[] buffer, Int32 offset, Int32 count)
at Elasticsearch.Net.Utf8Json.JsonSerializer.FillFromStream(Stream input, Byte[]& buffer)
at Elasticsearch.Net.Utf8Json.JsonSerializer.Deserialize[T](Stream stream, IJsonFormatterResolver resolver)
at Elasticsearch.Net.DiagnosticsSerializerProxy.Deserialize[T](Stream stream)
at Elasticsearch.Net.ResponseBuilder.SetBody[TResponse](ApiCallDetails details, RequestData requestData, Stream responseStream, String mimeType)
at Elasticsearch.Net.ResponseBuilder.ToResponse[TResponse](RequestData requestData, Exception ex, Nullable`1 statusCode, IEnumerable`1 warnings, Stream responseStream, String mimeType)
at Elasticsearch.Net.HttpConnection.Request[TResponse](RequestData requestData)
at Elasticsearch.Net.RequestPipeline.CallElasticsearch[TResponse](RequestData requestData)
at Elasticsearch.Net.Transport`1.Request[TResponse](HttpMethod method, String path, PostData data, IRequestParameters requestParameters)
--- End of inner exception stack trace ---
at Elasticsearch.Net.Transport`1.Request[TResponse](HttpMethod method, String path, PostData data, IRequestParameters requestParameters)
at Loader.Writers.CustomerRepository.SearchCustomer(Func`2 query)
...

Both exceptions led me here and here.

Seems a problem related to CurlResponseStream and libcurl.

That's what i see on the box:

# yum list installed
...
libcurl.x86_64    7.29.0-46.el7    @base
dotnet-host.x86_64    2.1.5-1    @packages-microsoft-com-prod
dotnet-hostfxr-2.0.5.x86_64	2.0.5-1    @packages-microsoft-com-prod
dotnet-runtime-2.0.5.x86_64    2.0.5-1    @packages-microsoft-com-prod
dotnet-sdk-2.1.4.x86_64    2.1.4-1    @packages-microsoft-com-prod
...
# yum list available libcurl
libcurl.x86_64    7.29.0-54.el7    @base

As @russcam mentioned Nest6 as well as Nest7 (.net core >=2.1) uses SocketsHttpHandler.
Checked if UseSocketsHttpHandler is set to false. It's not.

Currently checking if the app converted to .net core 2.2 + updated libs on AMI will fix the problem.

Will let you know.

Steps to reproduce:
short/simplified example looks like this
let me know if you need better example. I'll try to get something even closer if this is not enough.
Some numbers according to logs during stress tests:

If this process runs for two hours it can fetch 3,712,029 documents from ES and around 95% of them will be sent for bulk update.
Max response size (300 docs) for this loader is about 6MB.

Two other loaders that have the same structure also get exception on Search. but there is one where it happens on Update.

Provide ConnectionSettings (if relevant):

connectionSettings.EnableHttpCompression();
#if DEBUG
//connectionSettings.EnableDebugMode();
#endif
connectionSettings.DisableDirectStreaming();
connectionSettings.EnableHttpPipelining();
connectionSettings.PrettyJson(false);
connectionSettings.IncludeServerStackTraceOnError();
@andrx
Copy link
Contributor Author

andrx commented Oct 23, 2019

After switching to .net core 2.2 I can't reproduce the issue.

@russcam
Copy link
Contributor

russcam commented Oct 23, 2019

That's good to hear @andrx. Would still liked to dig into this but it's unlikely to be this week

@russcam
Copy link
Contributor

russcam commented Dec 10, 2019

If switching to netcore 2.2 has solved this, I'm going to close this issue. You may want to move to netcore 3.0 or 3.1, now that 2.2 is coming to EOL on December 23, 2019 👍

@russcam russcam closed this as completed Dec 10, 2019
@andrx
Copy link
Contributor Author

andrx commented Dec 10, 2019

We did update those to .net core 3. No issues. Thank you!

@kuleshov-aleksei
Copy link

Hello, @andrx the issue is gone when you updated to .net core 3 and it did not appear again?

I got exactly what you described here, but I'm using .netcore3.1. But in my case stream Id is bit odd:

The stream with Id 00000000-0000-0000-0000-000000000000 and Tag Elasticsearch.Net is disposed

This is my set up:

NEST/Elasticsearch.Net version: NEST 7.7.0 / Elasticsearch.Net 7.7.0 running on centos 7

Elasticsearch version: 7.7.0 (running in docker)

@andrx
Copy link
Contributor Author

andrx commented Oct 16, 2020

Hey @kuleshov-aleksei I checked what we have currently.

It's NEST 7.5.1 running against ES 7.9.1 all running on vms centos7.

Fortunately or unfortunately :) we haven't had enough time for refactoring these apps and decided to switch to .net 5 once it's ready. so the apps are still on .net 3.0. Been running without any issue so far. I think your case is unrelated to libcurl because .net core 3+ doesn't use it.

newer apps we've built handle less traffic than the use case started this issue but they run on latest .net core 3.1 on debian on k8s. also no issues.

Share your stack trace. It should lead to the answer

@kuleshov-aleksei
Copy link

Hmm.. This is strange because, I have CurlResponseStream too:

In my case I do not execute search request, only update.

Elasticsearch.Net.UnexpectedElasticsearchClientException: The read operation failed, see inner exception.
 ---> System.IO.IOException: The read operation failed, see inner exception.
 ---> System.ObjectDisposedException: Cannot access a disposed object.
Object name: 'The stream with Id 00000000-0000-0000-0000-000000000000 and Tag Elasticsearch.Net is disposed.'.
   at Elasticsearch.Net.RecyclableMemoryStream.CheckDisposed()
   at Elasticsearch.Net.RecyclableMemoryStream.SafeRead(Byte[] buffer, Int32 offset, Int32 count, Int32& streamPosition)
   at System.IO.MemoryStream.ReadAsync(Memory`1 buffer, CancellationToken cancellationToken)
--- End of stack trace from previous location where exception was thrown ---
   at System.Net.Http.CurlHandler.MultiAgent.TransferDataFromRequestStream(IntPtr buffer, Int32 length, EasyRequest easy)
   at System.Net.Http.CurlHandler.MultiAgent.CurlSendCallback(IntPtr buffer, UInt64 size, UInt64 nitems, IntPtr context)
   --- End of inner exception stack trace ---
   at System.Net.Http.CurlHandler.CurlResponseStream.Read(Byte[] buffer, Int32 offset, Int32 count)
   at Elasticsearch.Net.Utf8Json.JsonSerializer.FillFromStream(Stream input, Byte[]& buffer)
   at Elasticsearch.Net.Utf8Json.JsonSerializer.Deserialize[T](Stream stream, IJsonFormatterResolver resolver)
   at Elasticsearch.Net.DiagnosticsSerializerProxy.Deserialize[T](Stream stream)
   at Elasticsearch.Net.ResponseBuilder.SetBody[TResponse](ApiCallDetails details, RequestData requestData, Stream responseStream, String mimeType)
   at Elasticsearch.Net.ResponseBuilder.ToResponse[TResponse](RequestData requestData, Exception ex, Nullable`1 statusCode, IEnumerable`1 warnings, Stream responseStream, String mimeType)
   at Elasticsearch.Net.HttpConnection.Request[TResponse](RequestData requestData)
   at Elasticsearch.Net.RequestPipeline.CallElasticsearch[TResponse](RequestData requestData)
   at Elasticsearch.Net.Transport`1.Request[TResponse](HttpMethod method, String path, PostData data, IRequestParameters requestParameters)
   --- End of inner exception stack trace ---
   at Elasticsearch.Net.Transport`1.Request[TResponse](HttpMethod method, String path, PostData data, IRequestParameters requestParameters)
at (internal stack of application with UpdateRequest)

Maybe this case does not relate to elasticsearch, but it is some problem with libcurl in centos-7?
Unfortunately, I can't reproduce this error. I found your issue and was curious if you fixed this problem somehow.

@andrx
Copy link
Contributor Author

andrx commented Oct 16, 2020

Maybe you have UseSocketsHttpHandler switch set to false?
three ways it could be set discussed here

or wrong dotnet version it's built or running with? check dotnet version inside the container

@kuleshov-aleksei
Copy link

Oh, yeah, UseSocketsHttpHandler is set to false in one of the dependent packages. I have encountered some strange side effects already. Setting UseSocketsHttpHandler to true fixed issue with tls networking.

I will try to set UseSocketsHttpHandler to true.

Thank you!

@andrx
Copy link
Contributor Author

andrx commented Oct 17, 2020

@kuleshov-aleksei good luck

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants