Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Increased query memory usage on EFCore 8 RC2 #32052

Closed
gabynevada opened this issue Oct 14, 2023 · 8 comments · Fixed by #32069
Closed

Increased query memory usage on EFCore 8 RC2 #32052

gabynevada opened this issue Oct 14, 2023 · 8 comments · Fixed by #32069
Labels
area-sqlserver closed-fixed The issue has been fixed and is/will be included in the release indicated by the issue milestone. customer-reported type-bug
Milestone

Comments

@gabynevada
Copy link

I'm getting memory buildup problem from RC2 onwards while streaming from an IAsyncEnumerable. (Tested with latest daily build).

I'm using IAsyncEnuerable to stream data from database and save it an Azure Storage account. From RC2 onwards it keeps consuming memory until the process crashes. If I revert back to RC1 the process works as expected and no significant memory buildup occurs.

This is the expected memory allocations if using RC1
image

This is latest daily build memory allocations
image

Include your code

Sample code of IAsyncEnumerable

vay testAsyncEnumerable = _context.MyModel
      .AsNoTracking()
      .OrderBy(e => e.Field2)
      .Select(
        e =>
          new MyImportDto(
            e.Field1,
            e.Field2,
            e.Field3
          )
      )
      .AsAsyncEnumerable();

Sample code I'm using to use the IAsyncEnumerable to stream from database query to storage account

    private async Task<ImportResultDto> UploadDataToSource<T>(
        string fileName,
        IAsyncEnumerable<T> dataList
    )
    {
        var blobHttpHeaders = new BlobHttpHeaders { ContentType = "application/protobuf", };
        var blobServiceClient = new BlobServiceClient(shouldBeAConnectionString);
        var containerClient = blobServiceClient.GetBlobContainerClient(containerName);
        var blobClient = containerClient.GetBlockBlobClient(fileName);
        await using var stream = await blobClient.OpenWriteAsync(
            true,
            new BlockBlobOpenWriteOptions { HttpHeaders = blobHttpHeaders, }
        );
        await using var compressor = new GZipStream(stream, CompressionLevel.SmallestSize);
        var totalRecordsImported = 0;
        await foreach (var dataItem in dataList)
        {
            Serializer.SerializeWithLengthPrefix(compressor, dataItem, PrefixStyle.Base128, 1);
            totalRecordsImported++;
        }

        return new ImportResultDto(totalRecordsImported);
    }

Include stack traces

Stack trace for the Out of memory exception

fail: Microsoft.EntityFrameworkCore.Query[10100]
      An exception occurred while iterating over the results of a query for context type 'MyContext'.
      System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown.
         at Microsoft.EntityFrameworkCore.Query.Internal.BufferedDataReader.BufferedDataRecord.DoubleBufferCapacity()
         at Microsoft.EntityFrameworkCore.Query.Internal.BufferedDataReader.BufferedDataRecord.ReadRow()
         at Microsoft.EntityFrameworkCore.Query.Internal.BufferedDataReader.BufferedDataRecord.InitializeAsync(DbDataReader reader, IReadOnlyList`1 columns, CancellationToken cancellationToken)
         at Microsoft.EntityFrameworkCore.Query.Internal.BufferedDataReader.InitializeAsync(IReadOnlyList`1 columns, CancellationToken cancellationToken)
         at Microsoft.EntityFrameworkCore.Query.Internal.BufferedDataReader.InitializeAsync(IReadOnlyList`1 columns, CancellationToken cancellationToken)
         at Microsoft.EntityFrameworkCore.Storage.RelationalCommand.ExecuteReaderAsync(RelationalCommandParameterObject parameterObject, CancellationToken cancellationToken)
         at Microsoft.EntityFrameworkCore.Storage.RelationalCommand.ExecuteReaderAsync(RelationalCommandParameterObject parameterObject, CancellationToken cancellationToken)
         at Microsoft.EntityFrameworkCore.Query.Internal.SingleQueryingEnumerable`1.AsyncEnumerator.InitializeReaderAsync(AsyncEnumerator enumerator, CancellationToken cancellationToken)
         at Microsoft.EntityFrameworkCore.Storage.ExecutionStrategy.<>c__DisplayClass30_0`2.<<ExecuteAsync>b__0>d.MoveNext()
      --- End of stack trace from previous location ---
         at Microsoft.EntityFrameworkCore.Storage.ExecutionStrategy.ExecuteImplementationAsync[TState,TResult](Func`4 operation, Func`4 verifySucceeded, TState state, CancellationToken cancellationToken)
         at Microsoft.EntityFrameworkCore.Storage.ExecutionStrategy.ExecuteImplementationAsync[TState,TResult](Func`4 operation, Func`4 verifySucceeded, TState state, CancellationToken cancellationToken)
         at Microsoft.EntityFrameworkCore.Storage.ExecutionStrategy.ExecuteAsync[TState,TResult](TState state, Func`4 operation, Func`4 verifySucceeded, CancellationToken cancellationToken)
         at Microsoft.EntityFrameworkCore.Query.Internal.SingleQueryingEnumerable`1.AsyncEnumerator.MoveNextAsync()
      System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown.
         at Microsoft.EntityFrameworkCore.Query.Internal.BufferedDataReader.BufferedDataRecord.DoubleBufferCapacity()
         at Microsoft.EntityFrameworkCore.Query.Internal.BufferedDataReader.BufferedDataRecord.ReadRow()
         at Microsoft.EntityFrameworkCore.Query.Internal.BufferedDataReader.BufferedDataRecord.InitializeAsync(DbDataReader reader, IReadOnlyList`1 columns, CancellationToken cancellationToken)
         at Microsoft.EntityFrameworkCore.Query.Internal.BufferedDataReader.InitializeAsync(IReadOnlyList`1 columns, CancellationToken cancellationToken)
         at Microsoft.EntityFrameworkCore.Query.Internal.BufferedDataReader.InitializeAsync(IReadOnlyList`1 columns, CancellationToken cancellationToken)
         at Microsoft.EntityFrameworkCore.Storage.RelationalCommand.ExecuteReaderAsync(RelationalCommandParameterObject parameterObject, CancellationToken cancellationToken)
         at Microsoft.EntityFrameworkCore.Storage.RelationalCommand.ExecuteReaderAsync(RelationalCommandParameterObject parameterObject, CancellationToken cancellationToken)
         at Microsoft.EntityFrameworkCore.Query.Internal.SingleQueryingEnumerable`1.AsyncEnumerator.InitializeReaderAsync(AsyncEnumerator enumerator, CancellationToken cancellationToken)
         at Microsoft.EntityFrameworkCore.Storage.ExecutionStrategy.<>c__DisplayClass30_0`2.<<ExecuteAsync>b__0>d.MoveNext()
      --- End of stack trace from previous location ---
         at Microsoft.EntityFrameworkCore.Storage.ExecutionStrategy.ExecuteImplementationAsync[TState,TResult](Func`4 operation, Func`4 verifySucceeded, TState state, CancellationToken cancellationToken)
         at Microsoft.EntityFrameworkCore.Storage.ExecutionStrategy.ExecuteImplementationAsync[TState,TResult](Func`4 operation, Func`4 verifySucceeded, TState state, CancellationToken cancellationToken)
         at Microsoft.EntityFrameworkCore.Storage.ExecutionStrategy.ExecuteAsync[TState,TResult](TState state, Func`4 operation, Func`4 verifySucceeded, CancellationToken cancellationToken)
         at Microsoft.EntityFrameworkCore.Query.Internal.SingleQueryingEnumerable`1.AsyncEnumerator.MoveNextAsync()

Stack trace for suspicious object allocations

Allocated type : System.Object[]
  Objects : n/a
  Bytes   : 671088664

Allocated by
   100%  DoubleBufferCapacity • 640.00 MB / 640.00 MB • Microsoft.EntityFrameworkCore.Query.Internal.BufferedDataReader+BufferedDataRecord.DoubleBufferCapacity()
     100%  ReadRow • 640.00 MB / - • Microsoft.EntityFrameworkCore.Query.Internal.BufferedDataReader+BufferedDataRecord.ReadRow()
       100%  MoveNext • 640.00 MB / - • Microsoft.EntityFrameworkCore.Query.Internal.BufferedDataReader+BufferedDataRecord+<InitializeAsync>d__90.MoveNext()
         100%  RunInternal • 640.00 MB / - • System.Threading.ExecutionContext.RunInternal(ExecutionContext, ContextCallback, Object)
           100%  MoveNext • 640.00 MB / - • System.Runtime.CompilerServices.AsyncTaskMethodBuilder+AsyncStateMachineBox<TResult, TStateMachine>.MoveNext(Thread)
             100%  RunOrScheduleAction • 640.00 MB / - • System.Threading.Tasks.AwaitTaskContinuation.RunOrScheduleAction(IAsyncStateMachineBox, Boolean)
               100%  RunContinuations • 640.00 MB / - • System.Threading.Tasks.Task.RunContinuations(Object)
                 100%  CompleteAsyncCall • 640.00 MB / - • Microsoft.Data.SqlClient.SqlDataReader.CompleteAsyncCall<T>(Task<TResult>, SqlDataReader+SqlDataReaderBaseAsyncCallContext<T>)
                   100%  RunFromThreadPoolDispatchLoop • 640.00 MB / - • System.Threading.ExecutionContext.RunFromThreadPoolDispatchLoop(Thread, ExecutionContext, ContextCallback, Object)
                     100%  ExecuteWithThreadLocal • 640.00 MB / - • System.Threading.Tasks.Task.ExecuteWithThreadLocal(Task, Thread)
                       100%  Dispatch • 640.00 MB / - • System.Threading.ThreadPoolWorkQueue.Dispatch()
                         100%  WorkerDoWork • 640.00 MB / - • System.Threading.PortableThreadPool+WorkerThread.WorkerDoWork(PortableThreadPool, Boolean)
                           100%  WorkerThreadStart • 640.00 MB / - • System.Threading.PortableThreadPool+WorkerThread.WorkerThreadStart()
                             100%  RunWorker • 640.00 MB / - • System.Threading.Thread+StartHelper.RunWorker()
                               100%  Run • 640.00 MB / - • System.Threading.Thread+StartHelper.Run()
                                 100%  StartCallback • 640.00 MB / - • System.Threading.Thread.StartCallback()
                                  ►  100%  [AllThreadsRoot] • 640.00 MB / - • [AllThreadsRoot]

#stacktrace

Stacktrace for BufferedDataReader from memory allocation profiling

Allocating method : Microsoft.EntityFrameworkCore.Query.Internal.BufferedDataReader+BufferedDataRecord.DoubleBufferCapacity()
  Objects : n/a
  Bytes   : 1593835616

Top allocated types:
 640.00 MB Object[]
 384.00 MB DateTime[]
 384.00 MB Guid[]
 112.00 MB Boolean[]

Stacktrace:
   100%  DoubleBufferCapacity • 1.48 GB / 1.48 GB • Microsoft.EntityFrameworkCore.Query.Internal.BufferedDataReader+BufferedDataRecord.DoubleBufferCapacity()
     100%  ReadRow • 1.48 GB / - • Microsoft.EntityFrameworkCore.Query.Internal.BufferedDataReader+BufferedDataRecord.ReadRow()
       100%  MoveNext • 1.48 GB / - • Microsoft.EntityFrameworkCore.Query.Internal.BufferedDataReader+BufferedDataRecord+<InitializeAsync>d__90.MoveNext()
         100%  RunInternal • 1.48 GB / - • System.Threading.ExecutionContext.RunInternal(ExecutionContext, ContextCallback, Object)
           100%  MoveNext • 1.48 GB / - • System.Runtime.CompilerServices.AsyncTaskMethodBuilder+AsyncStateMachineBox<TResult, TStateMachine>.MoveNext(Thread)
             100%  RunOrScheduleAction • 1.48 GB / - • System.Threading.Tasks.AwaitTaskContinuation.RunOrScheduleAction(IAsyncStateMachineBox, Boolean)
               100%  RunContinuations • 1.48 GB / - • System.Threading.Tasks.Task.RunContinuations(Object)
                 100%  CompleteAsyncCall • 1.48 GB / - • Microsoft.Data.SqlClient.SqlDataReader.CompleteAsyncCall<T>(Task<TResult>, SqlDataReader+SqlDataReaderBaseAsyncCallContext<T>)
                   100%  RunFromThreadPoolDispatchLoop • 1.48 GB / - • System.Threading.ExecutionContext.RunFromThreadPoolDispatchLoop(Thread, ExecutionContext, ContextCallback, Object)
                     100%  ExecuteWithThreadLocal • 1.48 GB / - • System.Threading.Tasks.Task.ExecuteWithThreadLocal(Task, Thread)
                       100%  Dispatch • 1.48 GB / - • System.Threading.ThreadPoolWorkQueue.Dispatch()
                         100%  WorkerDoWork • 1.48 GB / - • System.Threading.PortableThreadPool+WorkerThread.WorkerDoWork(PortableThreadPool, Boolean)
                           100%  WorkerThreadStart • 1.48 GB / - • System.Threading.PortableThreadPool+WorkerThread.WorkerThreadStart()
                             100%  RunWorker • 1.48 GB / - • System.Threading.Thread+StartHelper.RunWorker()
                               100%  Run • 1.48 GB / - • System.Threading.Thread+StartHelper.Run()
                                 100%  StartCallback • 1.48 GB / - • System.Threading.Thread.StartCallback()
                                  ►  100%  [AllThreadsRoot] • 1.48 GB / - • [AllThreadsRoot]

#stacktrace

Include provider and version information

EF Core version: 8.0.0-rtm.23513.13
Database provider: Microsoft.EntityFrameworkCore.SqlServer
Target framework: .NET 8.0
Operating system: macOS Sonoma
IDE: JetBrains Rider 2023.3 EAP 2

@roji
Copy link
Member

roji commented Oct 14, 2023

@gabynevada thank you for submitting this; if indeed there's a regression here between rc1 and rc2 we'll do whatever we can to fix it before GA.

To help with this, is it possible for you to isolate this in a minimal, runnable console program which reproduces the problematic behavior? Just a minimal program with a minimal model that we can use to see this effect.

@gabynevada
Copy link
Author

gabynevada commented Oct 14, 2023

@roji Sure, tried for a few hours but no luck replicating on a minimal console app yet. I've seen on the main project that this DoubleBufferCapacity is responsible for most allocations when I use the RC2 library version.

On the RC1 or the console app project I don't see the DoubleBufferCapacity method executing at all when profiling.

Will try again during the weekend see if I can replicate on the console app.

System.Guid[]
  Objects : n/a
  Bytes   : 805306416

 100%  DoubleBufferCapacity • 768.00 MB / 768.00 MB • Microsoft.EntityFrameworkCore.Query.Internal.BufferedDataReader+BufferedDataRecord.DoubleBufferCapacity()
   100%  ReadRow • 768.00 MB / - • Microsoft.EntityFrameworkCore.Query.Internal.BufferedDataReader+BufferedDataRecord.ReadRow()
     100%  MoveNext • 768.00 MB / - • Microsoft.EntityFrameworkCore.Query.Internal.BufferedDataReader+BufferedDataRecord+<InitializeAsync>d__90.MoveNext()
       100%  RunInternal • 768.00 MB / - • System.Threading.ExecutionContext.RunInternal(ExecutionContext, ContextCallback, Object)
         100%  MoveNext • 768.00 MB / - • System.Runtime.CompilerServices.AsyncTaskMethodBuilder+AsyncStateMachineBox<TResult, TStateMachine>.MoveNext(Thread)
           100%  RunOrScheduleAction • 768.00 MB / - • System.Threading.Tasks.AwaitTaskContinuation.RunOrScheduleAction(IAsyncStateMachineBox, Boolean)
             100%  RunContinuations • 768.00 MB / - • System.Threading.Tasks.Task.RunContinuations(Object)
               100%  CompleteAsyncCall • 768.00 MB / - • Microsoft.Data.SqlClient.SqlDataReader.CompleteAsyncCall<T>(Task<TResult>, SqlDataReader+SqlDataReaderBaseAsyncCallContext<T>)
                 100%  RunFromThreadPoolDispatchLoop • 768.00 MB / - • System.Threading.ExecutionContext.RunFromThreadPoolDispatchLoop(Thread, ExecutionContext, ContextCallback, Object)
                   100%  ExecuteWithThreadLocal • 768.00 MB / - • System.Threading.Tasks.Task.ExecuteWithThreadLocal(Task, Thread)
                     100%  Dispatch • 768.00 MB / - • System.Threading.ThreadPoolWorkQueue.Dispatch()
                       100%  WorkerDoWork • 768.00 MB / - • System.Threading.PortableThreadPool+WorkerThread.WorkerDoWork(PortableThreadPool, Boolean)
                         100%  WorkerThreadStart • 768.00 MB / - • System.Threading.PortableThreadPool+WorkerThread.WorkerThreadStart()
                           100%  RunWorker • 768.00 MB / - • System.Threading.Thread+StartHelper.RunWorker()
                             100%  Run • 768.00 MB / - • System.Threading.Thread+StartHelper.Run()
                               100%  StartCallback • 768.00 MB / - • System.Threading.Thread.StartCallback()
                                ►  100%  [AllThreadsRoot] • 768.00 MB / - • [AllThreadsRoot]

#stacktrace

@gabynevada
Copy link
Author

gabynevada commented Oct 15, 2023

@roji It was tricky but I think I finally managed to reproduce, I could only do it when using a remote database. Used an Azure SQL Managed instance. When I try with a local database I couldn't reproduce.

Here is the minimal reproduction: https://github.com/gabynevada/efcore-rc2-memory-regression, tested with about 17 million records.

@ajcvickers
Copy link
Member

@gabynevada Does it still repro if you change UseSqlServer to this:

options.UseSqlServer(connectionString, b => b.UseAzureSql(false));

Also, does it repro with a local database with this:

options.UseSqlServer(connectionString, b => b.UseAzureSql(true));

@gabynevada
Copy link
Author

gabynevada commented Oct 15, 2023

@ajcvickers Yes!

b.UseAzureSql(false) fixes the problem on RC2 and b.UseAzureSql(true) repros while using a local database.

@ajcvickers
Copy link
Member

@AndriySvyryd We should consider reverting the auto-detection of this based on connection string.

@AndriySvyryd
Copy link
Member

AndriySvyryd commented Oct 16, 2023

EF Triage: We will revert the breaking change that enabled a retrying execution strategy for Azure SQL connection strings, but we will keep the part that configures the execution strategy behavior based on the connection string if a retrying execution strategy was explicitly configured.

Post 8.0 we will warn if an Azure SQL connection string is used without a retrying execution strategy: #32065

@AndriySvyryd AndriySvyryd self-assigned this Oct 16, 2023
@AndriySvyryd AndriySvyryd added this to the 8.0.0 milestone Oct 16, 2023
@AndriySvyryd AndriySvyryd changed the title IAsyncEnumerable query memory buildup from EfCore 8 RC2 Increased query memory usage on EFCore 8 RC2 Oct 16, 2023
AndriySvyryd added a commit that referenced this issue Oct 17, 2023
@AndriySvyryd AndriySvyryd added the closed-fixed The issue has been fixed and is/will be included in the release indicated by the issue milestone. label Oct 17, 2023
@AndriySvyryd AndriySvyryd removed their assignment Oct 17, 2023
@stevendarby
Copy link
Contributor

Does the EF Triage note match what was implemented?

If using Azure SQL + EnableRetryOnFailure, will EF 8 configure the specialised retry behaviour based on the connection string or will it need to be opted into with UseAzureSql?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area-sqlserver closed-fixed The issue has been fixed and is/will be included in the release indicated by the issue milestone. customer-reported type-bug
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants