Skip to content

High memory allocation of TDSPacket objects, leading to OOME (table with binary data column) #2664

Open
@jubax

Description

@jubax

Driver version

12.10.0.jre11

SQL Server version

Microsoft SQL Server 2019 (RTM) - 15.0.2000.5 (X64) Sep 24 2019 13:48:23 Copyright (C) 2019 Microsoft Corporation Developer Edition (64-bit) on Windows Server 2019 Standard 10.0 (Build 17763: )

Client Operating System

Windows Server 2019 Standard

JAVA/JVM version

OpenJDK 64-Bit Server VM Temurin-21.0.5+11 (build 21.0.5+11-LTS, mixed mode, sharing)

Table schema

CREATE TABLE #MY_STAGING_TABLE
(
	HASH binary(32) NOT NULL,
	ID bigint NOT NULL
);

Problem description

We are inserting rows into a staging table. We call 64 times PreparedStatement.addBatch() (where each batch has 1024 rows) and then call PreparedStatement.executeBatch(). This code structure works fine for basically any data type.

We call PreparedStatement.setLong() and PreparedStatement.setBytes(), where the bytes is always a byte[32].

The problem is that executeBatch() allocates way too much memory.

Image

Note that the allocated memory before the executeBatch() call is just a few 100 MBs.

Expected behavior

A reasonable amount of memory allocation by the JDBC driver

Actual behavior

Very high allocation. The screenshot below is from the Eclipse Heap Analyzer tool (this is from a run where we limited the main memory to 12 GB):

Image

The heap analyzer also shows a histogram:

Image

Error message/stack trace

OutOfMemoryError

Any other details that can be helpful

n/a

JDBC trace logs

n/a

I just wanted to create this bug report to get some initial feedback before spending the time for a complete Java example; maybe there already is a known problem with binary data?

Thanks

Metadata

Metadata

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions