Skip to content

IBatch add method fails silently when a memory limit is exceeded #6900

@MWASoftware

Description

@MWASoftware

I have been testing the Firebird 4 IBatch interface, and ran a stress test to see how quickly a large dataset could be created. The test table is defined as:

Create Table LotsOfData (
RowID integer not null,
theDate TimeStamp,
MyText VarChar(1024),
Primary Key (RowID)
);

I used IBatch to add rows one at a time, and the test was set up to add 100K rows. At the end of the test, the batch completion interface was checked and the transaction committed. The table was then read back in order to verify read/write. An MD5 checksum on data in and data out was used to verify no data loss or corruption.

The test result showed that only 4061 records were written to table. This figure was reported both by the Batch Completion interface and on read back. However, no errors were reported at any time. The IBatch add interface status vector was checked on every call as was the execute. In both cases no error reported. The batch completion report "No More errors" for every row up to and including row 4061.

It seems that this a silent failure. An error should be reported either in the status vector (on IBatch::Add - the ideal solution) on in the batch completion.

Note that changing the test table to

Create Table LotsOfData (
RowID integer not null,
theDate TimeStamp,
MyText VarChar(512),
Primary Key (RowID)
);

resulted in 8083 rows being successfully written.

Metadata

Metadata

Assignees

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions