We are using HDBC-mysql for our project and ran into a snag. When getting
data from a column of type LONG BLOB or LONG TEXT (max of 4G bytes in
mysql) the program crashes:
malloc: resource exhausted (out of memory)
The data actually in the columns is of course much less than 4G (more like 10k
to 1M) and we have worked around this by using MEDIUM TEXT instead.
However it seems that the mysql driver is mallocing the maximum size of bytes
needed to hold the largest data of the column's type, which is bad even for the
16M max of MEDIUM TEXT.
I have tried to look at the driver code and identified two places which appear to
be allocating buffers with malloc, by simply grepping for "mallocBytes". However
I am not familiar enough with the mysql C api or with the haskell FFI to debug this.
Someone in #haskell on freenode suggested that only enough space should be
allocated for the data, but you might not be able to know that until its retrieved.
Hopefully thats not the case and this is fixable.
Please respond with additional questions or directives for things that I can do that
Just ran over this issue too... 3GB memory linux box complains about low memory, even when I'm loading one record with 20kB content in longtext column... well...
Maybe it's just issue with libmysqlclient, it maybe does not report required column size or something like that..?
I'm afraid that this project is only very lightly maintained, as I don't use the HDBC bindings myself. Patches to fix this issue are welcome.