Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Importing SQL Server image type into bytea max size problem #81

Closed
ghost opened this issue Jun 28, 2016 · 10 comments · Fixed by #294
Closed

Importing SQL Server image type into bytea max size problem #81

ghost opened this issue Jun 28, 2016 · 10 comments · Fixed by #294

Comments

@ghost
Copy link

ghost commented Jun 28, 2016

Hi,

I'm trying to import binary data held in a column in SQL Server using the image data type. I map this to bytea and it doesn't error, however, the maximum size of my data seems to be capped at 64512 bytes. Is there is a simple reason for this?

Thanks!

@GeoffMontee
Copy link
Collaborator

This is probably related to issue #40. I haven't determined why FreeTDS is capping the length of some types yet.

@nextstopsun
Copy link

Any plans for fixing this?
This issue really ruins importing geometries, since they quite often exceed cap.

@kevinob11
Copy link

I'm running into this as well, any workarounds here?

@manio
Copy link
Contributor

manio commented Oct 14, 2021

Guys,
I've got the following problem: I am using a tds_fdw foreign table with a blob (pdf) field. It is properly converted to bytea in postgres. The problem is with the bytea size, so I hope I am writing to proper issue here in github... :)

All my blobs are retrieved as 4096 bytes length, while in remote mssql db these are larger (and differ per row).

I was digging into this problem today and found a workaround.

First I can see that the problematic function is:
dbdatlen(festate->dbproc, ncol + 1);
It is returning 4096 for my blobs which is not true.

Next I found this thread:
https://lists.ibiblio.org/pipermail/freetds/2014q1/028781.html
According to this - I've executed another query right before fetching the data in tdsIterateForeignScan(), the query is:
'SET TEXTSIZE 2147483647'

Guess what? It is working!

Sample debug before:
DEBUG: tds_fdw: Data length is 4096
and after:
DEBUG: tds_fdw: Data length is 102265
The resulting bytea in postgres contains the valid-length pdf data :)

I can see in freetds NEWS file that in dblib there is:
' support DBTEXTSIZE option for dbsetopt;'
and also:
'- Added 'text size' config option which changes textsize on connect'
So maybe we can even pass it as tds_fdw server parameter?

I can prepare a pull-request with this change but I want to know if this is a good solution for this and if it will be accepted...

@GeoffMontee
Copy link
Collaborator

Hi @manio,

Thanks for the excellent debugging!

I can prepare a pull-request with this change but I want to know if this is a good solution for this and if it will be accepted...

Yes, it would be accepted. Thanks for the offer!

manio added a commit to manio/tds_fdw that referenced this issue Oct 14, 2021
Blob fields was truncated to 4096 because dbdatlen() was returning
incorrect size without TEXTSIZE server option set properly.
This commit fixes this problem, so the blobs are properly fetched
and converted to PostgreSQL 'bytea' type column.

Fixes tds-fdw#81
@manio
Copy link
Contributor

manio commented Oct 14, 2021

Thanks @GeoffMontee,
PR is ready. I am throwing a warning instead of error when this call is not executed properly...

@kevinob11
Copy link

I'm irrationally excited about this thanks @manio!

@GeoffMontee
Copy link
Collaborator

Thanks for the PR, @manio! It has been merged.

@kevinob11
Copy link

kevinob11 commented Oct 27, 2021

@GeoffMontee any plans for a release? @manio any way to manually do this for now until a release?

@manio
Copy link
Contributor

manio commented Oct 28, 2021

@kevinob11 just apply a patch and build it manually - IMHO no other way around before release. Maybe a jenkins produced some binaries - but I am not sure...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants