Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Large columns support (SQL_NO_TOTAL) #41

Closed
jlgale opened this issue Apr 27, 2017 · 8 comments
Closed

Large columns support (SQL_NO_TOTAL) #41

jlgale opened this issue Apr 27, 2017 · 8 comments

Comments

@jlgale
Copy link

jlgale commented Apr 27, 2017

https://github.com/Koka/odbc-rs/blob/master/src/statement/output.rs#L115

It looks like this function doesn't expect the SQL_NO_TOTAL case (-4) and it goes badly.

I don't know what SQL_NO_TOTAL means, but I can report that this happened when querying a column with large text values.

@Koka Koka added the bug label Apr 28, 2017
@donhcd
Copy link
Contributor

donhcd commented May 1, 2017

it seems like this was at least partially fixed in fe6f5ba

@jlgale
Copy link
Author

jlgale commented May 1, 2017

Maybe we can try to repo it with the latest?

@Koka
Copy link
Owner

Koka commented May 2, 2017

Looks like it's reproduced on large string columns of specific types. Code for handling such a case is still missing, though. @jlgale could you please share additional details on your database vendor and db schema?

@pacman82
Copy link
Contributor

pacman82 commented May 2, 2017

Hi, I'm back from Vacation;-) SQL_NO_TOTAL means that odbc is not able to tell you how much memory you have to allocate to hold the field of the column. Usually this happens for database columns which are intended to hold large amounts of data. You are supposed to call get_data repeatedly on them to process the data chunk-wise. The best fit on the rust side of things could be something implementing the io::Read trait. Of course we can also parse it into a if memory is large enough String

@jlgale
Copy link
Author

jlgale commented May 2, 2017

@Koka we are using a database called "snowflake" (snowflake.net). It's similar to Amazon's redshift.

The value in question comes from a VARIANT type column that has been cast to text. So basically a JSON string. These can be up to 16MB in size.

It seems reasonable that the default would be to read in everything but allow for OdbcType implementations that do something else.

@jlgale
Copy link
Author

jlgale commented May 2, 2017

With the new release, 0.7.1, we no longer see the the panic but we just get an empty string. An improvement :)

@Koka Koka changed the title panic in get_data_str() Large columns support (SQL_NO_TOTAL) May 10, 2017
@Koka Koka added feature-request and removed bug labels May 10, 2017
@pacman82
Copy link
Contributor

I'll start working on this

@pacman82
Copy link
Contributor

@jlgale Just looking over old issues, probably not relevant for you any more, but eventually fetching arbitrary large texts has been implemented in odbc-api https://docs.rs/odbc-api/0.32.0/odbc_api/struct.CursorRow.html#method.get_text

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants