Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP

Loading…

_handleTokenStream in tds-client.js #5

Open
shawnb457 opened this Issue · 32 comments
@shawnb457

TypeError: Object # has no method '_handleTokenStream'
at TdsClient. (C:\Program Files\nodejs\node_modules\tds\lib\tds-c
lient.js:167:19)

@cretz cretz was assigned
@cretz
Owner

This is caused by token streams spanning packets. I will fix this shortly.

@kyberias

Is there a workaround and/or do you need help in fixing this?

@cretz
Owner

Actually, I need help replicating it. I was having problems replicating it in my test case. Can you give me the SQL (or JS) that replicates it? I have committed a change that calls the right method, but I want to put in a test case that makes sure your need works.

@kyberias

I was able to replicate it with a query to a table that has about 10 columns and returns roughly 20+ rows (SELECT * FROM Table). If I take the exact same query and limit it with TOP 1, it works. Somehow related to the result set size, or number of rows, I guess.

@kyberias

Here are some observations... I have a table called PlayerAccount with 1313 rows.

"SELECT COUNT(*) FROM PlayerAccount" works
"SELECT * FROM PlayerAccount" fails before getting any rows
"SELECT Name FROM PlayerAccount" fails after getting roughly 100 rows
"SELECT Id FROM PlayerAccount" fails after getting roughly 1000 rows

@cretz
Owner

Ok, I put in a test case for this issue the other day and it was breaking. Basically, it can't handle multi-packet responses properly (test case here). Basically it fails after ~80 results. I will work on whatever it is causing the problem when spanning packets. I will always accept help too if the code isn't too ugly.

@nambad

I am facing the same issue..!!!
The output rows are 24 with 6 columns... but after 12 rows the program breaks.. Need help..

@jonbuffington

I have a similar symptom as @nambad . The result set is 35 rows with 6 columns. The break occurs after receiving row #30.

Checking token type:  209
Retrieved token type:  209
Need 103, length 63
Stream incomplete, rolling back
Received 603 bytes at state 4 <Buffer 74 00 20 00 70 00 65 00 72 00 20 00 41 00 63 00 74 00 69 00 76 00 65 00 20 00 53 00 65 00 73 00 73 00 69 00 6f 00 6e 00 0e 00 64 00 65 00 63 00 69 00 6d ...>

/FOO/console/node_modules/tds/lib/tds-client.js:167
      return this._handleTokenStream();

TypeError: Object #<TdsClient> has no method '_handleTokenStream'
@cretz
Owner

I apologize. The method name change is fixed in master, but I am still having problems with streams that span packets. I am working on it...

@jonbuffington

@cretz Thanks. I npm linked to master and the execute succeeds. The only side-effect is the console message:
Need 103, length 63

I am new to CoffeeScript but if you point me to the general area, I can lend another set of eyes.

@cretz
Owner

This means I went past my expected buffer. Somewhere, I tried to read from my BufferStream past the available length. The console.log definitely shouldn't be enabled by default; I'm sorry about that.

@jonbuffington

@cretz No apologies needed. I will step through the debugger and identify where the issue may be located.

@zachaller

We also just came across this issue wondering if anybody has had any luck with a fix/workaround

@cretz
Owner

The missing method issue is fixed in master, but I am still struggling on fixing the issue where data passes a normal packet's size (see the broken test case in master). Hopefully it wont take too much longer to find the issue.

@zachaller

Yea i have been going through it kind of also got the test case fixed and its throwing unrecognized packet type of 60 and going over the buffer just like you mentioned but i am unfamiliar with both tds and your code base so its going super slow hopefully you get it fixed before i can figure anything out, if i even can :) more eyes never hurt though.

@KillWil

Same problem here ;) Thanks for this module

@noregrets

I'm experiencing the same thing whenever the returned dataset is more than a handful of records.

@cretz cretz referenced this issue from a commit
@cretz Work on issue #5 32a1de0
@cretz
Owner

I have committed a fix in master which I think resolves this issue. Once I have other fixes in place, I will release a new version on npm. If anyone wants to test with their specific situation to see if their issue is resolved, I'd appreciate it.

@noregrets

Appears to have fixed the issue I was experiencing. Thank you!

@rickbergfalk

@cretz

I'm running tds from master, and I think I'm experiencing the issue related to large queries. Queries with small result sets return just fine, but larger result sets do not. I'm not receiving an error message from the TDS library at all, so I'm not really sure where things are breaking.

Regardless, thanks for working on this library! It's nice to be able to use node against a more "enterprise" database.

@cretz
Owner

Can you provide me an example preferably with temp tables? I already have this test.

@cretz
Owner

Ok, I have added another test case which does show an error still existing with multiple varchar rows (see commit above). I am still investigating...

@cakedevp

I was having a problem similar to what @redidas comments but I realized that the problem was because my query was returning some NULL values in varchar fields, so when I removed the fields containing NULL values from the query or replaced NULL values with ISNULL(field, '') it worked again with no problems.

Also, I don't know if I should ask for support in this place but I couldn't find somewhere else so here it is: I've been trying to insert NULL values in some fields, but when I do something like this:

stmt.execute({ foo: "foo", bar: null });

the library throws an error apparently because I didn't send a value for "bar" field. So my question is: How can I insert a NULL value?

Thanks for your support and by the way, thanks for this library.

@philcar

Getting the same issue here. I must limit my query to top 40 otherwise, i get the handleTokenStream error with the npm release. With the latest release, i don't get this error. Instead, the query hang after a certain number of rows. If i try to do another execute, i get an error that a statement is already executing.

@cretz
Owner

Yeah, I have the broken test case in there, just haven't gotten around to digging. It can be very difficult (not made easier by my spaghetti code in tds-client.coffee)

@ldv2001

I'm running master, and I get all data. However, when large amounts of data is requested, the done event of the statement is not firing. If i limit the query with TOP 10 or something like that, the event fires.

@0dc2

Has this been fixed by anyone? I am getting this error right now...

@vjpr

I'm getting this error too.

Need 1455, length 1440

TypeError: Object #<TdsClient> has no method '_handleTokenStream'
  at TdsClient.exports.TdsClient.TdsClient._socketData (node_modules/tds/lib/tds-client.js:167:19)
  at Socket.<anonymous> (node_modules/tds/lib/tds-client.js:2:59)
  at Socket.EventEmitter.emit (events.js:95:17)
  at Socket.<anonymous> (_stream_readable.js:746:14)
  at Socket.EventEmitter.emit (events.js:92:17)
  at emitReadable_ (_stream_readable.js:408:10)
  at emitReadable (_stream_readable.js:404:5)
  at readableAddChunk (_stream_readable.js:165:9)
  at Socket.Readable.push (_stream_readable.js:127:10)
  at TCP.onread (net.js:526:21)
@Yannici

Got the same issue ... any fixes for this?

@Yannici

I fixed this problem. Go to tds-client.js file and find "_handeTokenStream" and replace this method with "_handleToken" ... I think its just a typo by the developer ;)

@girishrd2004

@Yannici Thanks. The change has fixed "Object # has no method '_handleTokenStream'"
error.
However I still get message as below with no data returned.

Need 1429773, length 218

Observed that it is happening only for columns of type varchar(MAX) or nvarchar(MAX). I believe it is due to size of 218 defined for columns in tds JS

@Yannici

@girishrd2004 Sorry, but as far as I can remember, that doesn't fully fixed it. I think I just used another provider or technology for connections with sql server.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Something went wrong with that request. Please try again.