-
Notifications
You must be signed in to change notification settings - Fork 337
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MySqlBulkCopy failed when data is too big (over 80,000 rows) #780
Comments
It looks like you might be starting with a CSV file; are you able to load it successfully with |
@bgrainger I can successfully import the CSV using mysqlimport |
I was curious if using a different C#-based approach |
Oh. I didn't know there is a
|
Thanks for confirming that I've opened #781 to improve the docs. |
I can't reproduce yet (with a 82MB file containing 1.2m rows). |
I can reproduce that with 30k rows (yes, my table has many columns). I'm using mysql 5.7.29 with 0.66.0 driver |
I still haven't been able to reproduce the problem. Is it possible that anyone who can reproduce this could get a Wireshark packet capture (ideally of a large batch that's still small enough to work, and then a batch that's large enough to fail)? CC @MaceWindu |
Finally found time to check it. I've updated to recent 8.0.20.0 version and issue still persists, so it is not version specific issue. Will see if I can grab netflow logs. Update: I think it is mysql issue, not provider issue. Update 2: What I found interesting is that probably it also depends on target table size, as at some batch size, it fails not on first batch, but on second (of same size) |
It does seem like MySQL is closing the connection, but there are reports that |
Wireshark logs for 0.69.2 inserting 20060 records works always |
The conversation used TLS, so I can't really see anything in the packet capture other than MySQL Server sending Oddly, the success conversation has 3,096 client packets and 27 server packets; the fail conversation is much shorter: it only has 133 client packets and 15 server packets. So even though the client is going to try to send more data, it doesn't get to send most of it. Is there somehow malformed data in an early packet that makes the server close the connection? Are there any errors logged in MySQL Server when this happens? Can you send me the server's SSL private key (you can email it to [GitHub user name] at gmail) so I can try to decrypt the conversation? Or can you repeat the packet capture, but with |
Whoops, I actually captured both (with ssl and without ssl), but attached wrong one. Will send proper logs a bit later when I reach my machine. |
packet dump without ssl I don't see anything in mysql logs, but after sql server restart I found out that limit of accepted records changed (increased, but I didn't checked to which number). |
Oh, the error is very obvious now:
The failing packet capture is sending a packet that's (That also explains why To fix this, the packet size that |
my code:
If I uncomment the rd.Read() loop, the log will be read
80473 line read
. So the reader is all right. Also I tried removing part of the data from the file and try to find if some ill-formed data cause the error. It turns out the error occurs whenever the file is too big (I guess about 32765), no matter which part of the file is removed.The text was updated successfully, but these errors were encountered: