-
Notifications
You must be signed in to change notification settings - Fork 6.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot read all data. Bytes read: xxx. Bytes expected:xxx #23719
Comments
Please, can you show your insert query and the schema of table. From stacktrace seems that you use https://github.com/ClickHouse/clickhouse-jdbc, right? |
yes, I use clickhouse-jdbc the BinaryRow format for inserting data |
Hey guys, but the problem is not related just to JDBC, we have the same problem in ODBC integration engine. I will send details later today. |
sql: ddl:
the ods_htl_accountemployee_all is a distributed table on multiple ods_htl_accountemployee table. |
In my case it was probably because of exhausted memory - I tried to load table with 62 million rows and it consumed at the end 52 GB of RAM. After adding few more GB of RAM everything looks fine. Question is, is it usual that it consumes such a big amount of memory? |
No. |
For sure the reason is the same as described here: #23778 (comment) As a workaround - you can try to slice the data by some simple WHERE condition to smaller chunks - it may help. INSERT INTO clickhouse_table SELECT * FROM odbc(...) Try to do INSERT INTO clickhouse_table SELECT * FROM odbc(...) where date = '2020-01-01';
INSERT INTO clickhouse_table SELECT * FROM odbc(...) where date = '2020-01-02';
INSERT INTO clickhouse_table SELECT * FROM odbc(...) where date = '2020-01-03';
-- ...
-- etc. Closing issue as a duplicate of #23778 (there is a lot more debug information there). |
This will probably work. When I tried to find the cause of this trouble, I tried to load smaller datasets and it worked without any troubles. Usually the trouble starts around higher tens of milions. I have never had this issues under ten milions of rows. |
Hi guys. The exception just exist when I insert data into one table, other tables are ok.
Describe the bug
when I insert some data into a distributed table, I got "Cannot read all data." exeception
How to reproduce
I use 21.3.4.25 lts
Error message and/or stacktrace
The text was updated successfully, but these errors were encountered: