You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I use Java language.
First, I use spark to query Hive table to produce a DataSet.Which count is 2w record.
Then, I use spark.forearchPartition ,and use iterator to insert into ck which is preparestatement ,and I count the number I have inserted . It's also 20000 record.
Finally, when the job have completed, there is no error logs. It seems every thing is ok (Because I have catch all exception and print messages but get nothing error). But when I query clickhouse use client , it show it has 360736 record in db.
Other descriptions
At first, I use distribute table , and I thought it is the reason which cause this problem . Then I alter the target table as a normal table,but it has no use. And I have try to insert about 17,000,000 record to a distribute table ,but I query about 200,000,000+ records in clickhouse client.
The text was updated successfully, but these errors were encountered:
JDBC version in pom.xml
com.github.housepower clickhouse-native-jdbc 1.6-stable net.jpountz.lz4 lz4ClickHouse version
ru.yandex.clickhouse clickhouse-jdbc 0.1.52 net.jpountz.lz4 lz4 guava com.google.guavaError logs
no error logs
Steps to reproduce
I use Java language.
First, I use spark to query Hive table to produce a DataSet.Which count is 2w record.
Then, I use spark.forearchPartition ,and use iterator to insert into ck which is preparestatement ,and I count the number I have inserted . It's also 20000 record.
Finally, when the job have completed, there is no error logs. It seems every thing is ok (Because I have catch all exception and print messages but get nothing error). But when I query clickhouse use client , it show it has 360736 record in db.
Other descriptions
At first, I use distribute table , and I thought it is the reason which cause this problem . Then I alter the target table as a normal table,but it has no use. And I have try to insert about 17,000,000 record to a distribute table ,but I query about 200,000,000+ records in clickhouse client.
The text was updated successfully, but these errors were encountered: