You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I first created a iceberg table with hadoop catalog and wrote data normally, but threw this exception during my second append
data to this table ,I am using Java8
Exception in thread "main" java.lang.NoSuchMethodError: java.nio.ByteBuffer.position(I)Ljava/nio/ByteBuffer;
at seaweedfs.client.SeaweedRead.read(SeaweedRead.java:89)
at seaweedfs.client.SeaweedInputStream.read(SeaweedInputStream.java:126)
at seaweedfs.client.SeaweedInputStream.read(SeaweedInputStream.java:106)
at seaweed.hdfs.SeaweedHadoopInputStream.read(SeaweedHadoopInputStream.java:37)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
at java.io.DataInputStream.read(DataInputStream.java:149)
at org.apache.iceberg.hadoop.HadoopStreams$HadoopSeekableInputStream.read(HadoopStreams.java:123)
at org.apache.iceberg.avro.AvroIO$AvroInputStreamAdapter.read(AvroIO.java:117)
at org.apache.avro.file.DataFileReader.openReader(DataFileReader.java:65)
at org.apache.iceberg.avro.AvroIterable.newFileReader(AvroIterable.java:100)
at org.apache.iceberg.avro.AvroIterable.iterator(AvroIterable.java:76)
at org.apache.iceberg.avro.AvroIterable.iterator(AvroIterable.java:36)
at org.apache.iceberg.relocated.com.google.common.collect.Iterables.addAll(Iterables.java:337)
at org.apache.iceberg.relocated.com.google.common.collect.Lists.newLinkedList(Lists.java:241)
at org.apache.iceberg.ManifestLists.read(ManifestLists.java:45)
at org.apache.iceberg.BaseSnapshot.cacheManifests(BaseSnapshot.java:148)
at org.apache.iceberg.BaseSnapshot.dataManifests(BaseSnapshot.java:174)
at org.apache.iceberg.MergingSnapshotProducer.apply(MergingSnapshotProducer.java:848)
at org.apache.iceberg.SnapshotProducer.apply(SnapshotProducer.java:217)
at org.apache.iceberg.SnapshotProducer.lambda$commit$2(SnapshotProducer.java:366)
at org.apache.iceberg.SnapshotProducer$$Lambda$77/7649301.run(Unknown Source)
at org.apache.iceberg.util.Tasks$Builder.runTaskWithRetry(Tasks.java:413)
at org.apache.iceberg.util.Tasks$Builder.runSingleThreaded(Tasks.java:219)
at org.apache.iceberg.util.Tasks$Builder.run(Tasks.java:203)
at org.apache.iceberg.util.Tasks$Builder.run(Tasks.java:196)
at org.apache.iceberg.SnapshotProducer.commit(SnapshotProducer.java:364)
at operator.TestRest.main(TestRest.java:123)
The text was updated successfully, but these errors were encountered:
When I first created a iceberg table with hadoop catalog and wrote data normally, but threw this exception during my second append
com.github.chrislusf seaweedfs-hadoop3-client 3.13 org.apache.parquet parquet-column 1.13.1 com.google.guava guava 11.0.2 org.apache.hadoop hadoop-common 3.3.3data to this table ,I am using Java8
Exception in thread "main" java.lang.NoSuchMethodError: java.nio.ByteBuffer.position(I)Ljava/nio/ByteBuffer;
at seaweedfs.client.SeaweedRead.read(SeaweedRead.java:89)
at seaweedfs.client.SeaweedInputStream.read(SeaweedInputStream.java:126)
at seaweedfs.client.SeaweedInputStream.read(SeaweedInputStream.java:106)
at seaweed.hdfs.SeaweedHadoopInputStream.read(SeaweedHadoopInputStream.java:37)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
at java.io.DataInputStream.read(DataInputStream.java:149)
at org.apache.iceberg.hadoop.HadoopStreams$HadoopSeekableInputStream.read(HadoopStreams.java:123)
at org.apache.iceberg.avro.AvroIO$AvroInputStreamAdapter.read(AvroIO.java:117)
at org.apache.avro.file.DataFileReader.openReader(DataFileReader.java:65)
at org.apache.iceberg.avro.AvroIterable.newFileReader(AvroIterable.java:100)
at org.apache.iceberg.avro.AvroIterable.iterator(AvroIterable.java:76)
at org.apache.iceberg.avro.AvroIterable.iterator(AvroIterable.java:36)
at org.apache.iceberg.relocated.com.google.common.collect.Iterables.addAll(Iterables.java:337)
at org.apache.iceberg.relocated.com.google.common.collect.Lists.newLinkedList(Lists.java:241)
at org.apache.iceberg.ManifestLists.read(ManifestLists.java:45)
at org.apache.iceberg.BaseSnapshot.cacheManifests(BaseSnapshot.java:148)
at org.apache.iceberg.BaseSnapshot.dataManifests(BaseSnapshot.java:174)
at org.apache.iceberg.MergingSnapshotProducer.apply(MergingSnapshotProducer.java:848)
at org.apache.iceberg.SnapshotProducer.apply(SnapshotProducer.java:217)
at org.apache.iceberg.SnapshotProducer.lambda$commit$2(SnapshotProducer.java:366)
at org.apache.iceberg.SnapshotProducer$$Lambda$77/7649301.run(Unknown Source)
at org.apache.iceberg.util.Tasks$Builder.runTaskWithRetry(Tasks.java:413)
at org.apache.iceberg.util.Tasks$Builder.runSingleThreaded(Tasks.java:219)
at org.apache.iceberg.util.Tasks$Builder.run(Tasks.java:203)
at org.apache.iceberg.util.Tasks$Builder.run(Tasks.java:196)
at org.apache.iceberg.SnapshotProducer.commit(SnapshotProducer.java:364)
at operator.TestRest.main(TestRest.java:123)
The text was updated successfully, but these errors were encountered: