You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The cluster run on arm servers,
The cluster enable hive.cache,
The trino version is 389,
The data can access successful on x86 cluster without hive.cache.
Next we wait a time to disable hive.cache to test.
error message:
io.trino.spi.TrinoException: Failed to read ORC file: hdfs://xxxxxxxxx/user/hive/warehouse/xxxxx.db/xxxx/xxx=20220815/000037_0
at io.trino.plugin.hive.orc.OrcPageSource.handleException(OrcPageSource.java:208)
at io.trino.plugin.hive.orc.OrcPageSourceFactory.lambda$createOrcPageSource$8(OrcPageSourceFactory.java:393)
at io.trino.orc.OrcBlockFactory$OrcBlockLoader.load(OrcBlockFactory.java:83)
at io.trino.spi.block.LazyBlock$LazyData.load(LazyBlock.java:400)
at io.trino.spi.block.LazyBlock$LazyData.getFullyLoadedBlock(LazyBlock.java:379)
at io.trino.spi.block.LazyBlock.getLoadedBlock(LazyBlock.java:286)
at io.trino.plugin.hive.orc.OrcPageSource$SourceColumn$MaskingBlockLoader.load(OrcPageSource.java:379)
at io.trino.spi.block.LazyBlock$LazyData.load(LazyBlock.java:400)
at io.trino.spi.block.LazyBlock$LazyData.getFullyLoadedBlock(LazyBlock.java:379)
at io.trino.spi.block.LazyBlock.getLoadedBlock(LazyBlock.java:286)
at io.trino.operator.project.DictionaryAwarePageProjection$DictionaryAwarePageProjectionWork.setupDictionaryBlockProjection(DictionaryAwarePageProjection.java:211)
at io.trino.operator.project.DictionaryAwarePageProjection$DictionaryAwarePageProjectionWork.lambda$getResult$0(DictionaryAwarePageProjection.java:197)
at io.trino.spi.block.LazyBlock$LazyData.load(LazyBlock.java:400)
at io.trino.spi.block.LazyBlock$LazyData.getFullyLoadedBlock(LazyBlock.java:379)
at io.trino.spi.block.LazyBlock.getLoadedBlock(LazyBlock.java:286)
at io.trino.operator.project.PageProcessor$ProjectSelectedPositions.processBatch(PageProcessor.java:345)
at io.trino.operator.project.PageProcessor$ProjectSelectedPositions.process(PageProcessor.java:208)
at io.trino.operator.WorkProcessorUtils$ProcessWorkProcessor.process(WorkProcessorUtils.java:391)
at io.trino.operator.WorkProcessorUtils.lambda$flatten$7(WorkProcessorUtils.java:296)
at io.trino.operator.WorkProcessorUtils$3.process(WorkProcessorUtils.java:338)
at io.trino.operator.WorkProcessorUtils$ProcessWorkProcessor.process(WorkProcessorUtils.java:391)
at io.trino.operator.WorkProcessorUtils$3.process(WorkProcessorUtils.java:325)
at io.trino.operator.WorkProcessorUtils$ProcessWorkProcessor.process(WorkProcessorUtils.java:391)
at io.trino.operator.WorkProcessorUtils$3.process(WorkProcessorUtils.java:325)
at io.trino.operator.WorkProcessorUtils$ProcessWorkProcessor.process(WorkProcessorUtils.java:391)
at io.trino.operator.WorkProcessorUtils.lambda$flatten$7(WorkProcessorUtils.java:296)
at io.trino.operator.WorkProcessorUtils$3.process(WorkProcessorUtils.java:338)
at io.trino.operator.WorkProcessorUtils$ProcessWorkProcessor.process(WorkProcessorUtils.java:391)
at io.trino.operator.WorkProcessorUtils$3.process(WorkProcessorUtils.java:325)
at io.trino.operator.WorkProcessorUtils$ProcessWorkProcessor.process(WorkProcessorUtils.java:391)
at io.trino.operator.WorkProcessorUtils.getNextState(WorkProcessorUtils.java:240)
at io.trino.operator.WorkProcessorUtils.lambda$processStateMonitor$3(WorkProcessorUtils.java:219)
at io.trino.operator.WorkProcessorUtils$ProcessWorkProcessor.process(WorkProcessorUtils.java:391)
at io.trino.operator.WorkProcessorUtils.getNextState(WorkProcessorUtils.java:240)
at io.trino.operator.WorkProcessorUtils.lambda$finishWhen$4(WorkProcessorUtils.java:234)
at io.trino.operator.WorkProcessorUtils$ProcessWorkProcessor.process(WorkProcessorUtils.java:391)
at io.trino.operator.WorkProcessorSourceOperatorAdapter.getOutput(WorkProcessorSourceOperatorAdapter.java:150)
at io.trino.operator.Driver.processInternal(Driver.java:410)
at io.trino.operator.Driver.lambda$process$10(Driver.java:313)
at io.trino.operator.Driver.tryWithLock(Driver.java:698)
at io.trino.operator.Driver.process(Driver.java:305)
at io.trino.operator.Driver.processForDuration(Driver.java:276)
at io.trino.execution.SqlTaskExecution$DriverSplitRunner.processFor(SqlTaskExecution.java:740)
at io.trino.execution.executor.PrioritizedSplitRunner.process(PrioritizedSplitRunner.java:163)
at io.trino.execution.executor.TaskExecutor$TaskRunner.run(TaskExecutor.java:488)
at io.trino.$gen.Trino_389____20221103_014715_2.run(Unknown Source)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: java.lang.UnsupportedOperationException: Slice is not backed by a byte array
at io.airlift.slice.Slice.checkHasByteArray(Slice.java:333)
at io.airlift.slice.Slice.byteArray(Slice.java:352)
at io.trino.orc.stream.CompressedOrcChunkLoader.nextChunk(CompressedOrcChunkLoader.java:129)
at io.trino.orc.stream.OrcInputStream.advance(OrcInputStream.java:204)
at io.trino.orc.stream.OrcInputStream.read(OrcInputStream.java:83)
at io.trino.orc.stream.LongInputStreamV2.readValues(LongInputStreamV2.java:64)
at io.trino.orc.stream.LongInputStreamV2.next(LongInputStreamV2.java:366)
at io.trino.orc.reader.SliceDictionaryColumnReader.openRowGroup(SliceDictionaryColumnReader.java:242)
at io.trino.orc.reader.SliceDictionaryColumnReader.readBlock(SliceDictionaryColumnReader.java:119)
at io.trino.orc.reader.SliceColumnReader.readBlock(SliceColumnReader.java:74)
at io.trino.orc.OrcBlockFactory$OrcBlockLoader.load(OrcBlockFactory.java:76)
... 46 more
The text was updated successfully, but these errors were encountered:
If you are running into issues, I would strongly recommend avoiding usage of the current implementation of hive.cache. It has been unmaintained for a long time and it's unlikely to ever receive any code fixes.
There is an alternative caching mechanism being worked on right now #16375 which should hopefully replace the current one.
cc: @electrum@beinan
The cluster run on arm servers,
The cluster enable hive.cache,
The trino version is 389,
The data can access successful on x86 cluster without hive.cache.
Next we wait a time to disable hive.cache to test.
error message:
The text was updated successfully, but these errors were encountered: