Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Mix-iceberg format with hive metastore not working on spark #1963

Closed
2 tasks done
Tracked by #1930
GavinH1984 opened this issue Sep 12, 2023 · 0 comments · Fixed by #1965
Closed
2 tasks done
Tracked by #1930

[Bug]: Mix-iceberg format with hive metastore not working on spark #1963

GavinH1984 opened this issue Sep 12, 2023 · 0 comments · Fixed by #1965
Labels
type:bug Something isn't working

Comments

@GavinH1984
Copy link
Contributor

What happened?

when using spark to access mix-iceberg format on external hive catalog, there will error raise.

Affects Versions

master

What engines are you seeing the problem on?

Spark

How to reproduce

  1. create mix-iceberg catalog with hive metastore.
  2. using the amoro terminal to run query like 'show tables;'
  3. error should be showing up.

Relevant log output

2023/09/08 00:54:33 show tables
current catalog is null, switch to iceberg_catalog before execution
2023/09/08 00:54:33 meet exception during execution.
2023/09/08 00:54:33 java.lang.RuntimeException: error when execute sql:use `iceberg_catalog`
	at com.netease.arctic.server.terminal.kyuubi.KyuubiSession.execute(KyuubiSession.java:111)
	at com.netease.arctic.server.terminal.kyuubi.KyuubiSession.executeStatement(KyuubiSession.java:62)
	at com.netease.arctic.server.terminal.TerminalSessionContext$ExecutionTask.executeStatement(TerminalSessionContext.java:277)
	at com.netease.arctic.server.terminal.TerminalSessionContext$ExecutionTask.execute(TerminalSessionContext.java:240)
	at com.netease.arctic.server.terminal.TerminalSessionContext$ExecutionTask.lambda$get$0(TerminalSessionContext.java:201)
	at com.netease.arctic.table.TableMetaStore.doAsUgi(TableMetaStore.java:365)
	at com.netease.arctic.table.TableMetaStore.lambda$doAs$0(TableMetaStore.java:345)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:360)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at com.netease.arctic.table.TableMetaStore.doAs(TableMetaStore.java:345)
	at com.netease.arctic.server.terminal.TerminalSessionContext$ExecutionTask.get(TerminalSessionContext.java:193)
	at com.netease.arctic.server.terminal.TerminalSessionContext$ExecutionTask.get(TerminalSessionContext.java:164)
	at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.kyuubi.jdbc.hive.KyuubiSQLException: org.apache.kyuubi.KyuubiSQLException: org.apache.kyuubi.KyuubiSQLException: Error operating ExecuteStatement: java.lang.IllegalStateException: failed when load catalog iceberg_catalog
	at com.netease.arctic.catalog.CatalogLoader.load(CatalogLoader.java:160)
	at com.netease.arctic.catalog.CatalogLoader.loadCatalog(CatalogLoader.java:205)
	at com.netease.arctic.catalog.CatalogLoader.load(CatalogLoader.java:72)
	at com.netease.arctic.spark.ArcticSparkCatalog.initialize(ArcticSparkCatalog.java:424)
	at org.apache.spark.sql.connector.catalog.Catalogs$.load(Catalogs.scala:60)
	at org.apache.spark.sql.connector.catalog.CatalogManager.$anonfun$catalog$1(CatalogManager.scala:53)
	at scala.collection.mutable.HashMap.getOrElseUpdate(HashMap.scala:86)
	at org.apache.spark.sql.connector.catalog.CatalogManager.catalog(CatalogManager.scala:53)
	at org.apache.spark.sql.connector.catalog.LookupCatalog$CatalogAndNamespace$.unapply(LookupCatalog.scala:86)
	at org.apache.spark.sql.catalyst.analysis.ResolveCatalogs$$anonfun$apply$1.applyOrElse(ResolveCatalogs.scala:33)
	at org.apache.spark.sql.catalyst.analysis.ResolveCatalogs$$anonfun$apply$1.applyOrElse(ResolveCatalogs.scala:32)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDownWithPruning$2(AnalysisHelper.scala:170)
	at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:176)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDownWithPruning$1(AnalysisHelper.scala:170)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:323)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDownWithPruning(AnalysisHelper.scala:168)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDownWithPruning$(AnalysisHelper.scala:164)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsDownWithPruning(LogicalPlan.scala:30)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDownWithPruning$4(AnalysisHelper.scala:175)
	at org.apache.spark.sql.catalyst.trees.UnaryLike.mapChildren(TreeNode.scala:1228)
	at org.apache.spark.sql.catalyst.trees.UnaryLike.mapChildren$(TreeNode.scala:1227)
	at org.apache.spark.sql.catalyst.plans.logical.SetCatalogAndNamespace.mapChildren(v2Commands.scala:752)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDownWithPruning$1(AnalysisHelper.scala:175)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:323)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDownWithPruning(AnalysisHelper.scala:168)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDownWithPruning$(AnalysisHelper.scala:164)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsDownWithPruning(LogicalPlan.scala:30)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsWithPruning(AnalysisHelper.scala:99)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsWithPruning$(AnalysisHelper.scala:96)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsWithPruning(LogicalPlan.scala:30)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperators(AnalysisHelper.scala:76)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperators$(AnalysisHelper.scala:75)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:30)
	at org.apache.spark.sql.catalyst.analysis.ResolveCatalogs.apply(ResolveCatalogs.scala:32)
	at org.apache.spark.sql.catalyst.analysis.ResolveCatalogs.apply(ResolveCatalogs.scala:28)
	at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$2(RuleExecutor.scala:211)
	at scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:126)
	at scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:122)
	at scala.collection.immutable.List.foldLeft(List.scala:91)
	at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1(RuleExecutor.scala:208)
	at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1$adapted(RuleExecutor.scala:200)
	at scala.collection.immutable.List.foreach(List.scala:431)
	at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:200)
	at org.apache.spark.sql.catalyst.analysis.Analyzer.org$apache$spark$sql$catalyst$analysis$Analyzer$$executeSameContext(Analyzer.scala:231)
	at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$execute$1(Analyzer.scala:227)
	at org.apache.spark.sql.catalyst.analysis.AnalysisContext$.withNewAnalysisContext(Analyzer.scala:173)
	at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:227)
	at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:188)
	at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$executeAndTrack$1(RuleExecutor.scala:179)
	at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:88)
	at org.apache.spark.sql.catalyst.rules.RuleExecutor.executeAndTrack(RuleExecutor.scala:179)
	at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$1(Analyzer.scala:212)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:330)
	at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:211)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$analyzed$1(QueryExecution.scala:76)
	at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$2(QueryExecution.scala:185)
	at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:510)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:185)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779)
	at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:184)
	at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:76)
	at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:74)
	at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:66)
	at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779)
	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:97)
	at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:622)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779)
	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:617)
	at org.apache.kyuubi.engine.spark.operation.ExecuteStatement.$anonfun$executeStatement$1(ExecuteStatement.scala:83)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at org.apache.kyuubi.engine.spark.operation.SparkOperation.$anonfun$withLocalProperties$1(SparkOperation.scala:155)
	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:169)
	at org.apache.kyuubi.engine.spark.operation.SparkOperation.withLocalProperties(SparkOperation.scala:139)
	at org.apache.kyuubi.engine.spark.operation.ExecuteStatement.executeStatement(ExecuteStatement.scala:78)
	at org.apache.kyuubi.engine.spark.operation.ExecuteStatement$$anon$1.run(ExecuteStatement.scala:100)
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
	at java.base/java.util.concurrent.FutureTask.run(Unknown Source)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
	at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.lang.RuntimeException: run with ugi request failed.
	at com.netease.arctic.table.TableMetaStore.doAsUgi(TableMetaStore.java:370)
	at com.netease.arctic.table.TableMetaStore.lambda$doAs$0(TableMetaStore.java:345)
	at java.base/java.security.AccessController.doPrivileged(Native Method)
	at java.base/javax.security.auth.Subject.doAs(Unknown Source)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1822)
	at com.netease.arctic.table.TableMetaStore.doAs(TableMetaStore.java:345)
	at com.netease.arctic.mixed.BasicMixedIcebergCatalog.initialize(BasicMixedIcebergCatalog.java:97)
	at com.netease.arctic.mixed.BasicMixedIcebergCatalog.initialize(BasicMixedIcebergCatalog.java:84)
	at com.netease.arctic.catalog.CatalogLoader.load(CatalogLoader.java:155)
	... 81 more
Caused by: java.lang.VerifyError: Stack map does not match the one at exception handler 20
Exception Details:
  Location:
    com/netease/arctic/shade/org/apache/iceberg/hive/HiveCatalog.alterHiveDataBase(Lcom/netease/arctic/shade/org/apache/iceberg/catalog/Namespace;Lorg/apache/hadoop/hive/metastore/api/Database;)V @20: astore_3
  Reason:
    Type 'org/apache/hadoop/hive/metastore/api/NoSuchObjectException' (current frame, stack[0]) is not assignable to 'com/netease/arctic/shade/org/apache/thrift/TException' (stack map, stack[0])
  Current Frame:
    bci: @0
    flags: { }
    locals: { 'com/netease/arctic/shade/org/apache/iceberg/hive/HiveCatalog', 'com/netease/arctic/shade/org/apache/iceberg/catalog/Namespace', 'org/apache/hadoop/hive/metastore/api/Database' }
    stack: { 'org/apache/hadoop/hive/metastore/api/NoSuchObjectException' }
  Stackmap Frame:
    bci: @20
    flags: { }
    locals: { 'com/netease/arctic/shade/org/apache/iceberg/hive/HiveCatalog', 'com/netease/arctic/shade/org/apache/iceberg/catalog/Namespace', 'org/apache/hadoop/hive/metastore/api/Database' }
    stack: { 'com/netease/arctic/shade/org/apache/thrift/TException' }
  Bytecode:
    0000000: 2ab4 00a9 2b2c ba02 9900 00b9 00dd 0200
    0000010: 57a7 0066 4ebb 0125 592d 1301 2704 bd01
    0000020: 2959 032b 53b7 029a bf4e bb01 2e59 bb01
    0000030: 3059 b701 3113 029c b601 372b b601 3a13
    0000040: 0213 b601 37b6 013e 2db7 0141 bf4e b801
    0000050: 47b6 014a bb01 2e59 bb01 3059 b701 3113
    0000060: 029e b601 372b b601 3a13 0213 b601 37b6
    0000070: 013e 2db7 0141 bfb1                    
  Exception Handler Table:
    bci [0, 17] => handler: 20
    bci [0, 17] => handler: 20
    bci [0, 17] => handler: 41
    bci [0, 17] => handler: 77
  Stackmap Table:
    same_locals_1_stack_item_frame(@20,Object[#179])
    same_locals_1_stack_item_frame(@41,Object[#179])
    same_locals_1_stack_item_frame(@77,Object[#181])
    same_frame(@119)

	at java.base/java.lang.Class.forName0(Native Method)
	at java.base/java.lang.Class.forName(Unknown Source)
	at com.netease.arctic.shade.org.apache.iceberg.common.DynConstructors$Builder.impl(DynConstructors.java:149)
	at com.netease.arctic.shade.org.apache.iceberg.CatalogUtil.loadCatalog(CatalogUtil.java:221)
	at com.netease.arctic.shade.org.apache.iceberg.CatalogUtil.buildIcebergCatalog(CatalogUtil.java:284)
	at com.netease.arctic.mixed.BasicMixedIcebergCatalog.lambda$initialize$0(BasicMixedIcebergCatalog.java:97)
	at com.netease.arctic.table.TableMetaStore.doAsUgi(TableMetaStore.java:365)
	... 89 more

	at org.apache.kyuubi.KyuubiSQLException$.apply(KyuubiSQLException.scala:69)
	at org.apache.kyuubi.engine.spark.operation.SparkOperation$$anonfun$onError$1.$anonfun$applyOrElse$1(SparkOperation.scala:189)
	at org.apache.kyuubi.Utils$.withLockRequired(Utils.scala:395)
	at org.apache.kyuubi.operation.AbstractOperation.withLockRequired(AbstractOperation.scala:51)
	at org.apache.kyuubi.engine.spark.operation.SparkOperation$$anonfun$onError$1.applyOrElse(SparkOperation.scala:177)
	at org.apache.kyuubi.engine.spark.operation.SparkOperation$$anonfun$onError$1.applyOrElse(SparkOperation.scala:172)
	at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:38)
	at org.apache.kyuubi.engine.spark.operation.ExecuteStatement.$anonfun$executeStatement$1(ExecuteStatement.scala:88)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at org.apache.kyuubi.engine.spark.operation.SparkOperation.$anonfun$withLocalProperties$1(SparkOperation.scala:155)
	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:169)
	at org.apache.kyuubi.engine.spark.operation.SparkOperation.withLocalProperties(SparkOperation.scala:139)
	at org.apache.kyuubi.engine.spark.operation.ExecuteStatement.executeStatement(ExecuteStatement.scala:78)
	at org.apache.kyuubi.engine.spark.operation.ExecuteStatement$$anon$1.run(ExecuteStatement.scala:100)
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
	at java.base/java.util.concurrent.FutureTask.run(Unknown Source)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
	at java.base/java.lang.Thread.run(Unknown Source)
Caused by: java.lang.IllegalStateException: failed when load catalog iceberg_catalog
	at com.netease.arctic.catalog.CatalogLoader.load(CatalogLoader.java:160)
	at com.netease.arctic.catalog.CatalogLoader.loadCatalog(CatalogLoader.java:205)
	at com.netease.arctic.catalog.CatalogLoader.load(CatalogLoader.java:72)
	at com.netease.arctic.spark.ArcticSparkCatalog.initialize(ArcticSparkCatalog.java:424)
	at org.apache.spark.sql.connector.catalog.Catalogs$.load(Catalogs.scala:60)
	at org.apache.spark.sql.connector.catalog.CatalogManager.$anonfun$catalog$1(CatalogManager.scala:53)
	at scala.collection.mutable.HashMap.getOrElseUpdate(HashMap.scala:86)
	at org.apache.spark.sql.connector.catalog.CatalogManager.catalog(CatalogManager.scala:53)
	at org.apache.spark.sql.connector.catalog.LookupCatalog$CatalogAndNamespace$.unapply(LookupCatalog.scala:86)
	at org.apache.spark.sql.catalyst.analysis.ResolveCatalogs$$anonfun$apply$1.applyOrElse(ResolveCatalogs.scala:33)
	at org.apache.spark.sql.catalyst.analysis.ResolveCatalogs$$anonfun$apply$1.applyOrElse(ResolveCatalogs.scala:32)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDownWithPruning$2(AnalysisHelper.scala:170)
	at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:176)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDownWithPruning$1(AnalysisHelper.scala:170)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:323)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDownWithPruning(AnalysisHelper.scala:168)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDownWithPruning$(AnalysisHelper.scala:164)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsDownWithPruning(LogicalPlan.scala:30)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDownWithPruning$4(AnalysisHelper.scala:175)
	at org.apache.spark.sql.catalyst.trees.UnaryLike.mapChildren(TreeNode.scala:1228)
	at org.apache.spark.sql.catalyst.trees.UnaryLike.mapChildren$(TreeNode.scala:1227)
	at org.apache.spark.sql.catalyst.plans.logical.SetCatalogAndNamespace.mapChildren(v2Commands.scala:752)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDownWithPruning$1(AnalysisHelper.scala:175)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:323)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDownWithPruning(AnalysisHelper.scala:168)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsDownWithPruning$(AnalysisHelper.scala:164)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsDownWithPruning(LogicalPlan.scala:30)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsWithPruning(AnalysisHelper.scala:99)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperatorsWithPruning$(AnalysisHelper.scala:96)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsWithPruning(LogicalPlan.scala:30)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperators(AnalysisHelper.scala:76)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.resolveOperators$(AnalysisHelper.scala:75)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:30)
	at org.apache.spark.sql.catalyst.analysis.ResolveCatalogs.apply(ResolveCatalogs.scala:32)
	at org.apache.spark.sql.catalyst.analysis.ResolveCatalogs.apply(ResolveCatalogs.scala:28)
	at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$2(RuleExecutor.scala:211)
	at scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:126)
	at scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:122)
	at scala.collection.immutable.List.foldLeft(List.scala:91)
	at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1(RuleExecutor.scala:208)
	at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$execute$1$adapted(RuleExecutor.scala:200)
	at scala.collection.immutable.List.foreach(List.scala:431)
	at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:200)
	at org.apache.spark.sql.catalyst.analysis.Analyzer.org$apache$spark$sql$catalyst$analysis$Analyzer$$executeSameContext(Analyzer.scala:231)
	at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$execute$1(Analyzer.scala:227)
	at org.apache.spark.sql.catalyst.analysis.AnalysisContext$.withNewAnalysisContext(Analyzer.scala:173)
	at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:227)
	at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:188)
	at org.apache.spark.sql.catalyst.rules.RuleExecutor.$anonfun$executeAndTrack$1(RuleExecutor.scala:179)
	at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:88)
	at org.apache.spark.sql.catalyst.rules.RuleExecutor.executeAndTrack(RuleExecutor.scala:179)
	at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$1(Analyzer.scala:212)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:330)
	at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:211)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$analyzed$1(QueryExecution.scala:76)
	at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$2(QueryExecution.scala:185)
	at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:510)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:185)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779)
	at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:184)
	at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:76)
	at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:74)
	at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:66)
	at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779)
	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:97)
	at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:622)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:779)
	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:617)
	at org.apache.kyuubi.engine.spark.operation.ExecuteStatement.$anonfun$executeStatement$1(ExecuteStatement.scala:83)
	... 11 more
Caused by: java.lang.RuntimeException: run with ugi request failed.
	at com.netease.arctic.table.TableMetaStore.doAsUgi(TableMetaStore.java:370)
	at com.netease.arctic.table.TableMetaStore.lambda$doAs$0(TableMetaStore.java:345)
	at java.base/java.security.AccessController.doPrivileged(Native Method)
	at java.base/javax.security.auth.Subject.doAs(Unknown Source)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1822)
	at com.netease.arctic.table.TableMetaStore.doAs(TableMetaStore.java:345)
	at com.netease.arctic.mixed.BasicMixedIcebergCatalog.initialize(BasicMixedIcebergCatalog.java:97)
	at com.netease.arctic.mixed.BasicMixedIcebergCatalog.initialize(BasicMixedIcebergCatalog.java:84)
	at com.netease.arctic.catalog.CatalogLoader.load(CatalogLoader.java:155)
	... 81 more
Caused by: java.lang.VerifyError: Stack map does not match the one at exception handler 20
Exception Details:
  Location:
    com/netease/arctic/shade/org/apache/iceberg/hive/HiveCatalog.alterHiveDataBase(Lcom/netease/arctic/shade/org/apache/iceberg/catalog/Namespace;Lorg/apache/hadoop/hive/metastore/api/Database;)V @20: astore_3
  Reason:
    Type 'org/apache/hadoop/hive/metastore/api/NoSuchObjectException' (current frame, stack[0]) is not assignable to 'com/netease/arctic/shade/org/apache/thrift/TException' (stack map, stack[0])
  Current Frame:
    bci: @0
    flags: { }
    locals: { 'com/netease/arctic/shade/org/apache/iceberg/hive/HiveCatalog', 'com/netease/arctic/shade/org/apache/iceberg/catalog/Namespace', 'org/apache/hadoop/hive/metastore/api/Database' }
    stack: { 'org/apache/hadoop/hive/metastore/api/NoSuchObjectException' }
  Stackmap Frame:
    bci: @20
    flags: { }
    locals: { 'com/netease/arctic/shade/org/apache/iceberg/hive/HiveCatalog', 'com/netease/arctic/shade/org/apache/iceberg/catalog/Namespace', 'org/apache/hadoop/hive/metastore/api/Database' }
    stack: { 'com/netease/arctic/shade/org/apache/thrift/TException' }
  Bytecode:
    0000000: 2ab4 00a9 2b2c ba02 9900 00b9 00dd 0200
    0000010: 57a7 0066 4ebb 0125 592d 1301 2704 bd01
    0000020: 2959 032b 53b7 029a bf4e bb01 2e59 bb01
    0000030: 3059 b701 3113 029c b601 372b b601 3a13
    0000040: 0213 b601 37b6 013e 2db7 0141 bf4e b801
    0000050: 47b6 014a bb01 2e59 bb01 3059 b701 3113
    0000060: 029e b601 372b b601 3a13 0213 b601 37b6
    0000070: 013e 2db7 0141 bfb1                    
  Exception Handler Table:
    bci [0, 17] => handler: 20
    bci [0, 17] => handler: 20
    bci [0, 17] => handler: 41
    bci [0, 17] => handler: 77
  Stackmap Table:
    same_locals_1_stack_item_frame(@20,Object[#179])
    same_locals_1_stack_item_frame(@41,Object[#179])
    same_locals_1_stack_item_frame(@77,Object[#181])
    same_frame(@119)

	at java.base/java.lang.Class.forName0(Native Method)
	at java.base/java.lang.Class.forName(Unknown Source)
	at com.netease.arctic.shade.org.apache.iceberg.common.DynConstructors$Builder.impl(DynConstructors.java:149)
	at com.netease.arctic.shade.org.apache.iceberg.CatalogUtil.loadCatalog(CatalogUtil.java:221)
	at com.netease.arctic.shade.org.apache.iceberg.CatalogUtil.buildIcebergCatalog(CatalogUtil.java:284)
	at com.netease.arctic.mixed.BasicMixedIcebergCatalog.lambda$initialize$0(BasicMixedIcebergCatalog.java:97)
	at com.netease.arctic.table.TableMetaStore.doAsUgi(TableMetaStore.java:365)
	... 89 more

	at org.apache.kyuubi.KyuubiSQLException$.apply(KyuubiSQLException.scala:69)
	at org.apache.kyuubi.operation.ExecuteStatement.waitStatementComplete(ExecuteStatement.scala:129)
	at org.apache.kyuubi.operation.ExecuteStatement.$anonfun$runInternal$1(ExecuteStatement.scala:161)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

	at org.apache.kyuubi.jdbc.hive.KyuubiStatement.waitForOperationToComplete(KyuubiStatement.java:350)
	at org.apache.kyuubi.jdbc.hive.KyuubiStatement.executeWithConfOverlay(KyuubiStatement.java:196)
	at org.apache.kyuubi.jdbc.hive.KyuubiStatement.execute(KyuubiStatement.java:190)
	at com.netease.arctic.server.terminal.kyuubi.KyuubiSession.execute(KyuubiSession.java:109)
	... 16 more

Anything else

This happen every time.

Are you willing to submit a PR?

  • Yes I am willing to submit a PR!

Code of Conduct

  • I agree to follow this project's Code of Conduct
@GavinH1984 GavinH1984 added the type:bug Something isn't working label Sep 12, 2023
@GavinH1984 GavinH1984 changed the title [Bug]: Mix-iceberg format with external catalog not working on spark3.3 [Bug]: Mix-iceberg format with external catalog not working on spark Sep 12, 2023
@GavinH1984 GavinH1984 changed the title [Bug]: Mix-iceberg format with external catalog not working on spark [Bug]: Mix-iceberg format with hive metastore not working on spark Sep 12, 2023
@baiyangtx baiyangtx mentioned this issue Sep 13, 2023
56 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type:bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant