You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
ERROR Seatunnel: Exception StackTrace:java.lang.RuntimeException: Execute Flink task error
at org.apache.seatunnel.core.flink.command.FlinkTaskExecuteCommand.execute(FlinkTaskExecuteCommand.java:84)
at org.apache.seatunnel.core.base.Seatunnel.run(Seatunnel.java:39)
at org.apache.seatunnel.example.flink.LocalFlinkExample.main(LocalFlinkExample.java:40)
Caused by: org.apache.flink.table.api.TableException: Type is not supported: BigInteger
at org.apache.flink.table.calcite.FlinkTypeFactory$.org$apache$flink$table$calcite$FlinkTypeFactory$$typeInfoToSqlTypeName(FlinkTypeFactory.scala:389)
at org.apache.flink.table.calcite.FlinkTypeFactory.createTypeFromTypeInfo(FlinkTypeFactory.scala:67)
at org.apache.flink.table.calcite.FlinkTypeFactory$$anonfun$buildLogicalRowType$1.apply(FlinkTypeFactory.scala:212)
at org.apache.flink.table.calcite.FlinkTypeFactory$$anonfun$buildLogicalRowType$1.apply(FlinkTypeFactory.scala:202)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
at org.apache.flink.table.calcite.FlinkTypeFactory.buildLogicalRowType(FlinkTypeFactory.scala:202)
at org.apache.flink.table.calcite.FlinkTypeFactory.buildLogicalRowType(FlinkTypeFactory.scala:185)
at org.apache.flink.table.catalog.QueryOperationCatalogViewTable.lambda$createCalciteTable$3(QueryOperationCatalogViewTable.java:69)
at org.apache.flink.table.catalog.QueryOperationCatalogViewTable.getRowType(QueryOperationCatalogViewTable.java:114)
at org.apache.calcite.prepare.CalciteCatalogReader.getTable(CalciteCatalogReader.java:130)
at org.apache.calcite.prepare.CalciteCatalogReader.getTableForMember(CalciteCatalogReader.java:229)
at org.apache.calcite.prepare.CalciteCatalogReader.getTableForMember(CalciteCatalogReader.java:83)
at org.apache.calcite.tools.RelBuilder.scan(RelBuilder.java:1094)
at org.apache.calcite.tools.RelBuilder.scan(RelBuilder.java:1123)
at org.apache.flink.table.plan.QueryOperationConverter$SingleRelVisitor.visit(QueryOperationConverter.java:284)
at org.apache.flink.table.plan.QueryOperationConverter$SingleRelVisitor.visit(QueryOperationConverter.java:140)
at org.apache.flink.table.operations.CatalogQueryOperation.accept(CatalogQueryOperation.java:68)
at org.apache.flink.table.plan.QueryOperationConverter.defaultMethod(QueryOperationConverter.java:137)
at org.apache.flink.table.plan.QueryOperationConverter.defaultMethod(QueryOperationConverter.java:118)
at org.apache.flink.table.operations.utils.QueryOperationDefaultVisitor.visit(QueryOperationDefaultVisitor.java:92)
at org.apache.flink.table.operations.CatalogQueryOperation.accept(CatalogQueryOperation.java:68)
at org.apache.flink.table.calcite.FlinkRelBuilder.tableOperation(FlinkRelBuilder.scala:121)
at org.apache.flink.table.api.internal.BatchTableEnvImpl.translate(BatchTableEnvImpl.scala:568)
at org.apache.flink.table.api.internal.BatchTableEnvImpl.translate(BatchTableEnvImpl.scala:552)
at org.apache.flink.table.api.bridge.java.internal.BatchTableEnvironmentImpl.toDataSet(BatchTableEnvironmentImpl.scala:106)
at org.apache.seatunnel.flink.util.TableUtil.tableToDataSet(TableUtil.java:50)
at org.apache.seatunnel.flink.batch.FlinkBatchExecution.fromSourceTable(FlinkBatchExecution.java:105)
at org.apache.seatunnel.flink.batch.FlinkBatchExecution.start(FlinkBatchExecution.java:70)
at org.apache.seatunnel.core.flink.command.FlinkTaskExecuteCommand.execute(FlinkTaskExecuteCommand.java:81)
... 2 more
Search before asking
What happened
TableException throws when run seatunnel config below.
if i remove result_table_name parameter , the job is running well
SeaTunnel Version
2.1.2
SeaTunnel Config
Running Command
Error Exception
Flink or Spark Version
flink :1.13
Java or Scala Version
1.8
Screenshots
No response
Are you willing to submit PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: