Closed
Description
What happened?
The basic problem is:
- The type for
ArrayMaximalIndependentSet
isTArray(...)
where...
is whatever the node type is. - We choose a PType based on the Type.
- We choose an SType based on the PType.
unwrapReturn
makes an incorrect assumption about which SType corresponds to aTArray(String)
.
In particular,
Code.invokeScalaObject1[UnsafeIndexedSeq, IndexedSeq[Any]](Graph.getClass, "maximalIndependentSet", jEdges)
returns a Java array of whatever svalueToJavaValue
returns. In that case, that's a String[]
which we call SJavaArrayString
. However, the SType chosen for TArray(TString)
is SIndexablePointer(SBinary)
.
I think the real fix here is to just pass region pointers into MaximalIndependentSet. Just get an elementIterator
from PCanonicalArray
and use loadElement
, etc. to populate the Graph
.
In [1]: import hail as hl
...: ht = hl.Table.parallelize([hl.Struct(i='A', j='B', kin=0.25), hl.Struct(i='A', j='C', kin=0.25), hl.Struct(i='D', j='E', kin=0.5)])
...: hl.maximal_independent_set(ht.i, ht.j, False).collect()
---------------------------------------------------------------------------
FatalError Traceback (most recent call last)
Cell In[1], line 3
1 import hail as hl
2 ht = hl.Table.parallelize([hl.Struct(i='A', j='B', kin=0.25), hl.Struct(i='A', j='C', kin=0.25), hl.Struct(i='D', j='E', kin=0.5)])
----> 3 hl.maximal_independent_set(ht.i, ht.j, False).collect()
File <decorator-gen-1148>:2, in collect(self, _localize, _timed)
File ~/miniconda3/lib/python3.10/site-packages/hail/typecheck/check.py:584, in _make_dec.<locals>.wrapper(__original_func, *args, **kwargs)
581 @decorator
582 def wrapper(__original_func, *args, **kwargs):
583 args_, kwargs_ = check_all(__original_func, args, kwargs, checkers, is_method=is_method)
--> 584 return __original_func(*args_, **kwargs_)
File ~/miniconda3/lib/python3.10/site-packages/hail/table.py:2162, in Table.collect(self, _localize, _timed)
2160 e = construct_expr(rows_ir, hl.tarray(t.row.dtype))
2161 if _localize:
-> 2162 return Env.backend().execute(e._ir, timed=_timed)
2163 else:
2164 return e
File ~/miniconda3/lib/python3.10/site-packages/hail/backend/py4j_backend.py:82, in Py4JBackend.execute(self, ir, timed)
80 return (value, timings) if timed else value
81 except FatalError as e:
---> 82 raise e.maybe_user_error(ir) from None
File ~/miniconda3/lib/python3.10/site-packages/hail/backend/py4j_backend.py:76, in Py4JBackend.execute(self, ir, timed)
74 # print(self._hail_package.expr.ir.Pretty.apply(jir, True, -1))
75 try:
---> 76 result_tuple = self._jbackend.executeEncode(jir, stream_codec, timed)
77 (result, timings) = (result_tuple._1(), result_tuple._2())
78 value = ir.typ._from_encoding(result)
File ~/miniconda3/lib/python3.10/site-packages/py4j/java_gateway.py:1321, in JavaMember.__call__(self, *args)
1315 command = proto.CALL_COMMAND_NAME +\
1316 self.command_header +\
1317 args_command +\
1318 proto.END_COMMAND_PART
1320 answer = self.gateway_client.send_command(command)
-> 1321 return_value = get_return_value(
1322 answer, self.gateway_client, self.target_id, self.name)
1324 for temp_arg in temp_args:
1325 temp_arg._detach()
File ~/miniconda3/lib/python3.10/site-packages/hail/backend/py4j_backend.py:35, in handle_java_exception.<locals>.deco(*args, **kwargs)
33 tpl = Env.jutils().handleForPython(e.java_exception)
34 deepest, full, error_id = tpl._1(), tpl._2(), tpl._3()
---> 35 raise fatal_error_from_java_error_triplet(deepest, full, error_id) from None
36 except pyspark.sql.utils.CapturedException as e:
37 raise FatalError('%s\n\nJava stack trace:\n%s\n'
38 'Hail version: %s\n'
39 'Error summary: %s' % (e.desc, e.stackTrace, hail.__version__, e.desc)) from None
FatalError: ClassCastException: class is.hail.types.physical.stypes.concrete.SIndexablePointer cannot be cast to class is.hail.types.physical.stypes.concrete.SJavaArrayString (is.hail.types.physical.stypes.concrete.SIndexablePointer and is.hail.types.physical.stypes.concrete.SJavaArrayString are in unnamed module of loader 'app')
Java stack trace:
java.lang.ClassCastException: class is.hail.types.physical.stypes.concrete.SIndexablePointer cannot be cast to class is.hail.types.physical.stypes.concrete.SJavaArrayString (is.hail.types.physical.stypes.concrete.SIndexablePointer and is.hail.types.physical.stypes.concrete.SJavaArrayString are in unnamed module of loader 'app')
at is.hail.expr.ir.functions.RegistryFunctions.unwrapReturn(Functions.scala:364)
at is.hail.expr.ir.Emit.$anonfun$emitI$85(Emit.scala:1173)
at is.hail.expr.ir.IEmitCodeGen.map(Emit.scala:352)
at is.hail.expr.ir.Emit.emitI(Emit.scala:1153)
at is.hail.expr.ir.streams.EmitStream$.is$hail$expr$ir$streams$EmitStream$$emit$1(EmitStream.scala:148)
at is.hail.expr.ir.streams.EmitStream$.produce(EmitStream.scala:321)
at is.hail.expr.ir.Emit.emitStream$2(Emit.scala:821)
at is.hail.expr.ir.Emit.emitI(Emit.scala:1177)
at is.hail.expr.ir.Emit.$anonfun$emitSplitMethod$1(Emit.scala:607)
at is.hail.expr.ir.Emit.$anonfun$emitSplitMethod$1$adapted(Emit.scala:605)
at is.hail.expr.ir.EmitCodeBuilder$.scoped(EmitCodeBuilder.scala:19)
at is.hail.expr.ir.EmitCodeBuilder$.scopedVoid(EmitCodeBuilder.scala:29)
at is.hail.expr.ir.EmitMethodBuilder.voidWithBuilder(EmitClassBuilder.scala:1086)
at is.hail.expr.ir.Emit.emitSplitMethod(Emit.scala:605)
at is.hail.expr.ir.Emit.emitInSeparateMethod(Emit.scala:622)
at is.hail.expr.ir.Emit.emitI(Emit.scala:809)
at is.hail.expr.ir.Emit.emitInNewBuilder$1(Emit.scala:818)
at is.hail.expr.ir.Emit.$anonfun$emitI$33(Emit.scala:979)
at is.hail.expr.ir.EmitCode$.fromI(Emit.scala:461)
at is.hail.expr.ir.Emit.$anonfun$emitI$32(Emit.scala:979)
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at scala.collection.TraversableLike.map(TraversableLike.scala:286)
at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
at scala.collection.AbstractTraversable.map(Traversable.scala:108)
at is.hail.expr.ir.Emit.$anonfun$emitI$31(Emit.scala:978)
at is.hail.expr.ir.IEmitCodeGen.map(Emit.scala:352)
at is.hail.expr.ir.Emit.emitI(Emit.scala:977)
at is.hail.expr.ir.Emit.emitI$3(Emit.scala:2616)
at is.hail.expr.ir.Emit.$anonfun$emit$22(Emit.scala:2699)
at is.hail.expr.ir.EmitCode$.fromI(Emit.scala:461)
at is.hail.expr.ir.Emit.emit(Emit.scala:2698)
at is.hail.expr.ir.Emit.emit$2(Emit.scala:2613)
at is.hail.expr.ir.Emit.$anonfun$emit$6(Emit.scala:2639)
at is.hail.expr.ir.EmitCode$.fromI(Emit.scala:461)
at is.hail.expr.ir.Emit.emit(Emit.scala:2638)
at is.hail.expr.ir.Emit.emitFallback$1(Emit.scala:827)
at is.hail.expr.ir.Emit.emitI(Emit.scala:2537)
at is.hail.expr.ir.Emit.emitI$3(Emit.scala:2616)
at is.hail.expr.ir.Emit.$anonfun$emit$7(Emit.scala:2641)
at is.hail.expr.ir.EmitCodeBuilder.withScopedMaybeStreamValue(EmitCodeBuilder.scala:183)
at is.hail.expr.ir.Emit.$anonfun$emit$6(Emit.scala:2640)
at is.hail.expr.ir.EmitCode$.fromI(Emit.scala:461)
at is.hail.expr.ir.Emit.emit(Emit.scala:2638)
at is.hail.expr.ir.Emit.emitFallback$1(Emit.scala:827)
at is.hail.expr.ir.Emit.emitI(Emit.scala:2537)
at is.hail.expr.ir.Emit$.$anonfun$apply$5(Emit.scala:94)
at is.hail.expr.ir.EmitCodeBuilder$.scoped(EmitCodeBuilder.scala:19)
at is.hail.expr.ir.EmitCodeBuilder$.scopedCode(EmitCodeBuilder.scala:24)
at is.hail.expr.ir.EmitMethodBuilder.emitWithBuilder(EmitClassBuilder.scala:1084)
at is.hail.expr.ir.WrappedEmitMethodBuilder.emitWithBuilder(EmitClassBuilder.scala:1141)
at is.hail.expr.ir.WrappedEmitMethodBuilder.emitWithBuilder$(EmitClassBuilder.scala:1141)
at is.hail.expr.ir.EmitFunctionBuilder.emitWithBuilder(EmitClassBuilder.scala:1157)
at is.hail.expr.ir.Emit$.apply(Emit.scala:91)
at is.hail.expr.ir.Compile$.$anonfun$apply$4(Compile.scala:74)
at is.hail.backend.BackendWithCodeCache.lookupOrCompileCachedFunction(Backend.scala:125)
at is.hail.backend.BackendWithCodeCache.lookupOrCompileCachedFunction$(Backend.scala:121)
at is.hail.backend.spark.SparkBackend.lookupOrCompileCachedFunction(SparkBackend.scala:273)
at is.hail.expr.ir.Compile$.apply(Compile.scala:40)
at is.hail.expr.ir.lowering.TableStageToRVD$.apply(RVDToTableStage.scala:112)
at is.hail.backend.spark.SparkBackend.lowerDistributedSort(SparkBackend.scala:689)
at is.hail.backend.Backend.lowerDistributedSort(Backend.scala:110)
at is.hail.expr.ir.lowering.LowerAndExecuteShuffles$.$anonfun$apply$1(LowerAndExecuteShuffles.scala:23)
at is.hail.expr.ir.RewriteBottomUp$.$anonfun$apply$2(RewriteBottomUp.scala:11)
at is.hail.utils.StackSafe$More.advance(StackSafe.scala:60)
at is.hail.utils.StackSafe$.run(StackSafe.scala:16)
at is.hail.utils.StackSafe$StackFrame.run(StackSafe.scala:32)
at is.hail.expr.ir.RewriteBottomUp$.apply(RewriteBottomUp.scala:21)
at is.hail.expr.ir.lowering.LowerAndExecuteShuffles$.apply(LowerAndExecuteShuffles.scala:20)
at is.hail.expr.ir.lowering.LowerAndExecuteShufflesPass.transform(LoweringPass.scala:157)
at is.hail.expr.ir.lowering.LoweringPass.$anonfun$apply$3(LoweringPass.scala:16)
at is.hail.utils.ExecutionTimer.time(ExecutionTimer.scala:81)
at is.hail.expr.ir.lowering.LoweringPass.$anonfun$apply$1(LoweringPass.scala:16)
at is.hail.utils.ExecutionTimer.time(ExecutionTimer.scala:81)
at is.hail.expr.ir.lowering.LoweringPass.apply(LoweringPass.scala:14)
at is.hail.expr.ir.lowering.LoweringPass.apply$(LoweringPass.scala:13)
at is.hail.expr.ir.lowering.LowerAndExecuteShufflesPass.apply(LoweringPass.scala:151)
at is.hail.expr.ir.lowering.LoweringPipeline.$anonfun$apply$1(LoweringPipeline.scala:22)
at is.hail.expr.ir.lowering.LoweringPipeline.$anonfun$apply$1$adapted(LoweringPipeline.scala:20)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at is.hail.expr.ir.lowering.LoweringPipeline.apply(LoweringPipeline.scala:20)
at is.hail.expr.ir.lowering.EvalRelationalLets$.execute$1(EvalRelationalLets.scala:10)
at is.hail.expr.ir.lowering.EvalRelationalLets$.lower$1(EvalRelationalLets.scala:18)
at is.hail.expr.ir.lowering.EvalRelationalLets$.apply(EvalRelationalLets.scala:32)
at is.hail.expr.ir.lowering.EvalRelationalLetsPass.transform(LoweringPass.scala:147)
at is.hail.expr.ir.lowering.LoweringPass.$anonfun$apply$3(LoweringPass.scala:16)
at is.hail.utils.ExecutionTimer.time(ExecutionTimer.scala:81)
at is.hail.expr.ir.lowering.LoweringPass.$anonfun$apply$1(LoweringPass.scala:16)
at is.hail.utils.ExecutionTimer.time(ExecutionTimer.scala:81)
at is.hail.expr.ir.lowering.LoweringPass.apply(LoweringPass.scala:14)
at is.hail.expr.ir.lowering.LoweringPass.apply$(LoweringPass.scala:13)
at is.hail.expr.ir.lowering.EvalRelationalLetsPass.apply(LoweringPass.scala:141)
at is.hail.expr.ir.lowering.LoweringPipeline.$anonfun$apply$1(LoweringPipeline.scala:22)
at is.hail.expr.ir.lowering.LoweringPipeline.$anonfun$apply$1$adapted(LoweringPipeline.scala:20)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at is.hail.expr.ir.lowering.LoweringPipeline.apply(LoweringPipeline.scala:20)
at is.hail.expr.ir.CompileAndEvaluate$._apply(CompileAndEvaluate.scala:50)
at is.hail.backend.spark.SparkBackend._execute(SparkBackend.scala:462)
at is.hail.backend.spark.SparkBackend.$anonfun$executeEncode$2(SparkBackend.scala:498)
at is.hail.backend.ExecuteContext$.$anonfun$scoped$3(ExecuteContext.scala:75)
at is.hail.utils.package$.using(package.scala:635)
at is.hail.backend.ExecuteContext$.$anonfun$scoped$2(ExecuteContext.scala:75)
at is.hail.utils.package$.using(package.scala:635)
at is.hail.annotations.RegionPool$.scoped(RegionPool.scala:17)
at is.hail.backend.ExecuteContext$.scoped(ExecuteContext.scala:63)
at is.hail.backend.spark.SparkBackend.withExecuteContext(SparkBackend.scala:350)
at is.hail.backend.spark.SparkBackend.$anonfun$executeEncode$1(SparkBackend.scala:495)
at is.hail.utils.ExecutionTimer$.time(ExecutionTimer.scala:52)
at is.hail.backend.spark.SparkBackend.executeEncode(SparkBackend.scala:494)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
at java.base/java.lang.Thread.run(Thread.java:829)
Hail version: 0.2.120-f00f916faf78
Error summary: ClassCastException: class is.hail.types.physical.stypes.concrete.SIndexablePointer cannot be cast to class is.hail.types.physical.stypes.concrete.SJavaArrayString (is.hail.types.physical.stypes.concrete.SIndexablePointer and is.hail.types.physical.stypes.concrete.SJavaArrayString are in unnamed module of loader 'app')
Version
0.2.122
Relevant log output
No response