You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Our Spark integration can handle JavaBean style classes. A corner case exists with that right now that keeps the serialization from succeeding:
A java bean class at the root (call this BeanA.class)
BeanA has a java.util.Map with a String key and a BeanB value type
BeanB is a simple java bean
How this breaks:
The ScalaValueWriter sees BeanA and converts it into a Scala Map. It passes this map to itself.
The ScalaValueWriter sees the Map, and passes the java.util.Map to itself.
The ScalaValueWriter cannot handle the java.util.Map and delegates that to the JavaValueWriter.
The JavaValueWriter sees a map, and tries to serialize the first entry. It grabs the value from the current entry (a BeanB instance) and passes it to itself.
The JavaValueWriter doesn't know how to handle a JavaBean object and throws an error
Instead how this should work is that at step 4, the JavaValueWriter should pass the bean instance to the ScalaValueWriter instead of itself. If at this point the ScalaValueWriter cannot handle a value it would then re-delegate to the JavaValueWriter after trying itself. This may mean taking a minor performance hit to guarantee all values supported by the Spark integration are accounted for.
The text was updated successfully, but these errors were encountered:
Our Spark integration can handle JavaBean style classes. A corner case exists with that right now that keeps the serialization from succeeding:
How this breaks:
ScalaValueWriter
sees BeanA and converts it into a Scala Map. It passes this map to itself.ScalaValueWriter
sees the Map, and passes the java.util.Map to itself.ScalaValueWriter
cannot handle the java.util.Map and delegates that to theJavaValueWriter
.JavaValueWriter
sees a map, and tries to serialize the first entry. It grabs the value from the current entry (a BeanB instance) and passes it to itself.JavaValueWriter
doesn't know how to handle a JavaBean object and throws an errorInstead how this should work is that at step 4, the
JavaValueWriter
should pass the bean instance to theScalaValueWriter
instead of itself. If at this point theScalaValueWriter
cannot handle a value it would then re-delegate to theJavaValueWriter
after trying itself. This may mean taking a minor performance hit to guarantee all values supported by the Spark integration are accounted for.The text was updated successfully, but these errors were encountered: