Skip to content

start overflow using bulk_insert - hoodie.datasource.write.operation #14946

@hudi-bot

Description

@hudi-bot

Hi

when I try to use bulk_insert with a large number of fields in schema, I get a stack overflow

this doesn't happen when I use insert of upsert.

I have attached a sample main to reproduce the error and also the stack trace

please let me know if you need more information

Many thanks

 

JIRA info


Comments

09/Dec/21 13:11;jrivas;[https://github.com/zubipower/StackOverFlowHudiError/blob/main/StackOverflowErrorTest.java]

also code added there

I can provide pom.xml and jvm settings if necessary

this is running on jdk 8

<scala.binary.version>2.12</scala.binary.version>
<spark.version>3.0.0</spark.version>
<hudi.version>0.9.0</hudi.version>
<avro.version>1.8.2</avro.version>
<hadoop.version>2.7.3</hadoop.version>
<parquet.version>1.10.1</parquet.version>;;;

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions