Skip to content

create table writeTo() operation doesn't support "partitionedBy“ method #4079

@ekin4l

Description

@ekin4l

hi

I followed the documentation example and encountered an error when writing a DataFrame to a table

for example : shopDF is a dataframe contains some shop infos, when write to an iceberg table, it throw an error:

shopDF.writeTo("hive_prod.icedb.t_shop").tableProperty("write.format.default","parquet").partitionBy($"city").createOrReplace()

:24: error: value partitionBy is not a member of org.apache.spark.sql.CreateTableWriter[org.apache.spark.sql.Row]
shopDF.writeTo("hive_prod.icedb.t_shop").tableProperty("write.format.default","parquet").partitionBy($"city").createOrReplace()

thanks

--------vesion infomation----
spark : 3.2.1
iceberg : org.apache.iceberg:iceberg-spark-runtime-3.2_2.12:0.13.0
java : 1.8

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions