Skip to content

[VL] can not write hive table on HDFS #5879

@RaoZhiRou-Z

Description

@RaoZhiRou-Z

Description

when add the config "spark.gluten.sql.native.writer.enabled true",and an error occurred:"The file path is not local when writing data with parquet format in velox runtime!"
Does velox not support writing hive table on HDFS now?

void VeloxParquetDatasource::init(const std::unordered_map<std::string, std::string>& sparkConfs) { if (strncmp(filePath_.c_str(), "file:", 5) == 0) { sink_ = dwio::common::FileSink::create(filePath_, {.pool = pool_.get()}); } else { throw std::runtime_error("The file path is not local when writing data with parquet format in velox runtime!"); } ......... }

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions