New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug] failed to insert data to hive partition table if using INSERT INTO
for spark-hive-connector
#6078
Closed
2 of 4 tasks
Labels
Comments
@yikf do u have time to take a look? |
The process of writing with Apache Spark DataSourceV2 using dynamic partitioning to handle static partitions, This exception by the issue reported should be a bug in KSHC, and will take a look. |
4 tasks
pan3793
pushed a commit
that referenced
this issue
Mar 7, 2024
… as dynamic partition at write path # 🔍 Description ## Issue References 🔗 This pull request fixes #6078, KSHC should handle the commit of the partitioned table as dynamic partition at write path, that's beacuse the process of writing with Apache Spark DataSourceV2 using dynamic partitioning to handle static partitions. ## Describe Your Solution 🔧 Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change. ## Types of changes 🔖 - [x] Bugfix (non-breaking change which fixes an issue) - [ ] New feature (non-breaking change which adds functionality) - [ ] Breaking change (fix or feature that would cause existing functionality to change) ## Test Plan 🧪 #### Behavior Without This Pull Request ⚰️ #### Behavior With This Pull Request 🎉 #### Related Unit Tests --- # Checklist 📝 - [x] This patch was not authored or co-authored using [Generative Tooling](https://www.apache.org/legal/generative-tooling.html) **Be nice. Be informative.** Closes #6082 from Yikf/KYUUBI-6078. Closes #6078 2ae1836 [yikaifei] KSHC should handle the commit of the partitioned table as dynamic partition at write path Authored-by: yikaifei <yikaifei@apache.org> Signed-off-by: Cheng Pan <chengpan@apache.org> (cherry picked from commit 5bee05e) Signed-off-by: Cheng Pan <chengpan@apache.org>
zhaohehuhu
pushed a commit
to zhaohehuhu/incubator-kyuubi
that referenced
this issue
Mar 21, 2024
… table as dynamic partition at write path # 🔍 Description ## Issue References 🔗 This pull request fixes apache#6078, KSHC should handle the commit of the partitioned table as dynamic partition at write path, that's beacuse the process of writing with Apache Spark DataSourceV2 using dynamic partitioning to handle static partitions. ## Describe Your Solution 🔧 Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change. ## Types of changes 🔖 - [x] Bugfix (non-breaking change which fixes an issue) - [ ] New feature (non-breaking change which adds functionality) - [ ] Breaking change (fix or feature that would cause existing functionality to change) ## Test Plan 🧪 #### Behavior Without This Pull Request ⚰️ #### Behavior With This Pull Request 🎉 #### Related Unit Tests --- # Checklist 📝 - [x] This patch was not authored or co-authored using [Generative Tooling](https://www.apache.org/legal/generative-tooling.html) **Be nice. Be informative.** Closes apache#6082 from Yikf/KYUUBI-6078. Closes apache#6078 2ae1836 [yikaifei] KSHC should handle the commit of the partitioned table as dynamic partition at write path Authored-by: yikaifei <yikaifei@apache.org> Signed-off-by: Cheng Pan <chengpan@apache.org>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Code of Conduct
Search before asking
Describe the bug
failed to insert data to hive partition table if using
INSERT INTO
for spark-hive-connector, could be reproduced by replaceINSERT OVERWRITE
withINSERT INTO
in HiveQuerySuite.scalaAffects Version(s)
master
Kyuubi Server Log Output
No response
Kyuubi Engine Log Output
Kyuubi Server Configurations
No response
Kyuubi Engine Configurations
No response
Additional context
No response
Are you willing to submit PR?
The text was updated successfully, but these errors were encountered: