New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-2540] [SQL] Add HiveDecimal & HiveVarchar support in unwrapping data #1436
Conversation
QA tests have started for PR 1436. This patch merges cleanly. |
@@ -280,6 +280,16 @@ private[hive] case class HiveGenericUdf(name: String, children: Seq[Expression]) | |||
private[hive] trait HiveInspectors { | |||
|
|||
def unwrapData(data: Any, oi: ObjectInspector): Any = oi match { | |||
case hvoi: HiveVarcharObjectInspector => if (data == null) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
might be a better way to write this:
case hvoi: HiveVarcharObjectInspector =>
if (data == null) null else hvoi.getPrimitiveJavaObject(data).getValue
case hdoi: HiveDecimalObjectInspector =>
if (data == null) null else BigDecimal(hdoi.getPrimitiveJavaObject(data).bigDecimalValue())
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, cool. :)
QA tests have started for PR 1436. This patch merges cleanly. |
QA results for PR 1436: |
QA results for PR 1436: |
Unit test actually failed. |
Hey @chenghao-intel -- can you create a JIRA ticket for this? |
Jira Ticket are created: https://issues.apache.org/jira/browse/SPARK-2540 |
QA tests have started for PR 1436. This patch merges cleanly. |
QA results for PR 1436: |
@rxin I think this is ready to be merged. |
Thanks! I've merged this into master and 1.0. |
…g data Author: Cheng Hao <hao.cheng@intel.com> Closes #1436 from chenghao-intel/unwrapdata and squashes the following commits: 34cc21a [Cheng Hao] update the table scan accodringly since the unwrapData function changed afc39da [Cheng Hao] Polish the code 39d6475 [Cheng Hao] Add HiveDecimal & HiveVarchar support in unwrap data (cherry picked from commit 7f17208) Signed-off-by: Michael Armbrust <michael@databricks.com>
…g data Author: Cheng Hao <hao.cheng@intel.com> Closes apache#1436 from chenghao-intel/unwrapdata and squashes the following commits: 34cc21a [Cheng Hao] update the table scan accodringly since the unwrapData function changed afc39da [Cheng Hao] Polish the code 39d6475 [Cheng Hao] Add HiveDecimal & HiveVarchar support in unwrap data
…rts (apache#1436) Currently for `ALTER DATABASE SET LOCATION` command, Spark will throw exception when Hive version (e.g., specified via `spark.sql.hive.metastore.version`) is not 3.0/3.1. This PR removes the check so that the command works as long as the Hive version used by the Hive metastore (which could be different from the version used by Spark) supports the alter database location feature added via [HIVE-8472](https://issues.apache.org/jira/browse/HIVE-8472). If it does not support it, the same exception will still be thrown from Spark side. For the command `ALTER DATABASE SET LOCATION` command, Spark currently throws exception like the following: ``` AnalysisException: Hive 2.3.9 does not support altering database location ``` This is not accurate since it only considers the client version, while the feature support is on the Hive metastore server side. Therefore, the command should succeed if Spark is using Hive 2.3 while the remote Hive megastore is using Hive 3.1. On the other hand, the command will not succeed if Spark is using Hive 3.1 (thus no exception) but the remote Hive metastore is using 2.3. Yes, previously Spark users using Hive client with version other than 3.0/3.1 won't be able to run `ALTER DATABASE SET LOCATION` command against Hive metastore 3.x. After this PR it should work. Modified the existing test case. Closes apache#36750 from sunchao/SPARK-29260. Lead-authored-by: Chao Sun <sunchao@apple.com> Co-authored-by: Chao Sun <sunchao@apache.org> Signed-off-by: Yuming Wang <yumwang@ebay.com> Co-authored-by: Chao Sun <sunchao@apache.org>
No description provided.