New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-18169][SQL] Suppress warnings when dropping views on a dropped table #15682
Conversation
Test build #67775 has finished for PR 15682 at commit
|
Test build #68449 has finished for PR 15682 at commit
|
Retest this please. |
Test build #68453 has finished for PR 15682 at commit
|
Hmm.
|
Retest this please |
Test build #68479 has finished for PR 15682 at commit
|
Retest this please. |
Test build #68635 has finished for PR 15682 at commit
|
Hi, @gatorsmile . |
@@ -202,6 +202,7 @@ case class DropTableCommand( | |||
sparkSession.sharedState.cacheManager.uncacheQuery( | |||
sparkSession.table(tableName.quotedString)) | |||
} catch { | |||
case ae: AnalysisException if ae.getMessage.contains("Table or view not found") => |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it NoSuchTableException
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you for review, @gatorsmile .
It's originally NoSuchTableException
, but Analyzer
changed into this at here.
So, the catch expression becomes ugly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
May I fix that to throw the original exception? Oops, Sorry. We cannot. The current NoSuchTableException
extends AnalysisException
and has more context like database name.AnalysisException
has the line number and start position.
Based on my understanding, when we try to drop a view, we look up and analyze the view and then issue an exception if any involved table has been dropped. We might have a bug here. That means, we will not call |
Sorry, but |
Ah, you mean that code you gave the pointer. |
Yep. |
This case is more complicated. Please try to check whether there is a bug. If yes, please fix it in this PR. |
Yep. I investigate the behavior here. |
Thank you! I think the PR #15896 is fixing a related issue in
|
I see. I'll include that command here, too. |
Please follw the ongoing design decision changes in Spark-18465 |
Sure! I'm watching that PR now. I'll wait for a while. |
Hi, @gatorsmile .
|
If the issue (and bugs) about uncacheQuery and uncacheTable is postpone, I'd like to focus on the warning messages in this PR. Is it okay? We can revisit that uncache cases for Apache Spark 2.2. For |
We should not suppress the warnings. Instead, we should improve the error message to show the unache attempt failed, right? |
@dongjoon-hyun @gatorsmile I totally missed this one. Shall we try to get this in 2.1? |
The warning message is valid. We might need to issue a better exception here. To resolve the root cause, we might need adding a new API for uncaching the view in which a relevant table has been dropped? I am not confident how to do it in a perfect way. Let me open a JIRA now and cc both of you in the discussion. |
Thank you for coming back, @gatorsmile and @hvanhovell ! |
A JIRA is created: https://issues.apache.org/jira/browse/SPARK-18549 |
Ah, In fact, that issue includes this. May I close this PR and the issue SPARK-18169 ? |
Sure, maybe close it now. If we are unable to contain it in Spark 2.1, we can continue this work to issue a better error message. |
I guess SPARK-18169 will cover the better error message too. That issue will need to revisit all occurrence of |
What changes were proposed in this pull request?
Apache Spark 2.x shows an AnalysisException warning message when dropping a view on a dropped table. This does not happen on dropping temporary views. Also, Spark 1.6.x does not show warnings. We had better suppress this to be more consistent in Spark 2.x and with Spark 1.6.x.
Note that this is different case of dropping non-exist views. For those cases, Spark raises NoSuchTableException.
How was this patch tested?
Manual because this is a suppression of warning message.