-
Notifications
You must be signed in to change notification settings - Fork 29.1k
[SPARK-28640][SQL] Do not show stack trace when default or session catalog is misconfigured #25372
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Test build #108733 has finished for PR 25372 at commit
|
|
Retest this please. |
| } catch { | ||
| case _: CatalogNotFoundException => | ||
| logWarning("Session catalog is not defined") | ||
| None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@jzhuge Could you print out more information of CatalogNotFoundException without the full stack trace?
Without that, the previous error message will be better for users because it gives the exact reason Cannot find catalog plugin class for catalog 'session': xxx.
$ bin/spark-shell --conf spark.sql.catalog.session=xxx
scala> spark.sessionState.analyzer.sessionCatalog
19/08/10 18:17:29 ERROR HiveSessionStateBuilder$$anon$1: Cannot load v2 session catalog
org.apache.spark.SparkException: Cannot find catalog plugin class for catalog 'session': xxxThere was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
BTW, @jzhuge . Is this all places to give a warning instead of the stacks traces?
With this PR, I still see the previous behavior.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@dongjoon-hyun Thanks for the review. Your command line is not the case I tried to fix in the PR. In your case, the stack trace is helpful.
It seems that the current master has session catalog defined by default, so here is the command line to reproduce my case:
$ bin/spark-shell --master 'local[*]' --conf spark.sql.catalog.session=
...
Spark context available as 'sc' (master = local[*], app id = local-1565588237201).
Spark session available as 'spark'.
...
scala> spark.sessionState.analyzer.sessionCatalog
...
2019-08-11 22:37:24,216 ERROR [main] hive.HiveSessionStateBuilder$$anon$1 (Logging.scala:logError(94)) - Cannot load v2 session catalog
org.apache.spark.SparkException: Cannot find catalog plugin class for catalog 'session':
at org.apache.spark.sql.catalog.v2.Catalogs.load(Catalogs.java:81)
...
res0: Option[org.apache.spark.sql.catalog.v2.CatalogPlugin] = None
Here the stack trace does not add more information. And I am concerned that if any rule uses session catalog, we will see this long stack trace again and again.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you fix both cases in this PR? I believe typo cases will be more frequent than the empty configuration cases.
|
Test build #108926 has finished for PR 25372 at commit
|
|
Handle typo in plugin class name for both v2 session catalog and default catalog. Manual test cases
|
|
Test build #109317 has finished for PR 25372 at commit
|
|
Test build #109364 has finished for PR 25372 at commit
|
|
Sorry for being late. Could you resolve the conflicts, @jzhuge ? |
…talog is misconfigured LookupCatalog.sessionCatalog logs an error message and the exception stack upon any nonfatal exception. When session catalog is just misconfigured, this may alarm the user unnecessarily. It should be enough to give a warning and return None. - bin/spark-shell --conf spark.sql.catalog.session=noclass - bin/spark-shell --conf spark.sql.default.catalog=def - bin/spark-shell --conf spark.sql.default.catalog=def --conf spark.sql.catalog.def=noclass
|
Test build #109692 has finished for PR 25372 at commit
|
|
Is this test failure related to the patch? |
|
retest this please |
|
Test build #110052 has finished for PR 25372 at commit
|
|
We're closing this PR because it hasn't been updated in a while. If you'd like to revive this PR, please reopen it! |
What changes were proposed in this pull request?
LookupCatalog's
sessionCataloganddefaultCataloglogs an error message and the exception stack upon any nonfatal exception. When either catalog is misconfigured, this may clutter the console and alarm the user unnecessarily. It should be enough to print a warning and return None.How was this patch tested?
Manual test cases
Start Spark shell with either of these configurations:
Enter
spark.sessionState.analyzer.defaultCatalogorspark.sessionState.analyzer.sessionCatalogat the prompt, expect a warning and no stack trace.