-
Notifications
You must be signed in to change notification settings - Fork 29.1k
[SPARK-19209] [WIP] JDBC: Fix "No suitable driver" on the first try #16678
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
srowen
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Change seems reasonable in any event
| DriverManager.getDriver(url).getClass.getCanonicalName | ||
| } | ||
| } | ||
| val driverClass = parameters.get(JDBC_DRIVER_CLASS) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Might be too late, but should these be private?
Does it solve the issue to make this a def instead?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
spark.read.format("jdbc").option("url", "jdbc:sqlite:").option("dbtable", "x").loadIn the above code, it is not passing JDBC_DRIVER_CLASS. Thus, this does not help.
Actually, if @darabos passes JDBC_DRIVER_CLASS as an JDBC option, the problem is solved. Thus, I suspect it is caused by the classloader.
|
Test build #71836 has finished for PR 16678 at commit
|
|
Thanks for the quick pull request!
Unfortunately it does not. I built the code with this command: I ran the commands in the pull request description in the fresh Have you been unable to reproduce this on your machine? Do you think something's wrong with my environment? |
|
(I used |
|
@darabos Thank you! It confirms my guess. Let me think about it. Thanks! Because you will not enable Hive support, if you do not use |
What changes were proposed in this pull request?
This PR is to revert some changes made in #15292
Based on my understanding, the problem is
java.sql.DriverManagerclass that can't access drivers loaded by Spark ClassLoader. The changes made in this PR does not sound a solution for the reported issue. It could be caused by the other code changes in 2.1 that change the current ClassLoader@darabos Could you please help us try it in your local environment? Thanks!
Below is the error @darabos got in his environement.
How was this patch tested?
@darabos Could you make a manual test and see whether this changes can resolve your issue?