Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ZEPPELIN-2324. Add property zeppelin.spark.enableSupportedVersionCheck for trying new spark version #2197

Closed
wants to merge 2 commits into from

Conversation

zjffdu
Copy link
Contributor

@zjffdu zjffdu commented Mar 28, 2017

What is this PR for?

For now, every time when I want to try new spark version, I have to change file SparkVersion.java and rebuild it. It is not so convenient, so I'd like to add property zeppelin.spark. enableSupportedVersionCheck for spark interpreter. So that I can try new spark version by setting this property as false, of course it is only for zeppelin developer.

What type of PR is it?

[Improvement]

Todos

  • - Task

What is the Jira issue?

How should this be tested?

Verify it in spark master

Screenshots (if appropriate)

Questions:

  • Does the licenses files need update? No
  • Is there breaking changes for older versions? No
  • Does this needs documentation? No

@zjffdu
Copy link
Contributor Author

zjffdu commented Mar 28, 2017

@Leemoonsoo @jongyoul @felixcheung Please help review.

Copy link
Member

@jongyoul jongyoul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you add testcase for this conf?

Copy link
Member

@felixcheung felixcheung left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is kind of messy, having to change it in so many places. perhaps it should be changed in sparkInterpreter.getSparkVersion().isSupportedVersion() instead?

<tr>
<td>zeppelin.spark.unSupportedVersionCheck</td>
<td>true</td>
<td>Don't change it, It is only for zeppelin developer use, not safe or use in production</td>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could you fix the capitalization:
unSupportedVersionCheck
Don't change it, It is only

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think I'd suggest calling it
zeppelin.spark.disableSupportedVersionCheck
or
zeppelin.spark.debugNoSupportedVersionCheck

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And then in the description as Do not change - developer only setting, not for production use or similar

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

change it to zeppelin.spark.enableSupportedVersionCheck and description to Do not change - developer only setting, not for production use

@zjffdu
Copy link
Contributor Author

zjffdu commented Mar 30, 2017

perhaps it should be changed in sparkInterpreter.getSparkVersion().isSupportedVersion() instead

I thought about it, but feel it is not proper to create a new member field in class SparkVersion as this is a interpreter level property, even it is not proper to me to create a static field in SparkVersion. So in the new commit, I create a new method isUnSupportedSparkVersion in SparkInterpreter to wrap all the logic.

Can you add testcase for this conf?

I didn't find an easy way for this test. So just manually verified it.

@@ -140,6 +140,11 @@ You can also set other Spark properties which are not listed in the table. For a
<td>true</td>
<td>Import implicits, UDF collection, and sql if set true.</td>
</tr>
<tr>
<td>zeppelin.spark.enableSupportedVersionCheck</td>
<td>true</td>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't feel strongly about it, but since this is a debug config and it's turning off something important, I feel it is better to default to "off" that to disable it needs to be explicitly done?

I'll let other comment..

Copy link
Contributor Author

@zjffdu zjffdu Mar 30, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I change the configure name to enable*, so it is true by default. :)

@@ -1153,12 +1155,16 @@ public Object getLastObject() {
return obj;
}

boolean isUnSupportedSparkVersion() {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd totally change S -> s in isUnsupportedSparkVersion :)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

@zjffdu zjffdu changed the title ZEPPELIN-2324. Add property zeppelin.spark.unSupportedVersionCheck for trying new spark version ZEPPELIN-2324. Add property zeppelin.spark.zeppelin.spark.enableSupportedVersionCheck for trying new spark version Mar 31, 2017
@zjffdu zjffdu changed the title ZEPPELIN-2324. Add property zeppelin.spark.zeppelin.spark.enableSupportedVersionCheck for trying new spark version ZEPPELIN-2324. Add property zeppelin.spark.enableSupportedVersionCheck for trying new spark version Mar 31, 2017
Copy link
Member

@felixcheung felixcheung left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

right, I think it's fine

@zjffdu
Copy link
Contributor Author

zjffdu commented Mar 31, 2017

Will merge it if no more discussion.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
3 participants