Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ZEPPELIN-1258] Add Spark packages support to Livy interpreter #1255

Closed
wants to merge 5 commits into from

Conversation

mfelgamal
Copy link
Contributor

@mfelgamal mfelgamal commented Jul 31, 2016

What is this PR for?

Adding extra libraries to livy interpreter which isn't exist by default.

What type of PR is it?

[ Improvement ]

Todos

  • [Test case ] - Task

What is the Jira issue?

How should this be tested?

  • Create new livy interpreter or modify the default.
  • Set livy.spark.jars.packages to list of maven coordinates of jars. The format for the coordinates should be groupId:artifactId:version.

Screenshots (if appropriate)


Questions:

  • Does the licenses files need update? no
  • Is there breaking changes for older versions? no
  • Does this needs documentation? yes

@zjffdu
Copy link
Contributor

zjffdu commented Jul 31, 2016

I would not recommend user to do that to add extra library. Because most of time zeppelin user don't know the cluster details like which/where extra library are located in the cluster machines. I would suggest livy to support dynamic library loading as native spark interpreter.

@jimdowling
Copy link

jimdowling commented Aug 1, 2016

@zjffdu - Livy doesn't support dynamic library loading right now. So, I don't think there's a ZeppelinContext available in Livy. Would this be acceptable until dynamic library loading is added to Livy and a Zeppelin Context becomes available? As of now, Livy is not that much use if you cannot add libraries to your Spark app.

@zjffdu
Copy link
Contributor

zjffdu commented Aug 1, 2016

@jimdowling Right, there's no ZeppelinContext in livy right now. Would allowing user to specify jars/packages when creating livy session more general ? (like --jars/--packages in spark-submit)

@jimdowling
Copy link

Do you mean by modifying the UI on notebook creation or using a param to selecting the interpreter like
'%livy --jars [path]

@zjffdu
Copy link
Contributor

zjffdu commented Aug 2, 2016

No, I mean adding interpreter property to allow user to specify jars/packages.

@mfelgamal
Copy link
Contributor Author

I think that the livy itself doesn't support add (--jars/--packages property in spark-submit)

@zjffdu
Copy link
Contributor

zjffdu commented Aug 2, 2016

You can use --conf spark.jars=<jars> and --conf spark.jars.packages=<packages> instead

@mfelgamal
Copy link
Contributor Author

@zjffdu I added spark.jars.packages property in the recent commit, and it works correctly.

@mfelgamal
Copy link
Contributor Author

@zjffdu do you have any further comments ?

<table class="table-configuration">
<tr>
<th>Property</th>
<th>Default</th>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this wouldn't be the "Default" right?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

True, it's just example.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's change it to Example?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@felixcheung done.

@felixcheung
Copy link
Member

could you update the title of this PR and JIRA ZEPPELIN-1258, if we are changing how we would approach this?

@mfelgamal
Copy link
Contributor Author

@felixcheung I suggests this title "Adding packages to livy interpreter", Is it Ok ?

@felixcheung
Copy link
Member

how about "Add Spark packages support to Livy interpreter"?

@mfelgamal mfelgamal changed the title Adding extra libraries to livy Add Spark packages support to Livy interpreter Aug 7, 2016
@mfelgamal
Copy link
Contributor Author

@felixcheung done.

@mfelgamal mfelgamal changed the title Add Spark packages support to Livy interpreter [ZEPPELIN-1258] Add Spark packages support to Livy interpreter Aug 9, 2016
@mfelgamal
Copy link
Contributor Author

@felixcheung do you have any further comments ?

@felixcheung
Copy link
Member

There is a test failure?

@mfelgamal
Copy link
Contributor Author

@felixcheung : This PR does not change the code. It is just a change in the configurations. So, it should not affect the tests

@felixcheung
Copy link
Member

That's a fair point.

I'll merge if there is no more discussion.

@asfgit asfgit closed this in b619699 Aug 15, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants