New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make the operator work for PySpark in spark master #181
Comments
@liyinan926 if this is not high priority for you, you can assign me for this, I will be playing soon with spark master anyways. (1st task only) |
@mrow4a Are you still interested in taking this? |
@liyinan926 Yes, on it now - I see |
It appears, that that flag is not taken into account, if specified otherwise by properties file - seems to lunch in cluster mode, but just a printout of driver log is a bit confusing.. |
@liyinan926 as I understand by #129, the goal is to reuse existing API, right (to freeze it for beta release) ? |
@mrow4a Can you clarify what you mean by |
@liyinan926 Python works out of the box without
or reusing existing
What do you think? (I personally would just resolve it by type:Python and mention in README) |
@mrow4a I think we should keep |
@liyinan926 you are totally right, I just remembered that some cases pass both --files and --py-files |
We still need to add support for the new config options |
Very strange is this with
And of course does not work. It adds the remote dependency to Python Path as s3a - needs investigation. |
Strange, to make
otherwise with just
|
The operator currently does not support PySpark, which is available now in the master branch of Spark. The following changes are needed to make the operator support PySpark in the master branch:
The text was updated successfully, but these errors were encountered: