-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cleanup Documentation - Spark Core Classes #18
Comments
Thanks for reaching out! You can take on this if you want. Here is what I was thinking, and it's mostly just 2 parts. First, update the existing README and rust docs so that it somewhat matches up with the spark docs that you have linked. Like you mentioned on the Second, would be to identify where there are missing parts on the core classes and we can make a longer issue tracker to start to build out those areas. For instance, This issue will be a way to create a cleaner roadmap of work to be completed :) I can also help with anything on this as well. I have slowly been making a list of gaps as well. |
Great! Thanks for the thorough explanation. I'll doubtless have ore questions for you as I work on this. |
starting now! |
@sjrusso8 do you want me to create new issue trackers for every pyspark core class? |
Follow up question: I am seeing unresolved paths in |
Let’s update the readme with the “done/open” and add any additional missing sections to the readme as well. Then let’s create one issue per core class that should be implemented and can be implemented. There are some things like UDFs that would not be feasible because of how the remote spark cluster deserializes and runs the UDFs. Things like “toPandas” might be “toPolars” instead. I’m still on the fence on translating an arrow recordbatch to the spark “row” representation when using collect. Im open for suggestions! |
Did you refresh the git submodule? That’s probably because of the build step under “/core” that points to the spark connect protobuf located in the submodule. Make sure the submodule is checked out to the tag for 3.5.1 and not “main” |
I checked out version 3.5.1. The git submodule is up to date. taking a look at the build step now |
yep, works now! |
quick bump here @sjrusso8 |
Description
The overall documentation needs to be reviewed and matched against the Spark Core Classes and Functions. For instance, the
README
should be accurate to what functions and methods are currently implemented compared to the existing Spark API.However, there are probably a few misses that are either currently implemented but marked as
open
, or were accidentally excluded. Might consider adding a few sections for other classes likeStreamingQueryManager
,DataFrameNaFunctions
,DataFrameStatFunctions
, etc.The text was updated successfully, but these errors were encountered: