Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-4051] [SQL] [PySpark] Convert Row into dictionary #2896

Closed
wants to merge 1 commit into from

Conversation

davies
Copy link
Contributor

@davies davies commented Oct 22, 2014

Added a method to Row to turn row into dict:

>>> row = Row(a=1)
>>> row.asDict()
{'a': 1}

@SparkQA
Copy link

SparkQA commented Oct 22, 2014

QA tests have started for PR 2896 at commit 8d97366.

  • This patch merges cleanly.

@SparkQA
Copy link

SparkQA commented Oct 22, 2014

QA tests have finished for PR 2896 at commit 8d97366.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds the following public classes (experimental):
    • raise TypeError("Cannot convert a Row class into dict")

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/22044/
Test PASSed.

@davies davies changed the title [SPARK-4051] [SQL] [PySQL] Convert Row into dictionary [SPARK-4051] [SQL] [PySpark] Convert Row into dictionary Oct 23, 2014
@liancheng
Copy link
Contributor

LGTM, thanks :)

@JoshRosen
Copy link
Contributor

This looks good to me, too. I guess it doesn't make sense for Row to extend dict, since it already extends tuple and uses tuple's indexing, etc, so this seems like a good approach. I'm going to merge this.

@asfgit asfgit closed this in d60a9d4 Oct 24, 2014
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
5 participants