Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-17625] [SQL] set expectedOutputAttributes when converting SimpleCatalogRelation to LogicalRelation #15182

Closed
wants to merge 2 commits into from

Conversation

wzhfy
Copy link
Contributor

@wzhfy wzhfy commented Sep 21, 2016

What changes were proposed in this pull request?

When converting SimpleCatalogRelation to LogicalRelation, we also need to set expectedOutputAttributes, then the output in LogicalRelation is the attributes in expectedOutputAttributes, otherwise, it will use its schema to generate new attributes (and new exprId's), so the output of LogicalRelation will be different from output of SimpleCatalogRelation.

As a result, if we have a table in InMemoryCatalog, we can't lookup it (get a SimpleCatalogRelation) and then use it to compose LogicalPlans.

How was this patch tested?

add a test case

@wzhfy wzhfy changed the title [SPARK-17625] set expectedOutputAttributes when converting SimpleCatalogRelation to LogicalRelation [SPARK-17625] [SQL] set expectedOutputAttributes when converting SimpleCatalogRelation to LogicalRelation Sep 21, 2016
@wzhfy
Copy link
Contributor Author

wzhfy commented Sep 21, 2016

@cloud-fan @gatorsmile

@wzhfy
Copy link
Contributor Author

wzhfy commented Sep 21, 2016

Maybe the position of the test case is not right, where should I put it?

@SparkQA
Copy link

SparkQA commented Sep 21, 2016

Test build #65725 has finished for PR 15182 at commit 3e7beb8.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

val plan = Dataset.ofRows(spark, Project(Seq(expr), relation))
plan.queryExecution.assertAnalyzed()
}
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Your test case can be simplified to

    val tableName = "tbl"
    withTable(tableName) {
      spark.range(10).select('id as 'i, 'id as 'j).write.saveAsTable(tableName)
      val relation = spark.sessionState.catalog.lookupRelation(TableIdentifier(tableName))
      val expr = relation.resolve("i")
      Dataset.ofRows(spark, Project(Seq(expr), relation))
    }

@gatorsmile
Copy link
Member

The fix looks ok to me, but you need to improve your PR description. It might not be easy for reviewers to understand your fix based on your PR description.

I do not know which test suite is the best place, but SQLQuerySuite is not good for sure. Your test case is not related to SQL. Let @cloud-fan make the decision. : )

@cloud-fan
Copy link
Contributor

Let's put it in DataFrameSuite

@wzhfy
Copy link
Contributor Author

wzhfy commented Sep 22, 2016

@cloud-fan Ok.

@gatorsmile
Copy link
Member

LGTM pending Jenkins

@cloud-fan
Copy link
Contributor

LGTM

@SparkQA
Copy link

SparkQA commented Sep 22, 2016

Test build #65756 has finished for PR 15182 at commit e2c3b9d.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@cloud-fan
Copy link
Contributor

thanks, merging to master!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants