Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-21588][SQL] SQLContext.getConf(key, null) should return null #18852

Closed
wants to merge 2 commits into from

Conversation

vinodkc
Copy link
Contributor

@vinodkc vinodkc commented Aug 5, 2017

What changes were proposed in this pull request?

In SQLContext.get(key,null) for a key that is not defined in the conf, and doesn't have a default value defined, throws a NPE. Int happens only when conf has a value converter

Added null check on defaultValue inside SQLConf.getConfString to avoid calling entry.valueConverter(defaultValue)

How was this patch tested?

Added unit test

@@ -808,6 +808,12 @@ class SQLQuerySuite extends QueryTest with SharedSQLContext {
Row("1"))
}

test("SPARK-21588 SQLContext.getConf(key, null) should return null") {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe directly test against SQLConf in SQLConfSuite, instead of via SQLContext?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the review comment.
Moved the test to SQLConfSuite

@SparkQA
Copy link

SparkQA commented Aug 5, 2017

Test build #80283 has finished for PR 18852 at commit 53b73ed.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Aug 5, 2017

Test build #80287 has finished for PR 18852 at commit b4055e1.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

test("SPARK-21588 SQLContext.getConf(key, null) should return null") {
assert(null == spark.conf.get("spark.sql.thriftServer.incrementalCollect", null))
assert("<undefined>" == spark.conf.get(
"spark.sql.thriftServer.incrementalCollect", "<undefined>"))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The test cases need to be improved.

    withSQLConf(SQLConf.SHUFFLE_PARTITIONS.key -> "1") {
      assert("1" == spark.conf.get(SQLConf.SHUFFLE_PARTITIONS.key, null))
      assert("1" == spark.conf.get(SQLConf.SHUFFLE_PARTITIONS.key, "<undefined>"))
    }

    assert(spark.conf.getOption("spark.sql.nonexistent").isEmpty)
    assert(null == spark.conf.get("spark.sql.nonexistent", null))
    assert("<undefined>" == spark.conf.get("spark.sql.nonexistent", "<undefined>"))

@gatorsmile
Copy link
Member

LGTM except a comment in test cases.

@gatorsmile
Copy link
Member

LGTM pending Jenkins.

}

assert(spark.conf.getOption("spark.sql.nonexistent").isEmpty)
assert(null == spark.conf.get("spark.sql.nonexistent", null))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because the key doesn't exist, this doesn't actually test the issue. This line passes without this change too.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The NPE is only happened for an existing entry like the above SHUFFLE_PARTITIONS or the previous spark.sql.thriftServer.incrementalCollect.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is to improve the test case coverage.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's good. But with current test, we don't actually test against the case of NPE which is the main issue.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I knew it, but I just do not want to introduce any regression. Thus, I just to cover both scenarios.

Copy link
Member

@viirya viirya Aug 6, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, those tests are good. We just should add another test for NPE case too. Otherwise, there's no regression test for the change from this PR.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The line 276 does not verify the fix? Why we still need to add another test case?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Aha, right. Sorry. Miss it.

@viirya
Copy link
Member

viirya commented Aug 6, 2017

LGTM

@SparkQA
Copy link

SparkQA commented Aug 6, 2017

Test build #80292 has finished for PR 18852 at commit 4295cd3.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

asfgit pushed a commit that referenced this pull request Aug 6, 2017
## What changes were proposed in this pull request?

In SQLContext.get(key,null) for a key that is not defined in the conf, and doesn't have a default value defined, throws a NPE. Int happens only when conf has a value converter

Added null check on defaultValue inside SQLConf.getConfString to avoid calling entry.valueConverter(defaultValue)

## How was this patch tested?
Added unit test

Author: vinodkc <vinod.kc.in@gmail.com>

Closes #18852 from vinodkc/br_Fix_SPARK-21588.

(cherry picked from commit 1ba967b)
Signed-off-by: gatorsmile <gatorsmile@gmail.com>
asfgit pushed a commit that referenced this pull request Aug 6, 2017
## What changes were proposed in this pull request?

In SQLContext.get(key,null) for a key that is not defined in the conf, and doesn't have a default value defined, throws a NPE. Int happens only when conf has a value converter

Added null check on defaultValue inside SQLConf.getConfString to avoid calling entry.valueConverter(defaultValue)

## How was this patch tested?
Added unit test

Author: vinodkc <vinod.kc.in@gmail.com>

Closes #18852 from vinodkc/br_Fix_SPARK-21588.

(cherry picked from commit 1ba967b)
Signed-off-by: gatorsmile <gatorsmile@gmail.com>
@gatorsmile
Copy link
Member

Thanks! Merging to master/2.2/2.1

@asfgit asfgit closed this in 1ba967b Aug 6, 2017
MatthewRBruce pushed a commit to Shopify/spark that referenced this pull request Jul 31, 2018
## What changes were proposed in this pull request?

In SQLContext.get(key,null) for a key that is not defined in the conf, and doesn't have a default value defined, throws a NPE. Int happens only when conf has a value converter

Added null check on defaultValue inside SQLConf.getConfString to avoid calling entry.valueConverter(defaultValue)

## How was this patch tested?
Added unit test

Author: vinodkc <vinod.kc.in@gmail.com>

Closes apache#18852 from vinodkc/br_Fix_SPARK-21588.

(cherry picked from commit 1ba967b)
Signed-off-by: gatorsmile <gatorsmile@gmail.com>
jzhuge pushed a commit to jzhuge/spark that referenced this pull request Aug 20, 2018
In SQLContext.get(key,null) for a key that is not defined in the conf, and doesn't have a default value defined, throws a NPE. Int happens only when conf has a value converter

Added null check on defaultValue inside SQLConf.getConfString to avoid calling entry.valueConverter(defaultValue)

Added unit test

Author: vinodkc <vinod.kc.in@gmail.com>

Closes apache#18852 from vinodkc/br_Fix_SPARK-21588.

(cherry picked from commit 1ba967b)
Signed-off-by: gatorsmile <gatorsmile@gmail.com>
@vinodkc vinodkc deleted the br_Fix_SPARK-21588 branch May 25, 2021 07:59
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants