Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-32810][SQL][TESTS][FOLLOWUP][3.0] Check path globbing in JSON/CSV datasources v1 and v2 #29690

Conversation

MaxGekk
Copy link
Member

@MaxGekk MaxGekk commented Sep 9, 2020

What changes were proposed in this pull request?

In the PR, I propose to move the test SPARK-32810: CSV and JSON data sources should be able to read files with escaped glob metacharacter in the paths from DataFrameReaderWriterSuite to CSVSuite and to JsonSuite. This will allow to run the same test in CSVv1Suite/CSVv2Suite and in JsonV1Suite/JsonV2Suite.

Why are the changes needed?

To improve test coverage by checking JSON/CSV datasources v1 and v2.

Does this PR introduce any user-facing change?

No

How was this patch tested?

By running affected test suites:

$ build/sbt "sql/test:testOnly org.apache.spark.sql.execution.datasources.csv.*"
$ build/sbt "sql/test:testOnly org.apache.spark.sql.execution.datasources.json.*"

@SparkQA
Copy link

SparkQA commented Sep 9, 2020

Test build #128437 has finished for PR 29690 at commit 633e05e.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@HyukjinKwon
Copy link
Member

Merged to branch-3.0.

HyukjinKwon pushed a commit that referenced this pull request Sep 9, 2020
…CSV datasources v1 and v2

### What changes were proposed in this pull request?
In the PR, I propose to move the test `SPARK-32810: CSV and JSON data sources should be able to read files with escaped glob metacharacter in the paths` from `DataFrameReaderWriterSuite` to `CSVSuite` and to `JsonSuite`. This will allow to run the same test in `CSVv1Suite`/`CSVv2Suite` and in `JsonV1Suite`/`JsonV2Suite`.

### Why are the changes needed?
To improve test coverage by checking JSON/CSV datasources v1 and v2.

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
By running affected test suites:
```
$ build/sbt "sql/test:testOnly org.apache.spark.sql.execution.datasources.csv.*"
$ build/sbt "sql/test:testOnly org.apache.spark.sql.execution.datasources.json.*"
```

Closes #29690 from MaxGekk/globbing-paths-when-inferring-schema-dsv2-3.0.

Authored-by: Max Gekk <max.gekk@gmail.com>
Signed-off-by: HyukjinKwon <gurwls223@apache.org>
@HyukjinKwon HyukjinKwon closed this Sep 9, 2020
holdenk pushed a commit to holdenk/spark that referenced this pull request Oct 27, 2020
…CSV datasources v1 and v2

### What changes were proposed in this pull request?
In the PR, I propose to move the test `SPARK-32810: CSV and JSON data sources should be able to read files with escaped glob metacharacter in the paths` from `DataFrameReaderWriterSuite` to `CSVSuite` and to `JsonSuite`. This will allow to run the same test in `CSVv1Suite`/`CSVv2Suite` and in `JsonV1Suite`/`JsonV2Suite`.

### Why are the changes needed?
To improve test coverage by checking JSON/CSV datasources v1 and v2.

### Does this PR introduce _any_ user-facing change?
No

### How was this patch tested?
By running affected test suites:
```
$ build/sbt "sql/test:testOnly org.apache.spark.sql.execution.datasources.csv.*"
$ build/sbt "sql/test:testOnly org.apache.spark.sql.execution.datasources.json.*"
```

Closes apache#29690 from MaxGekk/globbing-paths-when-inferring-schema-dsv2-3.0.

Authored-by: Max Gekk <max.gekk@gmail.com>
Signed-off-by: HyukjinKwon <gurwls223@apache.org>
@MaxGekk MaxGekk deleted the globbing-paths-when-inferring-schema-dsv2-3.0 branch December 11, 2020 20:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants