Skip to content

[SPARK-42273][CONNECT][TESTS] Skip Spark Connect tests if dependencies are not installed#39840

Closed
HyukjinKwon wants to merge 4 commits intoapache:masterfrom
HyukjinKwon:SPARK-42273
Closed

[SPARK-42273][CONNECT][TESTS] Skip Spark Connect tests if dependencies are not installed#39840
HyukjinKwon wants to merge 4 commits intoapache:masterfrom
HyukjinKwon:SPARK-42273

Conversation

@HyukjinKwon
Copy link
Member

What changes were proposed in this pull request?

This PR proposes to skip doctests if the dependencies for Spark Connect are not installed.
The skipping logic has to be added before the modules are loaded because it loads the module first when Python runs doctest. Otherwise, it fails as below:

Traceback (most recent call last):
  File "/usr/local/Cellar/python@3.9/3.9.10/Frameworks/Python.framework/Versions/3.9/lib/python3.9/runpy.py", line 197, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/local/Cellar/python@3.9/3.9.10/Frameworks/Python.framework/Versions/3.9/lib/python3.9/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/.../spark/python/pyspark/sql/tests/connect/test_connect_basic.py", line 29, in <module>
    from pyspark.sql.connect.client import Retrying
  File "/.../spark/python/pyspark/sql/connect/__init__.py", line 21, in <module>
    from pyspark.sql.connect.dataframe import DataFrame  # noqa: F401
  File "/.../spark/python/pyspark/sql/connect/dataframe.py", line 50, in <module>
    import pyspark.sql.connect.plan as plan
  File "/.../spark/python/pyspark/sql/connect/plan.py", line 26, in <module>
    import pyspark.sql.connect.proto as proto
  File "/.../spark/python/pyspark/sql/connect/proto/__init__.py", line 18, in <module>
    from pyspark.sql.connect.proto.base_pb2_grpc import *
  File "/.../spark/python/pyspark/sql/connect/proto/base_pb2_grpc.py", line 19, in <module>
    import grpc
ModuleNotFoundError: No module named 'grpc'

Why are the changes needed?

To make the tests pass without these optional dependencies.

Does this PR introduce any user-facing change?

No, dev-only.

How was this patch tested?

./python/run-tests --module pyspark-connect -p 1

@HyukjinKwon
Copy link
Member Author

cc @zhengruifeng

Copy link
Member

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1, LGTM. Thank you, @HyukjinKwon !

@HyukjinKwon HyukjinKwon marked this pull request as draft February 2, 2023 00:36
@HyukjinKwon HyukjinKwon marked this pull request as ready for review February 2, 2023 01:32
@HyukjinKwon
Copy link
Member Author

The last change was just to fix the type hint that dosen;t affect the test results.

Merged to master and branch-3.4.

HyukjinKwon added a commit that referenced this pull request Feb 2, 2023
…s are not installed

### What changes were proposed in this pull request?

This PR proposes to skip doctests if the dependencies for Spark Connect are not installed.
The skipping logic has to be added before the modules are loaded because it loads the module first when Python runs doctest. Otherwise, it fails as below:

```
Traceback (most recent call last):
  File "/usr/local/Cellar/python3.9/3.9.10/Frameworks/Python.framework/Versions/3.9/lib/python3.9/runpy.py", line 197, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/local/Cellar/python3.9/3.9.10/Frameworks/Python.framework/Versions/3.9/lib/python3.9/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/.../spark/python/pyspark/sql/tests/connect/test_connect_basic.py", line 29, in <module>
    from pyspark.sql.connect.client import Retrying
  File "/.../spark/python/pyspark/sql/connect/__init__.py", line 21, in <module>
    from pyspark.sql.connect.dataframe import DataFrame  # noqa: F401
  File "/.../spark/python/pyspark/sql/connect/dataframe.py", line 50, in <module>
    import pyspark.sql.connect.plan as plan
  File "/.../spark/python/pyspark/sql/connect/plan.py", line 26, in <module>
    import pyspark.sql.connect.proto as proto
  File "/.../spark/python/pyspark/sql/connect/proto/__init__.py", line 18, in <module>
    from pyspark.sql.connect.proto.base_pb2_grpc import *
  File "/.../spark/python/pyspark/sql/connect/proto/base_pb2_grpc.py", line 19, in <module>
    import grpc
ModuleNotFoundError: No module named 'grpc'
```

### Why are the changes needed?

To make the tests pass without these optional dependencies.

### Does this PR introduce _any_ user-facing change?

No, dev-only.

### How was this patch tested?

```
./python/run-tests --module pyspark-connect -p 1
```

Closes #39840 from HyukjinKwon/SPARK-42273.

Lead-authored-by: Hyukjin Kwon <gurwls223@apache.org>
Co-authored-by: Hyukjin Kwon <gurwls223@gmail.com>
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
(cherry picked from commit ca20e40)
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
snmvaughan pushed a commit to snmvaughan/spark that referenced this pull request Jun 20, 2023
…s are not installed

### What changes were proposed in this pull request?

This PR proposes to skip doctests if the dependencies for Spark Connect are not installed.
The skipping logic has to be added before the modules are loaded because it loads the module first when Python runs doctest. Otherwise, it fails as below:

```
Traceback (most recent call last):
  File "/usr/local/Cellar/python3.9/3.9.10/Frameworks/Python.framework/Versions/3.9/lib/python3.9/runpy.py", line 197, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/local/Cellar/python3.9/3.9.10/Frameworks/Python.framework/Versions/3.9/lib/python3.9/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/.../spark/python/pyspark/sql/tests/connect/test_connect_basic.py", line 29, in <module>
    from pyspark.sql.connect.client import Retrying
  File "/.../spark/python/pyspark/sql/connect/__init__.py", line 21, in <module>
    from pyspark.sql.connect.dataframe import DataFrame  # noqa: F401
  File "/.../spark/python/pyspark/sql/connect/dataframe.py", line 50, in <module>
    import pyspark.sql.connect.plan as plan
  File "/.../spark/python/pyspark/sql/connect/plan.py", line 26, in <module>
    import pyspark.sql.connect.proto as proto
  File "/.../spark/python/pyspark/sql/connect/proto/__init__.py", line 18, in <module>
    from pyspark.sql.connect.proto.base_pb2_grpc import *
  File "/.../spark/python/pyspark/sql/connect/proto/base_pb2_grpc.py", line 19, in <module>
    import grpc
ModuleNotFoundError: No module named 'grpc'
```

### Why are the changes needed?

To make the tests pass without these optional dependencies.

### Does this PR introduce _any_ user-facing change?

No, dev-only.

### How was this patch tested?

```
./python/run-tests --module pyspark-connect -p 1
```

Closes apache#39840 from HyukjinKwon/SPARK-42273.

Lead-authored-by: Hyukjin Kwon <gurwls223@apache.org>
Co-authored-by: Hyukjin Kwon <gurwls223@gmail.com>
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
(cherry picked from commit ca20e40)
Signed-off-by: Hyukjin Kwon <gurwls223@apache.org>
@HyukjinKwon HyukjinKwon deleted the SPARK-42273 branch January 15, 2024 00:48
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants