Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-46770][K8S][TESTS] Remove legacy docker-for-desktop logic #44796

Closed
wants to merge 1 commit into from

Conversation

dongjoon-hyun
Copy link
Member

@dongjoon-hyun dongjoon-hyun commented Jan 19, 2024

What changes were proposed in this pull request?

This PR aims to remove legacy docker-for-desktop logic in favor of docker-desktop.

Why are the changes needed?

Does this PR introduce any user-facing change?

No.

How was this patch tested?

Pass the CIs and manually test with Docker Desktop.

$ build/sbt -Psparkr -Pkubernetes -Pkubernetes-integration-tests -Dtest.exclude.tags=minikube,local -Dspark.kubernetes.test.deployMode=docker-desktop "kubernetes-integration-tests/test"
...
[info] KubernetesSuite:
[info] - SPARK-42190: Run SparkPi with local[*] (12 seconds, 759 milliseconds)
[info] - Run SparkPi with no resources (13 seconds, 747 milliseconds)
[info] - Run SparkPi with no resources & statefulset allocation (19 seconds, 688 milliseconds)
[info] - Run SparkPi with a very long application name. (12 seconds, 436 milliseconds)
[info] - Use SparkLauncher.NO_RESOURCE (17 seconds, 411 milliseconds)
[info] - Run SparkPi with a master URL without a scheme. (12 seconds, 352 milliseconds)
[info] - Run SparkPi with an argument. (17 seconds, 481 milliseconds)
[info] - Run SparkPi with custom labels, annotations, and environment variables. (12 seconds, 375 milliseconds)
[info] - All pods have the same service account by default (17 seconds, 375 milliseconds)
[info] - Run extraJVMOptions check on driver (9 seconds, 362 milliseconds)
[info] - SPARK-42474: Run extraJVMOptions JVM GC option check - G1GC (12 seconds, 319 milliseconds)
[info] - SPARK-42474: Run extraJVMOptions JVM GC option check - Other GC (9 seconds, 280 milliseconds
[info] - SPARK-42769: All executor pods have SPARK_DRIVER_POD_IP env variable (12 seconds, 404 milliseconds)
[info] - Verify logging configuration is picked from the provided SPARK_CONF_DIR/log4j2.properties (18 seconds, 198 milliseconds)
[info] - Run SparkPi with env and mount secrets. (19 seconds, 463 milliseconds)
[info] - Run PySpark on simple pi.py example (18 seconds, 373 milliseconds)
[info] - Run PySpark to test a pyfiles example (14 seconds, 435 milliseconds)
[info] - Run PySpark with memory customization (17 seconds, 334 milliseconds)
[info] - Run in client mode. (5 seconds, 235 milliseconds)
[info] - Start pod creation from template (12 seconds, 447 milliseconds)
[info] - SPARK-38398: Schedule pod creation from template (17 seconds, 351 milliseconds)
[info] - Test basic decommissioning (45 seconds, 365 milliseconds)
[info] - Test basic decommissioning with shuffle cleanup (49 seconds, 679 milliseconds)
[info] - Test decommissioning with dynamic allocation & shuffle cleanups (2 minutes, 52 seconds)
[info] - Test decommissioning timeouts (50 seconds, 379 milliseconds)
[info] - SPARK-37576: Rolling decommissioning (1 minute, 17 seconds)
[info] - Run SparkR on simple dataframe.R example (19 seconds, 453 milliseconds)
[info] YuniKornSuite:
[info] Run completed in 14 minutes, 39 seconds.
[info] Total number of tests run: 27
[info] Suites: completed 2, aborted 0
[info] Tests: succeeded 27, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[success] Total time: 1078 s (17:58), completed Jan 19, 2024, 12:12:23 AM

Was this patch authored or co-authored using generative AI tooling?

No.

@dongjoon-hyun
Copy link
Member Author

Could you review this K8s test PR, @LuciferYang ?

Copy link
Contributor

@LuciferYang LuciferYang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1, LGTM

@dongjoon-hyun
Copy link
Member Author

Thank you! I'll merge this because I verified this manually~

@dongjoon-hyun dongjoon-hyun deleted the SPARK-46770 branch January 19, 2024 08:23
szehon-ho pushed a commit to szehon-ho/spark that referenced this pull request Feb 7, 2024
### What changes were proposed in this pull request?

This PR aims to remove legacy `docker-for-desktop` logic in favor of `docker-desktop`.

### Why are the changes needed?

- Docker Desktop switched the underlying node name and context to `docker-desktop` in 2020.
  - docker/for-win#5089 (comment)
- Since Apache Spark 3.2.2, we have been hiding it from the documentation via SPARK-38272 and now we can delete it.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

Pass the CIs and manually test with Docker Desktop.

```
$ build/sbt -Psparkr -Pkubernetes -Pkubernetes-integration-tests -Dtest.exclude.tags=minikube,local -Dspark.kubernetes.test.deployMode=docker-desktop "kubernetes-integration-tests/test"
...
[info] KubernetesSuite:
[info] - SPARK-42190: Run SparkPi with local[*] (12 seconds, 759 milliseconds)
[info] - Run SparkPi with no resources (13 seconds, 747 milliseconds)
[info] - Run SparkPi with no resources & statefulset allocation (19 seconds, 688 milliseconds)
[info] - Run SparkPi with a very long application name. (12 seconds, 436 milliseconds)
[info] - Use SparkLauncher.NO_RESOURCE (17 seconds, 411 milliseconds)
[info] - Run SparkPi with a master URL without a scheme. (12 seconds, 352 milliseconds)
[info] - Run SparkPi with an argument. (17 seconds, 481 milliseconds)
[info] - Run SparkPi with custom labels, annotations, and environment variables. (12 seconds, 375 milliseconds)
[info] - All pods have the same service account by default (17 seconds, 375 milliseconds)
[info] - Run extraJVMOptions check on driver (9 seconds, 362 milliseconds)
[info] - SPARK-42474: Run extraJVMOptions JVM GC option check - G1GC (12 seconds, 319 milliseconds)
[info] - SPARK-42474: Run extraJVMOptions JVM GC option check - Other GC (9 seconds, 280 milliseconds
[info] - SPARK-42769: All executor pods have SPARK_DRIVER_POD_IP env variable (12 seconds, 404 milliseconds)
[info] - Verify logging configuration is picked from the provided SPARK_CONF_DIR/log4j2.properties (18 seconds, 198 milliseconds)
[info] - Run SparkPi with env and mount secrets. (19 seconds, 463 milliseconds)
[info] - Run PySpark on simple pi.py example (18 seconds, 373 milliseconds)
[info] - Run PySpark to test a pyfiles example (14 seconds, 435 milliseconds)
[info] - Run PySpark with memory customization (17 seconds, 334 milliseconds)
[info] - Run in client mode. (5 seconds, 235 milliseconds)
[info] - Start pod creation from template (12 seconds, 447 milliseconds)
[info] - SPARK-38398: Schedule pod creation from template (17 seconds, 351 milliseconds)
[info] - Test basic decommissioning (45 seconds, 365 milliseconds)
[info] - Test basic decommissioning with shuffle cleanup (49 seconds, 679 milliseconds)
[info] - Test decommissioning with dynamic allocation & shuffle cleanups (2 minutes, 52 seconds)
[info] - Test decommissioning timeouts (50 seconds, 379 milliseconds)
[info] - SPARK-37576: Rolling decommissioning (1 minute, 17 seconds)
[info] - Run SparkR on simple dataframe.R example (19 seconds, 453 milliseconds)
[info] YuniKornSuite:
[info] Run completed in 14 minutes, 39 seconds.
[info] Total number of tests run: 27
[info] Suites: completed 2, aborted 0
[info] Tests: succeeded 27, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[success] Total time: 1078 s (17:58), completed Jan 19, 2024, 12:12:23 AM
```

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes apache#44796 from dongjoon-hyun/SPARK-46770.

Authored-by: Dongjoon Hyun <dhyun@apple.com>
Signed-off-by: Dongjoon Hyun <dhyun@apple.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
2 participants