Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Revert "[SPARK-40165][BUILD] Update test plugins to latest versions" #37618

Closed
wants to merge 1 commit into from

Conversation

Yikun
Copy link
Member

@Yikun Yikun commented Aug 22, 2022

What changes were proposed in this pull request?

This reverts commit 3ed382f. #37598

Why are the changes needed?

This patch cause the error like:

[error] /home/runner/work/spark/spark/sql/hive-thriftserver/src/main/java/org/apache/hive/service/auth/HttpAuthUtils.java:36:1:  error: package org.apache.http.protocol does not exist
[error] import org.apache.http.protocol.BasicHttpContext;
[error]                                ^
[error] /home/runner/work/spark/spark/sql/hive-thriftserver/src/main/java/org/apache/hive/service/auth/HttpAuthUtils.java:156:1:  error: cannot find symbol
[error]     private final HttpContext httpContext;
[error]                   ^  symbol:   class HttpContext
[error]   location: class HttpKerberosClientAction
[error] 3 errors

The old cache + new pom passed. Then, cache is outdate, the real error happened.

According investigation, I found that this 3ed382f should be the real first bad commit.

I just found this job are using cache:
https://github.com/panbingkun/spark/runs/7938361118?check_suite_focus=true

With old cache + new: passed. (that's the reason that the original PR passed)

Without any cache + new pom: failed. (that's reason that CI break)

Does this PR introduce any user-facing change?

No

How was this patch tested?

CI passed, especially, 2.13 passed

@LuciferYang
Copy link
Contributor

Can we just revert maven-dependency-plugin?

@Yikun
Copy link
Member Author

Yikun commented Aug 22, 2022

@LuciferYang I'm not sure about this, but if one PR once breaked CI, we'd better revert the whole change to recover CI first and then re-submit new one.

@Yikun
Copy link
Member Author

Yikun commented Aug 22, 2022

At least 2.13 CI have been passed without any cache, let's wait for other CI to be green, then merge this first.

@LuciferYang
Copy link
Contributor

Wait a little moment , I only revert maven-dependency-plugin in https://github.com/LuciferYang/spark/runs/7955190045?check_suite_focus=true

@LuciferYang
Copy link
Contributor

And It seems strange that sbt local builds can also succeed :(

@Yikun
Copy link
Member Author

Yikun commented Aug 22, 2022

And It seems strange that sbt local builds can also succeed :(

You perhaps want to cleanup org.apache.maven.plugins and org.scalacheck in your local maven repo.

@LuciferYang
Copy link
Contributor

And It seems strange that sbt local builds can also succeed :(

You perhaps want to cleanup org.apache.maven.plugins and org.scalacheck in your local maven repo.

Thanks, let me try this

@Yikun
Copy link
Member Author

Yikun commented Aug 22, 2022

key: build-${{ hashFiles('**/pom.xml', 'project/build.properties', 'build/mvn', 'build/sbt', 'build/sbt-launch-lib.bash', 'build/spark-build-info') }}
restore-keys: |
  build-${{ hashFiles('**/pom.xml', 'project/build.properties', 'build/mvn', 'build/sbt', 'build/sbt-launch-lib.bash', 'build/spark-build-info') }}

emmm...BTW, to avoid this kind of problem happened again, I think restore-keys should be match strictly to restore cache (yep, is equal to remove restore-keys, see scala sbt offical example ), otherwise it will use recent build- cache, once local repo cache has some dependency error, the CI will not figure out and then give a fake green.

Need some inputs from @HyukjinKwon @dongjoon-hyun WDYT?

@LuciferYang
Copy link
Contributor

@LuciferYang I'm not sure about this, but if one PR once breaked CI, we'd better revert the whole change to recover CI first and then re-submit new one.

+1, Agree

On the other hand, only revert maven-dependency-plugin seems also pass, but I haven't found root cause yet.

https://github.com/LuciferYang/spark/runs/7955190045?check_suite_focus=true

@Yikun
Copy link
Member Author

Yikun commented Aug 22, 2022

It's time to sleep:

  • Let's merge this PR to recover CI after this PR CI is all green
  • We might want to remove restore-keys in a separate PR to improve infra to catch similar failure in time.
  • After restore-keys apply, then re-submit 3ed382f , make sure cache missed and is not restore, we might see CI failed then fix issue.

@Yikun Yikun marked this pull request as ready for review August 22, 2022 16:03
Copy link
Contributor

@LuciferYang LuciferYang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1,LGTM

Copy link
Member

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1, LGTM (if CI passes)

@dongjoon-hyun
Copy link
Member

All tests passed including Scala 2.13. Merged to master.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
5 participants