-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Fix] Fix spark/flink starter script error on windows #6435
Conversation
The purpose of this commit is to fix spark & flink job submit issue on windows platform.
Add more comments after last change..
os_type="Windows"; | ||
} else if (SystemUtils.IS_OS_MAC) { | ||
os_type="Mac"; | ||
} else if (SystemUtils.IS_OS_LINUX) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
switch?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you mean change the code to the code style "switch ... case" ? If this is true, I'll modify as you suggested.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think "switch... case" is needed here, because there are many different properties from SystemUtils(e.g. IS_OS_LINUX, IS_OS_WINDOWS,,,etc), if SystemUtils can return unique property "OS_TYPE" then we can change to Switch... case style easily.
if (local_os_type.toLowerCase().equals("windows")) { | ||
cmd_flink="%FLINK_HOME%/bin/flink.cmd"; | ||
} else if (local_os_type.toLowerCase().equals("linux")) { | ||
cmd_flink="${FLINK_HOME}/bin/flink"; | ||
} else if (local_os_type.toLowerCase().equals("unknown")) { | ||
cmd_flink="error"; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ditto
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed .
if (local_os_type.toLowerCase().equals("windows")) { | ||
cmd_spark="%SPARK_HOME%/bin/spark-submit.cmd"; | ||
} else if (local_os_type.toLowerCase().equals("linux")) { | ||
cmd_spark="${SPARK_HOME}/bin/spark-submit"; | ||
} else if (local_os_type.toLowerCase().equals("unknown")) { | ||
cmd_spark="error"; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ditto
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
fixed.
mege Issue6386 fix with "dev' branch
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please add test in e2e.
Hi, please tell me the full detailed steps for "add test in e2e" since I'm fresh on this. Bases on my understanding, it seems that it's unnecessary to add new test cases because current test case is enoguh to cover the bug fix. As I mentioned before, we can verify the fix like below: For conf file, just using the example conf file from "seatunnel_home/config" dir. |
Oh I see. How about add some UT to verify whether the output command is right. Our UT will exeucted both on windows and linux. |
What does UT mean ? user testcase ? |
unit test. you can refer Line 37 in ec533ec
|
I checked the source code of "TestFlinkParameter.java" and feel a bit confused. I can't find these classes in dev branch of base repo(apache/seatunnel)'s code line, About this PR, Is it also necessary to introduce some new java classes to finish a simple unit test case ? |
Yes, just create new java file with new test case. The |
please merge dev branch update |
The purpose of this commit is to fix spark & flink job submit issue on windows platform.
Add more comments after last change..
Sorry, I made an error operation and I'm trying to recover it |
I've finished java unit test cases and tested successfully. BTW, I also fixed windows specific starter script errors(e.g. start-seatunnel-flink-15-connector-v2.cmd, start-seatunnel-spark-3-connector-v2.cmd) because in 2.3.4 release version running these scripts will trigger "JAVA CLASSNOTFOUNDEXCEPTION". |
seatunnel_issue6386-fix- with_java_unit_test
pom.xml
Outdated
@@ -55,7 +55,7 @@ | |||
|
|||
<properties> | |||
<!--todo The classification is too confusing, reclassify by type--> | |||
<revision>2.3.5-SNAPSHOT</revision> | |||
<revision>2.3.4</revision> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
revert?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, I forgot to modify this file because on my local windows environment setting "2.3.5-SNAPSHOT" will cause build fails.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixed.
fix incorrect version issue from [pom.xml]
If the merge conflict cannot be resolved, you can resubmit the PR |
I closed this PR and submitted a new PR: |
This PR is to fix seatunnel issue 6386
Does this PR introduce any user-facing change?
No
How was this patch tested?
This pach can be tested via general windows specific flink/spark starter script.
Flink: start-seatunnel-flink-13(15)-connector-v2.cmd
Spark: start-seatunnel-spark-2(3)-connector-v2.cmd
For conf file, just using the example conf file from "seatunnel_home/config" dir.
v2.batch.config.template, v2.streaming.template
Check list