Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] [engine-server] Failed to detect a valid hadoop home directory #5892

Closed
3 tasks done
SleeperZLX opened this issue Nov 21, 2023 · 4 comments
Closed
3 tasks done

Comments

@SleeperZLX
Copy link

Search before asking

  • I had searched in the issues and found no similar issues.

What happened

I followed https://seatunnel.apache.org/docs/2.3.3/contribution/setup to download seatunnel and run it on local machine. Meanwhile, I utilized mongo connector in seatunnel-examples/seatunnel-engine-examples to read mongo data and sink to another mongo collection. However, it showed me error: java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset, while in the setup document hadoop was not necessary. So I want to know how to fix this bug? Install hadoop in my machine(Windows 11)?

SeaTunnel Version

2.3.3

SeaTunnel Config

just download source code of seatunnel 2.3.3-release from github

Running Command

Just use IDEA 2022.2 and press "Run" button in the main function of seatunnel-examples\seatunnel-engine-examples\src\main\java\org\apache\seatunnel\example\engine\SeaTunnelEngineExample.java

Error Exception

2023-11-21 21:07:39,433 DEBUG org.apache.hadoop.util.Shell - Failed to find winutils.exe
java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. -see https://wiki.apache.org/hadoop/WindowsProblems
	at org.apache.hadoop.util.Shell.fileNotFoundException(Shell.java:549) ~[hadoop-common-3.1.4.jar:?]
	at org.apache.hadoop.util.Shell.getHadoopHomeDir(Shell.java:570) ~[hadoop-common-3.1.4.jar:?]
	at org.apache.hadoop.util.Shell.getQualifiedBin(Shell.java:593) ~[hadoop-common-3.1.4.jar:?]
	at org.apache.hadoop.util.Shell.<clinit>(Shell.java:690) ~[hadoop-common-3.1.4.jar:?]
	at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:78) ~[hadoop-common-3.1.4.jar:?]
	at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:3487) ~[hadoop-common-3.1.4.jar:?]
	at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:3482) ~[hadoop-common-3.1.4.jar:?]
	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3319) ~[hadoop-common-3.1.4.jar:?]
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:479) ~[hadoop-common-3.1.4.jar:?]
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:227) ~[hadoop-common-3.1.4.jar:?]
	at org.apache.seatunnel.engine.checkpoint.storage.hdfs.HdfsStorage.initStorage(HdfsStorage.java:68) ~[classes/:?]
	at org.apache.seatunnel.engine.checkpoint.storage.hdfs.HdfsStorage.<init>(HdfsStorage.java:57) ~[classes/:?]
	at org.apache.seatunnel.engine.checkpoint.storage.hdfs.common.HdfsFileStorageInstance.getOrCreateStorage(HdfsFileStorageInstance.java:53) ~[classes/:?]
	at org.apache.seatunnel.engine.checkpoint.storage.hdfs.HdfsStorageFactory.create(HdfsStorageFactory.java:75) ~[classes/:?]
	at org.apache.seatunnel.engine.server.checkpoint.CheckpointManager.<init>(CheckpointManager.java:103) ~[classes/:?]
	at org.apache.seatunnel.engine.server.master.JobMaster.initCheckPointManager(JobMaster.java:251) ~[classes/:?]
	at org.apache.seatunnel.engine.server.master.JobMaster.init(JobMaster.java:234) ~[classes/:?]
	at org.apache.seatunnel.engine.server.CoordinatorService.lambda$submitJob$5(CoordinatorService.java:461) ~[classes/:?]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_341]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_341]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_341]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_341]
	at java.lang.Thread.run(Thread.java:750) [?:1.8.0_341]
Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.
	at org.apache.hadoop.util.Shell.checkHadoopHomeInner(Shell.java:469) ~[hadoop-common-3.1.4.jar:?]
	at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:440) ~[hadoop-common-3.1.4.jar:?]
	at org.apache.hadoop.util.Shell.<clinit>(Shell.java:517) ~[hadoop-common-3.1.4.jar:?]
	... 19 more

Zeta or Flink or Spark Version

No response

Java or Scala Version

Java: 1.8.0_341
Scala: 2.11.12

Screenshots

No response

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Code of Conduct

@SleeperZLX SleeperZLX added the bug label Nov 21, 2023
Copy link

This issue has been automatically marked as stale because it has not had recent activity for 30 days. It will be closed in next 7 days if no further activity occurs.

@github-actions github-actions bot added the stale label Dec 22, 2023
Copy link

This issue has been closed because it has not received response for too long time. You could reopen it if you encountered similar problems in the future.

@Hu-WF
Copy link

Hu-WF commented Jan 9, 2024

same problem:
image

@FengSq
Copy link

FengSq commented Feb 29, 2024

所以,这个问题修复了吗

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants