You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Bug report. If you’ve found a bug, please provide a code snippet or test to reproduce it below.
The easier it is to track down the bug, the faster it is solved.
Feature Request. Start by telling us what problem you’re trying to solve.
Often a solution already exists! Don’t send pull requests to implement new features without
first getting our support. Sometimes we leave features out on purpose to keep the project small.
Issue description
Description
I use spark on Yarn. The localized jar file is included classpath duplicately when the jar path is configured as a symbolic link.
For example, if /some/__app__.jar is included after submitting a spark job and the symbolic link(/some -> /parent/some) is there, final classpath has both /parent/some/__app__.jar and /some/__app__.jar.
It has occurred when I use spark-submit with arguments as --packages, --jars.
I do some workarounds to solve this problem.
I localized the elasticsearch hadoop jar using --files.
I add ./* to classpath manually.
I read some similar issue(#579).
I think it is more reasonable to normalize some jars by identifying inode.
Steps to reproduce
The localized nodemanager's directory is composed of some symbolic link.
I use spark-submit and send jars through --packages or --jars arguments. (e.g. --packages org.elasticsearch:elasticsearch-spark-20_2.10:6.1.1)
I get Multiple ES-Hadoop versions detected in the classpath ... error messages.
like as:
java.lang.Error: Multiple ES-Hadoop versions detected in the classpath; please use only one
jar:file:/some/__app__.jar
jar:file:/parent/some/__app__.jar
What kind an issue is this?
The easier it is to track down the bug, the faster it is solved.
Often a solution already exists! Don’t send pull requests to implement new features without
first getting our support. Sometimes we leave features out on purpose to keep the project small.
Issue description
Description
I use
spark
onYarn
. The localized jar file is included classpath duplicately when the jar path is configured as a symbolic link.For example, if
/some/__app__.jar
is included after submitting a spark job and the symbolic link(/some -> /parent/some
) is there, final classpath has both/parent/some/__app__.jar
and/some/__app__.jar
.It has occurred when I use
spark-submit
with arguments as--packages
,--jars
.I do some workarounds to solve this problem.
--files
../*
to classpath manually.I read some similar issue(#579).
I think it is more reasonable to normalize some jars by identifying inode.
Steps to reproduce
spark-submit
and send jars through--packages
or--jars
arguments. (e.g.--packages org.elasticsearch:elasticsearch-spark-20_2.10:6.1.1
)Multiple ES-Hadoop versions detected in the classpath ...
error messages.like as:
Version Info
OS: : centos7
JVM : jdk-1.8.0_65
Hadoop/Spark: Spark 2.2.1
ES-Hadoop : elasticsearch-spark-20_2.10
ES : 6.1.1
The text was updated successfully, but these errors were encountered: