Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Duplicated entry" error when I run assembly twice #13

Closed
mateiz opened this issue Aug 30, 2011 · 2 comments
Closed

"Duplicated entry" error when I run assembly twice #13

mateiz opened this issue Aug 30, 2011 · 2 comments

Comments

@mateiz
Copy link

mateiz commented Aug 30, 2011

I recently set up Spark (http://github.com/mesos/spark) to use the sbt-assembly plugin, and it works great the first time I run sbt assembly, but if I run this again afterwards I get errors like this:

[error] {file:/Users/matei/workspace/spark/}core/*:assembly: java.util.zip.ZipException: duplicate entry: spark/Partitioner.class

If I do sbt clean followed by assembly again though, it works fine. Any idea why this is happening? Am I doing something wrong in my project? Here is my SBT script: https://github.com/mesos/spark/blob/master/project/SparkBuild.scala.

@eed3si9n
Copy link
Member

The way sbt-assembly is implemented is that it unzips all the jars in the classpath and add *.class to the assembled jar in addition to the regular *.class files. You have the following entry there:

unmanagedJars in Compile <<= baseDirectory map { base => (base ** "*.jar").classpath }

Could it be picking up other stuff like the previously assembled jar in target/?

@mateiz
Copy link
Author

mateiz commented Aug 30, 2011

Thanks, that was the problem! I changed it to only add the JARs in lib.

@mateiz mateiz closed this as completed Aug 30, 2011
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants