You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I recently set up Spark (http://github.com/mesos/spark) to use the sbt-assembly plugin, and it works great the first time I run sbt assembly, but if I run this again afterwards I get errors like this:
The way sbt-assembly is implemented is that it unzips all the jars in the classpath and add *.class to the assembled jar in addition to the regular *.class files. You have the following entry there:
unmanagedJars in Compile <<= baseDirectory map { base => (base ** "*.jar").classpath }
Could it be picking up other stuff like the previously assembled jar in target/?
I recently set up Spark (http://github.com/mesos/spark) to use the sbt-assembly plugin, and it works great the first time I run sbt assembly, but if I run this again afterwards I get errors like this:
If I do sbt clean followed by assembly again though, it works fine. Any idea why this is happening? Am I doing something wrong in my project? Here is my SBT script: https://github.com/mesos/spark/blob/master/project/SparkBuild.scala.
The text was updated successfully, but these errors were encountered: