sbt / sbt-assembly

Deploy über-JARs. Restart processes. (port of codahale/assembly-sbt)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

publishing fat jar to repository leads to nearly empty jar file

jornfranke opened this issue · comments

Hallo,

I use sbt assembly to publish a jar file to an own enterprise repository. The reason is that they need to be packaged with a pyspark application. This works for most of the jars, but for some of the jars it always creates a nearly empty jar file with only the metadata definition. If I run

sbt +clean +assembly

The correct jar file with all including is created
However, if I run

sbt +clean +assembly +publish

then locally an empty jar file (containing only META-INF/Manifest) is created and also only the empty jar file is uploaded. Again for others this perfectly works fine. What I noticed that for the others working we crosscompile to Scala 2.12 and Scala 2.11. The one that do not work crosscompile only to Scala 2.11. But this might be not related.

Can you please help?

Sbt: 1.5.3

Assembly plugin: 1.0.0
Operating System: Reproducible under Linux and Windows.

build.sbt

import sbt._
import Keys._
import scala._

val rasterframesVersion = "0.9.1"
lazy val root = (project in file("."))
.settings(
organization := "org.locationtech.rasterframes",
name := "rasterframes-uberjar",
version :=rasterframesVersion+sys.env.get("VERSION_TYPE").getOrElse("-SNAPSHOT")
)
 .configs( IntegrationTest )
  .settings( Defaults.itSettings : _*)
  

  
fork  := true
crossScalaVersions := Seq("2.11.12")
autoScalaLibrary := false
scalacOptions += "-target:jvm-1.8"
artifact in (Compile, assembly) := {
  val art = (artifact in (Compile, assembly)).value
  art.withClassifier(Some(""))
}
addArtifact(artifact in (Compile, assembly), assembly)
assemblyJarName in assembly := {
     val newName = s"${name.value}_${scalaBinaryVersion.value}-${version.value}.jar"
     newName
}
publishConfiguration := publishConfiguration.value.withOverwrite(true)
publishLocalConfiguration := publishLocalConfiguration.value.withOverwrite(true)
publishTo := {
  if (isSnapshot.value)
    Some("Local Realm" at "http://localhost:8080/maven-snapshots-local/")
  else
     Some("Local Realm" at "http://localhost:8080/maven-releases-local/")
}

credentials += Credentials("Local Realm", "localhost", "", "")
assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)
assemblyMergeStrategy in assembly := {
 case PathList("META-INF", xs @ _*) =>
      (xs map {_.toLowerCase}) match {
        case ("manifest.mf" :: Nil) | ("index.list" :: Nil) | ("dependencies" :: Nil) =>
          MergeStrategy.discard
        case ps @ (x :: xs) if ps.last.endsWith(".sf") || ps.last.endsWith(".dsa") || ps.last.endsWith(".rsa") =>
          MergeStrategy.discard
        case "plexus" :: xs =>
          MergeStrategy.discard
        case "services" :: xs =>
          MergeStrategy.filterDistinctLines
        case ("spring.schemas" :: Nil) | ("spring.handlers" :: Nil) =>
          MergeStrategy.filterDistinctLines
        case _ => MergeStrategy.first
      }
    case   "application.conf"|"reference.conf" => MergeStrategy.concat
    case x => MergeStrategy.first
 }
// this is a very outdated httpclient which is due to the experimental package in rasterframes. Although we do not use it, we will see errors if it is not included :(
assemblyShadeRules in assembly := Seq(
   ShadeRule.rename("org.apache.commons.httpclient.**" -> "rasterframes.shade.org.apache.commons.httpclient.@1").inAll
)
libraryDependencies += "commons-httpclient" % "commons-httpclient" % "3.1"
// Raster frames
libraryDependencies += "org.locationtech.rasterframes" %% "rasterframes" % rasterframesVersion % "compile"
libraryDependencies += "org.locationtech.rasterframes" %% "rasterframes-datasource" % rasterframesVersion % "compile"
libraryDependencies += "org.locationtech.rasterframes" %% "pyrasterframes" % rasterframesVersion % "compile"
// Spark
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.4.4" % "provided"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.4" % "provided"

Thank you.

best regards

I found a workaround:
Instead of

sbt +clean +assembly +publish

just write

sbt +clean +publish

and the issue disappears