Can't run embedded server with kafka-avro-serializer dep
kevomacartney opened this issue · comments
Hello,
I'm trying to write tests for my producer however I'm not having any luck. I am simply calling EmbeddedKafka.start()
but this fails immediately with the exception below:
Exception Stack
A needed class was not found. This could be due to an error in your runpath. Missing class: org/apache/kafka/common/metrics/MetricsContext
java.lang.NoClassDefFoundError: org/apache/kafka/common/metrics/MetricsContext
at net.manub.embeddedkafka.ops.KafkaOps.startKafka(kafkaOps.scala:52)
at net.manub.embeddedkafka.ops.KafkaOps.startKafka$(kafkaOps.scala:26)
at net.manub.embeddedkafka.EmbeddedKafka$.startKafka(EmbeddedKafka.scala:52)
at net.manub.embeddedkafka.ops.RunningKafkaOps.startKafka(kafkaOps.scala:81)
at net.manub.embeddedkafka.ops.RunningKafkaOps.startKafka$(kafkaOps.scala:74)
at net.manub.embeddedkafka.EmbeddedKafka$.startKafka(EmbeddedKafka.scala:52)
at net.manub.embeddedkafka.EmbeddedKafka$.start(EmbeddedKafka.scala:70)
at whoseturn.test.support.kafka.KafkaSupport.beforeAll(KafkaSupport.scala:10)
at whoseturn.test.support.kafka.KafkaSupport.beforeAll$(KafkaSupport.scala:9)
at whoseturn.web.todos.TodoFeedItemProducerSpec.beforeAll(TodoFeedItemProducerSpec.scala:14)
at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:212)
at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
at whoseturn.web.todos.TodoFeedItemProducerSpec.run(TodoFeedItemProducerSpec.scala:14)
at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
at org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13(Runner.scala:1320)
at org.scalatest.tools.Runner$.$anonfun$doRunRunRunDaDoRunRun$13$adapted(Runner.scala:1314)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1314)
at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24(Runner.scala:993)
at org.scalatest.tools.Runner$.$anonfun$runOptionallyWithPassFailReporter$24$adapted(Runner.scala:971)
at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1480)
at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
at org.scalatest.tools.Runner$.run(Runner.scala:798)
at org.scalatest.tools.Runner.run(Runner.scala)
at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2or3(ScalaTestRunner.java:40)
at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:27)
Caused by: java.lang.ClassNotFoundException: org.apache.kafka.common.metrics.MetricsContext
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 27 more
After some time googling, it seems like a dependancy problem so I removed my dependancies one at a time till I stopped getting this error. It turns out if I have kafka-avro-serializer
as a dependancy I will get this error. I don't really understand why and don't know what a possible solution could be. Any suggestions as to what I can do, I would really like to use both libraries if possible. Below is my build.sbt dependancies.
build.sbt
version := "0.1"
scalaVersion := "2.13.3"
resolvers += "Confluent" at "https://packages.confluent.io/maven/"
libraryDependencies ++= List(
"org.typelevel" %% "cats-core" % "2.1.1",
"com.typesafe.akka" %% "akka-stream" % "2.6.8",
"org.typelevel" %% "cats-effect" % "2.1.4",
"com.typesafe.scala-logging" %% "scala-logging" % "3.9.2",
"org.slf4j" % "slf4j-api" % "1.7.5",
"org.slf4j" % "slf4j-simple" % "1.7.5",
"org.apache.commons" % "commons-io" % "1.3.2",
"org.apache.avro" % "avro" % "1.9.2",
"org.apache.kafka" % "kafka-clients" % "2.6.0",
"io.github.embeddedkafka" %% "embedded-kafka" % "2.6.0",
"io.confluent" % "kafka-avro-serializer" % "5.5.1"
)
Hi @kelvinmac,
I see that you're using kafka-avro-serializer v5.5.1 along with embedded-kafka v2.6.0: maybe v2.5.1 would be more appropriate, but anyway I was wondering if you are using Confluent Schema Registry to store Avro schemas, in which case I'd recommend using embedded-kafka-schema-registry v5.5.1.
BTW this error is similar to #202, maybe they have something in common..
Hi @kelvinmac @francescopellegrini,
I had the similar issue with kafka-streams-avro-serde. Was able to fix this by excluding the kafka-client from kafka-avro-serializer.
libraryDependencies += "io.confluent" % "kafka-streams-avro-serde" % "5.5.1" exclude("org.apache.kafka", "kafka-clients")
Maybe this is a workaround.
@phamtrinli out of curiosity, what classes are you using out of kafka-streams-avro-serde? I noted that almost all of them are annotated as @Unstable
..
@francescopellegrini Hmm thats actually true. AFAICT we only use "io.confluent.kafka.streams.serdes.avro.SpecificAvroSerde". I think this is just a convenient Wrapper around io.confluent.kafka.serializers.KafkaAvroSerializer / io.confluent.kafka.serializers.KafkaAvroDeserializer.
Yeah, it's the same need I had when I wrote these utils. I wasn't comfortable with using unstable classes, so I just wrote my own wrappers (which btw are going to be removed from the library soon).
I fixed this issue by using v5.5.1, and as I was also working with Cassandra-All
library I had also to
dependencyOverrides ++= "com.google.guava" % "guava" % "18.0"
which fixed the problems I was having.