potix2 / spark-google-spreadsheets

Google Spreadsheets datasource for SparkSQL and DataFrames

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Outdated Jackson?

antonkulaga opened this issue · comments

When I load it to my zeppelin notebook (spark 2.4.3 based)

%spark.dep
//Google spread sheets
z.load("com.github.potix2:spark-google-spreadsheets_2.11:0.6.2")
z.load("org.apache.commons:commons-lang3:3.8.1")
import org.apache.spark.sql.SQLContext

val sqlContext = new SQLContext(sc)

// Creates a DataFrame from a specified worksheet
val df = sqlContext.read.
    format("com.github.potix2.spark.google.spreadsheets").
    option("serviceAccountId", "antonkulaga@gmail.com").
    option("credentialPath", "file:///data/keyscross-species-0dc27a2c3c1c.p12").
    load("1uIOfsrmh_HKpDRZ_ajlwMafO451leuOD0aoHQwn2ljw/samples")

I get:

java.lang.NoSuchMethodError: com.fasterxml.jackson.core.JsonFactory.requiresPropertyOrdering()Z at com.fasterxml.jackson.databind.ObjectMapper.<init>(ObjectMapper.java:537) at com.fasterxml.jackson.databind.ObjectMapper.<init>(ObjectMapper.java:448) at org.apache.spark.metrics.sink.MetricsServlet.<init>(MetricsServlet.scala:48) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:200) at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:194) at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:130) at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:130) at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236) at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40) at scala.collection.mutable.HashMap.foreach(HashMap.scala:130) at org.apache.spark.metrics.MetricsSystem.registerSinks(MetricsSystem.scala:194) at org.apache.spark.metrics.MetricsSystem.start(MetricsSystem.scala:102) at org.apache.spark.SparkContext.<init>(SparkContext.scala:514)