web-dev-qa-db-fra.com

Spark2.1.0 versions incompatibles de Jackson 2.7.6

J'essaie d'exécuter un exemple d'étincelle simple dans intellij, mais l'erreur est la suivante:

Exception in thread "main" Java.lang.ExceptionInInitializerError
at org.Apache.spark.SparkContext.withScope(SparkContext.scala:701)
at org.Apache.spark.SparkContext.textFile(SparkContext.scala:819)
at spark.test$.main(test.scala:19)
at spark.test.main(test.scala)
at Sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at Sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.Java:62)
at Sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.Java:43)
at Java.lang.reflect.Method.invoke(Method.Java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.Java:147)
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Incompatible Jackson version: 2.7.6
at com.fasterxml.jackson.module.scala.JacksonModule$class.setupModule(JacksonModule.scala:64)
at com.fasterxml.jackson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:19)
at com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.Java:730)
at org.Apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82)
at org.Apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala)

J'ai essayé de mettre à jour ma dépendance de Jackson, mais cela ne semble pas fonctionner, je fais ceci:

libraryDependencies += "com.fasterxml.jackson.core" % "jackson-core" % "2.8.7"
libraryDependencies += "com.fasterxml.jackson.core" % "jackson-databind" % "2.8.7"

mais les mêmes messages d'erreur apparaissent toujours, quelqu'un peut-il m'aider à corriger l'erreur?

Voici l'exemple de code spark:

object test {
def main(args: Array[String]): Unit = {
    if (args.length < 1) {
        System.err.println("Usage: <file>")
        System.exit(1)
    }

    val conf = new SparkConf()
    val sc = new SparkContext("local","wordcount",conf)
    val line = sc.textFile(args(0))

    line.flatMap(_.split(" ")).map((_, 1)).reduceByKey(_+_).collect().foreach(println)

    sc.stop()
    }
}

Et voici mon build.sbt:

name := "testSpark2"

version := "1.0"

scalaVersion := "2.11.8"


libraryDependencies += "com.fasterxml.jackson.core" % "jackson-core" % "2.8.7"
libraryDependencies += "com.fasterxml.jackson.core" % "jackson-databind" % "2.8.7"

libraryDependencies += "org.Apache.spark" % "spark-core_2.11" % "2.1.0"
libraryDependencies += "org.Apache.spark" % "spark-mllib_2.11" % "2.1.0"
libraryDependencies += "org.Apache.spark" % "spark-repl_2.11" % "2.1.0"
libraryDependencies += "org.Apache.spark" % "spark-streaming-flume_2.10" % "2.1.0"
libraryDependencies += "org.Apache.spark" % "spark-sql_2.11" % "2.1.0"
libraryDependencies += "org.Apache.spark" % "spark-network-shuffle_2.11" % "2.1.0"
libraryDependencies += "org.Apache.spark" % "spark-streaming-kafka-0-10_2.11" % "2.1.0"
libraryDependencies += "org.Apache.spark" % "spark-Hive_2.11" % "2.1.0"
libraryDependencies += "org.Apache.spark" % "spark-streaming-flume-Assembly_2.11" % "2.1.0"
libraryDependencies += "org.Apache.spark" % "spark-mesos_2.11" % "2.1.0"
libraryDependencies += "org.Apache.spark" % "spark-graphx_2.11" % "2.1.0"
libraryDependencies += "org.Apache.spark" % "spark-catalyst_2.11" % "2.1.0"
libraryDependencies += "org.Apache.spark" % "spark-launcher_2.11" % "2.1.0"
29
Yang

Spark 2.1.0 contient com.fasterxml.jackson.core en tant que dépendance transitive. Nous n’avons donc pas besoin d’inclure alors dans libraryDependencies

Mais si vous souhaitez ajouter une version différente des dépendances com.fasterxml.jackson.core, vous devez les remplacer. Comme ça:

name := "testSpark2"

version := "1.0"

scalaVersion := "2.11.8"


dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-core" % "2.8.7"
dependencyOverrides += "com.fasterxml.jackson.core" % "jackson-databind" % "2.8.7"
dependencyOverrides += "com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.8.7"

libraryDependencies += "org.Apache.spark" % "spark-core_2.11" % "2.1.0"
libraryDependencies += "org.Apache.spark" % "spark-mllib_2.11" % "2.1.0"
libraryDependencies += "org.Apache.spark" % "spark-repl_2.11" % "2.1.0"
libraryDependencies += "org.Apache.spark" % "spark-streaming-flume_2.11" % "2.1.0"
libraryDependencies += "org.Apache.spark" % "spark-sql_2.11" % "2.1.0"
libraryDependencies += "org.Apache.spark" % "spark-network-shuffle_2.11" % "2.1.0"
libraryDependencies += "org.Apache.spark" % "spark-streaming-kafka-0-10_2.11" % "2.1.0"
libraryDependencies += "org.Apache.spark" % "spark-Hive_2.11" % "2.1.0"
libraryDependencies += "org.Apache.spark" % "spark-streaming-flume-Assembly_2.11" % "2.1.0"
libraryDependencies += "org.Apache.spark" % "spark-mesos_2.11" % "2.1.0"
libraryDependencies += "org.Apache.spark" % "spark-graphx_2.11" % "2.1.0"
libraryDependencies += "org.Apache.spark" % "spark-catalyst_2.11" % "2.1.0"
libraryDependencies += "org.Apache.spark" % "spark-launcher_2.11" % "2.1.0"

Alors, changez votre build.sbt comme celui ci-dessus et cela fonctionnera comme prévu.

J'espère que ça aide!

43
himanshuIIITian

FYI. Dans mon cas, j'utilise spark et kafka-streams dans l'application, tandis que kafka-streams utilise com.fasterxml.jackson.core 2.8.5. Ajouter exclude comme ci-dessous a résolu le problème 

(gradle)

compile (group: "org.Apache.kafka", name: "kafka-streams", version: "0.11.0.0"){
    exclude group:"com.fasterxml.jackson.core"
}
6
Leon

Solution avec Gradle, en utilisant la resolutionStrategy ( https://docs.gradle.org/current/dsl/org.gradle.api.artifacts.ResolutionStrategy.html ):

configurations {

    all {

        resolutionStrategy {
            force 'com.fasterxml.jackson.core:jackson-core:2.4.4', 'com.fasterxml.jackson.core:jackson-databind:2.4.4', 'com.fasterxml.jackson.core:jackson-annotations:2.4.4'
        }
    }
}
1
Thomas Decaux