web-dev-qa-db-fra.com

Erreur lors de l'initialisation de SparkContext: une URL principale doit être définie dans votre configuration.

J'ai utilisé ce code

Mon erreur est:

Using Spark's default log4j profile: org/Apache/spark/log4j-defaults.properties

17/02/03 20:39:24 INFO SparkContext: Running Spark version 2.1.0

17/02/03 20:39:25 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-Java classes where applicable

17/02/03 20:39:25 WARN SparkConf: Detected deprecated memory fraction 
settings: [spark.storage.memoryFraction]. As of Spark 1.6, execution and  
storage memory management are unified. All memory fractions used in the old 
model are now deprecated and no longer read. If you wish to use the old 
memory management, you may explicitly enable `spark.memory.useLegacyMode` 
(not recommended).

17/02/03 20:39:25 ERROR SparkContext: Error initializing SparkContext.

org.Apache.spark.SparkException: A master URL must be set in your 
configuration
at org.Apache.spark.SparkContext.<init>(SparkContext.scala:379)
at PCA$.main(PCA.scala:26)
at PCA.main(PCA.scala)
at Sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at Sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.Java:62)
at Sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.Java:43)
at Java.lang.reflect.Method.invoke(Method.Java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.Java:144)

17/02/03 20:39:25 INFO SparkContext: Successfully stopped SparkContext
Exception in thread "main" org.Apache.spark.SparkException: A master URL must be set in your configuration
at org.Apache.spark.SparkContext.<init>(SparkContext.scala:379)
at PCA$.main(PCA.scala:26)
at PCA.main(PCA.scala)
at Sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
   Sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.Java:62)
at   
    Sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.Java:43)

at Java.lang.reflect.Method.invoke(Method.Java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.Java:144)

Process finished with exit code 1
8
fakherzad

Si vous utilisez une étincelle autonome, alors

val conf = new SparkConf().setMaster("spark://master") //missing 

et vous pouvez passer le paramètre tout en soumettant le travail 

spark-submit --master spark://master

Si vous utilisez spark local, alors 

val conf = new SparkConf().setMaster("local[2]") //missing 

vous pouvez passer un paramètre pendant la soumission du travail 

spark-submit --master local

si vous exécutez étincelle sur le fil alors 

spark-submit --master yarn
7
Hutashan Chandrakar

Le message d'erreur est assez clair, vous devez fournir l'adresse du nœud Spark Master, via la variable SparkContext ou spark-submit:

val conf = 
  new SparkConf()
    .setAppName("ClusterScore")
    .setMaster("spark://172.1.1.1:7077") // <--- This is what's missing
    .set("spark.storage.memoryFraction", "1")

val sc = new SparkContext(conf)
4
Yuval Itzchakov
 SparkConf configuration = new SparkConf()
            .setAppName("Your Application Name")
            .setMaster("local");
 val sc = new SparkContext(conf);

Ça va marcher...

3
Shyam Gupta