報錯資訊:
Exception in thread "main" org.apache.spark.SparkException:
Only one SparkContext may be running in this JVM (see SPARK-2243).
To ignore this error, set spark.driver.allowMultipleContexts = true.
The currently running SparkContext was created at:
org.apache.spark.api.java.JavaSparkContext
原因:
出現這個問題的原因就是你建立了多個
SparkContext
,就像下面這種用法,隻需要幹掉J
avaSparkContext
就可:
SparkConf conf = new SparkConf()
.setAppName(“myapplication”).setMaster(“local[4]”);
JavaSparkContext jsc = new JavaSparkContext(conf);
JavaStreamingContext stream = new JavaStreamingContext(conf, Durations.seconds(10));
解決這個問題兩種方式:
方式1:
SparkConf conf = new SparkConf()
.setAppName(“myapplication”) .setMaster(“local[4]”);
JavaStreamingContext stream = new JavaStreamingContext(conf, Durations.seconds(10));
方式2:
SparkConf conf = new SparkConf()
.setAppName(“myapplication”).setMaster(“local[4]”);
JavaSparkContext jsc = new JavaSparkContext(conf);
JavaStreamingContext stream = new JavaStreamingContext(jsc, Durations.seconds(10));