Only one SparkContext should be running in this JVM (see SPARK-2243).The currently running SparkCont

错误信息:

scala> import org.apache.spark._
import org.apache.spark._

scala> import org.apache.spark.streaming._
import org.apache.spark.streaming._

scala> val conf = new SparkConf().setMaster("local[2]").setAppName("NetworkWordCount")20/02/09 15:11:31 WARN ProcfsMetricsGetter: Exception when trying to compute pagesize, as a result reporting of ProcessTree metrics is stopped

conf: org.apache.spark.SparkConf = org.apache.spark.SparkConf@78168463

scala> val ssc = new StreamingContext(conf, Seconds(1))
org.apache.spark.SparkException: Only one SparkContext should be running in this JVM (see SPARK-2243).The currently running SparkContext was created at:
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:887)
org.apache.spark.repl.Main$.createSparkSession(Main.scala:106)
<init>(<console>:15)
<init>(<console>:42)
<init>(<console>:44)
.<init>(<console>:48)
.<clinit>(<console>)
.$print$lzycompute(<console>:7)
.$print(<console>:6)
$print(<console>)
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
java.lang.reflect.Method.invoke(Method.java:498)
scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:745)
scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1021)
scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:574)
scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:41)
scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:37)
scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41)
  at org.apache.spark.SparkContext$.$anonfun$assertNoOtherContextIsRunning$2(SparkContext.scala:2548)
  at scala.Option.foreach(Option.scala:407)
  at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2545)
  at org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:2622)
  at org.apache.spark.SparkContext.<init>(SparkContext.scala:89)
  at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:850)
  at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:84)
  ... 51 elided

错误原因:

在这个JVM中只能有一个SparkContext运行,当前运行的SparkContext已经被Spark-shell创建了。

解决办法:


val ssc = new StreamingContext(conf, Seconds(1))
改为
val ssc = new StreamingContext(sc, Seconds(1))

例如:

scala> import org.apache.spark._
import org.apache.spark._

scala> import org.apache.spark.streaming._
import org.apache.spark.streaming._

scala> import org.apache.spark.streaming.StreamingContext._
import org.apache.spark.streaming.StreamingContext._

scala> val ssc = new StreamingContext(sc, Seconds(1))
ssc: org.apache.spark.streaming.StreamingContext = org.apache.spark.streaming.StreamingContext@1ef50c3b

scala> val ssc = new StreamingContext(sc, Seconds(1))
ssc: org.apache.spark.streaming.StreamingContext = org.apache.spark.streaming.StreamingContext@680a5de1
发布了380 篇原创文章 · 获赞 131 · 访问量 65万+

猜你喜欢

转载自blog.csdn.net/daqiang012/article/details/104235685