ava.lang.IllegalArgumentException: System memory 259522560 must be at least 471859200

    groups with modify permissions:
   Set() 18/10/07 21:30:59 INFO Utils: Successfully started service   
   'sparkDriver' on port 53852. 18/10/07 21:30:59 INFO SparkEnv:   
   Registering MapOutputTracker 18/10/07 21:30:59 ERROR SparkContext:   
   Error initializing SparkContext. java.lang.IllegalArgumentException: 
   System memory 259522560 must be at least 471859200. Please increase  
   heap size using the    --driver-memory option or spark.driver.memory
   in Spark configuration. 	at org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:212)
      	at    org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:194)
      	at org.apache.spark.SparkEnv$.create(SparkEnv.scala:308) 	at    org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:165) 	at   
   org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256)
      	at org.apache.spark.SparkContext.<init>(SparkContext.scala:420) 	at    cn.itcast.wordcount.WordCount$.main(WordCount.scala:14) 	at   
   cn.itcast.wordcount.WordCount.main(WordCount.scala) 18/10/07 21:30:59
   INFO SparkContext: Successfully stopped SparkContext Exception in   
   thread "main" java.lang.IllegalArgumentException: System memory   
   259522560 must be at least 471859200. Please increase heap size using
   the --driver-memory option or spark.driver.memory in Spark   
   configuration. 	at   
   org.apache.spark.memory.UnifiedMemoryManager$.getMaxMemory(UnifiedMemoryManager.scala:212)
      	at    org.apache.spark.memory.UnifiedMemoryManager$.apply(UnifiedMemoryManager.scala:194)
      	at org.apache.spark.SparkEnv$.create(SparkEnv.scala:308) 	at    org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:165) 	at   
   org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256)
      	at org.apache.spark.SparkContext.<init>(SparkContext.scala:420) 	at    cn.itcast.wordcount.WordCount$.main(WordCount.scala:14) 	at   
   cn.itcast.wordcount.WordCount.main(WordCount.scala)

*

运行spark在本地运行测试时出错,解决方案:
在VM options上设置:-Xms256m -Xmx1024m
在这里插入图片描述

参考博客:https://blog.csdn.net/yizheyouye/article/details/50676022

猜你喜欢

转载自blog.csdn.net/weixin_43322685/article/details/82961748