spark.yarn.archive spark.yarn.jars

参考自:http://spark.apache.org/docs/latest/running-on-yarn.html#preparations

在spark on yarn模式下,/usr/local/spark-current2.3/conf下的spark-defaults.conf配置文件有一个spark.yarn.archive配置项

1.如果没有配置

#spark.yarn.archive         hdfs://ddd/project/spark2.3_test_liuzc/jar

程序运行时日志: 

20/03/25 16:18:10 WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
20/03/25 16:18:16 INFO yarn.Client: Uploading resource file:/tmp/spark-8b8a4eb9-365e-403b-9411-abc60c3ea461/__spark_libs__5628865856189033955.zip -> hdfs://ddd/user/pgxl/.sparkStaging/application_1570676918196_86932/__spark_libs__5628865856189033955.zip

2.如果已经配置

 spark.yarn.archive               hdfs://ddd/project/spark2.3_test_liuzc/jar

运行时日志:

20/03/25 16:21:36 INFO yarn.Client: Source and destination file systems are the same. Not copying hdfs://ddd/project/spark2.3_test_liuzc/jar

总结:(个人理解)

executor端运行所需的jar包,如果没有配置,会从本地把jar包压缩上传到一个hdfs临时目录,

                                           如果已经配置的话,会直接从hdfs路径上读取,可以节省一点时间

发布了79 篇原创文章 · 获赞 107 · 访问量 8万+

猜你喜欢

转载自blog.csdn.net/zuochang_liu/article/details/105097695
今日推荐