场景一:windows平台上Intellij IDEA运行spark
打开spark安装目录下的conf 文件夹 D:\soft\spark\conf ,
将log4j.properties.template 复制重命名为log4j.properties,将其中的INFO修改为WARN(第二行位置) 后,只显示WARN和ERROR信息。将log4j.properties直接放至/src/main/resources/下,就可以了。
场景二 linux 环境下运行spark
1. cd $SPARK_HOME/conf目录下,拷贝一个log4j.properties.template,命名为log4j.properties
- $ cp log4j.properties.template log4j.properties
- # Set everything to be logged to the console
- log4j.rootCategory=WARN, console
- log4j.appender.console=org.apache.log4j.ConsoleAppender
- log4j.appender.console.target=System.err
- log4j.appender.console.layout=org.apache.log4j.PatternLayout
- log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
- # Settings to quiet third party logs that are too verbose
- log4j.logger.org.spark-project.jetty=WARN
- log4j.logger.org.spark-project.jetty.util.component.AbstractLifeCycle=ERROR
- log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
- log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
- log4j.logger.org.apache.parquet=ERROR
- log4j.logger.parquet=ERROR
- # SPARK-9183: Settings to avoid annoying messages when looking up nonexistent UDFs in SparkSQL with Hive support
- log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL
- log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR