1.启动spark后,运行bin/spark-shell会出现一个警告
WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2.解决办法:
(1)第一种:在linux环境变量里设置linux共享库:命令行输入一下命令
vim /etc/profile
export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native/:$LD_LIBRARY_PATH
source /etc/profile
(2)第二种:设置环境变量和conf/spark-env.sh
vim /etc/profile
export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native/
source /etc/profile
进入conf/spark-env.sh目录下:
vim conf/spark-env.sh
export LD_LIBRARY_PATH=$JAVA_LIBRARY_PATH
问题就解决了