flume搜集数据到hadoop HA可能出现的问题及解决办法

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/haoxiaoyan/article/details/84304736

把Hadoop集群的hdfs-site.xml、core-site.xml两个配置文件复制到 flume安装目录的conf目录去,把hadoop-hdfs-2.7.1.jar复制到 Flume  lib目录。

ERROR - org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:459)] process failed

java.lang.IllegalArgumentException: java.net.UnknownHostException: ha1

解决方法

ha1公司Hadoop集群NameService的名字,这个错误是由于找不到Hadoop集群NameService造成的,所以需要把hdfs-site.xml复制到flume/conf目录。

root@:~# cp /etc/hadoop/conf/hdfs-site.xml /home//flume/conf/

 

java.io.IOException: Mkdirs failed to create /test/flume/16-09-19 (exists=false, cwd=file:/data/apache-flume-1.6.0-bin)

解决方法

root@:~# cp/etc/hadoop/conf/core-site.xml /home/flume/conf/

 

java.io.IOException: No FileSystem for scheme: hdfs

解决方法把hadoop-hdfs-*.jar复制到flume/lib目录下

root@:~# cp /opt/cloudera/parcels/CDH-5.13.3-1.cdh5.13.3.p0.2/lib/hadoop/client/hadoop-hdfs.jar /home/worker/flume/lib/

 

java.lang.NullPointerException: Expected timestamp in the Flume event headers, but it was null

 

原因是Event对象headers没有设置timestamp造成的,解决办法:设置a1.sinks.k1.hdfs.useLocalTimeStamp=true,使用本地时间戳。

猜你喜欢

转载自blog.csdn.net/haoxiaoyan/article/details/84304736