报错:INFO hdfs.DataStreamer: Exception in createBlockOutputStream解决办法
错误信息如下:
[hadoop3@master ~]$ ls
app bin cnWin10.iso data part-r-00000 test_fenfu2000_1w.json tools zookeeper.out
[hadoop3@master ~]$ hdfs dfs -put cnWin10.iso /windows
2018-07-12 20:08:28,148 INFO hdfs.DataStreamer: Exception in createBlockOutputStream blk_1073741981_1157
java.io.IOException: Got error, status=ERROR, status message , ack with firstBadLink as 192.168.145.212:9866
at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:134)
at org.apache.hadoop.hdfs.protocol.datatransfer.DataTransferProtoUtil.checkBlockOpStatus(DataTransferProtoUtil.java:110)
at org.apache.hadoop.hdfs.DataStreamer.createBlockOutputStream(DataStreamer.java:1778)
at org.apache.hadoop.hdfs.DataStreamer.nextBlockOutputStream(DataStreamer.java:1679)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:716)
2018-07-12 20:08:28,149 WARN hdfs.DataStreamer: Abandoning BP-692736207-192.168.145.200-1530848165201:blk_1073741981_1157
2018-07-12 20:08:28,162 WARN hdfs.DataStreamer: Excluding datanode DatanodeInfoWithStorage[192.168.145.212:9866,DS-f9501a7e-acea-4553-bd5d-a10fa36244e7,DISK]
^C2018-07-12 20:08:47,463 INFO fs.FileSystem: Ignoring failure to deleteOnExit for path hdfs://mycluster/windows/cnWin10.iso._COPYING_
put: java.nio.channels.ClosedChannelException
at org.apache.hadoop.hdfs.ExceptionLastSeen.throwException4Close(ExceptionLastSeen.java:73)
at org.apache.hadoop.hdfs.DFSOutputStream.checkClosed(DFSOutputStream.java:153)
at org.apache.hadoop.fs.FSOutputSummer.write(FSOutputSummer.java:105)
at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FSDataOutputStream.java:57)
at java.io.DataOutputStream.write(DataOutputStream.java:107)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:94)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:66)
at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:127)
at org.apache.hadoop.fs.shell.CommandWithDestination$TargetFileSystem.writeStreamToFile(CommandWithDestination.java:485)
at org.apache.hadoop.fs.shell.CommandWithDestination.copyStreamToTarget(CommandWithDestination.java:407)
at org.apache.hadoop.fs.shell.CommandWithDestination.copyFileToTarget(CommandWithDestination.java:342)
at org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:277)
at org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:262)
at org.apache.hadoop.fs.shell.Command.processPaths(Command.java:331)
at org.apache.hadoop.fs.shell.Command.processPathArgument(Command.java:303)
at org.apache.hadoop.fs.shell.CommandWithDestination.processPathArgument(CommandWithDestination.java:257)
at org.apache.hadoop.fs.shell.Command.processArgument(Command.java:285)
at org.apache.hadoop.fs.shell.Command.processArguments(Command.java:269)
at org.apache.hadoop.fs.shell.CommandWithDestination.processArguments(CommandWithDestination.java:228)
at org.apache.hadoop.fs.shell.CopyCommands$Put.processArguments(CopyCommands.java:289)
at org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:119)
at org.apache.hadoop.fs.shell.Command.run(Command.java:176)
at org.apache.hadoop.fs.FsShell.run(FsShell.java:328)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
at org.apache.hadoop.fs.FsShell.main(FsShell.java:391)
解决办法:
关闭防火墙:
systemctl -stop firewalld.service
永久关闭防火墙:
systemctl disable firewalld.service
永久关闭selinux:
vi /etc/selinux/config文件
设置“SELINUX=disabled”
再次执行上传文件操作成功
[hadoop3@master ~]$ hdfs dfs -put cnWin10.iso /windows
[hadoop3@master ~]$