HDFS文件权限不足导致Sqoop执行失败

故障信息

在使用sqoop从Mysql向HDFS导入文件时报错:

Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=root, access=WRITE, inode="/user/root/.staging":hdfs:hdfs:drwxr-xr-x

解决方法

1、查看hdfs上目录权限,可以看出/usrer/root目录的属主为hdfs,因此可以以hdfs用户的身份修改该目录权限:

[root@hdp1 ~]# hadoop dfs -ls /user/
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.

Found 12 items
drwx------   - accumulo  hdfs            0 2018-04-24 14:55 /user/accumulo
drwxrwx---   - ambari-qa hdfs            0 2018-04-24 15:00 /user/ambari-qa
drwxr-xr-x   - druid     hadoop          0 2018-04-24 14:56 /user/druid
drwxr-xr-x   - hbase     hdfs            0 2018-04-24 14:53 /user/hbase
drwxr-xr-x   - hcat      hdfs            0 2018-04-24 14:56 /user/hcat
drwxr-xr-x   - hdfs      hdfs            0 2018-04-25 18:19 /user/hdfs
drwxrwxrwx   - hive      hdfs            0 2018-05-29 21:26 /user/hive
drwxrwxr-x   - livy      hdfs            0 2018-04-24 14:54 /user/livy
drwxrwxr-x   - oozie     hdfs            0 2018-04-24 14:58 /user/oozie
drwxrwxrwx   - hdfs      hdfs            0 2018-05-29 20:41 /user/root
drwxrwxr-x   - spark     hdfs            0 2018-04-24 14:53 /user/spark
drwxr-xr-x   - zeppelin  hdfs            0 2018-04-24 14:54 /user/zeppelin

2、修改相应目录的权限:

sudo -u hdfs hadoop fs -chmod  -R 777 /user/root/

3、再次执行sqoop命令

参考:https://www.cnblogs.com/hapjin/p/4846853.html

猜你喜欢

转载自blog.csdn.net/wiborgite/article/details/80502781