版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/EVISWANG/article/details/78962760
一.问题描述
在Hadoop集群的一个节点上用sqoop import 从Oracle 取数据总报如下错误:
17/12/28 19:16:48
INFO mapreduce.Job: Task Id : attempt_1512535288105_6822_m_000000_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.RuntimeException: java.sql.SQLException: Io exception: The Network Adapter could not establish the connection
at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:726)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.lang.RuntimeException: java.sql.SQLException: Io exception: The Network Adapter could not establish the connection
at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:220)
at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:165)
... 9 more
Caused by: java.sql.SQLException: Io exception: The Network Adapter could not establish the connection
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:112)
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:146)
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:255)
at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:387)
at oracle.jdbc.driver.PhysicalConnection.<init>(PhysicalConnection.java:414)
at oracle.jdbc.driver.T4CConnection.<init>(T4CConnection.java:165)
at oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:35)
at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:801)
at java.sql.DriverManager.getConnection(DriverManager.java:571)
at java.sql.DriverManager.getConnection(DriverManager.java:215)
at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:302)
at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:213)
... 10 more
Error: java.lang.RuntimeException: java.lang.RuntimeException: java.sql.SQLException: Io exception: The Network Adapter could not establish the connection
at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:726)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.lang.RuntimeException: java.sql.SQLException: Io exception: The Network Adapter could not establish the connection
at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:220)
at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:165)
... 9 more
Caused by: java.sql.SQLException: Io exception: The Network Adapter could not establish the connection
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:112)
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:146)
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:255)
at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:387)
at oracle.jdbc.driver.PhysicalConnection.<init>(PhysicalConnection.java:414)
at oracle.jdbc.driver.T4CConnection.<init>(T4CConnection.java:165)
at oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:35)
at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:801)
at java.sql.DriverManager.getConnection(DriverManager.java:571)
at java.sql.DriverManager.getConnection(DriverManager.java:215)
at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:302)
at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:213)
... 10 more
二.问题分析
首先想到的是Oracle监听服务的问题,检查了一切正常,又单独测试了下连通性也没问题:
[root@SH-hadoop2020 tmp]#
sqoop list-tables --connect jdbc:oracle:thin:@ora-s.chxxa.com:1521:prcl --username xxxsuser --password xxxx
Warning: /usr/local/sqoop-1.4.5-cdh5.3.10/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/local/sqoop-1.4.5-cdh5.3.10/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/12/28 20:40:11 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.10
17/12/28 20:40:11 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/12/28 20:40:11 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.
17/12/28 20:40:11 INFO manager.SqlManager: Using default fetchSize of 1000
17/12/28 20:40:11 INFO manager.OracleManager: Time zone has been set to GMT
P2P_DEBT_CONTRACTSIGN
P2P_DEBT_CONTRACTSIGN_APP
PPP
P2P_INVESTMENT_DIVIDEND
P2P_QQ_DEBT
P2P_INVESTOR_ACCOUNT_HD
TMP_HAIDONG_SJ
TMP_HAIDONG_005
TMP_002
P2P_PRODUCT_DEMOTION_REL
TMP_HD_ZJ_003
TMP_D_LX
TMP_D_BJ
TMP_LIXI_11
Warning: /usr/local/sqoop-1.4.5-cdh5.3.10/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/local/sqoop-1.4.5-cdh5.3.10/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/12/28 20:40:11 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.10
17/12/28 20:40:11 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/12/28 20:40:11 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.
17/12/28 20:40:11 INFO manager.SqlManager: Using default fetchSize of 1000
17/12/28 20:40:11 INFO manager.OracleManager: Time zone has been set to GMT
P2P_DEBT_CONTRACTSIGN
P2P_DEBT_CONTRACTSIGN_APP
PPP
P2P_INVESTMENT_DIVIDEND
P2P_QQ_DEBT
P2P_INVESTOR_ACCOUNT_HD
TMP_HAIDONG_SJ
TMP_HAIDONG_005
TMP_002
P2P_PRODUCT_DEMOTION_REL
TMP_HD_ZJ_003
TMP_D_LX
TMP_D_BJ
TMP_LIXI_11
sqoop是可以正常的扫描Oracle的表的,查询数据也是没问题的:
[root@SH-hadoop2020 tmp]#
sqoop eval --connect jdbc:oracle:thin:@ora-s.xxxxa.com:1521:orcl --username scott --password xxxxxx --query "SELECT * FROM emp"
Warning: /usr/local/sqoop-1.4.5-cdh5.3.10/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/local/sqoop-1.4.5-cdh5.3.10/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/12/28 20:48:32 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.10
17/12/28 20:48:32 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/12/28 20:48:32 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.
17/12/28 20:48:32 INFO manager.SqlManager: Using default fetchSize of 1000
17/12/28 20:48:32 INFO manager.OracleManager: Time zone has been set to GMT
-------------------------------------------------------------------------------------------------------------------------------------------------------
| EMPNO | ENAME | JOB | MGR | HIREDATE | SAL | COMM | DEPTNO |
-------------------------------------------------------------------------------------------------------------------------------------------------------
| 7369 | SMITH | CLERK | 7902 | 1980-12-17 00:00:00.0 | 800 | (null) | 20 |
| 7499 | ALLEN | SALESMAN | 7698 | 1981-02-20 00:00:00.0 | 1600 | 300 | 30 |
| 7521 | WARD | SALESMAN | 7698 | 1981-02-22 00:00:00.0 | 1250 | 500 | 30 |
| 7566 | JONES | MANAGER | 7839 | 1981-04-02 00:00:00.0 | 2975 | (null) | 20 |
| 7654 | MARTIN | SALESMAN | 7698 | 1981-09-28 00:00:00.0 | 1250 | 1400 | 30 |
| 7698 | BLAKE | MANAGER | 7839 | 1981-05-01 00:00:00.0 | 2850 | (null) | 30 |
| 7782 | CLARK | MANAGER | 7839 | 1981-06-09 00:00:00.0 | 2450 | (null) | 10 |
| 7788 | SCOTT | ANALYST | 7566 | 1987-04-19 00:00:00.0 | 3000 | (null) | 20 |
| 7839 | KING | PRESIDENT | (null) | 1981-11-17 00:00:00.0 | 5000 | (null) | 10 |
| 7844 | TURNER | SALESMAN | 7698 | 1981-09-08 00:00:00.0 | 1500 | 0 | 30 |
| 7876 | ADAMS | CLERK | 7788 | 1987-05-23 00:00:00.0 | 1100 | (null) | 20 |
| 7900 | JAMES | CLERK | 7698 | 1981-12-03 00:00:00.0 | 950 | (null) | 30 |
| 7902 | FORD | ANALYST | 7566 | 1981-12-03 00:00:00.0 | 3000 | (null) | 20 |
| 7934 | MILLER | CLERK | 7782 | 1982-01-23 00:00:00.0 | 1300 | (null) | 10 |
-------------------------------------------------------------------------------------------------------------------------------------------------------
You have mail in /var/spool/mail/root
[root@SH-hadoop2020 tmp]#
[root@SH-hadoop2020 tmp]#
Warning: /usr/local/sqoop-1.4.5-cdh5.3.10/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/local/sqoop-1.4.5-cdh5.3.10/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/12/28 20:48:32 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.10
17/12/28 20:48:32 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/12/28 20:48:32 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.
17/12/28 20:48:32 INFO manager.SqlManager: Using default fetchSize of 1000
17/12/28 20:48:32 INFO manager.OracleManager: Time zone has been set to GMT
-------------------------------------------------------------------------------------------------------------------------------------------------------
| EMPNO | ENAME | JOB | MGR | HIREDATE | SAL | COMM | DEPTNO |
-------------------------------------------------------------------------------------------------------------------------------------------------------
| 7369 | SMITH | CLERK | 7902 | 1980-12-17 00:00:00.0 | 800 | (null) | 20 |
| 7499 | ALLEN | SALESMAN | 7698 | 1981-02-20 00:00:00.0 | 1600 | 300 | 30 |
| 7521 | WARD | SALESMAN | 7698 | 1981-02-22 00:00:00.0 | 1250 | 500 | 30 |
| 7566 | JONES | MANAGER | 7839 | 1981-04-02 00:00:00.0 | 2975 | (null) | 20 |
| 7654 | MARTIN | SALESMAN | 7698 | 1981-09-28 00:00:00.0 | 1250 | 1400 | 30 |
| 7698 | BLAKE | MANAGER | 7839 | 1981-05-01 00:00:00.0 | 2850 | (null) | 30 |
| 7782 | CLARK | MANAGER | 7839 | 1981-06-09 00:00:00.0 | 2450 | (null) | 10 |
| 7788 | SCOTT | ANALYST | 7566 | 1987-04-19 00:00:00.0 | 3000 | (null) | 20 |
| 7839 | KING | PRESIDENT | (null) | 1981-11-17 00:00:00.0 | 5000 | (null) | 10 |
| 7844 | TURNER | SALESMAN | 7698 | 1981-09-08 00:00:00.0 | 1500 | 0 | 30 |
| 7876 | ADAMS | CLERK | 7788 | 1987-05-23 00:00:00.0 | 1100 | (null) | 20 |
| 7900 | JAMES | CLERK | 7698 | 1981-12-03 00:00:00.0 | 950 | (null) | 30 |
| 7902 | FORD | ANALYST | 7566 | 1981-12-03 00:00:00.0 | 3000 | (null) | 20 |
| 7934 | MILLER | CLERK | 7782 | 1982-01-23 00:00:00.0 | 1300 | (null) | 10 |
-------------------------------------------------------------------------------------------------------------------------------------------------------
You have mail in /var/spool/mail/root
[root@SH-hadoop2020 tmp]#
[root@SH-hadoop2020 tmp]#
但就是抽取数据不可以
[root@SH-hadoop2020 tmp]#
sqoop import --connect jdbc:oracle:thin:@ora-s.xxxna.com:1521:orcl --username apsuser --password aps --table EMP -m 1 --hive-import --hive-table "crf_investor.emp" --hive-overwrite --null-non-string '' --null-string '' --delete-target-dir --hive-drop-import-delims
Warning: /usr/local/sqoop-1.4.5-cdh5.3.10/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/local/sqoop-1.4.5-cdh5.3.10/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/12/28 21:46:50 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.10
17/12/28 21:46:50 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/12/28 21:46:50 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
17/12/28 21:46:50 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
17/12/28 21:46:50 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.
17/12/28 21:46:50 INFO manager.SqlManager: Using default fetchSize of 1000
17/12/28 21:46:50 INFO tool.CodeGenTool: Beginning code generation
17/12/28 21:46:51 INFO manager.OracleManager: Time zone has been set to GMT
17/12/28 21:46:51 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM EMP t WHERE 1=0
17/12/28 21:46:51 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/local/hadoop-2.5.2
注: /tmp/sqoop-root/compile/8ffe4abd7f8676361bedd6311aa09900/EMP.java使用或覆盖了已过时的 API。
注: 有关详细信息, 请使用 -Xlint:deprecation 重新编译。
17/12/28 21:46:55 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/8ffe4abd7f8676361bedd6311aa09900/EMP.jar
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop-2.5.2/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hbase-0.98.9-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
17/12/28 21:46:56 INFO tool.ImportTool: Destination directory EMP deleted.
17/12/28 21:46:56 INFO manager.OracleManager: Time zone has been set to GMT
17/12/28 21:46:56 INFO manager.OracleManager: Time zone has been set to GMT
17/12/28 21:46:56 INFO mapreduce.ImportJobBase: Beginning import of EMP
17/12/28 21:46:56 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapre
17/12/28 21:46:58 INFO mapreduce.Job: Running job: job_1512535288105_6835
17/12/28 21:47:03 INFO mapreduce.Job: Job job_1512535288105_6835 running in uber mode : false
17/12/28 21:47:03 INFO mapreduce.Job: map 0% reduce 0%
17/12/28 21:47:11 INFO mapreduce.Job: Task Id : attempt_1512535288105_6835_m_000000_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.RuntimeException: java.sql.SQLRecoverableException: IO 错误: The Network Adapter could not establish the connection
at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:726)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.lang.RuntimeException: java.sql.SQLRecoverableException: IO 错误: The Network Adapter could not establish the connection
at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:220)
at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:165)
... 9 more
Warning: /usr/local/sqoop-1.4.5-cdh5.3.10/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/local/sqoop-1.4.5-cdh5.3.10/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/12/28 21:46:50 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.10
17/12/28 21:46:50 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/12/28 21:46:50 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
17/12/28 21:46:50 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
17/12/28 21:46:50 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.
17/12/28 21:46:50 INFO manager.SqlManager: Using default fetchSize of 1000
17/12/28 21:46:50 INFO tool.CodeGenTool: Beginning code generation
17/12/28 21:46:51 INFO manager.OracleManager: Time zone has been set to GMT
17/12/28 21:46:51 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM EMP t WHERE 1=0
17/12/28 21:46:51 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/local/hadoop-2.5.2
注: /tmp/sqoop-root/compile/8ffe4abd7f8676361bedd6311aa09900/EMP.java使用或覆盖了已过时的 API。
注: 有关详细信息, 请使用 -Xlint:deprecation 重新编译。
17/12/28 21:46:55 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/8ffe4abd7f8676361bedd6311aa09900/EMP.jar
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop-2.5.2/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hbase-0.98.9-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
17/12/28 21:46:56 INFO tool.ImportTool: Destination directory EMP deleted.
17/12/28 21:46:56 INFO manager.OracleManager: Time zone has been set to GMT
17/12/28 21:46:56 INFO manager.OracleManager: Time zone has been set to GMT
17/12/28 21:46:56 INFO mapreduce.ImportJobBase: Beginning import of EMP
17/12/28 21:46:56 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapre
17/12/28 21:46:58 INFO mapreduce.Job: Running job: job_1512535288105_6835
17/12/28 21:47:03 INFO mapreduce.Job: Job job_1512535288105_6835 running in uber mode : false
17/12/28 21:47:03 INFO mapreduce.Job: map 0% reduce 0%
17/12/28 21:47:11 INFO mapreduce.Job: Task Id : attempt_1512535288105_6835_m_000000_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.RuntimeException: java.sql.SQLRecoverableException: IO 错误: The Network Adapter could not establish the connection
at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:167)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:726)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.lang.RuntimeException: java.sql.SQLRecoverableException: IO 错误: The Network Adapter could not establish the connection
at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:220)
at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:165)
... 9 more
三.问题解决
Hadoop集群写数据的时候各个节点都要有策略, 读的时候是从本节点读的,但是写的话会分散到集群的其他节点。
四.其他注意事项
sqoop环境变量:
[root@SH-hadoop2020 ~]# echo $SQOOP_HOME
/usr/local/sqoop-1.4.5-cdh5.3.10
[root@SH-hadoop2020 ~]# cat /etc/profile
/usr/local/sqoop-1.4.5-cdh5.3.10
[root@SH-hadoop2020 ~]# cat /etc/profile
unset i
unset -f pathmunge
#set JAVA environment
export JAVA_HOME=/usr/lib/jvm/jdk1.7.0_76
export CLASSPATH=.:$JAVA_HOME/jre/lib/rt.jar:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
#set HADOOP environment
export HADOOP_HOME=/usr/local/hadoop-2.5.2
#set hbase-0.98.9-hadoop2 environment
export HBASE_HOME=/usr/local/hbase-0.98.9-hadoop
#set sqoop-1.4.5 environment
#export SQOOP_HOME=/usr/local/sqoop-1.4.5
export SQOOP_HOME=/usr/local/sqoop-1.4.5-cdh5.3.10
#export SQOOP_HOME=/usr/local/sqoop-1.4.6-cdh5.13.1
# Hive Env
export HIVE_HOME=/usr/local/apache-hive-2.1.0-bin
export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$ZOOKEEPER_HOME/bin:$SQOOP_HOME/bin:$HBASE_HOME/bin:$SQOOP_HOME/bin:$HIVE_HOME/bin:$KAFKA_HOME/bin
ulimit -SHn 102400
alias grep='grep --color'
unset -f pathmunge
#set JAVA environment
export JAVA_HOME=/usr/lib/jvm/jdk1.7.0_76
export CLASSPATH=.:$JAVA_HOME/jre/lib/rt.jar:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
#set HADOOP environment
export HADOOP_HOME=/usr/local/hadoop-2.5.2
#set hbase-0.98.9-hadoop2 environment
export HBASE_HOME=/usr/local/hbase-0.98.9-hadoop
#set sqoop-1.4.5 environment
#export SQOOP_HOME=/usr/local/sqoop-1.4.5
export SQOOP_HOME=/usr/local/sqoop-1.4.5-cdh5.3.10
#export SQOOP_HOME=/usr/local/sqoop-1.4.6-cdh5.13.1
# Hive Env
export HIVE_HOME=/usr/local/apache-hive-2.1.0-bin
export PATH=$PATH:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$ZOOKEEPER_HOME/bin:$SQOOP_HOME/bin:$HBASE_HOME/bin:$SQOOP_HOME/bin:$HIVE_HOME/bin:$KAFKA_HOME/bin
ulimit -SHn 102400
alias grep='grep --color'
lib库:
[root@SH-hadoop2020 local]# ls -l $SQOOP_HOME/lib
总用量 19812
-rw-rw-r-- 1 root root 224277 4月 13 2016 ant-contrib-1.0b3.jar
-rw-rw-r-- 1 root root 36455 4月 13 2016 ant-eclipse-1.0-jvm1.2.jar
-rw-r--r-- 1 root root 16761 1月 3 09:28 APSUSER_TEST1.java
-rw-rw-r-- 1 root root 437649 4月 13 2016 avro-1.7.6-cdh5.3.10.jar
-rw-rw-r-- 1 root root 180876 4月 13 2016 avro-mapred-1.7.6-cdh5.3.10-hadoop2.jar
-rw-rw-r-- 1 root root 58160 4月 13 2016 commons-codec-1.4.jar
-rw-rw-r-- 1 root root 241367 4月 13 2016 commons-compress-1.4.1.jar
-rw-rw-r-- 1 root root 109043 4月 13 2016 commons-io-1.4.jar
-rw-rw-r-- 1 root root 267634 4月 13 2016 commons-jexl-2.1.1.jar
-rw-rw-r-- 1 root root 60686 4月 13 2016 commons-logging-1.1.1.jar
-rw-r--r-- 1 root root 24520 12月 28 22:00 EMP.java
-rw-rw-r-- 1 root root 1648200 4月 13 2016 guava-11.0.2.jar
-rw-rw-r-- 1 root root 706710 4月 13 2016 hsqldb-1.8.0.10.jar
-rw-rw-r-- 1 root root 35058 4月 13 2016 jackson-annotations-2.3.0.jar
-rw-rw-r-- 1 root root 197986 4月 13 2016 jackson-core-2.3.1.jar
-rw-rw-r-- 1 root root 227500 4月 13 2016 jackson-core-asl-1.8.8.jar
-rw-rw-r-- 1 root root 914311 4月 13 2016 jackson-databind-2.3.1.jar
-rw-rw-r-- 1 root root 668564 4月 13 2016 jackson-mapper-asl-1.8.8.jar
-rw-rw-r-- 1 root root 31866 4月 13 2016 jsr305-2.0.1.jar
-rw-rw-r-- 1 root root 357994 4月 13 2016 kite-data-core-0.15.0-cdh5.3.10.jar
-rw-rw-r-- 1 root root 50941 4月 13 2016 kite-data-hive-0.15.0-cdh5.3.10.jar
-rw-rw-r-- 1 root root 27321 4月 13 2016 kite-data-mapreduce-0.15.0-cdh5.3.10.jar
-rw-rw-r-- 1 root root 25242 4月 13 2016 kite-hadoop-compatibility-0.15.0-cdh5.3.10.jar
-rw-r--r-- 1 root root 968668 6月 26 2017 mysql-connector-java-5.1.35.jar
-rw-r--r-- 1 root root 1536554 12月 27 17:28 ojdbc14.jar.bak
-rw-r--r-- 1 root root 2095932 12月 28 22:01 ojdbc5-11.2.0.3.jar_bak
-rw-r--r-- 1 root root 1879860 12月 28 21:59 ojdbc5.jar_qdownload
-rw-r--r-- 1 root root 2739670 12月 28 21:46 ojdbc6.jar
-rw-rw-r-- 1 root root 19827 4月 13 2016 opencsv-2.3.jar
-rw-rw-r-- 1 root root 29555 4月 13 2016 paranamer-2.3.jar
-rw-rw-r-- 1 root root 49344 4月 13 2016 parquet-avro-1.5.0-cdh5.3.10.jar
-rw-rw-r-- 1 root root 871537 4月 13 2016 parquet-column-1.5.0-cdh5.3.10.jar
-rw-rw-r-- 1 root root 20263 4月 13 2016 parquet-common-1.5.0-cdh5.3.10.jar
-rw-rw-r-- 1 root root 277476 4月 13 2016 parquet-encoding-1.5.0-cdh5.3.10.jar
-rw-rw-r-- 1 root root 352114 4月 13 2016 parquet-format-2.1.0-cdh5.3.10.jar
-rw-rw-r-- 1 root root 20384 4月 13 2016 parquet-generator-1.5.0-cdh5.3.10.jar
-rw-rw-r-- 1 root root 182818 4月 13 2016 parquet-hadoop-1.5.0-cdh5.3.10.jar
-rw-rw-r-- 1 root root 914346 4月 13 2016 parquet-jackson-1.5.0-cdh5.3.10.jar
-rw-rw-r-- 1 root root 26084 4月 13 2016 slf4j-api-1.7.5.jar
-rw-rw-r-- 1 root root 995968 4月 13 2016 snappy-java-1.0.4.1.jar
-rw-r--r-- 1 root root 584207 8月 9 14:21 sqljdbc4-4.0.jar
-rw-rw-r-- 1 root root 94672 4月 13 2016 xz-1.0.jar
You have mail in /var/spool/mail/root
[root@SH-hadoop2020 local]#
总用量 19812
-rw-rw-r-- 1 root root 224277 4月 13 2016 ant-contrib-1.0b3.jar
-rw-rw-r-- 1 root root 36455 4月 13 2016 ant-eclipse-1.0-jvm1.2.jar
-rw-r--r-- 1 root root 16761 1月 3 09:28 APSUSER_TEST1.java
-rw-rw-r-- 1 root root 437649 4月 13 2016 avro-1.7.6-cdh5.3.10.jar
-rw-rw-r-- 1 root root 180876 4月 13 2016 avro-mapred-1.7.6-cdh5.3.10-hadoop2.jar
-rw-rw-r-- 1 root root 58160 4月 13 2016 commons-codec-1.4.jar
-rw-rw-r-- 1 root root 241367 4月 13 2016 commons-compress-1.4.1.jar
-rw-rw-r-- 1 root root 109043 4月 13 2016 commons-io-1.4.jar
-rw-rw-r-- 1 root root 267634 4月 13 2016 commons-jexl-2.1.1.jar
-rw-rw-r-- 1 root root 60686 4月 13 2016 commons-logging-1.1.1.jar
-rw-r--r-- 1 root root 24520 12月 28 22:00 EMP.java
-rw-rw-r-- 1 root root 1648200 4月 13 2016 guava-11.0.2.jar
-rw-rw-r-- 1 root root 706710 4月 13 2016 hsqldb-1.8.0.10.jar
-rw-rw-r-- 1 root root 35058 4月 13 2016 jackson-annotations-2.3.0.jar
-rw-rw-r-- 1 root root 197986 4月 13 2016 jackson-core-2.3.1.jar
-rw-rw-r-- 1 root root 227500 4月 13 2016 jackson-core-asl-1.8.8.jar
-rw-rw-r-- 1 root root 914311 4月 13 2016 jackson-databind-2.3.1.jar
-rw-rw-r-- 1 root root 668564 4月 13 2016 jackson-mapper-asl-1.8.8.jar
-rw-rw-r-- 1 root root 31866 4月 13 2016 jsr305-2.0.1.jar
-rw-rw-r-- 1 root root 357994 4月 13 2016 kite-data-core-0.15.0-cdh5.3.10.jar
-rw-rw-r-- 1 root root 50941 4月 13 2016 kite-data-hive-0.15.0-cdh5.3.10.jar
-rw-rw-r-- 1 root root 27321 4月 13 2016 kite-data-mapreduce-0.15.0-cdh5.3.10.jar
-rw-rw-r-- 1 root root 25242 4月 13 2016 kite-hadoop-compatibility-0.15.0-cdh5.3.10.jar
-rw-r--r-- 1 root root 968668 6月 26 2017 mysql-connector-java-5.1.35.jar
-rw-r--r-- 1 root root 1536554 12月 27 17:28 ojdbc14.jar.bak
-rw-r--r-- 1 root root 2095932 12月 28 22:01 ojdbc5-11.2.0.3.jar_bak
-rw-r--r-- 1 root root 1879860 12月 28 21:59 ojdbc5.jar_qdownload
-rw-r--r-- 1 root root 2739670 12月 28 21:46 ojdbc6.jar
-rw-rw-r-- 1 root root 19827 4月 13 2016 opencsv-2.3.jar
-rw-rw-r-- 1 root root 29555 4月 13 2016 paranamer-2.3.jar
-rw-rw-r-- 1 root root 49344 4月 13 2016 parquet-avro-1.5.0-cdh5.3.10.jar
-rw-rw-r-- 1 root root 871537 4月 13 2016 parquet-column-1.5.0-cdh5.3.10.jar
-rw-rw-r-- 1 root root 20263 4月 13 2016 parquet-common-1.5.0-cdh5.3.10.jar
-rw-rw-r-- 1 root root 277476 4月 13 2016 parquet-encoding-1.5.0-cdh5.3.10.jar
-rw-rw-r-- 1 root root 352114 4月 13 2016 parquet-format-2.1.0-cdh5.3.10.jar
-rw-rw-r-- 1 root root 20384 4月 13 2016 parquet-generator-1.5.0-cdh5.3.10.jar
-rw-rw-r-- 1 root root 182818 4月 13 2016 parquet-hadoop-1.5.0-cdh5.3.10.jar
-rw-rw-r-- 1 root root 914346 4月 13 2016 parquet-jackson-1.5.0-cdh5.3.10.jar
-rw-rw-r-- 1 root root 26084 4月 13 2016 slf4j-api-1.7.5.jar
-rw-rw-r-- 1 root root 995968 4月 13 2016 snappy-java-1.0.4.1.jar
-rw-r--r-- 1 root root 584207 8月 9 14:21 sqljdbc4-4.0.jar
-rw-rw-r-- 1 root root 94672 4月 13 2016 xz-1.0.jar
You have mail in /var/spool/mail/root
[root@SH-hadoop2020 local]#
jdbc版本等。