【sqoop】异常解决

一、Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf$ConfVars

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf$ConfVars
    at org.apache.hive.hcatalog.common.HCatConstants.<clinit>(HCatConstants.java:74)
    at org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureHCat(SqoopHCatUtilities.java:297)
    at org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureImportOutputFormat(SqoopHCatUtilities.java:783)
    at org.apache.sqoop.mapreduce.ImportJobBase.configureOutputFormat(ImportJobBase.java:98)
    at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:259)
    at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
    at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
    at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf$ConfVars
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 14 more

Em...

sqoop缺少几个hive的包

cp $HIVE_HOME/lib/hive-shims* $SQOOP_HOME/lib/

二、ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.NullPointerException

Em...

修改sqoop-env.sh

添加($HIVE_HOME是hive的绝对路径,配了hive环境变量可以这样写)

export HCAT_HOME=$HIVE_HOME/hcatalog

三、ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: HCat exited with status 1

Em...

这个异常上边INFO里有错误日志文件的路径位置,但是看到日志文件搜索不到error,很恼火奥,未解

有知道原因和解决方案的dalao求教!感谢!

------------------------------------------------------------------------------------------------------------------------------------------------

我执行的是这个命令,导入并创建orc分区表到hive

sqoop import \
--connect jdbc:mysql://agent:3306/intelligentCoal \
--username root \
--password 123456 \
--table t_user \
--driver com.mysql.jdbc.Driver \
--hcatalog-database intelligentCoal \
--create-hcatalog-table \
--hcatalog-table t_user_orc \
--hcatalog-partition-keys event_month \
--hcatalog-partition-values 202010 \
--hcatalog-storage-stanza 'stored as orc tblproperties ("orc.compress"="SNAPPY")' \
-m 1

-------------------------------------------

我先创建好orc分区表,上边的命令把 --create-hcatalog-table 去掉是成功导入的

sqoop import \
--connect jdbc:mysql://agent:3306/intelligentCoal \
--username root \
--password 123456 \
--table t_user \
--driver com.mysql.jdbc.Driver \
--hcatalog-database intelligentCoal \
--hcatalog-table t_user_orc \
--hcatalog-partition-keys event_month \
--hcatalog-partition-values 202010 \
--hcatalog-storage-stanza 'stored as orc tblproperties ("orc.compress"="SNAPPY")' \
-m 1

四、Caused by: java.sql.SQLIntegrityConstraintViolationException: ORA-01400: cannot insert NULL

在sqoop从hive导出到Oracle时出现的

org.apache.sqoop.mapreduce.AsyncSqlOutputFormat: Got exception in update thread: java.sql.SQLIntegrityConstraintViolationException: ORA-01400: cannot insert NULL into ("ICI"."TAB_TASK"."TASK_FUNC")

检查TAB_TASK表中的TASK_FUNC字段有非空约束

把task_func字段的非空约束去掉,或者补充该字段对应的数据

猜你喜欢

转载自blog.csdn.net/qq_44065303/article/details/109385261