ERROR Could not register mbeans java.security.AccessControlException: access denied ...

使用sqoop从mysql导数据到hive报错:

ERROR Could not register mbeans java.security.AccessControlException: access denied 

("javax.management.MBeanTrustPermission" "register")


sqoop是刚刚安装好的,运行sqoop version可以输出版本信息.但是运行import命令出错.经过确认,import语句是没有问题的,而且报错是在MR运行结束,loading data into hive的时候.详细error日志如下:

Please set $ACCUMULO_HOME to the root of your Accumulo installation.
18/06/22 12:27:39 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
18/06/22 12:27:39 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/06/22 12:27:39 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
18/06/22 12:27:39 INFO tool.CodeGenTool: Beginning code generation
18/06/22 12:27:39 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `dim_pty` AS t LIMIT 1
18/06/22 12:27:40 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `dim_pty` AS t LIMIT 1
18/06/22 12:27:40 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/hadoop/share/hadoop/mapreduce
Note: /tmp/sqoop-root/compile/0ff35a4244d7276dc8eacb8e1d4bd7fe/dim_pty.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
18/06/22 12:27:42 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/0ff35a4244d7276dc8eacb8e1d4bd7fe/dim_pty.jar
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/sqoop/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hive/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hive/lib/hive-jdbc-2.0.0-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
18/06/22 12:27:43 INFO tool.ImportTool: Destination directory dim_pty deleted.
18/06/22 12:27:43 WARN manager.MySQLManager: It looks like you are importing from mysql.
18/06/22 12:27:43 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
18/06/22 12:27:43 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
18/06/22 12:27:43 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
18/06/22 12:27:43 INFO mapreduce.ImportJobBase: Beginning import of dim_pty
18/06/22 12:27:43 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
18/06/22 12:27:43 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
18/06/22 12:27:43 INFO client.ConfiguredRMFailoverProxyProvider: Failing over to rm1
18/06/22 12:28:09 INFO db.DBInputFormat: Using read commited transaction isolation
18/06/22 12:28:09 INFO mapreduce.JobSubmitter: number of splits:1
18/06/22 12:28:10 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1528944069480_0011
18/06/22 12:28:10 INFO impl.YarnClientImpl: Submitted application application_1528944069480_0011
18/06/22 12:28:10 INFO mapreduce.Job: The url to track the job: http://hadoop002:8088/proxy/application_1528944069480_0011/
18/06/22 12:28:10 INFO mapreduce.Job: Running job: job_1528944069480_0011
18/06/22 12:28:20 INFO mapreduce.Job: Job job_1528944069480_0011 running in uber mode : false
18/06/22 12:28:20 INFO mapreduce.Job:  map 0% reduce 0%
18/06/22 12:28:27 INFO mapreduce.Job:  map 100% reduce 0%
18/06/22 12:28:29 INFO mapreduce.Job: Job job_1528944069480_0011 completed successfully
18/06/22 12:28:29 INFO mapreduce.Job: Counters: 30
        File System Counters
                FILE: Number of bytes read=0
                FILE: Number of bytes written=206437
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=87
                HDFS: Number of bytes written=0
                HDFS: Number of read operations=4
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=2
        Job Counters
                Launched map tasks=1
                Other local map tasks=1
                Total time spent by all maps in occupied slots (ms)=3938
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=3938
                Total vcore-seconds taken by all map tasks=3938
                Total megabyte-seconds taken by all map tasks=4032512
        Map-Reduce Framework
                Map input records=0
                Map output records=0
                Input split bytes=87
                Spilled Records=0
                Failed Shuffles=0
                Merged Map outputs=0
                GC time elapsed (ms)=85
                CPU time spent (ms)=1440
                Physical memory (bytes) snapshot=189157376
                Virtual memory (bytes) snapshot=2128355328
                Total committed heap usage (bytes)=147849216
        File Input Format Counters
                Bytes Read=0
        File Output Format Counters
                Bytes Written=0
18/06/22 12:28:29 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 46.1328 seconds (0 bytes/sec)
18/06/22 12:28:29 INFO mapreduce.ImportJobBase: Retrieved 0 records.
18/06/22 12:28:29 INFO mapreduce.ImportJobBase: Publishing Hive/Hcat import job data to Listeners for table dim_pty
18/06/22 12:28:29 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `dim_pty` AS t LIMIT 1
18/06/22 12:28:29 INFO hive.HiveImport: Loading uploaded data into Hive
2018-06-22 12:28:32,398 main ERROR Could not register mbeans java.security.AccessControlException: access denied ("javax.management.MBeanTrustPermission" "register")
        at java.security.AccessControlContext.checkPermission(AccessControlContext.java:472)
        at java.lang.SecurityManager.checkPermission(SecurityManager.java:585)
        at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.checkMBeanTrustPermission(DefaultMBeanServerInterceptor.java:1848)
        at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.registerMBean(DefaultMBeanServerInterceptor.java:322)
        at com.sun.jmx.mbeanserver.JmxMBeanServer.registerMBean(JmxMBeanServer.java:522)
        at org.apache.logging.log4j.core.jmx.Server.register(Server.java:379)
        at org.apache.logging.log4j.core.jmx.Server.reregisterMBeansAfterReconfigure(Server.java:171)
        at org.apache.logging.log4j.core.jmx.Server.reregisterMBeansAfterReconfigure(Server.java:147)
        at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:457)
        at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:246)
        at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:230)
        at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:140)
        at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:113)
        at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:98)
        at org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:156)
        at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jDefault(LogUtils.java:121)
        at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:73)
        at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:54)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:661)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:645)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:331)
        at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
        at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
        at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
        at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

18/06/22 12:28:32 WARN common.LogUtils: hive-site.xml not found on CLASSPATH

Logging initialized using configuration in jar:file:/home/hive/lib/hive-exec-2.0.0.jar!/hive-log4j2.properties
18/06/22 12:28:32 INFO SessionState:
Logging initialized using configuration in jar:file:/home/hive/lib/hive-exec-2.0.0.jar!/hive-log4j2.properties
18/06/22 12:28:32 INFO metastore.HiveMetaStore: 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
18/06/22 12:28:32 INFO metastore.ObjectStore: ObjectStore, initialize called
18/06/22 12:28:33 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/home/hive/lib/datanucleus-api-jdo-4.2.1.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/home/sqoop/lib/datanucleus-api-jdo-4.2.1.jar."
18/06/22 12:28:33 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/home/hive/lib/datanucleus-rdbms-4.1.7.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/home/sqoop/lib/datanucleus-rdbms-4.1.7.jar."
18/06/22 12:28:33 WARN DataNucleus.General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/home/hive/lib/datanucleus-core-4.1.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/home/sqoop/lib/datanucleus-core-4.1.6.jar."
18/06/22 12:28:33 INFO DataNucleus.Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
18/06/22 12:28:33 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored
18/06/22 12:28:41 ERROR bonecp.BoneCP: Unable to start/stop JMX

网上搜出来的都是更改jdk配置文件的办法,有尝试更改,但是无效.

解决办法:将hive-site.xml复制到${SQOOP_HOME}/conf下即可.

复制之后运行sqoop import命令成功.

本来已经解决问题了,但是后续给hive重新初始化后又遇到了新错误(如果您在这之后不更改hive-site.xml不会遇到该错误):

com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'hive1.deleteme1529656403521' doesn't exist in engine

解决办法点这里


猜你喜欢

转载自blog.csdn.net/weixin_39445556/article/details/80802459