hadoopHDFS Maven依赖

版权声明:www.hsboy.cn [email protected] https://blog.csdn.net/oHongShu1/article/details/88911281
<dependencies>
		<dependency>
			<groupId>junit</groupId>
			<artifactId>junit</artifactId>
			<version>RELEASE</version>
		</dependency>
		<dependency>
			<groupId>org.apache.logging.log4j</groupId>
			<artifactId>log4j-core</artifactId>
			<version>2.8.2</version>
		</dependency>
		<!-- hadoop 注意对应集群的版本号-->
		<dependency>
		    <groupId>org.apache.hadoop</groupId>
		    <artifactId>hadoop-common</artifactId>
		    <version>3.2.0</version>
		</dependency>
		<dependency>
		    <groupId>org.apache.hadoop</groupId>
		    <artifactId>hadoop-client</artifactId>
		    <version>3.2.0</version>
		</dependency>
		<dependency>
		    <groupId>org.apache.hadoop</groupId>
		    <artifactId>hadoop-hdfs</artifactId>
		    <version>3.2.0</version>
		</dependency>
		<dependency>
			<groupId>jdk.tools</groupId>
			<artifactId>jdk.tools</artifactId>
			<version>1.8</version>
			<scope>system</scope>
			<systemPath>${JAVA_HOME}/lib/tools.jar</systemPath>
		</dependency>
</dependencies>

log4j.properties配置

参照下文链接进行设置,并稍作更改
log4j配置,控制台和文件输出(生产环境|centos服务器)

测试(上传)

@Test
public void testCopyFromLocalFile() throws IOException, InterruptedException, URISyntaxException {

		// 1 获取文件系统
		Configuration configuration = new Configuration();
		//configuration.set("dfs.replication", "3"); 在这设置或者复制配置文件到classpath
		FileSystem fs = FileSystem.get(new URI("hdfs://namenode:9000"), configuration, "root");//或者不在此设置用户,使用JVM参数 -DHADOOP_USER_NAME

		// 2 上传文件
		fs.copyFromLocalFile(new Path("e:/test.txt"), new Path("/test.txt"));

		// 3 关闭资源
		fs.close();

		System.out.println("done");
}

猜你喜欢

转载自blog.csdn.net/oHongShu1/article/details/88911281