spark + scala开发过程中,遇到了netty版本冲突或包缺失导致的error,参考解决办法:
- 在终端中进入项目根目录,也就是pom文件所在的目录,运行
mvn dependency:tree
- 随即找到netty所处的包:
[INFO] | +- org.apache.hadoop:hadoop-hdfs:jar:2.6.0-cdh5.15.1:compile
[INFO] | | +- io.netty:netty:jar:3.10.5.Final:compile
- 在指定依赖中添加:
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>${hadoop.version}</version>
<!--将netty包排除-->
<exclusions>
<exclusion>
<groupId>io.netty</groupId>
<artifactId>netty</artifactId>
</exclusion>
</exclusions>
</dependency>
- 然后添加高版本netty:
<!--解决io.netty.buffer.PooledByteBufAllocator.defaultNumHeapArena()I异常,-->
<dependency>
<groupId>io.netty</groupId>
<artifactId>netty-all</artifactId>
<version>4.1.18.Final</version>
</dependency>