问题场景
spark程序提交之后,运行之后,报错,报错提示如下:
Caused by: java.io.IOException: Couldn't set up IO streams
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:826)
at org.apache.hadoop.ipc.Client$Connection.access$3000(Client.java:396)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1557)
at org.apache.hadoop.ipc.Client.call(Client.java:1480)
... 16 more
Caused by: java.lang.OutOfMemoryError: unable to create new native thread
at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:717)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:819)
... 19 more
问题分析
因为是在Hadoop集群下跑spark任务,如果被Hadoop RPC
创建的线程数目达到节点设置的ulimit -u
(可以在节点控制台输入ulimit -u
查看具体的值)的值,Java就会将这个作为内存溢出异常
。
问题解决
在集群上所有节点增加如下配置:
vi /etc/security/limits.conf
# 新增以下内容,username是要生效的对象
username soft nproc 100000
username hard nproc 100000
保存之后,重启整个集群每个节点,重启hadoop集群即可