Hi
I'm on Spark 1.6.1, and it happens that we override Yarn classpath in yarn-site.xml. So I have a simple job that reads avro files using com.databricks.avro library. When I run my job like that, it works and reports success:
- ./bin/spark-submit --class com.test.MyJob --verbose --master yarn-cluster --conf
- spark.yarn.user.classpath.first=true --num-executors 5 /tmp/fat-app.jar
Then I get a warning that I should use spark.{driver,executor}.userClassPathFirst, because spark.yarn.user.classpath.first is deprecated. So I change it into the following and there is an exception:
- ./bin/spark-submit --class com.test.MyJob --verbose --master yarn-cluster --conf spark.executor.userClassPathFirst=true --conf spark.driver.userClassPathFirst=true --num-executors 5 /tmp/fat-app.jar
- Exception in thread "main" org.apache.spark.SparkException: Application application_1467198279864_0129 finished with failed status
- at org.apache.spark.deploy.yarn.Client.run(Client.scala:1034)
- at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1081)
- at org.apache.spark.deploy.yarn.Client.main(Client.scala)
- at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
- at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
- at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.j
- Max number of executor failures (10) reached
So those options are not interchangeable, or I'm doing something wrong here?
The exact exception is here:
- 16/06/29 15:31:17 ERROR server.TransportRequestHandler: Error while invoking RpcHandler#receive() on RPC id 7914409773016323976
- java.lang.ClassCastException: cannot assign instance of scala.concurrent.duration.FiniteDuration to field org.apache.spark.rpc.RpcTimeout.duration of type scala.concurrent.duration.FiniteDuration in instance of org.apache.spark.rpc.RpcTimeout
- at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2083)
- at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1261)
- at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1996)
- at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
- at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
- at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
- at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
- at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
- at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
- at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
- at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
- at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
- at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
- at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
- at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
- at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
- at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:109)
- at org.apache.spark.rpc.netty.NettyRpcEnv$anonfun$deserialize$1$anonfun$apply$1.apply(NettyRpcEnv.scala:258)
- at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
- at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:310)
- at org.apache.spark.rpc.netty.NettyRpcEnv$anonfun$deserialize$1.apply(NettyRpcEnv.scala:257)
- at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
- at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:256)
- at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:588)
- at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:570)
- at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:149)
- at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:102)
- at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:104)
- at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)
- at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
- at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
- at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
- at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
- at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
- at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
- at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:86)
- at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
- at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
- at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
- at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
- at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
- at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
- at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
- at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
- at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
Regards