关于spark-submit报错java.lang.NoClassDefFoundError: scala/runtime/LambdaDeserialize
2017-04-23 08:27
253 查看
解决方案写在前面:将 Scala 的版本改成 2.11.8,环境是 Spark 2.1.0 。
当使用 spark-submit 提交 Scala App 时,代码仅仅做了一个filter或者map操作。然后报了一长串的错。
看起来像是Lambda表达式的问题。
但是在 spark-shell 里却是正常的。
由 http://blog.csdn.net/u013054888/article/details/54600229 得知可能是 Scala 版本的问题。我在 sbt 中写的是 2.12.2 。从 spark-shell 里可以看出应该使用 Scala-2.11.8 ,在 sbt 中把版本号改成 2.11.8 就一切和谐了。
当使用 spark-submit 提交 Scala App 时,代码仅仅做了一个filter或者map操作。然后报了一长串的错。
17/04/23 08:02:48 INFO DAGScheduler: ResultStage 0 (first at Main.scala:17) failed in 1.981 s due to Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, 192.168.1.100, executor 0): java.io.IOException: unexpected exception type at java.io.ObjectStreamClass.throwMiscException(ObjectStreamClass.java:1582) at java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1154) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2022) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2231) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2155) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2013) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2231) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2155) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2013) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2231) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2155) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2013) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1535) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:422) at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75) at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:114) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:80) at org.apache.spark.scheduler.Task.run(Task.scala:99) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at java.lang.invoke.SerializedLambda.readResolve(SerializedLambda.java:230) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at java.io.ObjectStreamClass.invokeReadResolve(ObjectStreamClass.java:1148) ... 23 more Caused by: java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: scala/runtime/LambdaDeserialize at Main$.$deserializeLambda$(Main.scala) ... 33 more Caused by: java.lang.NoClassDefFoundError: scala/runtime/LambdaDeserialize
看起来像是Lambda表达式的问题。
但是在 spark-shell 里却是正常的。
由 http://blog.csdn.net/u013054888/article/details/54600229 得知可能是 Scala 版本的问题。我在 sbt 中写的是 2.12.2 。从 spark-shell 里可以看出应该使用 Scala-2.11.8 ,在 sbt 中把版本号改成 2.11.8 就一切和谐了。
相关文章推荐
- SparkSQL toDF() java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/la
- 关于Spark 和 scala 版本冲突的问题
- spark-submit后出现错误 NoSuchMethodError: scala.runtime.VolatileObjectRef.zero()Lscala/runtime/Volatile..
- 好久不更新博客,关于spark-scala上开发的总结
- Scala - Spark Lambda“goesto“ => 分析
- 关于 Boost.Lambda
- 关于 Boost.Lambda
- 关于 Boost.Lambda
- 关于Eclipse出现java.lang.RuntimeException: Widget disposed too early! 的解决办法
- CFS调度器从2.6.25到2.6.29关于min_vruntime更新的跃进
- [转]关于Apache Portable Runtime and Tomcat
- 关于Eclipse出现java.lang.RuntimeException: Widget disposed too early! 的解决办法
- 关于 Boost.Lambda
- 关于Anonymous Delegate与Lambda Expression的趣事
- 关于VC Runtime Library
- CLTL2中关于Lambda-EXpress参数关键字的一些例子
- 关于 Boost.Lambda
- 关于VS2008中 Basic Runtime cheks的解释
- 关于Tomcat安装Apache Portable Runtime Utility Library 的问题
- 关于Scala的启动问题