spark常见问题定位
2015-10-15 15:14
399 查看
1.eclipse下报java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
在https://github.com/srccodes/hadoop-common-2.2.0-bin页面下载hadoop-common-2.2.0-bin-master.zip,解压,设置HADOOP_HOME环境变量为解压后的地址
参考资料:http://my.oschina.net/cloudcoder/blog/286234
2.eclipse下报ERROR SparkDeploySchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.
查看spark的log目录下的spark-hadoop-org.apache.spark.deploy.master.Master-1-h1.out
./bin/spark-submit --class org.apache.spark.examples.SparkPi --master spark://h1:7077 --executor-memory 3000m --total-executor-cores 100 ./lib/spark-examples-1.4.0-hadoop2.6.0.jar 1000
将命令的ip换成host
3.spark java.lang.StackOverflowError
a.检查下有没有死循环
b.深层次的递归
c.设定jvm参数:-Xss10m
4.WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
./bin/run-example SparkPi --spark.executor.memory 3000m
5.Cannot change version of project facet Dynamic Web Module to 2.5. http://blog.csdn.net/steveguoshao/article/details/38414145
6.Exception in thread "main" java.lang.NoSuchFieldError: DEF_CONTENT_CHARSET
有两个版本的http-core,从pom中去掉一个
在https://github.com/srccodes/hadoop-common-2.2.0-bin页面下载hadoop-common-2.2.0-bin-master.zip,解压,设置HADOOP_HOME环境变量为解压后的地址
参考资料:http://my.oschina.net/cloudcoder/blog/286234
2.eclipse下报ERROR SparkDeploySchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.
查看spark的log目录下的spark-hadoop-org.apache.spark.deploy.master.Master-1-h1.out
./bin/spark-submit --class org.apache.spark.examples.SparkPi --master spark://h1:7077 --executor-memory 3000m --total-executor-cores 100 ./lib/spark-examples-1.4.0-hadoop2.6.0.jar 1000
将命令的ip换成host
3.spark java.lang.StackOverflowError
a.检查下有没有死循环
b.深层次的递归
c.设定jvm参数:-Xss10m
4.WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
./bin/run-example SparkPi --spark.executor.memory 3000m
5.Cannot change version of project facet Dynamic Web Module to 2.5. http://blog.csdn.net/steveguoshao/article/details/38414145
6.Exception in thread "main" java.lang.NoSuchFieldError: DEF_CONTENT_CHARSET
有两个版本的http-core,从pom中去掉一个
相关文章推荐
- 93、持续集成以及Jenkins的知识介绍
- 回调函数【转载】
- D3中path各指令的含义
- linux route 添加路由
- http://blog.csdn.net/moxiaomomo/article/details/8490917
- jmeter 界面和简评
- 深入浅出 Cocoa 之 Bonjour 网络编程
- curl
- 在JSP页面中调用另一个JSP页面中的变量
- 怎样搭高质量的Android项目框架
- java入门 第一季3
- httpclient调用方法
- docker 源码分析 三(基于1.8.2版本),NewDaemon启动
- 【PAT】1077. Kuchiguse (20)
- Android MIFARE读写器详解3
- 如果你百度搜索一个实现方法……
- php gd库没有安装问题确认
- Z-index优先级总结
- 自动换行&&自动补全TextView
- Android开发大牛博客