您的位置:首页 > 产品设计 > UI/UE

提交spark任务Requesting 1 new executor because tasks are backlogged (new desired total will be 1)

2017-11-03 14:37 766 查看
这两天在集群上提交任务时一直提交不成功,一直爆出下面的问题,如果不主动停掉就会一直刷 check your cluster UI to ensure that workers are registered and have sufficient resources,开始以为是资源不足的问题,在网上找了好多方法测试都不行,后来发现虽然一直再刷等资源的问题(下面的报错日志,以我的蹩脚英语理解为等资源,并且spark的管理页面显示有任务提交上来,一直处于waiting),检查yarn的管理页面发现,yarn上根本就没有接受到spark提交的任务请求,问题就出在这

问题原因是:提交任务的方式错了,我刚开始提交任务的命令是./spark-submit --class Wordcount
--master spark://10.0.10.29:7077 --num-executors 100 --executor-cores 2 --conf spark.default.parallelism=1000 --conf spark.storage.memoryFraction=0.5 /home/zkp/sparktest.jar sparktest/b.txt

解决办法:把提交命令改成./spark-submit --class Wordcount --master yarn-cluster--num-executors 100 --executor-cores 2 --conf spark.default.parallelism=1000
--conf spark.storage.memoryFraction=0.5 /home/zkp/sparktest.jar sparktest/b.txt就好了

Requesting 1 new executor because tasks are backlogged (new desired total will be 1)

17/11/03 14:35:35 WARN scheduler.TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources

17/11/03 14:35:50 WARN scheduler.TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
17/11/03 14:36:05 WARN scheduler.TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources

记录下来,希望能帮到想我一样的新手
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签:  spark 集群 java
相关文章推荐