您的位置:首页 > 运维架构 > Shell

pyspark用pipe管道调用bash脚本时,遇到Permission Denied问题

2017-07-18 15:31 218 查看
当用pyspark在CDH的yarn集群上运行时,用pipe管道调用bash脚本若遇到如下问题

"/usr/lib64/python2.7/subprocess.py", line 1234, in _execute_child raise child_exception OSError:
[Errno 13] Permission denied
at org.apache.spark.api.python.PythonRunner$$anon$1.read(PythonRDD.scala:166)
at org.apache.spark.api.python.PythonRunner$$anon$1.<init>(PythonRDD.scala:207)
at org.apache.spark.api.python.PythonRunner.compute(PythonRDD.scala:125)
at org.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:70)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38


解决:

遇到该问题首先想到应该是没有执行权限。

给bash脚本添加执行权限,

chmod +x xx.sh命令

重新提交spark任务,如若还有该问题,则可能该脚本还需要可读或者可写 则设置该脚本所在的目录src权限,

chmod 777 -R src
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: