Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/solr/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/7/elixir/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Apache spark 火花管在纱线中不工作(java.io.IOException:无法运行程序“XXX.py”:错误=13,权限被拒绝)_Apache Spark_Pipe_Yarn - Fatal编程技术网

Apache spark 火花管在纱线中不工作(java.io.IOException:无法运行程序“XXX.py”:错误=13,权限被拒绝)

Apache spark 火花管在纱线中不工作(java.io.IOException:无法运行程序“XXX.py”:错误=13,权限被拒绝),apache-spark,pipe,yarn,Apache Spark,Pipe,Yarn,我对Spark编程是新手。 我试图使用Pipe操作符嵌入外部程序(一组包含已编译的C程序、bash、Python脚本的文件)。 代码如下所示: sc.addFile("hdfs://afolder",true) val infile = sc.textFile("afile.txt").pipe("afolder/abash.sh").take(3) sh将调用其他脚本和程序在afile.txt上执行以下操作 此代码在spark本地模式下运行良好。但是,当我尝试以纱线模式(客户机或集群)部署

我对Spark编程是新手。 我试图使用Pipe操作符嵌入外部程序(一组包含已编译的C程序、bash、Python脚本的文件)。 代码如下所示:

sc.addFile("hdfs://afolder",true)
val infile =  sc.textFile("afile.txt").pipe("afolder/abash.sh").take(3)
sh将调用其他脚本和程序在afile.txt上执行以下操作

此代码在spark本地模式下运行良好。但是,当我尝试以纱线模式(客户机或集群)部署时,我失败了,出现了以下消息**

WARN scheduler.TaskSetManager:在阶段1.0中丢失了任务0.0(TID 4, 数据库):java.io.IOException:无法运行程序“afolder/abash.sh”: 错误=13,权限被拒绝

文件夹的所有子目录和文件都已成功下载到本地spark tmp目录中(在我的示例中,这是/usr/local/hadoop/spark/) 在第一次失败之后,我递归地将777权限设置为hdfs中的一个文件夹。不过,我还是犯了同样的错误

有什么办法解决吗?谢谢

输出错误:

> 16/05/18 16:04:09 INFO storage.MemoryStore: Block broadcast_2 stored
   > as values in memory (estimated size 212.1 KB, free 212.1 KB) 16/05/18
   > 16:04:09 INFO storage.MemoryStore: Block broadcast_2_piece0 stored as
   > bytes in memory (estimated size 19.5 KB, free 231.6 KB) 16/05/18
   > 16:04:09 INFO storage.BlockManagerInfo: Added broadcast_2_piece0 in
   > memory on 210.107.197.201:42777 (size: 19.5 KB, free: 511.1 MB)
   > 16/05/18 16:04:09 INFO spark.SparkContext: Created broadcast 2 from
   > textFile at <console>:27 16/05/18 16:04:09 INFO
   > mapred.FileInputFormat: Total input paths to process : 1 16/05/18
   > 16:04:09 INFO spark.SparkContext: Starting job: take at <console>:27
   > 16/05/18 16:04:09 INFO scheduler.DAGScheduler: Got job 1 (take at
   > <console>:27) with 1 output partitions 16/05/18 16:04:09 INFO
   > scheduler.DAGScheduler: Final stage: ResultStage 1 (take at
   > <console>:27) 16/05/18 16:04:09 INFO scheduler.DAGScheduler: Parents
   > of final stage: List() 16/05/18 16:04:09 INFO scheduler.DAGScheduler:
   > Missing parents: List() 16/05/18 16:04:09 INFO scheduler.DAGScheduler:
   > Submitting ResultStage 1 (PipedRDD[5] at pipe at <console>:27), which
   > has no missing parents 16/05/18 16:04:09 INFO storage.MemoryStore:
   > Block broadcast_3 stored as values in memory (estimated size 3.7 KB,
   > free 235.3 KB) 16/05/18 16:04:09 INFO storage.MemoryStore: Block
   > broadcast_3_piece0 stored as bytes in memory (estimated size 2.2 KB,
   > free 237.5 KB) 16/05/18 16:04:09 INFO storage.BlockManagerInfo: Added
   > broadcast_3_piece0 in memory on 210.107.197.201:42777 (size: 2.2 KB,
   > free: 511.1 MB) 16/05/18 16:04:09 INFO spark.SparkContext: Created
   > broadcast 3 from broadcast at DAGScheduler.scala:1006 16/05/18
   > 16:04:09 INFO scheduler.DAGScheduler: Submitting 1 missing tasks from
   > ResultStage 1 (PipedRDD[5] at pipe at <console>:27) 16/05/18 16:04:09
   > INFO cluster.YarnScheduler: Adding task set 1.0 with 1 tasks 16/05/18
   > 16:04:09 INFO scheduler.TaskSetManager: Starting task 0.0 in stage 1.0
   > (TID 4, database, partition 0,NODE_LOCAL, 2603 bytes) 16/05/18
   > 16:04:11 INFO storage.BlockManagerInfo: Added broadcast_3_piece0 in
   > memory on database:51757 (size: 2.2 KB, free: 511.1 MB) 16/05/18
   > 16:04:11 WARN scheduler.TaskSetManager: Lost task 0.0 in stage 1.0
   > (TID 4, database): java.io.IOException: Cannot run program
   > "afolder/abash.sh": error=13, Permission denied
   >         at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
   >         at org.apache.spark.rdd.PipedRDD.compute(PipedRDD.scala:119)
   >         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
   >         at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
   >         at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
   >         at org.apache.spark.scheduler.Task.run(Task.scala:89)
   >         at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
   >         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
   >         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
   >         at java.lang.Thread.run(Thread.java:745) Caused by: java.io.IOException: error=13, Permission denied
   >         at java.lang.UNIXProcess.forkAndExec(Native Method)
   >         at java.lang.UNIXProcess.<init>(UNIXProcess.java:248)
   >         at java.lang.ProcessImpl.start(ProcessImpl.java:134)
   >         at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
   >         ... 9 more
   > 
   > 16/05/18 16:04:11 INFO scheduler.TaskSetManager: Starting task 0.1 in
   > stage 1.0 (TID 5, database, partition 0,NODE_LOCAL, 2603 bytes)
   > 16/05/18 16:04:12 INFO storage.BlockManagerInfo: Added
   > broadcast_3_piece0 in memory on database:52395 (size: 2.2 KB, free:
   > 511.1 MB) 16/05/18 16:04:12 INFO scheduler.TaskSetManager: Lost task 0.1 in stage 1.0 (TID 5) on executor database:    java.io.IOException (Cannot run program "afolder/abash.sh": error=13,    Permission denied)
   > [duplicate 1] 16/05/18 16:04:12 INFO scheduler.TaskSetManager:
   > Starting task 0.2 in stage 1.0 (TID 6, database, partition
   > 0,NODE_LOCAL, 2603 bytes) 16/05/18 16:04:12 INFO
   > scheduler.TaskSetManager: Lost task 0.2 in stage 1.0 (TID 6) on
   > executor database: java.io.IOException (Cannot run program
   > "afolder/abash.sh": error=13, Permission denied) [duplicate 2]
   > 16/05/18 16:04:12 INFO scheduler.TaskSetManager: Starting task 0.3 in
   > stage 1.0 (TID 7, database, partition 0,NODE_LOCAL, 2603 bytes)
   > 16/05/18 16:04:12 INFO scheduler.TaskSetManager: Lost task 0.3 in
   > stage 1.0 (TID 7) on executor database: java.io.IOException (Cannot
   > run program "afolder/abash.sh": error=13, Permission denied)
   > [duplicate 3] 16/05/18 16:04:12 ERROR scheduler.TaskSetManager: Task 0
   > in stage 1.0 failed 4 times; aborting job 16/05/18 16:04:12 INFO
   > cluster.YarnScheduler: Removed TaskSet 1.0, whose tasks have all
   > completed, from pool 16/05/18 16:04:12 INFO cluster.YarnScheduler:
   > Cancelling stage 1 16/05/18 16:04:12 INFO scheduler.DAGScheduler:
   > ResultStage 1 (take at <console>:27) failed in 2.955 s 16/05/18
   > 16:04:12 INFO scheduler.DAGScheduler: Job 1 failed: take at
   > <console>:27, took 2.963885 s org.apache.spark.SparkException: Job
   > aborted due to stage failure: Task 0 in stage 1.0 failed 4 times, most
   > recent failure: Lost task 0.3 in stage 1.0 (TID 7, database):
   > java.io.IOException: Cannot run program "afolder/abash.sh": error=13,
   > Permission denied
   >         at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
   >         at org.apache.spark.rdd.PipedRDD.compute(PipedRDD.scala:119)
   >         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
   >         at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
   >         at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
   >         at org.apache.spark.scheduler.Task.run(Task.scala:89)
   >         at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
   >         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
   >         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
   >         at java.lang.Thread.run(Thread.java:745) Caused by: java.io.IOException: error=13, Permission denied
   >         at java.lang.UNIXProcess.forkAndExec(Native Method)
   >         at java.lang.UNIXProcess.<init>(UNIXProcess.java:248)
   >         at java.lang.ProcessImpl.start(ProcessImpl.java:134)
   >         at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
   >         ... 9 more
   > 
   > Driver stacktrace:
   >         at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431)
   >         at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1419)
   >         at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1418)
   >         at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
   >         at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
   >         at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1418)
   >         at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
   >         at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
   >         at scala.Option.foreach(Option.scala:236)
   >         at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:799)
   >         at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1640)
   >         at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1599)
   >         at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1588)
   >         at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
   >         at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)
   >         at org.apache.spark.SparkContext.runJob(SparkContext.scala:1832)
   >         at org.apache.spark.SparkContext.runJob(SparkContext.scala:1845)
   >         at org.apache.spark.SparkContext.runJob(SparkContext.scala:1858)
   >         at org.apache.spark.rdd.RDD$$anonfun$take$1.apply(RDD.scala:1328)
   >         at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
   >         at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
   >         at org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
   >         at org.apache.spark.rdd.RDD.take(RDD.scala:1302)
   >         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:27)
   >         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:32)
   >         at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:34)
   >         at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:36)
   >         at $iwC$$iwC$$iwC$$iwC.<init>(<console>:38)
   >         at $iwC$$iwC$$iwC.<init>(<console>:40)
   >         at $iwC$$iwC.<init>(<console>:42)
   >         at $iwC.<init>(<console>:44)
   >         at <init>(<console>:46)
   >         at .<init>(<console>:50)
   >         at .<clinit>(<console>)
   >         at .<init>(<console>:7)
   >         at .<clinit>(<console>)
   >         at $print(<console>)
   >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   >         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   >         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   >         at java.lang.reflect.Method.invoke(Method.java:498)
   >         at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
   >         at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
   >         at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
   >         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
   >         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
   >         at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
   >         at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
   >         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
   >         at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
   >         at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
   >         at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
   >         at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
   >         at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
   >         at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
   >         at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
   >         at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
   >         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
   >         at org.apache.spark.repl.Main$.main(Main.scala:31)
   >         at org.apache.spark.repl.Main.main(Main.scala)
   >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   >         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   >         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   >         at java.lang.reflect.Method.invoke(Method.java:498)
   >         at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
   >         at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
   >         at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
   >         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
   >         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused    by: java.io.IOException: Cannot run program "afolder/abash.sh":
   > error=13, Permission denied
   >         at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
   >         at org.apache.spark.rdd.PipedRDD.compute(PipedRDD.scala:119)
   >         at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:306)
   >         at org.apache.spark.rdd.RDD.iterator(RDD.scala:270)
   >         at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
   >         at org.apache.spark.scheduler.Task.run(Task.scala:89)
   >         at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
   >         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
   >         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
   >         at java.lang.Thread.run(Thread.java:745) Caused by: java.io.IOException: error=13, Permission denied
   >         at java.lang.UNIXProcess.forkAndExec(Native Method)
   >         at java.lang.UNIXProcess.<init>(UNIXProcess.java:248)
   >         at java.lang.ProcessImpl.start(ProcessImpl.java:134)
   >         at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
   >         ... 9 more
>16/05/18 16:04:09信息存储。内存存储:块广播存储
>作为内存中的值(估计大小212.1 KB,可用大小212.1 KB)16/05/18
>16:04:09信息存储。内存存储:块广播\u 2\u片段0存储为
>内存中的字节(估计大小19.5 KB,可用大小231.6 KB)16/05/18
>16:04:09信息存储.BlockManagerInfo:在中添加了广播\u 2\u片段0
>210.107.197.201:42777上的内存(大小:19.5 KB,可用空间:511.1 MB)
>16/05/18 16:04:09信息spark.SparkContext:从创建广播2
>文本文件地址:27 16/05/18 16:04:09信息
>mapred.FileInputFormat:进程的总输入路径:1 16/05/18
>16:04:09信息spark.SparkContext:开始工作:在27
>16/05/18 16:04:09信息调度程序。DAG调度程序:获取作业1(在
>:27)带有1个输出分区16/05/18 16:04:09信息
>scheduler.DAGScheduler:最后阶段:结果阶段1(在
>:27)16/05/18 16:04:09信息调度程序。DAG调度程序:家长
>最后阶段的列表()16/05/18 16:04:09信息调度程序。DAGScheduler:
>缺少父项:列表()16/05/18 16:04:09信息计划程序。DAGScheduler:
>提交结果第1阶段(管道处的管道编号[5]:27),其中
>没有丢失的父项2018年5月16日16:04:09信息存储。内存存储:
>块广播_3存储为内存中的值(估计大小为3.7KB,
>免费235.3 KB)16/05/18 16:04:09信息存储。内存存储:块
>广播片段0以字节形式存储在内存中(估计大小为2.2 KB,
>免费237.5 KB)16/05/18 16:04:09信息存储。BlockManager信息:已添加
>在210.107.197.201:42777(大小:2.2 KB)上的内存中广播0,
>免费:511.1 MB)16/05/18 16:04:09信息spark.SparkContext:已创建
>从DAGScheduler广播3。scala:100616/05/18
>16:04:09 INFO scheduler.DAGScheduler:从提交1个缺少的任务
>结果第1阶段(管道处的管道编号[5]:27)16/05/18 16:04:09
>INFO cluster.YarnScheduler:添加任务集1.0和1个任务18年5月16日
>16:04:09 INFO scheduler.TaskSetManager:在阶段1.0中启动任务0.0
>(TID 4,数据库,分区0,节点本地,2603字节)16/05/18
>16:04:11 INFO storage.BlockManagerInfo:在中添加了广播\u 3\u片段0
>数据库内存:51757(大小:2.2KB,可用空间:511.1MB)16/05/18
>16:04:11 WARN scheduler.TaskSetManager:在阶段1.0中丢失了任务0.0
>(TID 4,数据库):java.io.IOException:无法运行程序
>“afolder/abash.sh”:错误=13,权限被拒绝
>位于java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
>位于org.apache.spark.rdd.PipedRDD.compute(PipedRDD.scala:119)
>在org.apache.spark.rdd.rdd.computeOrReadCheckpoint(rdd.scala:306)上
>位于org.apache.spark.rdd.rdd.iterator(rdd.scala:270)
>位于org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)
>位于org.apache.spark.scheduler.Task.run(Task.scala:89)
>位于org.apache.spark.executor.executor$TaskRunner.run(executor.scala:214)
>位于java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>位于java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>在java.lang.Thread.run(Thread.java:745)处,由于以下原因导致:java.io.IOException:error=13,权限被拒绝
>位于java.lang.UNIXProcess.forkAndExec(本机方法)
>位于java.lang.UNIXProcess(UNIXProcess.java:248)
>在java.lang.ProcessImpl.start(ProcessImpl.java:134)处
>位于java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
>         ... 9更多
> 
>18年5月16日16:04:11信息计划程序。TaskSetManager:正在启动中的任务0.1
>阶段1.0(TID 5,数据库,分区0,节点\本地,2603字节)
>18年5月16日16:04:12信息存储。BlockManager信息:已添加
>在数据库52395的内存中广播\u 3\u片段0(大小:2.2 KB,免费:
>511.1 MB)16/05/18 16:04:12 INFO scheduler.TaskSetManager:executor数据库上的阶段1.0(TID 5)中丢失了任务0.1:java.io.IOException(无法运行程序“afolder/abash.sh”:错误=13,权限被拒绝)
>[duplicate 1]16/05/18 16:04:12信息计划程序。任务集管理器:
>在阶段1.0中启动任务0.2(TID 6,数据库,分区
>0,节点_本地,2603字节)16/05/18 16:04:12信息
>scheduler.TaskSetManager:在上的阶段1.0(TID 6)中丢失了任务0.2
>执行器数据库:java.io.IOException(无法运行程序
>“afolder/abash.sh”:错误=13,权限被拒绝[重复2]
>18年5月16日16:04:12信息计划程序。TaskSetManager:正在启动中的任务0.3
>阶段1.0(TID 7,数据库,分区0,节点\本地,2603字节)
>18年5月16日16:04:12信息计划程序。任务集管理器:中丢失了任务0.3
>executor数据库上的阶段1.0(TID 7):java.io.IOException(无法
>运行程序“afolder/abash.sh”:错误=13,权限被拒绝)
>[duplicate 3]16/05/18 16:04:12错误计划程序。TaskSetManager:任务0
>第1.0阶段失败4次;