Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/amazon-s3/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Hadoop 执行S3NativeFileSystem.mkdir时AccessDeniedException_Hadoop_Amazon S3 - Fatal编程技术网

Hadoop 执行S3NativeFileSystem.mkdir时AccessDeniedException

Hadoop 执行S3NativeFileSystem.mkdir时AccessDeniedException,hadoop,amazon-s3,Hadoop,Amazon S3,我正在AWS EMR上运行hadoop作业。作业因此调用堆栈而失败: Job setup failed : com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: F732367C69BCEEE0), S3 Extended Request ID: 46bzoHyEim9

我正在AWS EMR上运行hadoop作业。作业因此调用堆栈而失败:

Job setup failed : com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: F732367C69BCEEE0), S3 Extended Request ID: 46bzoHyEim9YSGPt/M9F+OupUo3kuV6BJPdqW9AXkhNR+eLh5443kikWSjCZhzLmpBrgf3XeNus=
at com.amazonaws.http.AmazonHttpClient.handleErrorResponse(AmazonHttpClient.java:1160)
at com.amazonaws.http.AmazonHttpClient.executeOneRequest(AmazonHttpClient.java:748)
at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:467)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:302)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:3785)
at com.amazonaws.services.s3.AmazonS3Client.putObject(AmazonS3Client.java:1472)
at com.amazon.ws.emr.hadoop.fs.s3n.Jets3tNativeFileSystemStore.storeEmptyFile(Jets3tNativeFileSystemStore.java:185)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy40.storeEmptyFile(Unknown Source)
at com.amazon.ws.emr.hadoop.fs.s3n.S3NativeFileSystem.mkdir(S3NativeFileSystem.java:1148)
at com.amazon.ws.emr.hadoop.fs.s3n.S3NativeFileSystem.mkdirs(S3NativeFileSystem.java:1130)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1865)
at com.amazon.ws.emr.hadoop.fs.EmrFileSystem.mkdirs(EmrFileSystem.java:402)
at org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.setupJob(FileOutputCommitter.java:291)
at org.apache.hadoop.mapreduce.lib.output.DirectFileOutputCommitter.setupJob(DirectFileOutputCommitter.java:62)
at org.apache.hadoop.mapred.FileOutputCommitter.setupJob(FileOutputCommitter.java:132)
at org.apache.hadoop.mapred.DirectFileOutputCommitter.setupJob(DirectFileOutputCommitter.java:30)
at org.apache.hadoop.mapred.OutputCommitter.setupJob(OutputCommitter.java:233)
at org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler$EventProcessor.handleJobSetup(CommitterEventHandler.java:254)
at org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler$EventProcessor.run(CommitterEventHandler.java:234)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)

这不是很有帮助,因为我不知道它试图向S3发出什么请求。我只是在调用堆栈中看到一个
S3NativeFileSystem.mkdir
。如何获取该方法的参数?它试图创建什么目录?

我猜作业的输出试图写入S3,因此生成输出目录

位于org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.setupJob(FileOutputCommitter.java:291)

角色是否具有对bucket的S3 put权限