Amazon web services 将数据插入KMS加密强制存储桶时,Presto S3访问被拒绝
我在AWS s3上有一个bucket,它强制所有对象进行KMS加密。 我正在emr-5.2.1上运行Presto 我有s3上的外部表(无数据)。 当我使用Amazon web services 将数据插入KMS加密强制存储桶时,Presto S3访问被拒绝,amazon-web-services,amazon-s3,emr,presto,aws-kms,Amazon Web Services,Amazon S3,Emr,Presto,Aws Kms,我在AWS s3上有一个bucket,它强制所有对象进行KMS加密。 我正在emr-5.2.1上运行Presto 我有s3上的外部表(无数据)。 当我使用 INSERT INTO hive.s3.new_table SELECT * FROM src_table 我被拒绝访问错误。 我测试了几个不同的选项,并获得了支持,但运气不佳。 如果我从bucket中删除策略,那么Presto可以正常工作,但是在s3上创建的文件没有加密 Presto在读取加密的外部s3表或在hdfs上本地创建这些表方面
INSERT INTO hive.s3.new_table
SELECT * FROM src_table
我被拒绝访问错误。
我测试了几个不同的选项,并获得了支持,但运气不佳。
如果我从bucket中删除策略,那么Presto可以正常工作,但是在s3上创建的文件没有加密
Presto在读取加密的外部s3表或在hdfs上本地创建这些表方面没有任何问题。我不能允许未加密的数据
政策示例:
{
"Version":"2012-10-17",
"Id":"PutObjPolicy",
"Statement":[{
"Sid":"DenyUnEncryptedObjectUploads",
"Effect":"Deny",
"Principal":"*",
"Action":"s3:PutObject",
"Resource":"arn:aws:s3:::YourBucket/*",
"Condition":{
"StringNotEquals":{
"s3:x-amz-server-side-encryption":"aws:kms"
}
}
}
]
}
Presto config/etc/Presto/conf/catalog/hive.properties
hive.s3.ssl.enabled=true
hive.s3.use-instance-credentials=true
hive.s3.sse.enabled = true
hive.s3.kms-key-id = long_key_id_here
我是否在配置中遗漏了什么,或者在插入表时Presto没有使用KMS
根据亚马逊的说法:“如果不是通过SSL或使用SigV4对受AWS KMS保护的对象发出的所有GET和PUT请求都将失败。”Presto现在通过
hive.s3.SSE.KMS密钥id
hive connector配置属性支持SSE-KMS。您可以共享此表的create table命令吗?我特别寻找s3的位置。您使用的是s3、s3a还是s3n?我得到了支持部门的回复我可以将SSE-KMS与Presto一起使用吗?不幸的是,没有。Presto目前支持SSE-S3(AES256)或客户端加密(CSE-KMS)。对于所有使用EMRFSFilesystem的应用程序,如Hive、Spark、MR等,EMR都支持SSE-KMS。不幸的是,Presto使用的是PrestoFileSystem。这就是原因,任何更改/改进都需要直接添加到Presto。
AWS支持部门为此提交了通知单
Error:
com.facebook.presto.spi.PrestoException: Error committing write to Hive
at com.facebook.presto.hive.HiveRecordWriter.commit(HiveRecordWriter.java:132)
at com.facebook.presto.hive.HiveWriter.commit(HiveWriter.java:49)
at com.facebook.presto.hive.HivePageSink.doFinish(HivePageSink.java:152)
at com.facebook.presto.hive.authentication.NoHdfsAuthentication.doAs(NoHdfsAuthentication.java:23)
at com.facebook.presto.hive.HdfsEnvironment.doAs(HdfsEnvironment.java:76)
at com.facebook.presto.hive.HivePageSink.finish(HivePageSink.java:144)
at com.facebook.presto.spi.classloader.ClassLoaderSafeConnectorPageSink.finish(ClassLoaderSafeConnectorPageSink.java:49)
at com.facebook.presto.operator.TableWriterOperator.finish(TableWriterOperator.java:156)
at com.facebook.presto.operator.Driver.processInternal(Driver.java:394)
at com.facebook.presto.operator.Driver.processFor(Driver.java:301)
at com.facebook.presto.execution.SqlTaskExecution$DriverSplitRunner.processFor(SqlTaskExecution.java:622)
at com.facebook.presto.execution.TaskExecutor$PrioritizedSplitRunner.process(TaskExecutor.java:534)
at com.facebook.presto.execution.TaskExecutor$Runner.run(TaskExecutor.java:670)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: xxxxxx), S3 Extended Request ID: xxxxxxxxxxxxxx+xxx=
at com.facebook.presto.hive.PrestoS3FileSystem$PrestoS3OutputStream.uploadObject(PrestoS3FileSystem.java:1003)
at com.facebook.presto.hive.PrestoS3FileSystem$PrestoS3OutputStream.close(PrestoS3FileSystem.java:967)
at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:74)
at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:108)
at org.apache.hadoop.hive.ql.io.orc.WriterImpl.close(WriterImpl.java:2429)
at org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat$OrcRecordWriter.close(OrcOutputFormat.java:106)
at com.facebook.presto.hive.HiveRecordWriter.commit(HiveRecordWriter.java:129)
... 15 more
Caused by: com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: xxxxxxx)
at com.amazonaws.http.AmazonHttpClient.handleErrorResponse(AmazonHttpClient.java:1387)
at com.amazonaws.http.AmazonHttpClient.executeOneRequest(AmazonHttpClient.java:940)
at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:715)
at com.amazonaws.http.AmazonHttpClient.doExecute(AmazonHttpClient.java:466)
at com.amazonaws.http.AmazonHttpClient.executeWithTimer(AmazonHttpClient.java:427)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:376)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4039)
at com.amazonaws.services.s3.AmazonS3Client.putObject(AmazonS3Client.java:1583)
at com.amazonaws.services.s3.AmazonS3EncryptionClient.access$101(AmazonS3EncryptionClient.java:80)
at com.amazonaws.services.s3.AmazonS3EncryptionClient$S3DirectImpl.putObject(AmazonS3EncryptionClient.java:603)
at com.amazonaws.services.s3.internal.crypto.S3CryptoModuleBase.putObjectUsingMetadata(S3CryptoModuleBase.java:175)
at com.amazonaws.services.s3.internal.crypto.S3CryptoModuleBase.putObjectSecurely(S3CryptoModuleBase.java:161)
at com.amazonaws.services.s3.internal.crypto.CryptoModuleDispatcher.putObjectSecurely(CryptoModuleDispatcher.java:108)
at com.amazonaws.services.s3.AmazonS3EncryptionClient.putObject(AmazonS3EncryptionClient.java:483)
at com.amazonaws.services.s3.transfer.internal.UploadCallable.uploadInOneChunk(UploadCallable.java:131)
at com.amazonaws.services.s3.transfer.internal.UploadCallable.call(UploadCallable.java:123)
at com.amazonaws.services.s3.transfer.internal.UploadMonitor.call(UploadMonitor.java:139)
at com.amazonaws.services.s3.transfer.internal.UploadMonitor.call(UploadMonitor.java:47)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
... 3 more