Warning: file_get_contents(/data/phpspider/zhask/data//catemap/7/python-2.7/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 2.7 使用python 2.7和boto 2将文件从本地目录上载到aws S3时出现问题_Python 2.7_Amazon Web Services_File Upload_Amazon S3_Boto - Fatal编程技术网

Python 2.7 使用python 2.7和boto 2将文件从本地目录上载到aws S3时出现问题

Python 2.7 使用python 2.7和boto 2将文件从本地目录上载到aws S3时出现问题,python-2.7,amazon-web-services,file-upload,amazon-s3,boto,Python 2.7,Amazon Web Services,File Upload,Amazon S3,Boto,我正在做一个简单的操作,将gzip文件从S3 bucket下载到本地目录。我将它们解压缩到另一个本地目录中,然后将它们重新上传回S3 bucket,并保存到归档文件夹路径中。在执行此操作时,我希望确保处理的文件集与我最初从S3 bucket下载的文件集相同,在下面的代码中为(f_name)。现在,下面的代码并没有将它们上传回S3,这就是我被卡住的地方。但可以从S3下载并将其解压缩到本地目录。你能帮我了解一下上传文件功能有什么问题吗 from boto.s3.connection import S

我正在做一个简单的操作,将gzip文件从S3 bucket下载到本地目录。我将它们解压缩到另一个本地目录中,然后将它们重新上传回S3 bucket,并保存到归档文件夹路径中。在执行此操作时,我希望确保处理的文件集与我最初从S3 bucket下载的文件集相同,在下面的代码中为(f_name)。现在,下面的代码并没有将它们上传回S3,这就是我被卡住的地方。但可以从S3下载并将其解压缩到本地目录。你能帮我了解一下上传文件功能有什么问题吗

from boto.s3.connection import S3Connection
from boto.s3.key import *
import os
import os.path
aws_bucket= "event-logs-dev”  ## S3 Bucket name
local_download_directory= "/Users/TargetData/Download/test_queue1/“ ## local                directory to download the gzip files from S3.
Target_directory_to_extract = "/Users/TargetData/unzip”  ##local directory to gunzip the downloaded files.
Target_s3_path_to_upload= "event-logs-dev/data/clean/xact/logs/archive/“   ## S3 bucket path to upload the files.

def decompressAllFilesFromNetfiler(self,aws_bucket,local_download_directory,Target_d irectory_to_extract,Target_s3_path_to_upload):
zipFiles = [f for f in os.listdir(local_download_directory) if re.match(r'.*\.tar\.gz', f)]
for f_name in zipFiles:
    if os.path.exists(Target_directory_to_extract+"/"+f_name[:-len('.tar.gz')]) and os.access(Target_directory_to_extract+"/"+f_name[:-len('.tar.gz')], os.R_OK):
        print ('File {} already exists!'.format(f_name))
    else:
        f_name_with_path = os.path.join(local_download_directory, f_name)
        os.system('mkdir -p {} && tar vxzf {} -C {}'.format(Target_directory_to_extract, f_name_with_path, Target_directory_to_extract))
        print ('Extracted file {}'.format(f_name))
        self._uploadFile(aws_bucket,f_name,Target_s3_path_to_upload,Target_directory_to_extract)

def _uploadFile(self, aws_bucket, f_name,Target_s3_path_to_upload,Target_directory_to_extract):
full_key_name = os.path.expanduser(os.path.join(Target_s3_path_to_upload, f_name))
path = os.path.expanduser(os.path.join(Target_directory_to_extract, f_name))
try:
    print "Uploaded extracted file to: %s" % (full_key_name)
    key = aws_bucket.new_key(full_key_name)
    key.set_contents_from_filename(path)
except:
    if full_key_name is None:
        print "Error uploading”

当前,输出输出打印
将提取的文件上载到:事件日志dev/data/clean/xact/logs/archive/1442235602129200000.tar.gz
,但没有任何内容上载到S3 bucket。非常感谢你的帮助!!提前谢谢你

看来您已经剪切并粘贴了部分代码,并且可能由于上面的代码无法像粘贴的那样工作而丢失了格式。我已经冒昧地将其改为PEP8(大部分),但是仍然缺少一些创建S3对象的代码。由于您导入了模块,我假定您有那部分代码,只是没有粘贴它

这是正确格式化的代码的清理版本。我还向try:块添加了一个异常代码,以打印出您得到的错误。您应该更新异常,使其更特定于为make_key或set_contents引发的异常。。。但是一般的例外情况会让你开始。如果没有更多的东西,这是更可读的,但你也应该包括你的S3连接代码-并删除任何特定于你的领域(如密钥,商业秘密等)


非常感谢您花时间更正此代码。我真的很感谢你的帮助。是的,我已经有了连接S3 bucket的代码。我运行了上面的代码,得到了下面的输出。所以,目前有了这段代码,文件提取工作正常,但它并没有将这些文件上传到S3,也触发了错误。请建议下一步<代码>提取文件1440190802314590000.tar.gz错误:“unicode”对象没有属性“new_key”x 144019080590001/V。数据提取文件1440190802314590001.tar.gz错误:“unicode”对象没有属性“new_key”除非aws_bucket是bucket对象,否则此指令无效。
#!/usr/bin/env python
"""
do some download
some extract
and some upload
"""

from boto.s3.connection import S3Connection
from boto.s3.key import *
import os
import os.path


aws_bucket = 'event-logs-dev'
local_download_directory = '/Users/TargetData/Download/test_queue1/'
Target_directory_to_extract = '/Users/TargetData/unzip'
Target_s3_path_to_upload = 'event-logs-dev/data/clean/xact/logs/archive/'

'''
MUST BE SOME MAGIC HERE TO GET  AN S3 CONNECTION ???
aws_bucket IS NOT A BUCKET OBJECT ...
'''


def decompressAllFilesFromNetfiler(self,
                                   aws_bucket,
                                   local_download_directory,
                                   Target_directory_to_extract,
                                   Target_s3_path_to_upload):
    '''
    decompress stuff
    '''
    zipFiles = [f for f in os.listdir(
        local_download_directory) if re.match(r'.*\.tar\.gz', f)]
    for f_name in zipFiles:
        if os.path.exists(
            "{}/{}".format(Target_directory_to_extract,
                           f_name[:len('.tar.gz')])) and os.access(
                               "{}/{}".format(Target_directory_to_extract,
                                              f_name[:len('.tar.gz')])) and os.R_OK:

            print ('File {} already exists!'.format(f_name))
        else:
            f_name_with_path = os.path.join(local_download_directory, f_name)
            os.system('mkdir -p {} && tar vxzf {} -C {}'.format(
                Target_directory_to_extract,
                f_name_with_path,
                Target_directory_to_extract))
            print ('Extracted file {}'.format(f_name))
            self._uploadFile(aws_bucket,
                             f_name,
                             Target_s3_path_to_upload,
                             Target_directory_to_extract)


def _uploadFile(self,
                aws_bucket,
                f_name,
                Target_s3_path_to_upload,
                Target_directory_to_extract):
    full_key_name = os.path.expanduser(os.path.join(Target_s3_path_to_upload,
                                                    f_name))
    path = os.path.expanduser(os.path.join(Target_directory_to_extract, f_name))
    try:
        S3CONN = S3Connection()
        BUCKET = S3CONN.get_bucket(aws_bucket)
        key = BUCKET.new_key(full_key_name)
        key.set_contents_from_filename(path)
        print "Uploaded extracted file to: {}".format(full_key_name)
    except Exception as UploadERR:
        if full_key_name is None:
            print 'Error uploading'
        else:
            print "Error : {}".format(UploadERR)