Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/unit-testing/4.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 如何防止摩托车测试中出现这种错误?_Python_Unit Testing_Moto - Fatal编程技术网

Python 如何防止摩托车测试中出现这种错误?

Python 如何防止摩托车测试中出现这种错误?,python,unit-testing,moto,Python,Unit Testing,Moto,我正在尝试编写一个测试,验证register\u按位置提取\u是否能够从s3存储桶中读取并获取文件。在编写moto模拟测试时,我得到一个错误,指出bucket不存在 以下是寄存器\u通过\u位置方法提取\u: class ProcessTracker: # ... other methods and init here. def register_extracts_by_location(self, location_path, location_name=None):

我正在尝试编写一个测试,验证register\u按位置提取\u是否能够从s3存储桶中读取并获取文件。在编写moto模拟测试时,我得到一个错误,指出bucket不存在

以下是寄存器\u通过\u位置方法提取\u:

class ProcessTracker:
    # ... other methods and init here.

    def register_extracts_by_location(self, location_path, location_name=None):
        """
        For a given location, find all files and attempt to register them.
        :param location_name: Name of the location
        :param location_path: Path of the location
        :return:
        """
        location = LocationTracker(location_path=location_path, location_name=location_name)

        if location.location_type.location_type_name == "s3":
            s3 = boto3.resource("s3")

            path = location.location_path

            if path.startswith("s3://"):
                path = path[len("s3://")]

            bucket = s3.Bucket(path)

            for file in bucket.objects.all():
                ExtractTracker(process_run=self
                               , filename=file
                               , location=location
                               , status='ready')
        else:
            for file in os.listdir(location_path):
                ExtractTracker(process_run=self
                               , filename=file
                               , location=location
                               , status='ready')
测试的相关部分如下:

   def test_register_extracts_by_location_s3(self):
        """
        Testing that when the location is s3, all the extracts are registered and set to 'ready' status.
        The process/extract relationship should also be set to 'ready' since that is the last status the process set
        the extracts to.
        :return:
        """
        process_status = aliased(ExtractStatus)
        extract_status = aliased(ExtractStatus)

        expected_keys = 'test_local_dir_1.csv', 'test_local_dir_2.csv'

        with moto.mock_s3():
            conn = boto3.resource('s3', region_name='us-east-1')
            conn.create_bucket(Bucket='test_bucket')

            for file in expected_keys:
                conn.Object('test_bucket', file)

            self.process_tracker.register_extracts_by_location(location_path='s3://test_bucket')

看起来boto3仍在进行连接,但目前我不确定。收到的错误是:

botocore.errorfactory.NoSuchBucket: An error occurred (NoSuchBucket) when calling the ListObjects operation: The specified bucket does not exist


我能够通过创建一个模拟s3存储桶并在测试中进一步使用它来解决这个问题。以下是完成的测试,我认为它按预期工作:

    def test_register_extracts_by_location_s3(self):
    """
    Testing that when the location is s3, all the extracts are registered and set to 'ready' status.
    The process/extract relationship should also be set to 'ready' since that is the last status the process set
    the extracts to.
    :return:
    """
    process_status = aliased(ExtractStatus)
    extract_status = aliased(ExtractStatus)
    test_bucket = "test_bucket"

    expected_keys = ["test_local_dir_1.csv", "test_local_dir_2.csv"]

    client = boto3.client(
        "s3",
        region_name="us-east-1",
        aws_access_key_id="fake_access_key",
        aws_secret_access_key="fake_secret_key",
    )
    try:
        s3 = boto3.resource(
            "s3",
            region_name="us-east-1",
            aws_access_key_id="fake_access_key",
            aws_secret_access_key="fake_secret_key",
        )

        s3.meta.client.head_bucket(Bucket=test_bucket)
    except botocore.exceptions.ClientError:
        pass
    else:
        err = "%s should not exist" % test_bucket
        raise EnvironmentError(err)

    client.create_bucket(Bucket=test_bucket)

    current_dir = os.path.dirname(__file__)
    fixtures_dir = os.path.join(current_dir, "fixtures")

    for file in expected_keys:

        key = os.path.join(test_bucket, file)

        print(file)
        print(key)
        print(fixtures_dir)

        file = os.path.join(fixtures_dir, file)
        client.upload_file(Filename=file, Bucket=test_bucket, Key=key)

    self.process_tracker.register_extracts_by_location(
        location_path="s3://test_bucket"
    )

    extracts = (
        self.session.query(
            Extract.extract_filename,
            extract_status.extract_status_name,
            process_status.extract_status_name,
        )
        .join(
            ExtractProcess, Extract.extract_id == ExtractProcess.extract_tracking_id
        )
        .join(
            extract_status,
            Extract.extract_status_id == extract_status.extract_status_id,
        )
        .join(
            process_status,
            ExtractProcess.extract_process_status_id
            == process_status.extract_status_id,
        )
        .filter(
            ExtractProcess.process_tracking_id
            == self.process_tracker.process_tracking_run.process_tracking_id
        )
    )

    given_result = list()

    for extract in extracts:
        given_result.append(
            [
                extract.extract_filename,
                extract.extract_status_name,
                extract.extract_status_name,
            ]
        )

    expected_result = [
        ["test_bucket/test_local_dir_1.csv", "ready", "ready"],
        ["test_bucket/test_local_dir_2.csv", "ready", "ready"],
    ]

    self.assertCountEqual(expected_result, given_result)

你能解决这个问题吗?我也面临着同样的问题!是的,我刚刚发布了示例代码和一些解释。我开始意识到我必须将模拟的S3 bucket命名为与AWS上真实的S3 bucket完全相同的名称。