Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/amazon-s3/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 从S3下载和解压缩文件的内存占用_Java_Amazon S3_Stream_Zip_Bufferedreader - Fatal编程技术网

Java 从S3下载和解压缩文件的内存占用

Java 从S3下载和解压缩文件的内存占用,java,amazon-s3,stream,zip,bufferedreader,Java,Amazon S3,Stream,Zip,Bufferedreader,假设我有一个压缩文件存储在S3中,我希望从中读取。我很好奇以下代码的内存占用情况: public BufferedReader getFile(bucketName, key) { S3Object zippedFile = s3Client.getObject(new GetObjectRequest(bucketName, key)); GZIPInputStream gzp = new GZIPInputStream(zippedFile.getObjectContent(

假设我有一个压缩文件存储在S3中,我希望从中读取。我很好奇以下代码的内存占用情况:

public BufferedReader getFile(bucketName, key) {
    S3Object zippedFile = s3Client.getObject(new GetObjectRequest(bucketName, key));
    GZIPInputStream gzp = new GZIPInputStream(zippedFile.getObjectContent());
    InputStream isr = InputStreamReader(gzp, StandardCharsets.UTF_8);

    fileStream = new BufferedReader(isr);
}


BufferedReader zippedFileStream = getFile(bucketName, key);
//do some processing on zippedFileStream
zippedFileStream.close();

我对Java中文件指针/流的工作方式有点不熟悉。考虑到我要解压缩的文件很大(大约千兆字节),我能以多快的速度开始处理这个文件?我想做的是尽可能避免在内存中存储大量数据。谢谢你的帮助