Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/amazon-s3/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Node.js RXJS加密大文件并发送到AWS S3_Node.js_Amazon S3_Rxjs_Aes - Fatal编程技术网

Node.js RXJS加密大文件并发送到AWS S3

Node.js RXJS加密大文件并发送到AWS S3,node.js,amazon-s3,rxjs,aes,Node.js,Amazon S3,Rxjs,Aes,我对RXJS比较陌生,它的学习曲线相当陡峭。我在GRPC上传输文件,我想用AES加密它们,然后将它们存储在S3存储桶中。目前我正在内存中存储缓冲区,如果我开始上传大文件,这将很快成为一个问题 我想知道如何使用RXJS流,通过加密将其传输到S3 bucket(也就是说,我不想将整个文件存储在内存中) 这是可能的还是我误解了什么 这是我当前的实现 @GrpcStreamMethod() upload(data$: Observable<FileUploadRequest>): O

我对RXJS比较陌生,它的学习曲线相当陡峭。我在GRPC上传输文件,我想用AES加密它们,然后将它们存储在S3存储桶中。目前我正在内存中存储缓冲区,如果我开始上传大文件,这将很快成为一个问题

我想知道如何使用RXJS流,通过加密将其传输到S3 bucket(也就是说,我不想将整个文件存储在内存中)

这是可能的还是我误解了什么

这是我当前的实现

  @GrpcStreamMethod()
  upload(data$: Observable<FileUploadRequest>): Observable<FileUploadResponse> {
    let fileBuffer: Buffer;
    let metaBuffer: Buffer;
    let storageBucket: string;
    let publicKey: string;
    let publicKeyHash: string;

    const response$ = new Subject<FileUploadResponse>();
    let headersDeliverd = false;

    const symetricKeys = this.service.generateEncryptionKeys(); //returns key + iv
    const fileCipher = this.service.getCypher(symetricKeys); //returns cipher
    const metaCipher = this.service.getCypher(symetricKeys); //returns cipher

    data$.subscribe({
      next: (data: FileUploadRequest) => {
        const content = data?.file?.content;
        const meta: Meta = data?.metadata;

        // Metadata has been delivered
        if (meta && !headersDeliverd) {
          headersDeliverd = true;

          publicKey = meta.owner;
          publicKeyHash = this.service.createHash(meta.owner);
          storageBucket = join(FILE_STORE, publicKeyHash);
          const metaBuff = this.service.metaToBuffer(meta);
          metaBuffer = metaCipher.update(metaBuff);
        }

        if (content && !headersDeliverd) {
          console.log('SOME ERROR');
        }

        if (content && headersDeliverd) {
          fileBuffer = fileCipher.update(content);
        }
      },
      complete: async () => {
        const encryptedSymetricKey = await this.service.encryptSymetricKey(
          symetricKeys,
          publicKey,
        );

        fileBuffer = Buffer.concat([fileBuffer, fileCipher.final()]);
        metaBuffer = Buffer.concat([metaBuffer, metaCipher.final()]);

        const documentId = uuidv4();
        const fileName = `${documentId}.file`;
        const metaName = `${documentId}.meta`;
        const keys = `${documentId}.access`;

        await this.service.uploadToAws(
          metaBuffer,
          `${publicKeyHash}/${metaName}`,
        );
        await this.service.uploadToAws(
          fileBuffer,
          `${publicKeyHash}/${fileName}`,
        );
        await this.service.uploadToAws(
          JSON.stringify({
            [publicKeyHash]: encryptedSymetricKey,
          }),
          `${publicKeyHash}/${keys}`,
        );

        response$.next({
          status: Status.SUCCESS,
          path: storageBucket,
        });
        return response$.complete();
      },
    });

    return response$.asObservable();
  }
@GrpcStreamMethod()
上传(数据$:可观察):可观察{
let fileBuffer:Buffer;
设metaBuffer:Buffer;
让storageBucket:字符串;
让公钥:字符串;
让publicKeyHash:string;
const response$=新主题();
让headersDeliverd=false;
const symetricKeys=this.service.generateEncryptionKeys();//返回key+iv
const fileCipher=this.service.getCypher(symetricKeys);//返回密码
const metaCipher=this.service.getCypher(symetricKeys);//返回密码
数据$.subscribe({
下一步:(数据:FileUploadRequest)=>{
常量内容=数据?文件?内容;
常量meta:meta=data?元数据;
//元数据已经交付
if(元和头传递者){
headersDeliverd=true;
publicKey=meta.owner;
publicKeyHash=this.service.createHash(meta.owner);
storageBucket=join(文件存储,publicKeyHash);
const metaBuff=this.service.metaToBuffer(meta);
metaBuffer=metaCipher.update(metaBuff);
}
if(内容和标题交付者){
log('SOME ERROR');
}
if(内容和标题交付者){
fileBuffer=fileCipher.update(内容);
}
},
完成:异步()=>{
const encryptedSymetricKey=等待此服务。encryptedSymetricKey(
symetricKeys,
公钥,
);
fileBuffer=Buffer.concat([fileBuffer,fileCipher.final()]);
metaBuffer=Buffer.concat([metaBuffer,metaCipher.final()]);
const documentId=uuidv4();
常量文件名=`${documentId}.file`;
常量metaName=`${documentId}.meta`;
常量键=`${documentId}.access`;
等待this.service.uploadToAws(
梅塔布弗,
`${publicKeyHash}/${metaName}`,
);
等待this.service.uploadToAws(
文件缓冲区,
`${publicKeyHash}/${fileName}`,
);
等待this.service.uploadToAws(
JSON.stringify({
[publicKeyHash]:加密Symetrickey,
}),
`${publicKeyHash}/${keys}`,
);
响应$.next({
状态:状态,成功,
路径:storageBucket,
});
返回响应$.complete();
},
});
返回响应$.asObservable();
}

我想出的解决方案就是创建一个流,并将数据推送到它上面

uploadToAws(key) {
    const pass = new PassThrough();

    return {
      writeStream: pass,
      promise: this.s3.send(
        new PutObjectCommand({
          Bucket: '...',
          Key: key,
          Body: pass,
          ServerSideEncryption: '...',
          ContentLength: 37, //If you do not put content lendth will error
        }),
      ),
    };
}

const stream = new Readable({
  read(data) {
    return data;
  },
});

data$.subscribe({
    next: (data) => stream.push(data),
    error: (error) => stream.destroy(),
    complete: async () => stream.destroy()
});

stream.pipe(encryptFunc).pipe(uploadToAws);

为什么不能使用Node.js streams,而Node.js streams与crypto配合得很好?例如,crypto.createCipheriv()的返回值是Stream.Transform。基本上,您可以通过管道将文件直接输入、通过和输出到S3。RxJS听起来像是一个障碍。你的正确答案是它是一个障碍。我在这里使用RXJS的唯一真正原因是因为我使用的框架使用RXJS来处理GRPC流。你能建议我想出如何摆脱使用RXJS的方法吗?我已经想出了一个解决方案,我完成后将在这里发布。本质上,我只是创建一个读取流并将块推送到流中,然后通过EnCryption将其传输到aws。像这样实现了它
next:(data:FileUploadRequest)=>readableStream.push('data')