Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/amazon-s3/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Node.js npm streaming-s3,从request.get()流式传输图像_Node.js_Amazon S3_Stream - Fatal编程技术网

Node.js npm streaming-s3,从request.get()流式传输图像

Node.js npm streaming-s3,从request.get()流式传输图像,node.js,amazon-s3,stream,Node.js,Amazon S3,Stream,我正在测试,并试图通过请求上传基本教程。开始工作 目前,它一直处于停滞状态,从未启动、崩溃或发生任何事情 我的代码: var elem = list.shift(); console.log(elem._id+" "+elem.main_img); //elem contains exactly what it should var rStream = request.get(elem.main_img); uploader = new StreamingS3(rStream, {a

我正在测试,并试图通过请求上传基本教程。开始工作

目前,它一直处于停滞状态,从未启动、崩溃或发生任何事情

我的代码:

var elem = list.shift();
  console.log(elem._id+" "+elem.main_img); //elem contains exactly what it should
  var rStream = request.get(elem.main_img);
  uploader = new StreamingS3(rStream, {accessKeyId:'XXXXXX', secretAccessKey:'XXXXXXX',region:"eu-west-1"},
    {
      Bucket: 'xxxxxx-myimages',
      Key: elem._id,
      ContentType: 'image/jpeg',
      ACL:'public-read'
    },
    {
      concurrentParts: 2,
      waitTime: 10000,
      retries: 1,
      maxPartSize: 10*1024*1024,
    }
  );

  uploader.begin(); // important if callback not provided.

  uploader.on('data', function (bytesRead) {
    console.log(bytesRead, ' bytes read.');
  }).on('part', function (number) {
    console.log('Part ', number, ' uploaded.');
  }).on('uploaded', function (stats) {
    console.log('Upload stats: ', stats);
  }).on('finished', function (resp, stats) {
    console.log('Upload finished: ', resp);
  }).on('error', function (e) {
    console.log('Upload error: ', e);
  });

它基本上是一个填空的复制粘贴。。。是什么导致它不工作,也不显示任何输出?我检查了请求,它回复了200状态码。Console.log上传程序显示所有已正确填写的内容…

您确定列表中已设置所有内容吗?以下是我的控制台输出:

下面是我如何修改您的文件,基本上添加了一个静态元素,并更改了从shell获取的访问信息:

var list = [{main_img: 'http://cdn.sstatic.net/stackoverflow/img/sprites.png?v=3c6263c3453b', _id: '123'}];

var request = require('request');
var StreamingS3 = require('streaming-s3');
var elem = list.shift();
console.log(elem._id+" "+elem.main_img); //elem contains exactly what it should
var rStream = request.get(elem.main_img);
uploader = new StreamingS3(rStream, {accessKeyId:process.env.S3_ACCESS_KEY, secretAccessKey:process.env.S3_SECRET,region:process.env.S3_REGION},
    {
        Bucket: process.env.S3_BUCKET,
        Key: elem._id,
        ContentType: 'image/jpeg',
        ACL:'public-read'
    },
    {
        concurrentParts: 2,
        waitTime: 10000,
        retries: 1,
        maxPartSize: 10*1024*1024,
    }
);

uploader.begin(); // important if callback not provided.

uploader.on('data', function (bytesRead) {
    console.log(bytesRead, ' bytes read.');
}).on('part', function (number) {
    console.log('Part ', number, ' uploaded.');
}).on('uploaded', function (stats) {
    console.log('Upload stats: ', stats);
}).on('finished', function (resp, stats) {
    console.log('Upload finished: ', resp);
}).on('error', function (e) {
    console.log('Upload error: ', e);
});

你在windows或其他什么系统上吗?

不,是Unix。。。奇怪的是,我在KnoxS3库中遇到了类似的问题,当我使用http模块而不是请求时,这两个问题都消失了。。。http.get流到s3,而不是request.get。请求的版本是什么?嗯,我从npm得到了最新的。我修改过的程序对你有用吗?也许试一下,然后一行一行地换,看看哪里断了。首先设置环境变量,如
S3\u BUCKET=mybucket S3\u ACCESS\u KEY=xxxxxxx S3\u SECRET=xxxxxxx S3\u REGION=us-east-1 node script.js
var list = [{main_img: 'http://cdn.sstatic.net/stackoverflow/img/sprites.png?v=3c6263c3453b', _id: '123'}];

var request = require('request');
var StreamingS3 = require('streaming-s3');
var elem = list.shift();
console.log(elem._id+" "+elem.main_img); //elem contains exactly what it should
var rStream = request.get(elem.main_img);
uploader = new StreamingS3(rStream, {accessKeyId:process.env.S3_ACCESS_KEY, secretAccessKey:process.env.S3_SECRET,region:process.env.S3_REGION},
    {
        Bucket: process.env.S3_BUCKET,
        Key: elem._id,
        ContentType: 'image/jpeg',
        ACL:'public-read'
    },
    {
        concurrentParts: 2,
        waitTime: 10000,
        retries: 1,
        maxPartSize: 10*1024*1024,
    }
);

uploader.begin(); // important if callback not provided.

uploader.on('data', function (bytesRead) {
    console.log(bytesRead, ' bytes read.');
}).on('part', function (number) {
    console.log('Part ', number, ' uploaded.');
}).on('uploaded', function (stats) {
    console.log('Upload stats: ', stats);
}).on('finished', function (resp, stats) {
    console.log('Upload finished: ', resp);
}).on('error', function (e) {
    console.log('Upload error: ', e);
});