Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/javascript/394.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/node.js/37.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Javascript NodeJs csv模块-暂停和暂停;简历_Javascript_Node.js_Csv - Fatal编程技术网

Javascript NodeJs csv模块-暂停和暂停;简历

Javascript NodeJs csv模块-暂停和暂停;简历,javascript,node.js,csv,Javascript,Node.js,Csv,我正在使用nodeJs csv模块解析10000条大csv记录,并在每条记录上,即。, .on('record',func),我必须实现一个耗时的逻辑。但我不会等到我做完。我该怎么处理?文件讨论 暂停()并继续()。但是我在哪里使用它呢 var csv = require('csv'); var fs = require('fs'); csv() .from.stream(fs.createReadStream(__dirname+'/sample.in'))

我正在使用nodeJs csv模块解析10000条大csv记录,并在每条记录上,即。, .on('record',func),我必须实现一个耗时的逻辑。但我不会等到我做完。我该怎么处理?文件讨论 暂停()并继续()。但是我在哪里使用它呢

    var csv = require('csv');
    var fs = require('fs');
    csv()
    .from.stream(fs.createReadStream(__dirname+'/sample.in'))
    .to.path(__dirname+'/sample.out')
    .transform( function(row){
      row.unshift(row.pop());
      return row;
    })
    .on('record', function(row,index){
              //Time consuming logic goes here
    })
    .on('end', function(count){
      console.log('Number of lines: '+count);
    })
    .on('error', function(error){
      console.log(error.message);
    });

我终于成功了。感谢节点csv开发人员

//Read through the csv file one row at a time
csv()
    .from(__dirname+'data.csv', { columns: true })
    .transform(function(row, index, next){
         setTimeout(function(){
         console.log(row);
         next()
         }, 3000); //replace setTimeout with any fn and call next() in the callback

    }, {parallel: 1})
    .on("end", function (count) {
      var finalcount = count;
      var logtext = 'SUMMARY: Total count: ' + finalcount + '\n' 
                    + 'Error count:' + errorcount  + "("+errorArray+")"
                    + '\n' + 'Success count:' + successcount;
      fs.appendFile(config.logfilename, '\n'+ logtext, function (err) {
        console.log(logtext);
      });  


      console.log("done", count);
    });

你的节点版本是什么?10x之前的节点对于可读流具有不同的暂停和恢复实现。