Warning: file_get_contents(/data/phpspider/zhask/data//catemap/6/mongodb/11.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Node.js 从kafka消费者向mongodb推送消息_Node.js_Mongodb_Apache Kafka_Kafka Consumer Api_Apache Kafka Connect - Fatal编程技术网

Node.js 从kafka消费者向mongodb推送消息

Node.js 从kafka消费者向mongodb推送消息,node.js,mongodb,apache-kafka,kafka-consumer-api,apache-kafka-connect,Node.js,Mongodb,Apache Kafka,Kafka Consumer Api,Apache Kafka Connect,我已经在事件中使用“kafka节点”创建了kafka消费者 consumer.on('message' ()=>{ connecting to mongodb and inserting to a collection. }) 用于创建到mongo的连接并返回对象的mongo.js文件 const MongoClient = require('mongodb').MongoClient, assert = require('assert'); const url = 'mongodb:

我已经在事件中使用“kafka节点”创建了kafka消费者

consumer.on('message' ()=>{
connecting to mongodb and inserting to a collection.
})
用于创建到mongo的连接并返回对象的mongo.js文件

const MongoClient = require('mongodb').MongoClient, assert = require('assert');

const url = 'mongodb://root:****@ds031257.mlab.com:31257/kafka-node';

let _db;

 const connectDB =  (callback) => {
     try {
         MongoClient.connect(url, { useNewUrlParser: true }, (err, database) => {
             console.log('message' + database)
             _db = database.db('kafka-node');
             return callback(err);
         })
     } catch (e) {
         throw e;
     }
 }

 const getDB = () => _db;

 const close = () => _db.close();
 module.exports = { connectDB, getDB, close }

consumer.js用于创建consumer并将消息推送到mongodb

let kafka = require('kafka-node');
let MongoDB = require('./mongo');
let Consumer = kafka.Consumer,
    // The client specifies the ip of the Kafka producer and uses
    // the zookeeper port 2181
    client = new kafka.KafkaClient({ kafkaHost: 'localhost:9093, localhost:9094, localhost:9095' });
// The consumer object specifies the client and topic(s) it subscribes to
consumer = new Consumer(
    client, [{ topic: 'infraTopic', partitions: 3 }], { autoCommit: false });


consumer.on('ready', function () {
    console.log('consumer is ready');
});

consumer.on('error', function (err) {
    console.log('consumer is in error state');
    console.log(err);
})
client.refreshMetadata(['infraTopic'], (err) => {
    if (err) {
        console.warn('Error refreshing kafka metadata', err);
    }
});
consumer.on('message', function (message) {
    // grab the main content from the Kafka message
    console.log(message);
    MongoDB.connectDB((err) => {
        if (err) throw err
        // Load db & collections
        const db = MongoDB.getDB();
        const collectionKafka = db.collection('sampleCollection');
        try {
            collectionKafka.insertOne(
                {
                    timestamp: message.value,
                    topic: message.topic
                },
                function (err, res) {
                    if (err) {
                        database.close();
                        return console.log(err);
                    }
                    // Success
                }
            )
        } catch (e) {
            throw e
        }
    })
});
这是从kafka消费者向mongodb推送消息的正确方式吗? 使用此设置,它将一直工作到所有消息都被写入,一旦到达EOL,它将抛出“无法读取null的属性'db'”

这是从kafka消费者向mongodb推送消息的正确方式吗

我想这是一种方式,但我不认为这是正确的方式:)

更好的方法是使用卡夫卡连接。它是ApacheKafka的一部分,它的设计目的正是为了完成您正试图完成的任务—将数据从Kafka流式传输到目标系统(您也可以使用它将数据从其他系统流式传输到Kafka)

有一种方法,可以完全按照你想做的去做

如果您需要在写入数据之前对其进行处理,那么遵循的模式是使用Kafka Streams、KSQL或您想要使用但将其写回Kafka主题的任何处理工具进行处理。然后,卡夫卡连接阅读该主题,并将其传输到目标。通过这种方式,您可以将责任解耦,使系统变得更简单、更具弹性和可扩展性